PLE & PLN – it’s us!

Open and Social Learning according to Alec Couros:

An open course entitled Education, Curriculum, and Instruction: Open, Connected, Social using Free Ope Source Software through the University of Regina, was implemented in 2008. (Couros, 2010, p. 109). It was based on personal learning networks, and participants quickly realised the value of sustainable knowledge networks. This led to a context built around a series of events which quickly absorbed participants in an engaged community of participation (Couros, 2010, p. 110).

The theoretical foundations of the course were

The open movement (Couros, 2010, p. 111).

Complementary learning theories – social cognitive, social constructivism, andragogy, connectivism, and open teaching (Couros, 2010, pp. 112-115).

The primary learning environment was established collaboratively in the weeks preceding the course. The tools considered were:

Web CT (now Blackboard) – pros: familiar to students and the university had a strong infrastructure of support; cons: proprietary (modifications needed vendor support); directed learning favoured over constructivist; expensive licensing fees.

Moodle – pros: free; open source; modifiable, strong community support; touts a constructivist and social constructivist approach; available. Cons: needs PHP server infrastructure; requires technical expertise leading to hidden costs; software not as available as hoped; course-centric not student-centric; top-down instructivist approach.

Ning – pros: ease of use; freely available in 2008; familiar functionality similar to Facebook; community and individual privacy levels; user-centric spaces; content aggregation; communication tools. Cons: no wiki feature; awkward to add core content material.

Wikispaces – pros: senior, best-known and most stable of wiki providers; solid technical support; theme modification options; simple user interface – see http://eci831.ca/ (Couros, 2010, pp. 117 – 119).

The course required the establishment of a PLN, and it was mandatory that participants developed a personal blog/digital portfolio, participated in a collaborative wiki resource ( no longer active but was located at  http://t4tl.wikispaces.com; this is what happens when such a site is not paid for!) and completed a major digital project (sound like INF 530!) (Couros, 2010, pp. 119 -120).

The course was based on the following tools and interactions:

Synchronous activities: two events per week of between 1.5 and 2 hours in length; the first based on content knowledge (like our INF 537 colloquiums); the second on teaching skills (Couros, 2010, pp. 120-121).

Asynchronous activities: researching and blogging; shared bookmarking; artefact creation; participation in open professional development opportunities; creating content and uploading it to sites such as YouTube; microblogging; collaborative lesson design and contribution to the course wiki (Couros, 2010, pp. 121-122).

Knowledge networks and digital innovation’s forerunner?? Just like INF 530 and INF 536, students developed authentic, dynamic and fluid interactions both within the designated course spaces and in spaces they chose and shared themselves.

Defining Personal Learning Environments, and comparing them to Personal Learning Networks was an exercise undertaken by Couros through Twitter and recorded at http://educationaltechnology.ca/couros/1156. Key agreement indicated that PLEs are the tools, artefacts, processes, and physical connections that allow learners to control and manage their learning (Couros, 2010, p. 125). PLNs explicitly include the human connections that result in the advancement and enabling of a PLE (Couros, 2010, p. 125).

Couros makes the following recommendations for those wishing to use PLNs for teaching and learning:

  • Immersion by participants
  • Social media literacy
  • Active contributions strengthen your PLN
  • Know your “followers” or “friends”
  • PLNs are central to learning for sustained and long-term growth in both facilitators and students(Couros, 2010, pp. 125 -126).

The participatory learning communities developed by courses such as the one Couros describes continue to exist because they are not based around courses per se, but around communal learning (Couros, 2010, p. 127). Those of us taking the Knowledge Networks and Digital Innovation course can already attest to that in terms of the subjects we have already finished because for many of us the content continues to be shared and discussed. If Couros is correct, this course will never have to end – now there’s a challenge to my PLN!

Reference

Couros, A. (2010). Developing personal learning networks for open and social learning. In Veletsianos, G. (Ed.), Emerging technologies in distance education (109–128). Athabasca University: AU Press.

Article Review

Intelligent Questionnaire Design for Effective Participant Evaluation by Lisa Elias

Step 1: Before designing a survey it is critical that the objectives are identified – what is to be achieved and why is the survey necessary? This means considering the nature of the people who will be surveyed, those who will gain information and the purpose of the task itself.

Step 2. Then, write the questions in a clear, well thought out manner based on the objectives outlined in step 1. In this way, the data collected will be high quality and applicable to the needs of all concerned (Elias, 2015, p. 8).

Ensure that questions are:

Clear and unambiguous

Concise

Neutrally worded

Avoid embarrassment – omit or minimise sensitive topics

Ensure respondents’ privacy

Select the question formats with the objectives clearly in mind. A mix of question types will elicit the best data.

Question types to consider:

Yes/no – quick response enabling simple comparisons

Multiple choice – only one selection or multiple selection?

Likert scales – demonstrate a rating per respondent on a common scale

Open-ended responses – time-consuming to analyse but rich qualitative data. Use sparingly.

Alternative responses – allow respondents to opt out or provide their own answer

Ordinal/ranking – a series of items that respondents are asked to rank (for example from 1 to 5                         where 1 is most important and 5 least important; each number can only be used once)

Format the survey by considering the most logical layout to achieve your aims. This avoids confusing the respondents and makes analysis easier.

Introduction – explain why the survey has been established and convince people that                                    participation is valuable and worth their time and effort.

Order and group the questions according to the format you have deemed most logical.

Initial questions should be impersonal and easy to answer so that respondents continue.

Short is best (Elias, 2015, p. 9)…

But ensure the information will be adequate for the purpose.

Use contingency questions if applicable so that people do not have to answer questions                                   irrelevant to them. A preliminary question should ascertain how many, if any, questions of                           the following set need to be tackled.

Use a progress indicator for online surveys – it shows respondents how far they have to go.

Thank participants and provide your contact details.

Likert Scales should be:

labelled e.g. Poor (1) ranging to excellent (5)

Consist of an odd numbered scale so there is a mid-point – 5 or 7 options have proven best

Follow the same value pattern – either left to right or right to left

Make sure the words applied to the scale allow for the full range of responses

Elias provides a very helpful checklist to use when construction questionnaires:

  1. Has the survey been test-driven?
  2. Do others find the layout clear?
  3. Is the purpose explicitly explained?
  4. Have respondents been thanked?
  5. Is anonymity and confidentiality of data been guaranteed?
  6. Are instructions clear and precise?
  7. No duplication?
  8. Are questions plain and unequivocal?
  9. Are all questions essential?
  10. Are questions correctly ordered?
  11. Will closed questions result in the expected numerical data required?
  12. Are open text options sparingly used?
  13. Is there sufficient time for completion(Elias, 2015, p. 10)?

Reference:

Elias, L. (2015, February). Intelligent Questionnaire Design for Effective Participant Evaluations. Training and Development, 8-10.

 

#4

Tim Klapdor

Online Learning Technology Leader, Charles Sturt University

Tim presented some challenging points in his Colloquium entitled: You Are Not in Control (Klapdor, You Are Not In Control, 2015). The manner in which his introductory slide was set up, with the NOT inserted in a different colour to the rest of the title – almost as an afterthought, was an indication that seat belts would need to be fastened for the journey.

He quickly moved from this somewhat disturbing title to explaining that the strongest networks had the best nodes, and that these nodes comprise of the individuals whose knowledge and wisdom in relation to networking have made them the best. Momentarily lulled into a sense of security, he then went on to explain that the systems in which networking occur do not allow individuals to have autonomy or ownership (Klapdor, You Are Not In Control, 2015).

The challenging question of who owns the data, controls our identities and defines who we are in online spaces was then posed (Klapdor, You Are Not In Control, 2015). Suddenly, networking looked less inviting – are we in control of anything or are the systems controlling us? Who actually connects us to our social groups – the social media we employ, or each of the users within the network?

Within our cohort, few have their own domain registered, although we are all very engaged in such spaces. Should we be worried? And anyone who has engaged with digital space has “lost” access to such spaces. In my case I have invested heavily in Ning when it was new and free, PB wiki (ditto), TakingIT Global (when it was provided at work then not renewed), and Moodle (before the administration at school decided to change to a different LMS). All the work I did in these places is lost to me. I have similar concerns about this blog once I finish this course.

Tim then proceeded to describe the rise of those who would confine us to some defined spaces, while locking us out of others, much in the way that the feudal lords enclosed the commons and made them exclusive (Klapdor, You Are Not In Control, 2015). He raised the roles of copyright and licensing and then offered the solace of the rise of the hackers.

On his blog, he offers some solutions to consider for those who want to mind and manage their own learning. This is summarised here:

A suggested range of solutions?
A suggested range of solutions?

Overall a thought provoking presentation. I guess all we can do is think carefully about what we do online, where we do it, and the longevity or transience of our decisions. If something is special we need to try and future proof it. When we die, we need to consider the disposal, or dispersal of our digital remains and ensure that we define our need to endure on our networks or to be forgotten.

Thanks for the roller coaster ride Tim! You may now unbuckle your seatbelt.

References

Klapdor, T. (2015, June 16). Make Your Own Slogan MYOS. Retrieved from TimKlapdor: https://timklapdor.wordpress.com/2015/06/16/make-your-own-slogan-myos/

Klapdor, T. [Host] (2015, August 13). You Are Not In Control.

#3

Blog post for Colloquium 3

What does ‘flat’ learning look like?

Flat connected learning incorporates aspects of Collaboration, Project based learning, Blended learning, Flipped learning, and Inquiry-based learning established within a framework based on a combination of Web 2.0, leadership, pedagogy and learning design (Lindsay, n.d.). In many ways, this sums up the reality of teaching and learning in an era of rapid technological development and pedagogical change.

 

It also encapsulates the five stage taxonomy of online, global learning:

  1. Online interactions
  2. Real encounters
  3. Online learning
  4. Community of practice
  5. Learning collaboratives (Lindsay J., 2015)

 

According to Julie the norms of global collaboration begin with being prepared; depend on having a purpose; require the ability to paraphrase, perceive, and participate; entail a positive mindset and productive nature; and are based on the ability to detect the potential in situations (Lindsay J. , 2015).

Pedagogical change evolves from being able to approach learning design with a flexible attitude, engaging with professional learning in a progressive manner, and adopting the essential elements of conceptual change (Lindsay J., 2015).

In this scenario the teacher is viewed as an activator and the student as an active participant in the process, while the school provides the conduit, and the community is seen as a partner in learning (Lindsay J., 2015).

Once the technological requirements are in place, and teachers have knowledge of new ways of meaningful engagement through TPACK and SAMR, and the belief that such pedagogy is important, flat connections and global learning become realistic options for developing knowledge and wisdom (Lindsay J., 2015). Such an approach leads to cosmogogy: the study of learning through connection to the world through the digital technologies available today. In such a scenario the context lies in learning with, not about, and geo-location is irrelevant (Lindsay J., 2015).

This presentation was a great introduction to the peer presentations relating to selected chapters of Wang’s extensive tome (Wang, 2014). These expositions demonstrated a potential for school adaptation where senior secondary students could lighten the load for each other in collaboratively summarising text. It certainly was of benefit to our cohort in this subject.

Three colloquiums, three very different ways of doing business – and all of them useful and thought provoking.

References

Lindsay, J. [. (2015, August 6). Colloquium 3: Flat Classrooms.

Lindsay, J. (n.d.). Flat Learning. Retrieved August 11, 2015, from Flat Collections: http://www.flatconnections.com/flat-learning.html

Wang, V. (. (2014). Handbook of research on education and technology in a changing society. London: IGI Global.

 

 

 

 

ICT Horizons

The NMC Horizon Report 2015 K-12 and links to Wang and Weller readings:

The current edition of the Horizon report can be found here and a commentary on what it means for education can be found at the Mind Shift blog. It is always thought provoking to investigate this report and much of the content resonates with the subjects I have taken as part of my course.

This diagram gives a brief overview of this year’s findings:

Challenges

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The report this year sees integrating technology onto teacher education as solvable and cites the Finnish example of using Edukata (a participatory design model):

The more difficult, or wicked, challenge is scaling the models of teaching innovations(Johnson, Adams Becker, Estrada, & Freeman, 2015, p. 1).

 

 

 

 

 

 

 

Wang Chp 25

Building Education and Technology Competencies for a Changing Society

This post is prompted by the need to master the chapter of the same name in (Wang, 2014, pp. 331-342) written by three women working at Auburn University in the United States (Witte, Wohleb, & Skinner, 2014). Whilst American in focus, the material embodies a high level of relevance for Australian K-12 and tertiary education.

American college students’ success is being challenged by 3 factors:

3 factors These factors are reflected in 3 trends:

 

3 trends

To assist this, K-12 educators need to incorporate the everyday use of technology into classrooms to maximise good habits that will assist with lifelong learning. Demonstrated to have a positive effect on tertiary students are these 5 habits:

5 habits

Educators at both tertiary and secondary level need to teach 5 skills:

5 skills

 

Competent tertiary students need to be:

  • Internationalist
  • Adaptable(Witte, Wohleb, & Skinner, 2014, p. 332)

And they need to graduate with the following 12 skills:

12 skills for graduates

 

(Witte, Wohleb, & Skinner, 2014, p. 336)

These skills can be developed by K-12 teachers incorporating the following tools into their lesson design:

  • Web-based programs
  • Learning Management Systems
  • Virtual Chat Rooms
  • Web-cams
  • Skype or FaceTime  (Witte, Wohleb, & Skinner, 2014, p. 332).

Technology tools should be capable of assisting teachers to instruct, monitor and assess within a learning environment that is both engaging and motivating for their students (Witte, Wohleb, & Skinner, 2014, p. 333).

The processes that result from such a scenario should be more

  • Individualised
  • Differentiated
  • Specialised
  • Dynamic(Witte, Wohleb, & Skinner, 2014, p. 333).

In all cases the aim should be to reflect the positivity relating to technical incorporation, and such action should be linked to learning, as using appropriate tools is the critical link to student success (Witte, Wohleb, & Skinner, 2014, p. 335).

References

Wang, V. (. (2014). Handbook of research on education and technology in a changing society. IGI Global. London: IGI Global.

Witte, M. M., Wohleb, E., & Skinner, L. (2014). Building Education and Technology Competencies for a Changing Society. In V. (. Wang, Handbook of Research on Education and Technology in a Changing Society (pp. 331-342). IGI Global.

 

#2

Learning Analytics: A Traveller’s Guide

Anyone participating in the learning journey that is INF537 would have been intrigued by the title of Colloquium #2 (Welsh, 2015). The content, while very different in delivery from Colloquium #1 (Astbury, 2015), was equally thought provoking. Despite the title, data was not the only aspect covered, and the final comments indicated the incredible potential of learning analytics.

Simon’s opening comments related to his chosen title, as he pointed out that a traveller digs deeper than a tourist. He then commented that the interpretation and mining of data is an aspect of teaching and learning that is still sorting itself out.

For those who share an antipathy to using test scores to predict educational outcomes, Simon’s comments opened a door to improved educational futures. He explained that academic analytics are those used by institutions to aid with student management while learning analytics are interrogated to support learning and teaching for improved outcomes.

Investigating these concepts further indicates that data mining does not occur in a vacuum; it links to power and relationships; the capturing and sharing of data is in itself a development of knowledge capital (Weller, 2011, p. 43). Another aspect of such data is how it is managed and preserved (Weller, 2011, p. 43). Those generating the most data in a digital world are already privileged, and the rapidly expanding body of work is increasing the division between the haves and have-nots.

Simon referred to the example of the ATAR system and its use by schools to target areas that teachers need to improve, compared to its use by the MySchool website, where visitors choose a very different interpretation. This illustrated the importance of context and intent in such data collection and its subsequent use (Welsh, 2015).

There are three aspects of simplistic data use that cause concern:

  1. What does it mean for a student to be monitored in this way – is it profiling or determinism, as Hyacinth posted in the accompanying chat?
  2. The ethics of such use – who actually owns the data?
  3. The fact that teachers are being asked to interpret such data without training in data literacy (Liz Eckert).

It is also important to know how reporting systems are being used and where the data is coming from in order to give appropriate advice based on the conclusions that are being drawn. Much of the data comes from the vendors of Learning Management Systems, who have set up metrics based on ease of use. Algorithms based on the number of clicks or the amount of time spent on any given task are not really a measure of learning and need to be carefully interpreted. There is a big difference between measuring quantities of clicks and measuring the quality of engagement (Welsh, 2015).

The example of using Virtual Learning Environments (VLEs) to capture and mine data was very interesting. VLEs are vendor focussed and often simplistic in terms of the data they gather. Once an institution has invested in providing a VLE it can be stuck with that specific product, as migrating to another platform is expensive and time consuming (a point noted and discussed by several classmates). Weller considers that introducing VLEs has led to the educational institution losing control of data to the manufacturer, and cites the example of Blackboard trying to patent many core e-learning concepts (Weller, Digital Resilience, 2011, pp. 170-171). Andrew questioned consideration of other products as a replacement, notably Moodle, which is open source.

An example Simon explored in some detail was the use of subject forums, such as those used within the Charles Sturt Blackboard internet, and, in the case of my workplace SIMON (School Information Management on the Net). If students have to participate in online forums within their VLEs then a tool to measure this must be able to “read” the type of material being entered. In this way, within an hour of the posted comment a scaffold into deeper learning could be generated, problems within comments across the group can be alerted to the educator, and extra reading could be suggested to those requiring additional explanation, or extension.

This type of monitoring could lead to an easy citation mechanism for resources utilised, which, as Greg commented, would be “referencing heaven”. It is in these potentially positive contributions to learning that most teachers can see the real value of data mining, rather than the click counting and number of visits which are so commonly applied. Resulting real time adaptation of learning programs to personalise student learning experience, development of meta-cognitive skills for learners, fast response to learning design and quick adaptation of technical equipment and systems would all be welcomed by educators (Welsh, 2015).

Weller warns of potential risk from using data to analyse and improve results by stating that it could lead to Google replacing human librarians, and user generated “playlists” of information may make teachers irrelevant (Weller, Digital Resilience, 2011, p. 171). This is a very broad allegation which has been somewhat allayed by Simon’s Colloquium session.

As Rochelle commented: the link between educational data mining, decision support systems and expert systems is inextricable; Deborah’s response that the skill lies in using the power for good sums up the feeling of most educators whose primary focus is the overall well-being of people in their classes.

While Simon’s presentation assuaged some fears, it raised other issues of potential concern for teachers and students. Needless to say, we are living in revolutionary times, and, while a revolution may be bloodless, it is rarely painless (Weller, Digital Resilience, 2011, p. 168). The critical thing for scholars and teachers is that they stay involved, because they need to be in a position to determine what goes, what stays and what comes; passitivity is not an option (Weller, Digital Resilience, 2011, p. 184).

References

Astbury, A. [Host]. (2015, July 21). ABC Splash Online Colloquium 1. Melbourne, Victoria, Australia.

Weller, M. (2011). Digital Resilience. In M. Weller, The Digital Scholar: How Technology is Transforming Scholarly Practice (pp. 168-184). London: Bloomsbury Collections. doi:10.5040/978184966275.ch-014

Weller, M. (2011). The Nature of Scholarship. In M. Weller, The Digital Scholar, How Technology is Transforming Scholarly Practice (pp. 41-51). London: Bllomsbury Collections. doi:10.5040/978184966275.ch-014

Welsh, S. [Host]. (2015, July 28). Learning Analytics: A Traveller’s Guide; Online Colloquium 2. Albury, Victoria, Australia.

Acknowledgements:

Fellow travellers’ comments from the Colloquium chat box are acknowledged in blue.