Appraising calibre, authenticating content: (not AC/DC but AC/AC!)

We are living in an era where information is readily available, easily created, generally unedited or moderated, and widely shared. It is vital that readers have the capacity to appraise the calibre of content they encounter. Yet, it would appear that even students entering renowned universities cannot apply even the most basic of filters to images or documents presented to them (Weinberg, 2016).

Some simple starting points:

valenza-based-checklist
Based on Some Rules of Thumb – a guide to assessing online news and adapted to suit all types of information (Valenza, 2016).

Without applying a filter, or lens to what we read we run the risk of spreading misinformation, thereby perpetuating deliberately created and often specifically targeted fabrications which may be destabilising to governments or undermining to individuals. Far from choosing to be part of such a process, many are inadvertently passed on because people aren’t taking the time to evaluate sources (Tiffany, 2016).

Teacher-librarians such as Valenza promote their role as critical in educating more news literate and savvy information consumers. Tiffany states that this is more effective the earlier that students encounter such educators (Tiffany, 2016).

Coupled with the relatively recent rise in the spreading of “untruthiness”, is the concept held by many that free press equates to neutral information (Valenza, 2016). History teachers are adept at demonstrating that the underlying perspective of the creator, or interpretation of the historian affects the way in which the information s viewed. Much harder to teach, however, is the effect our own attitudes and biases affect the way in which we read and often lead us to ignore viewpoints that differ from our own (Valenza, 2016).

Teaching younger students about appraising calibre and authenticating content is made a little easier by using a resource such as the TED talk on “How to choose your own news” (Brown, 2014) – an engaging animation.

There is no doubt that there has been an exponential increase in the publication of extreme, untrue and misleading “fake news” since the rise of social media such as Twitter and Facebook, partly due to the fact that the number of clicks may equate to real income for the posters (Garun, 2016). This poses a real issue for the founders of such sites, such as Facebook’s Mark Zuckerberg, who has expressed concern at the site being forced into becoming arbiters of truth (Liptak, 2016). The sites on which such “untruthiness” is spread have become known for fostering click bait (Zimdars, 2016).

There have also been allegations that social platforms influenced election results in several countries in 2016 (Garun, 2016). This of itself may not be all bad – but it does indicate the serious need for teaching readers how to negotiate the publications of our time by understanding the underlying purpose of the publications to which they are exposed, and to question the authenticity of what they read, in much the same way that commercial transactions advise that the buyer must be aware. It is critical that leading universities such as Stanford do not continue to find that their students are vulnerable to fake news (Weinberg, 2016).

It is crucial that Australian students are able to learn within their own context about the ways this can be an issue locally, as well as seeing information relating to the United States in particular.  We need to be developing Australian resources to support teaching the necessary skills.

As a teacher-librarian and History teacher I am up for the challenge – are you? Join the conversation at #truthinessEDU

References

Brown, D. (Writer), & Harris-Norico, A. (Director). (2014). How to Choose Your News [Motion Picture]. TedED. Retrieved December 3, 2016, from http://ed.ted.com/lessons/how-to-choose-your-news-damon-brown

Garun, N. (2016, November 14). How social platforms influenced the 2016 election. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/14/13626694/election-2016-trending-social-media-facebook-twitter-influence

Liptak, A. (2016, November 13). Mark Zuckerberg warns about Facebook ‘becoming arbiters of truth’. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/13/13613566/mark-zuckerberg-facebook-misinformation-hoax-media

Tiffany, K. (2016, November 16). In the war on fake news, school librarians have a huge role to play. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/16/13637294/school-libraries-information-literacy-fake-news-election-2016

Valenza, J. (2016, 26 November). Truth, truthiness, triangulation: A news literacy toolkit for a “post-truth” world. Retrieved December 3, 2016, from School Library Journal: http://blogs.slj.com/neverendingsearch/2016/11/26/truth-truthiness-triangulation-and-the-librarian-way-a-news-literacy-toolkit-for-a-post-truth-world/

Weinberg, S. (2016, November 26). Stanford Study Finds Most Students Vulnerable To Fake News. (K. McEvers, Interviewer) Retrieved from http://www.npr.org/2016/11/22/503052574/stanford-study-finds-most-students-vulnerable-to-fake-news

Zimdars, M. (. (2016, November 15 ?). False, Misleading, Clickbait-y, and/or Satirical “News” Sources. Retrieved December 3, 2016, from http://d279m997dpfwgl.cloudfront.net/wp/2016/11/Resource-False-Misleading-Clickbait-y-and-Satirical-%E2%80%9CNews%E2%80%9D-Sources-1.pdf

 

Collaborative Networking with students

There are a number of ways to get students up and active, sharing notetaking and working together.

A3 paper:

This can be a wonderful tool for getting information out of their minds and into shared space. Some examples, which may be hard to read, but which give you an idea of how it works are below. They come from a VCE History revision lesson in 2015. Preparation involved picking up on the skills that were lacking – in this case the provision of evidence to support contentions that had been made – and writing a target in the middle. Each student then had a time frame of 2 minutes per page, before they had to move on to the next page.

A3 speed note sharing 2A3 speed note sharing 1

Collaborative cube:

This was a recommissioned TV trolley which a colleague and I decided to use as a frame for four sheets of whiteboard  material. We kept the wheels so that it could be moved as required. Students of all ages and genders really enjoy using the markers to complete tasks.

Cube

This was an activity with mixed age groups from Years 5 – 8 and from a range of schools, where they had the opportunity to draw pirates!

A4 paper:

Getting students in pairs to list ideas relating to a topic within a really tight timeframe can provide an active change of pace, stimulates competition and often draws out concepts that I have not connected to each other in quite the same way enhancing multiple interpretations of the same content.

Plastic Blocks:

This activity has been blogged about here

In a time when some people are recommending standing desks, it provides a real alternative if you can get them up and moving!

Final Report

Case Study Research Report:

Learning Management System utilisation by teachers and students at a regional Victorian school.

How well are the affordances of the SIMON LMS being used by teachers and students in one specified school setting?

Executive summary:

 

Learning Managements Systems (LMS) are web based products which have been used by most universities for a significant period of time. Increasingly schools, particularly at the secondary level, are also investing in such digital tools. This study compares the potential usage of a specific LMS in a small, regional, kindergarten to Year 12, Victorian Independent School with those aspects that teachers and students are actually using. Information was garnered by use of online surveys, and the findings suggest not only wide acceptance of some affordances by both teachers and students, but also ignorance of the potential of others. The Primary Campus usage is minimal, for a number of reasons. The data were comparable with results obtained by researchers working on LMS reviews in other institutions, predominantly universities.

 

The Nature and Context of the Case Study:

 

This case study report presents the results of an empirical inquiry investigating the extent to which the SIMON (LMS) (SIMON Solutions, 2009), as one example of a contemporary educational phenomenon, is being used to improve teaching and learning within the context of a specific regional Victorian school.  The inquiry was framed to discover the degree of usage by teachers and students, the individual uptake of those functions offered by the LMS compared to features not adopted, and the perceived advantages, problems and potential of this specific educational software. The underlying purpose was to understand user needs and perspectives, thereby identifying aspects of usage with which users of this LMS may require support, in order to improve the school’s knowledge networking and opportunities for digital innovation.

 

 

Terminology:

LMS, also referred to as Virtual Learning Environments, Digital Learning Environments, Course Management Systems or Electronic Learning Environments are web based applications which are accessible wherever an Internet connection allows (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 688).  While a significant amount of research has been conducted on the impact of such systems in universities, where, for example, uptake in Britain by 2005 was reported at ninety five percent (McGill & Klobas, 2009, p. 496), there are fewer examples focused on schools, and these are not K-12 settings. The circumstances of the chosen setting are therefore different to those institutions reported on in other academic literature.

 

SIMON is a learning management program, created in 2000 by practicing teachers at St Patrick’s College in Ballarat (Simkin, SIMON, 2015 c). It is now owned by the Ballarat Diocese and the original developers are still involved in managing its evolution. Whilst originally used in Catholic schools within this Diocese, SIMON usage has extended to other educational jurisdictions and Australian states. The school on which this case study focuses, was one of the first Independent Schools to adopt the program, moving to SIMON from Moodle about five years ago. It has also developed a relatively collaborative relationship with the founders of the LMS, by suggesting possible changes; an aspect of the specific context that does not apply to many other schools using the same product.

 

Constraints:

The decision to focus on reviewing one LMS in a single school, was selected to meet the constraints of the timeframe available for conducting the research, and the stipulations outlined for the writing of the research report. The chosen school has been using SIMON for six years, however employment of the system has been observably inconsistent from both a teaching and learning perspective. There is, therefore, potential to use the findings of this investigation to lead to improvement. Three forms of understanding are required before educational transformation can occur: a critique of the current, a vision of the desired and a theory for guiding the situation from where it is to where it should be in order to achieve better outcomes (Robinson & Aronica, 2015, p. 58). This sentiment encapsulates the intention of this case study research as investing in an LMS should result in measurable return on investment (Leaman, 2015, para 1).

The Process:

Literature Review:

The process necessitated commencing with a review of relevant literature, taking guidance from Thomas, the quality of material and the publications it was coming from were the first criteria,  including following up references to literature reviewed within the sources investigated (Thomas, 2013, pp. 60-61).  Most titles were retrieved from the university library, but one was provided through a Twitter connection which led to Professor Harland (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013) and another from a colleague, an intriguing and very specific research proposal highlighting issues which apply to segregated education, but which also reminded of the challenges of mixing methodology, and that awareness is not the same thing as use when it comes to LMS (Algahtani, 2014, p. 16).  These papers revealed some common themes surrounding LMS research, as outlined below.

 

Commonalities:

Research has predominantly considered the major LMS providers, notably Blackboard (which now incorporates WebCT and ANGEL (Islam, 2014, p. 252)), but also Dokeos, Smartschool (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689), Sakai (Lonn & Teasley, 2009, p. 687), Desire2Learn (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82) and the popular open source Moodle.  Some papers analyse usage of several LMSs, while others compare the utility offered by different options such as Facebook (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013, p. 83), and SLOODLE (Moodle incorporated with Second Life as a 3D virtual learning environment) (Yasar & Adiguzel, 2010, pp. 5683 – 5685).

 

The majority of the literature was based on surveys, so the decision to collect information through online surveys was validated. Given that the SIMON interface is different for teachers compared to students, two surveys were required. These were constructed using Google Forms.

 

Limitations:

A Parent Access Model survey is being developed for future use to strengthen the evaluation process and enhance the practical application of the recommendations. Lack of time and access to this module for the researcher precluded it from the case study. Use of SIMON by the Primary Campus would benefit from further discussion also. Analysing the purpose and style of the questionnaire was a vital stating point  (Elias, 2015), therefore the main elements of Elias’ work informed the overall structure (Simkin, 2015 a).

 

Survey Methodology:

Qualitative and quantitative surveys elicit very different information, and the literature review resulted in the decision to incorporate both styles of questioning. Qualitative methodology enables detailed descriptions to be provided that are not constrained by the researcher, enabling the respondents to elaborate on the things that matter to them (Ritchie, 2013, p. 4). The style of qualitative questions accessed aspects of critical theory enabling an understanding of the intersect between material conditions and their influence on human behaviour (Ritchie, 2013, p. 12). For example, the last two questions on both surveys (Appendix pages 23-25; 34-35) were ontologically focussed, aiming to compare realistic responses with idealistic possibilities.

 

Selecting the right tool for the anticipated outcome also required quantitative data gathering: deciding appropriate topics to assess by checklists (Appendix pages 20 &  ) compared to items that needed to be evaluated through Likert scale questions (Appendix page) followed (Thomas, 2013, pp. 209-215). It was important to set such questions up in a neutral manner, rather than in a way that directed the result to meet preconceived ideas; commencing with the Likert style questions using a scale of one to five (with one the lowest and five the highest level of agreement) allowed participants to proceed quickly through the quantifiable elements while offering a nuanced range of responses (Thomas, 2013, p. 214). This style of “scale” question allows for opinions to be presented easily; for those who like to explain in more detail, and to have open ended and creative thoughts, the qualitative examples were provided later in the survey (Thomas, 2013, p. 215). Questions needed to cover contextual, explanatory, evaluative and generative options (Ritchie, 2013, p. 27) to allow this report to describe and critique the current, suggest what might be possible and enable recommendations that might be educationally transformational (Robinson & Aronica, 2015, p. 58). The final questions were designed to evoke creative responses and raise the potential for the future of LMS for the next generation of learners, where the ideal system should be more of a learning environment or ecosystem, fitted together in the manner of building blocks to suit subject specific requirements (Straumsheim, 2015, p. 7).

 

In order to ensure clarity and precision (Thomas, 2013, p. 207), the surveys were trialled with fellow university students and work colleagues, including the school’s technical staff, who have strong knowledge of SIMON. Despite this there were some elements that might have offered a different insight: gender and year level of students for example and teaching methods of the staff; such omissions are typical of mixed method research, and hard to avoid in short time frames, especially by relatively inexperienced researchers as myself (Algahtani, 2014, p. 16).

 

The anticipated findings were targeted at establishing the overall satisfaction and learner engagement with SIMON’s functions in terms of organisation, pacing of work, access to resources, collection of materials, class discussion, and feedback, as outlined in the work of Rubin et al (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). The study had to identify the enabling functions as distinct from the hindrances, and whether they were impacting on design of and access to course materials in a positive or negative manner (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). Ease of navigation and number of clicks to access items can facilitate learning, where the inverse will frustrate users and lead to avoidance of features; this is particularly true of feedback. If Blackboard v.12 took twelve click to achieve something that Moodle could do with one, how did SIMON compare (Rubin, Fernandes, Avgerinou, & Moore, 2010, pp. 82-83)?

 

Critical Evaluation:

The Survey Findings:

The survey resulted in thirty-three teacher and sixty-eight student responses, or 47% and 31% respectively. All teaching staff were invited to participate, but only one Junior Campus teacher accepted the opportunity. Another emailed and said it wasn’t really relevant to them. Given that these teachers generally only use a small number of SIMON’s features this was not unreasonable. Teachers on small part-time loads were also not expected to participate; therefore this result was better than expected in the last week of term. Students from Year Nine to Year 12, who have one to one device access were the target population, and the number of respondents for the busy last week was also pleasing.

 

While every recommended avenue had been explored in terms of how to set up a valid survey instrument and pretesting had occurred (Elias, 2015), there were still unexpected outcomes. Omissions and problems arose from the survey’s construction. It would have been helpful to know the gender of recipients given that this has been a factor in a number of other research results, not only Algahtani’s where such issues would be expected (Algahtani, 2014). It would also have been helpful to ascertain the teaching areas of the staff, relative age group of each teacher (or years of experience), and the year levels of the students as was done by Kopcha (Kopcha, 2012, p. 1116).  Including questions to elicit this information would have enabled more targeted recommendations.

 

The use of Likert scale questions in the introductory part of the survey worked well, and respondents benefitted from having a five point scale. The usage responses indicated that 28% of teachers (see teacher and student results below) believe that they use SIMON to some extent or a great extent, while 53% of students (see teacher and student results below) reported that their teachers used SIMON at this level. This is an example where interpretation of results would have been more meaningful if the subjects being taught were known. Staff and students were generally more positive that SIMON supported their teaching and learning in some manner than they were negative.

 

Another anomaly of the type referred to above was revealed by the yes or no option relating to the uploading of work question (see teacher and student results below) where 81% of teachers reported that they did not ask students to do this, but 31% of students said that they did provide work to their teachers in this manner. An astonishing percentage of students reported video and audio feedback being provided (71%) where only 24% of teachers said that they provided this (see teacher and student results below). A follow-up question here on which subjects were making use of this facility would have been beneficial in terms of recommendations, especially if responding teachers had been asked to indicate their faculty.

 

Moving from Likert scale questions and yes or no option to open-ended responses proved valuable on both surveys, as had been anticipated. The number of respondents who completed these optional questions was very pleasing. The slight difference in questions between the two surveys was deliberate to allow for the differing access teachers have to the LMS compared to students. The responses to most of the common questions demonstrated a close correlation between teacher understanding and student use, with a couple of exceptions such as those outlined above.

 

A summation of feelings towards the LMS elicited by the surveys indicated a strong acceptance of the technology. This has been written about by researchers reviewing usage through technology acceptance models (TAM) (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689). As the school community concerned is technologically experienced, this was expected.  Results also demonstrated that while many users verbally describe a love-hate relationship with SIMON, the use of survey methodology produced more considered feedback (Straumsheim, 2015, para 3). Of the eighteen affordances Schoonenboom lists as desirable in an LMS fourteen are possible using SIMON; only meetings, online examinations, peer feedback, and open hours are not possible in the same manner as she describes (Schoonenboom, 2014, p. 248). Interestingly, the questions aimed at improving SIMON (17 – 19 for teachers and 18 – 19 for students) did not request any of these aspects be made available.

The broad overview of the findings from the open-ended comments (Appendix page) indicated that teachers enjoy the reporting facility because it links to the assessment module and saves them work. The most frustrating facet for both teachers and students is the number of clicks it requires to access work (51%). Students highlighted the inconsistent usage of the LMS by their teachers, and sometimes indicated that components are being incorrectly used: all student work should be in the “curriculum documents” section but some teachers are placing it in “general documents”. While there is an historic reason that may have led to this, it should no longer occur. Reporting live through assessment tasks should indicate more clearly that work is linked to the curriculum module.

 

Facets identified:

Five equal facets should  be provided by any LMS: interoperability, personalisation, analytics, collaboration and accessibility (Straumsheim, 2015, para 6), and, according to the findings, SIMON delivers all of these to some degree. Taking interoperability first, it most elearning tools that teachers currently use with their classes can be accommodated, either by linking the document to the system (such as collaborative OneNote notebooks), locating a file in the system, or providing a weblink.

 

Personalisation is also possible and has led to some confusion as evidenced in the responses.  The school concerned has added a number of links, for example, to library services, which some respondents find bewildering. It does however correspond to the findings reported by Horn et al that a range of “library help objects” and links to resources accounts for user differences and support needs (Horn, Maddox, Hagel, Currie, & Owen, 2013, pp. 238-240).

 

Analytics are available to teachers for administrative use, such as knowing if a student is present or has submitted their work, and also to the LMS administrator for checking the number of log ins for example. The teacher who suggested that it would be good if SIMON could calculate School Assessment Percentages (currently done through Excel 365) would be surprised to know that with teacher education in derived scores, it could.

 

Collaboration was not raised by respondents, although some referred to using the forum space. This is probably SIMON’s weakest suite, but looking at what is planned for the next software update, school, parent, student interaction should be improved (Simkin, 2015 c). Lonn and Teasleys’ research indicates that few users used or rated the interactive tools, preferring to comment on the tools that push out to or collect work from students (Lonn & Teasley, 2009, p. 693).  Collaboration through internal communities of practice and self-organising networks should become more common in the near future as more teachers look to make global connections, and the Senior Campus moves to a one-to-one device model in 2006 (Archer, 2006, p. 67)

 

 

In terms of accessibility, one teacher, who followed up on his survey by sending an email with more detailed information (Budenberg, 2015), found that most of his issues were due to lack of instruction during orientation. In a meeting to resolve some of his issues, Tim passed a comment that SIMON was like a Swiss army multi-purpose knife, citing almost word for word a comment from Alier et al which alludes to the fact that numerous tools, while helpful, may not offer the best solution for the purpose  (Alier, et al., 2012, p. 107).  His prior experience with Daymap, an LMS with the ability to email parents of a class with one click, was raised face-to-face. SIMON is a much cheaper solution.

 

 

Recommendations:

The case study has achieved its goal of leading to a number of recommendations for the school under evaluation. Given that no LMS will answer everyone’s needs, it is better to work with the one that is currently provided and maximise its strengths while minimising its weaknesses (Leaman, 2015 para 7).  In this setting there is the added benefit of access to the developers.

The following recommendations will be passed to the designated, relevant groups.

For SIMON developers:

  1. While interoperability between a range of platforms and SIMON is good, retrieval of information in terms of convolution (number of clicks) and lack of search functionality is a hindrance. This requires simplification in some form.
  2. Collaboration through a range of means: chat, peer assessment and an improved forum interface would be well regarded as beneficial to communities of practice.

For the School Executive:

  1. More effective mentoring of new teachers and ongoing in-servicing of all teaching staff would improve usage for students, thereby enhancing learning.
  2. A clear and consistent statement of expectations for usage by teachers appears to be unclear. Teachers need to model SIMON usage to students more effectively.

For the Teaching and Learning Committee:

Discussion is required to consider the following:

  1. Options for the provision of face-to-face assistance with SIMON mastery need to be provided for teachers and students (beyond their faculty or subject teachers).
  2. Opportunities for learning new aspects of SIMON at relevant times, for example when software is upgraded.
  3. Which LMS facets that may be suggested in other systems are missing from SIMON that are considered desirable.
  4. These survey findings – to enable improved practice.

 

For the Information Services Department:

  • That the location of library related information within the LMS be revisited and evaluated in terms of the most effective location/s for accessing them.

 

Conclusion:

The school studied has been using the SIMON Learning Management System for several years   yet the uptake varies enormously. Some teachers and students rarely use it, others use all aspects of it really well. The reporting package is compulsory and has been effectively used and appreciated by most teachers. Usage of the other features has been inconsistent. This report reveals those elements that have been used, the users’ experience with the LMS, and the outcomes that have been enabled for them through such use.  It is important to determine why some elements have been used, and others avoided. Steps should be taken to improve use, and consider the potential impact of change for learning.

References

Algahtani, M. (2014). Factors influencing the adoption of learning management systems in the Kingdom of Saudi Arabian Universities by female academic staff. Research proposal for confirmation of candidature (PhD) DR209 16th July 2014. Received by personal communication from Bradbeer, Susan, through a dropbox link provided by a lecturer at RMIT, 17 September 2015

Alier, M., Mayol, E., Casan, M. J., Piguillem, J., Merriman, J. W., Conde, M. A., . . . Severance, C. (2012). Clustering projects for interoperability. Journal of Universal Computer Science, 18(1), 106-222.

Archer, N. (2006). A Classification of Communities of Practice. In Encyclopedia of Communities of Practice in Information and Knowledge Management (pp. 21-29). Informationn Science Reference (an imprint of IGI Global).

Budenberg, T. (2015, September 16). personal email. A request for your assistance.

De Smet, C., Bourgonjon, J., De Wever, B., Schellens, T., & Valke, M. (2012). Researching instructional use and the acceptation of learning management systems by secondary school teachers. Computers & Education, 688-696. doi:10.1016/j.compedu.2011.09.013

Elias, L. (2015, February). Intelligent Questionnaire Design for Effective Participant Evaluations. Training and Development, 8-10.

Horn, A., Maddox, A., Hagel, P., Currie, M., & Owen, S. (2013). Enbedded library services: Beyond chance encounters for students from low SES backgrounds. Australian Academic and Research Libraries, 44 (4), pp. 235 – 250. doi:10.1080/00048623.2013.862149

Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: a critical incident technique approach. 249-261. doi:10.1016/j.chb.2013.09.010

Kopcha, T. J. (2012). Teachers’ perceptions of the barriers to technology integration and practices with technology under situated professional development. Computers & Education, 1109 – 1121. doi:10.1016/j.compedu.2012.05.014

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of Learning Management Systems. Computers & Education(53), 686–694. doi:10.1016/j.compedu.2009.04.008

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89. doi:10.1109/LaTiCE.2013.31

McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers & Education, 496 – 508. doi:10.1016/j.compendu.2008.10.002

Ritchie, J. L. (2013). Qualitative research practice: A guide for social science students and researchers. Great Britain: Sage.

Robinson, K., & Aronica, L. (2015). Creative Schools: Revolutionizing Education From The Ground Up. Melbourne: Allen Lane.

Rubin, B., Fernandes, R., Avgerinou, M. D., & Moore, J. (2010). The effect of learning management systems on student and faulty outcomes. Internet and Higher Education, 82 – 83. doi:10.1016/j.iheduc.2009.10.008

Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why intsructors in higher education intend to use some learning management system tools more than others. Computers & Education, pp. 247 – 256. doi:10.1016/j.compedu.2013.09.016

Simkin, M. (2015 a, August 17). Article review. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/08/17/article-review/

Simkin, M. (2015 b, October 6). SIMON. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/10/06/simon/

SIMON Solutions. (2009). Retrieved from SIMON: http://www.simonschools.net/about-simon.html

Straumsheim, C. (2015, May 11). Brick by Brick. Retrieved from Inside Higher Ed: https://www.insidehighered.com/news/2015/05/11/educause-releases-blueprint-next-generation-learning-management-systems

Thomas, G. (2013). How To Do Your Research Project; A Guide For Students in Education and Applied Social Science. London: SAGE.

Yasar, O., & Adiguzel, T. (2010). A working successor of learning management systems: SLOODLE. Procedia Social and Behavioural Sciences 2, 5682 – 5685. doi:10.1016/j.sbspro.2010.03.928

Teacher surveys

Student surveys 

Colloquium 5

Professional Learning and Leadership

The fifth and final visitor led presentation came from Cathie Howe (https://www.linkedin.com/in/cathiehowe) who spoke to us about her work as Professional Learning and Leadership Coordinator, NSW DEC, & Manager Macquarie ICT Innovations Centre http://www.macict.edu.au/ .

MacICT

As we have come to expect from a Colloquium, we learned of another example of knowledge networking and digital innovation which is impacting on the skills of teachers throughout New South Wales. Cathie’s work is centred on meeting the needs of C21t learners by improving teaching and learning in the connected age. She shared this image to summarise this philosophy:

C21st learner

As with all of the colloquiums we have experienced, this presentation also revisited concepts from the first subject in our course (INF 530) such as future work skills which is the motivation behind MacICT.

Pic link to 530

(Institute for the Future, 2011)

Cathie’s presentation was delivered with enthusiasm and she was happy to digress in order to answer our questions. The fact that the colloquium continued beyond the allocated time slot was testimony that she had engaged our class with her material.

Despite Internet connectivity issues making themselves known in every session in a range of ways which were annoying rather than terminal, these sessions were a demonstration of the potential for anywhere, anytime learning which was different to the methodology of a flipped classroom or Khan Academy. The latter examples are more akin to sage on the stage teaching while our colloquiums have seen one or more presenters engage with people in real-time and through oral and text connectivity.

Plenty of food for thought in terms of applications to “classroom” teaching where the room has no walls!

Thanks are due to all the presenters for the semester. Thanks too to Julie for finding such a range of fascinating presenters to further our educational program.

References

Howe, C. [Host] (2015, September 24). Colloquium 5.

Institute for the Future. (2011). Future Work Skills 2020. Retrieved October 4, 2015, from Institute for the Future: http://www.iftf.org/futureworkskills/

Macquarie University; Department of Education, New South Wales. (n.d.). MacICT. Retrieved October 4, 2015, from MacICT Macquarie ICT Innovations Centre: http://www.macict.edu.au/

 

 

 

PLE & PLN – it’s us!

Open and Social Learning according to Alec Couros:

An open course entitled Education, Curriculum, and Instruction: Open, Connected, Social using Free Ope Source Software through the University of Regina, was implemented in 2008. (Couros, 2010, p. 109). It was based on personal learning networks, and participants quickly realised the value of sustainable knowledge networks. This led to a context built around a series of events which quickly absorbed participants in an engaged community of participation (Couros, 2010, p. 110).

The theoretical foundations of the course were

The open movement (Couros, 2010, p. 111).

Complementary learning theories – social cognitive, social constructivism, andragogy, connectivism, and open teaching (Couros, 2010, pp. 112-115).

The primary learning environment was established collaboratively in the weeks preceding the course. The tools considered were:

Web CT (now Blackboard) – pros: familiar to students and the university had a strong infrastructure of support; cons: proprietary (modifications needed vendor support); directed learning favoured over constructivist; expensive licensing fees.

Moodle – pros: free; open source; modifiable, strong community support; touts a constructivist and social constructivist approach; available. Cons: needs PHP server infrastructure; requires technical expertise leading to hidden costs; software not as available as hoped; course-centric not student-centric; top-down instructivist approach.

Ning – pros: ease of use; freely available in 2008; familiar functionality similar to Facebook; community and individual privacy levels; user-centric spaces; content aggregation; communication tools. Cons: no wiki feature; awkward to add core content material.

Wikispaces – pros: senior, best-known and most stable of wiki providers; solid technical support; theme modification options; simple user interface – see http://eci831.ca/ (Couros, 2010, pp. 117 – 119).

The course required the establishment of a PLN, and it was mandatory that participants developed a personal blog/digital portfolio, participated in a collaborative wiki resource ( no longer active but was located at  http://t4tl.wikispaces.com; this is what happens when such a site is not paid for!) and completed a major digital project (sound like INF 530!) (Couros, 2010, pp. 119 -120).

The course was based on the following tools and interactions:

Synchronous activities: two events per week of between 1.5 and 2 hours in length; the first based on content knowledge (like our INF 537 colloquiums); the second on teaching skills (Couros, 2010, pp. 120-121).

Asynchronous activities: researching and blogging; shared bookmarking; artefact creation; participation in open professional development opportunities; creating content and uploading it to sites such as YouTube; microblogging; collaborative lesson design and contribution to the course wiki (Couros, 2010, pp. 121-122).

Knowledge networks and digital innovation’s forerunner?? Just like INF 530 and INF 536, students developed authentic, dynamic and fluid interactions both within the designated course spaces and in spaces they chose and shared themselves.

Defining Personal Learning Environments, and comparing them to Personal Learning Networks was an exercise undertaken by Couros through Twitter and recorded at http://educationaltechnology.ca/couros/1156. Key agreement indicated that PLEs are the tools, artefacts, processes, and physical connections that allow learners to control and manage their learning (Couros, 2010, p. 125). PLNs explicitly include the human connections that result in the advancement and enabling of a PLE (Couros, 2010, p. 125).

Couros makes the following recommendations for those wishing to use PLNs for teaching and learning:

  • Immersion by participants
  • Social media literacy
  • Active contributions strengthen your PLN
  • Know your “followers” or “friends”
  • PLNs are central to learning for sustained and long-term growth in both facilitators and students(Couros, 2010, pp. 125 -126).

The participatory learning communities developed by courses such as the one Couros describes continue to exist because they are not based around courses per se, but around communal learning (Couros, 2010, p. 127). Those of us taking the Knowledge Networks and Digital Innovation course can already attest to that in terms of the subjects we have already finished because for many of us the content continues to be shared and discussed. If Couros is correct, this course will never have to end – now there’s a challenge to my PLN!

Reference

Couros, A. (2010). Developing personal learning networks for open and social learning. In Veletsianos, G. (Ed.), Emerging technologies in distance education (109–128). Athabasca University: AU Press.

Article Review

Intelligent Questionnaire Design for Effective Participant Evaluation by Lisa Elias

Step 1: Before designing a survey it is critical that the objectives are identified – what is to be achieved and why is the survey necessary? This means considering the nature of the people who will be surveyed, those who will gain information and the purpose of the task itself.

Step 2. Then, write the questions in a clear, well thought out manner based on the objectives outlined in step 1. In this way, the data collected will be high quality and applicable to the needs of all concerned (Elias, 2015, p. 8).

Ensure that questions are:

Clear and unambiguous

Concise

Neutrally worded

Avoid embarrassment – omit or minimise sensitive topics

Ensure respondents’ privacy

Select the question formats with the objectives clearly in mind. A mix of question types will elicit the best data.

Question types to consider:

Yes/no – quick response enabling simple comparisons

Multiple choice – only one selection or multiple selection?

Likert scales – demonstrate a rating per respondent on a common scale

Open-ended responses – time-consuming to analyse but rich qualitative data. Use sparingly.

Alternative responses – allow respondents to opt out or provide their own answer

Ordinal/ranking – a series of items that respondents are asked to rank (for example from 1 to 5                         where 1 is most important and 5 least important; each number can only be used once)

Format the survey by considering the most logical layout to achieve your aims. This avoids confusing the respondents and makes analysis easier.

Introduction – explain why the survey has been established and convince people that                                    participation is valuable and worth their time and effort.

Order and group the questions according to the format you have deemed most logical.

Initial questions should be impersonal and easy to answer so that respondents continue.

Short is best (Elias, 2015, p. 9)…

But ensure the information will be adequate for the purpose.

Use contingency questions if applicable so that people do not have to answer questions                                   irrelevant to them. A preliminary question should ascertain how many, if any, questions of                           the following set need to be tackled.

Use a progress indicator for online surveys – it shows respondents how far they have to go.

Thank participants and provide your contact details.

Likert Scales should be:

labelled e.g. Poor (1) ranging to excellent (5)

Consist of an odd numbered scale so there is a mid-point – 5 or 7 options have proven best

Follow the same value pattern – either left to right or right to left

Make sure the words applied to the scale allow for the full range of responses

Elias provides a very helpful checklist to use when construction questionnaires:

  1. Has the survey been test-driven?
  2. Do others find the layout clear?
  3. Is the purpose explicitly explained?
  4. Have respondents been thanked?
  5. Is anonymity and confidentiality of data been guaranteed?
  6. Are instructions clear and precise?
  7. No duplication?
  8. Are questions plain and unequivocal?
  9. Are all questions essential?
  10. Are questions correctly ordered?
  11. Will closed questions result in the expected numerical data required?
  12. Are open text options sparingly used?
  13. Is there sufficient time for completion(Elias, 2015, p. 10)?

Reference:

Elias, L. (2015, February). Intelligent Questionnaire Design for Effective Participant Evaluations. Training and Development, 8-10.

 

#4

Tim Klapdor

Online Learning Technology Leader, Charles Sturt University

Tim presented some challenging points in his Colloquium entitled: You Are Not in Control (Klapdor, You Are Not In Control, 2015). The manner in which his introductory slide was set up, with the NOT inserted in a different colour to the rest of the title – almost as an afterthought, was an indication that seat belts would need to be fastened for the journey.

He quickly moved from this somewhat disturbing title to explaining that the strongest networks had the best nodes, and that these nodes comprise of the individuals whose knowledge and wisdom in relation to networking have made them the best. Momentarily lulled into a sense of security, he then went on to explain that the systems in which networking occur do not allow individuals to have autonomy or ownership (Klapdor, You Are Not In Control, 2015).

The challenging question of who owns the data, controls our identities and defines who we are in online spaces was then posed (Klapdor, You Are Not In Control, 2015). Suddenly, networking looked less inviting – are we in control of anything or are the systems controlling us? Who actually connects us to our social groups – the social media we employ, or each of the users within the network?

Within our cohort, few have their own domain registered, although we are all very engaged in such spaces. Should we be worried? And anyone who has engaged with digital space has “lost” access to such spaces. In my case I have invested heavily in Ning when it was new and free, PB wiki (ditto), TakingIT Global (when it was provided at work then not renewed), and Moodle (before the administration at school decided to change to a different LMS). All the work I did in these places is lost to me. I have similar concerns about this blog once I finish this course.

Tim then proceeded to describe the rise of those who would confine us to some defined spaces, while locking us out of others, much in the way that the feudal lords enclosed the commons and made them exclusive (Klapdor, You Are Not In Control, 2015). He raised the roles of copyright and licensing and then offered the solace of the rise of the hackers.

On his blog, he offers some solutions to consider for those who want to mind and manage their own learning. This is summarised here:

A suggested range of solutions?
A suggested range of solutions?

Overall a thought provoking presentation. I guess all we can do is think carefully about what we do online, where we do it, and the longevity or transience of our decisions. If something is special we need to try and future proof it. When we die, we need to consider the disposal, or dispersal of our digital remains and ensure that we define our need to endure on our networks or to be forgotten.

Thanks for the roller coaster ride Tim! You may now unbuckle your seatbelt.

References

Klapdor, T. (2015, June 16). Make Your Own Slogan MYOS. Retrieved from TimKlapdor: https://timklapdor.wordpress.com/2015/06/16/make-your-own-slogan-myos/

Klapdor, T. [Host] (2015, August 13). You Are Not In Control.

#3

Blog post for Colloquium 3

What does ‘flat’ learning look like?

Flat connected learning incorporates aspects of Collaboration, Project based learning, Blended learning, Flipped learning, and Inquiry-based learning established within a framework based on a combination of Web 2.0, leadership, pedagogy and learning design (Lindsay, n.d.). In many ways, this sums up the reality of teaching and learning in an era of rapid technological development and pedagogical change.

 

It also encapsulates the five stage taxonomy of online, global learning:

  1. Online interactions
  2. Real encounters
  3. Online learning
  4. Community of practice
  5. Learning collaboratives (Lindsay J., 2015)

 

According to Julie the norms of global collaboration begin with being prepared; depend on having a purpose; require the ability to paraphrase, perceive, and participate; entail a positive mindset and productive nature; and are based on the ability to detect the potential in situations (Lindsay J. , 2015).

Pedagogical change evolves from being able to approach learning design with a flexible attitude, engaging with professional learning in a progressive manner, and adopting the essential elements of conceptual change (Lindsay J., 2015).

In this scenario the teacher is viewed as an activator and the student as an active participant in the process, while the school provides the conduit, and the community is seen as a partner in learning (Lindsay J., 2015).

Once the technological requirements are in place, and teachers have knowledge of new ways of meaningful engagement through TPACK and SAMR, and the belief that such pedagogy is important, flat connections and global learning become realistic options for developing knowledge and wisdom (Lindsay J., 2015). Such an approach leads to cosmogogy: the study of learning through connection to the world through the digital technologies available today. In such a scenario the context lies in learning with, not about, and geo-location is irrelevant (Lindsay J., 2015).

This presentation was a great introduction to the peer presentations relating to selected chapters of Wang’s extensive tome (Wang, 2014). These expositions demonstrated a potential for school adaptation where senior secondary students could lighten the load for each other in collaboratively summarising text. It certainly was of benefit to our cohort in this subject.

Three colloquiums, three very different ways of doing business – and all of them useful and thought provoking.

References

Lindsay, J. [. (2015, August 6). Colloquium 3: Flat Classrooms.

Lindsay, J. (n.d.). Flat Learning. Retrieved August 11, 2015, from Flat Collections: http://www.flatconnections.com/flat-learning.html

Wang, V. (. (2014). Handbook of research on education and technology in a changing society. London: IGI Global.

 

 

 

 

ICT Horizons

The NMC Horizon Report 2015 K-12 and links to Wang and Weller readings:

The current edition of the Horizon report can be found here and a commentary on what it means for education can be found at the Mind Shift blog. It is always thought provoking to investigate this report and much of the content resonates with the subjects I have taken as part of my course.

This diagram gives a brief overview of this year’s findings:

Challenges

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The report this year sees integrating technology onto teacher education as solvable and cites the Finnish example of using Edukata (a participatory design model):

The more difficult, or wicked, challenge is scaling the models of teaching innovations(Johnson, Adams Becker, Estrada, & Freeman, 2015, p. 1).

 

 

 

 

 

 

 

New LMS

Learning Management and Knowledge Networking:

Coming to grips with a new learning management system, based on Blackboard, particularly after the long summer break, was a little confronting. Just when I thought I knew where to find what I needed, I found I had no idea.

Discussion forum
Discussion forum

The topics were all there, but the email notifications for responses did not seem to work all the time, and it became a case of checking in at log on and working out which thread had new messages.

Threads on Interact 2 were a little tricky
Threads on Interact 2 were a little tricky

Yes, there were fewer students in the cohort, but few participated in this type of networked learning in the manner peers had communicated in INF530 and INF536.

A sub-group of the subject became very active in their own PLN, using Twitter for regular question and answer sessions, and touching base with issues and concerns.

Tweeting Example