Appraising calibre, authenticating content: (not AC/DC but AC/AC!)

We are living in an era where information is readily available, easily created, generally unedited or moderated, and widely shared. It is vital that readers have the capacity to appraise the calibre of content they encounter. Yet, it would appear that even students entering renowned universities cannot apply even the most basic of filters to images or documents presented to them (Weinberg, 2016).

Some simple starting points:

valenza-based-checklist
Based on Some Rules of Thumb – a guide to assessing online news and adapted to suit all types of information (Valenza, 2016).

Without applying a filter, or lens to what we read we run the risk of spreading misinformation, thereby perpetuating deliberately created and often specifically targeted fabrications which may be destabilising to governments or undermining to individuals. Far from choosing to be part of such a process, many are inadvertently passed on because people aren’t taking the time to evaluate sources (Tiffany, 2016).

Teacher-librarians such as Valenza promote their role as critical in educating more news literate and savvy information consumers. Tiffany states that this is more effective the earlier that students encounter such educators (Tiffany, 2016).

Coupled with the relatively recent rise in the spreading of “untruthiness”, is the concept held by many that free press equates to neutral information (Valenza, 2016). History teachers are adept at demonstrating that the underlying perspective of the creator, or interpretation of the historian affects the way in which the information s viewed. Much harder to teach, however, is the effect our own attitudes and biases affect the way in which we read and often lead us to ignore viewpoints that differ from our own (Valenza, 2016).

Teaching younger students about appraising calibre and authenticating content is made a little easier by using a resource such as the TED talk on “How to choose your own news” (Brown, 2014) – an engaging animation.

There is no doubt that there has been an exponential increase in the publication of extreme, untrue and misleading “fake news” since the rise of social media such as Twitter and Facebook, partly due to the fact that the number of clicks may equate to real income for the posters (Garun, 2016). This poses a real issue for the founders of such sites, such as Facebook’s Mark Zuckerberg, who has expressed concern at the site being forced into becoming arbiters of truth (Liptak, 2016). The sites on which such “untruthiness” is spread have become known for fostering click bait (Zimdars, 2016).

There have also been allegations that social platforms influenced election results in several countries in 2016 (Garun, 2016). This of itself may not be all bad – but it does indicate the serious need for teaching readers how to negotiate the publications of our time by understanding the underlying purpose of the publications to which they are exposed, and to question the authenticity of what they read, in much the same way that commercial transactions advise that the buyer must be aware. It is critical that leading universities such as Stanford do not continue to find that their students are vulnerable to fake news (Weinberg, 2016).

It is crucial that Australian students are able to learn within their own context about the ways this can be an issue locally, as well as seeing information relating to the United States in particular.  We need to be developing Australian resources to support teaching the necessary skills.

As a teacher-librarian and History teacher I am up for the challenge – are you? Join the conversation at #truthinessEDU

References

Brown, D. (Writer), & Harris-Norico, A. (Director). (2014). How to Choose Your News [Motion Picture]. TedED. Retrieved December 3, 2016, from http://ed.ted.com/lessons/how-to-choose-your-news-damon-brown

Garun, N. (2016, November 14). How social platforms influenced the 2016 election. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/14/13626694/election-2016-trending-social-media-facebook-twitter-influence

Liptak, A. (2016, November 13). Mark Zuckerberg warns about Facebook ‘becoming arbiters of truth’. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/13/13613566/mark-zuckerberg-facebook-misinformation-hoax-media

Tiffany, K. (2016, November 16). In the war on fake news, school librarians have a huge role to play. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/16/13637294/school-libraries-information-literacy-fake-news-election-2016

Valenza, J. (2016, 26 November). Truth, truthiness, triangulation: A news literacy toolkit for a “post-truth” world. Retrieved December 3, 2016, from School Library Journal: http://blogs.slj.com/neverendingsearch/2016/11/26/truth-truthiness-triangulation-and-the-librarian-way-a-news-literacy-toolkit-for-a-post-truth-world/

Weinberg, S. (2016, November 26). Stanford Study Finds Most Students Vulnerable To Fake News. (K. McEvers, Interviewer) Retrieved from http://www.npr.org/2016/11/22/503052574/stanford-study-finds-most-students-vulnerable-to-fake-news

Zimdars, M. (. (2016, November 15 ?). False, Misleading, Clickbait-y, and/or Satirical “News” Sources. Retrieved December 3, 2016, from http://d279m997dpfwgl.cloudfront.net/wp/2016/11/Resource-False-Misleading-Clickbait-y-and-Satirical-%E2%80%9CNews%E2%80%9D-Sources-1.pdf

 

Final Report

Case Study Research Report:

Learning Management System utilisation by teachers and students at a regional Victorian school.

How well are the affordances of the SIMON LMS being used by teachers and students in one specified school setting?

Executive summary:

 

Learning Managements Systems (LMS) are web based products which have been used by most universities for a significant period of time. Increasingly schools, particularly at the secondary level, are also investing in such digital tools. This study compares the potential usage of a specific LMS in a small, regional, kindergarten to Year 12, Victorian Independent School with those aspects that teachers and students are actually using. Information was garnered by use of online surveys, and the findings suggest not only wide acceptance of some affordances by both teachers and students, but also ignorance of the potential of others. The Primary Campus usage is minimal, for a number of reasons. The data were comparable with results obtained by researchers working on LMS reviews in other institutions, predominantly universities.

 

The Nature and Context of the Case Study:

 

This case study report presents the results of an empirical inquiry investigating the extent to which the SIMON (LMS) (SIMON Solutions, 2009), as one example of a contemporary educational phenomenon, is being used to improve teaching and learning within the context of a specific regional Victorian school.  The inquiry was framed to discover the degree of usage by teachers and students, the individual uptake of those functions offered by the LMS compared to features not adopted, and the perceived advantages, problems and potential of this specific educational software. The underlying purpose was to understand user needs and perspectives, thereby identifying aspects of usage with which users of this LMS may require support, in order to improve the school’s knowledge networking and opportunities for digital innovation.

 

 

Terminology:

LMS, also referred to as Virtual Learning Environments, Digital Learning Environments, Course Management Systems or Electronic Learning Environments are web based applications which are accessible wherever an Internet connection allows (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 688).  While a significant amount of research has been conducted on the impact of such systems in universities, where, for example, uptake in Britain by 2005 was reported at ninety five percent (McGill & Klobas, 2009, p. 496), there are fewer examples focused on schools, and these are not K-12 settings. The circumstances of the chosen setting are therefore different to those institutions reported on in other academic literature.

 

SIMON is a learning management program, created in 2000 by practicing teachers at St Patrick’s College in Ballarat (Simkin, SIMON, 2015 c). It is now owned by the Ballarat Diocese and the original developers are still involved in managing its evolution. Whilst originally used in Catholic schools within this Diocese, SIMON usage has extended to other educational jurisdictions and Australian states. The school on which this case study focuses, was one of the first Independent Schools to adopt the program, moving to SIMON from Moodle about five years ago. It has also developed a relatively collaborative relationship with the founders of the LMS, by suggesting possible changes; an aspect of the specific context that does not apply to many other schools using the same product.

 

Constraints:

The decision to focus on reviewing one LMS in a single school, was selected to meet the constraints of the timeframe available for conducting the research, and the stipulations outlined for the writing of the research report. The chosen school has been using SIMON for six years, however employment of the system has been observably inconsistent from both a teaching and learning perspective. There is, therefore, potential to use the findings of this investigation to lead to improvement. Three forms of understanding are required before educational transformation can occur: a critique of the current, a vision of the desired and a theory for guiding the situation from where it is to where it should be in order to achieve better outcomes (Robinson & Aronica, 2015, p. 58). This sentiment encapsulates the intention of this case study research as investing in an LMS should result in measurable return on investment (Leaman, 2015, para 1).

The Process:

Literature Review:

The process necessitated commencing with a review of relevant literature, taking guidance from Thomas, the quality of material and the publications it was coming from were the first criteria,  including following up references to literature reviewed within the sources investigated (Thomas, 2013, pp. 60-61).  Most titles were retrieved from the university library, but one was provided through a Twitter connection which led to Professor Harland (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013) and another from a colleague, an intriguing and very specific research proposal highlighting issues which apply to segregated education, but which also reminded of the challenges of mixing methodology, and that awareness is not the same thing as use when it comes to LMS (Algahtani, 2014, p. 16).  These papers revealed some common themes surrounding LMS research, as outlined below.

 

Commonalities:

Research has predominantly considered the major LMS providers, notably Blackboard (which now incorporates WebCT and ANGEL (Islam, 2014, p. 252)), but also Dokeos, Smartschool (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689), Sakai (Lonn & Teasley, 2009, p. 687), Desire2Learn (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82) and the popular open source Moodle.  Some papers analyse usage of several LMSs, while others compare the utility offered by different options such as Facebook (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013, p. 83), and SLOODLE (Moodle incorporated with Second Life as a 3D virtual learning environment) (Yasar & Adiguzel, 2010, pp. 5683 – 5685).

 

The majority of the literature was based on surveys, so the decision to collect information through online surveys was validated. Given that the SIMON interface is different for teachers compared to students, two surveys were required. These were constructed using Google Forms.

 

Limitations:

A Parent Access Model survey is being developed for future use to strengthen the evaluation process and enhance the practical application of the recommendations. Lack of time and access to this module for the researcher precluded it from the case study. Use of SIMON by the Primary Campus would benefit from further discussion also. Analysing the purpose and style of the questionnaire was a vital stating point  (Elias, 2015), therefore the main elements of Elias’ work informed the overall structure (Simkin, 2015 a).

 

Survey Methodology:

Qualitative and quantitative surveys elicit very different information, and the literature review resulted in the decision to incorporate both styles of questioning. Qualitative methodology enables detailed descriptions to be provided that are not constrained by the researcher, enabling the respondents to elaborate on the things that matter to them (Ritchie, 2013, p. 4). The style of qualitative questions accessed aspects of critical theory enabling an understanding of the intersect between material conditions and their influence on human behaviour (Ritchie, 2013, p. 12). For example, the last two questions on both surveys (Appendix pages 23-25; 34-35) were ontologically focussed, aiming to compare realistic responses with idealistic possibilities.

 

Selecting the right tool for the anticipated outcome also required quantitative data gathering: deciding appropriate topics to assess by checklists (Appendix pages 20 &  ) compared to items that needed to be evaluated through Likert scale questions (Appendix page) followed (Thomas, 2013, pp. 209-215). It was important to set such questions up in a neutral manner, rather than in a way that directed the result to meet preconceived ideas; commencing with the Likert style questions using a scale of one to five (with one the lowest and five the highest level of agreement) allowed participants to proceed quickly through the quantifiable elements while offering a nuanced range of responses (Thomas, 2013, p. 214). This style of “scale” question allows for opinions to be presented easily; for those who like to explain in more detail, and to have open ended and creative thoughts, the qualitative examples were provided later in the survey (Thomas, 2013, p. 215). Questions needed to cover contextual, explanatory, evaluative and generative options (Ritchie, 2013, p. 27) to allow this report to describe and critique the current, suggest what might be possible and enable recommendations that might be educationally transformational (Robinson & Aronica, 2015, p. 58). The final questions were designed to evoke creative responses and raise the potential for the future of LMS for the next generation of learners, where the ideal system should be more of a learning environment or ecosystem, fitted together in the manner of building blocks to suit subject specific requirements (Straumsheim, 2015, p. 7).

 

In order to ensure clarity and precision (Thomas, 2013, p. 207), the surveys were trialled with fellow university students and work colleagues, including the school’s technical staff, who have strong knowledge of SIMON. Despite this there were some elements that might have offered a different insight: gender and year level of students for example and teaching methods of the staff; such omissions are typical of mixed method research, and hard to avoid in short time frames, especially by relatively inexperienced researchers as myself (Algahtani, 2014, p. 16).

 

The anticipated findings were targeted at establishing the overall satisfaction and learner engagement with SIMON’s functions in terms of organisation, pacing of work, access to resources, collection of materials, class discussion, and feedback, as outlined in the work of Rubin et al (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). The study had to identify the enabling functions as distinct from the hindrances, and whether they were impacting on design of and access to course materials in a positive or negative manner (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). Ease of navigation and number of clicks to access items can facilitate learning, where the inverse will frustrate users and lead to avoidance of features; this is particularly true of feedback. If Blackboard v.12 took twelve click to achieve something that Moodle could do with one, how did SIMON compare (Rubin, Fernandes, Avgerinou, & Moore, 2010, pp. 82-83)?

 

Critical Evaluation:

The Survey Findings:

The survey resulted in thirty-three teacher and sixty-eight student responses, or 47% and 31% respectively. All teaching staff were invited to participate, but only one Junior Campus teacher accepted the opportunity. Another emailed and said it wasn’t really relevant to them. Given that these teachers generally only use a small number of SIMON’s features this was not unreasonable. Teachers on small part-time loads were also not expected to participate; therefore this result was better than expected in the last week of term. Students from Year Nine to Year 12, who have one to one device access were the target population, and the number of respondents for the busy last week was also pleasing.

 

While every recommended avenue had been explored in terms of how to set up a valid survey instrument and pretesting had occurred (Elias, 2015), there were still unexpected outcomes. Omissions and problems arose from the survey’s construction. It would have been helpful to know the gender of recipients given that this has been a factor in a number of other research results, not only Algahtani’s where such issues would be expected (Algahtani, 2014). It would also have been helpful to ascertain the teaching areas of the staff, relative age group of each teacher (or years of experience), and the year levels of the students as was done by Kopcha (Kopcha, 2012, p. 1116).  Including questions to elicit this information would have enabled more targeted recommendations.

 

The use of Likert scale questions in the introductory part of the survey worked well, and respondents benefitted from having a five point scale. The usage responses indicated that 28% of teachers (see teacher and student results below) believe that they use SIMON to some extent or a great extent, while 53% of students (see teacher and student results below) reported that their teachers used SIMON at this level. This is an example where interpretation of results would have been more meaningful if the subjects being taught were known. Staff and students were generally more positive that SIMON supported their teaching and learning in some manner than they were negative.

 

Another anomaly of the type referred to above was revealed by the yes or no option relating to the uploading of work question (see teacher and student results below) where 81% of teachers reported that they did not ask students to do this, but 31% of students said that they did provide work to their teachers in this manner. An astonishing percentage of students reported video and audio feedback being provided (71%) where only 24% of teachers said that they provided this (see teacher and student results below). A follow-up question here on which subjects were making use of this facility would have been beneficial in terms of recommendations, especially if responding teachers had been asked to indicate their faculty.

 

Moving from Likert scale questions and yes or no option to open-ended responses proved valuable on both surveys, as had been anticipated. The number of respondents who completed these optional questions was very pleasing. The slight difference in questions between the two surveys was deliberate to allow for the differing access teachers have to the LMS compared to students. The responses to most of the common questions demonstrated a close correlation between teacher understanding and student use, with a couple of exceptions such as those outlined above.

 

A summation of feelings towards the LMS elicited by the surveys indicated a strong acceptance of the technology. This has been written about by researchers reviewing usage through technology acceptance models (TAM) (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689). As the school community concerned is technologically experienced, this was expected.  Results also demonstrated that while many users verbally describe a love-hate relationship with SIMON, the use of survey methodology produced more considered feedback (Straumsheim, 2015, para 3). Of the eighteen affordances Schoonenboom lists as desirable in an LMS fourteen are possible using SIMON; only meetings, online examinations, peer feedback, and open hours are not possible in the same manner as she describes (Schoonenboom, 2014, p. 248). Interestingly, the questions aimed at improving SIMON (17 – 19 for teachers and 18 – 19 for students) did not request any of these aspects be made available.

The broad overview of the findings from the open-ended comments (Appendix page) indicated that teachers enjoy the reporting facility because it links to the assessment module and saves them work. The most frustrating facet for both teachers and students is the number of clicks it requires to access work (51%). Students highlighted the inconsistent usage of the LMS by their teachers, and sometimes indicated that components are being incorrectly used: all student work should be in the “curriculum documents” section but some teachers are placing it in “general documents”. While there is an historic reason that may have led to this, it should no longer occur. Reporting live through assessment tasks should indicate more clearly that work is linked to the curriculum module.

 

Facets identified:

Five equal facets should  be provided by any LMS: interoperability, personalisation, analytics, collaboration and accessibility (Straumsheim, 2015, para 6), and, according to the findings, SIMON delivers all of these to some degree. Taking interoperability first, it most elearning tools that teachers currently use with their classes can be accommodated, either by linking the document to the system (such as collaborative OneNote notebooks), locating a file in the system, or providing a weblink.

 

Personalisation is also possible and has led to some confusion as evidenced in the responses.  The school concerned has added a number of links, for example, to library services, which some respondents find bewildering. It does however correspond to the findings reported by Horn et al that a range of “library help objects” and links to resources accounts for user differences and support needs (Horn, Maddox, Hagel, Currie, & Owen, 2013, pp. 238-240).

 

Analytics are available to teachers for administrative use, such as knowing if a student is present or has submitted their work, and also to the LMS administrator for checking the number of log ins for example. The teacher who suggested that it would be good if SIMON could calculate School Assessment Percentages (currently done through Excel 365) would be surprised to know that with teacher education in derived scores, it could.

 

Collaboration was not raised by respondents, although some referred to using the forum space. This is probably SIMON’s weakest suite, but looking at what is planned for the next software update, school, parent, student interaction should be improved (Simkin, 2015 c). Lonn and Teasleys’ research indicates that few users used or rated the interactive tools, preferring to comment on the tools that push out to or collect work from students (Lonn & Teasley, 2009, p. 693).  Collaboration through internal communities of practice and self-organising networks should become more common in the near future as more teachers look to make global connections, and the Senior Campus moves to a one-to-one device model in 2006 (Archer, 2006, p. 67)

 

 

In terms of accessibility, one teacher, who followed up on his survey by sending an email with more detailed information (Budenberg, 2015), found that most of his issues were due to lack of instruction during orientation. In a meeting to resolve some of his issues, Tim passed a comment that SIMON was like a Swiss army multi-purpose knife, citing almost word for word a comment from Alier et al which alludes to the fact that numerous tools, while helpful, may not offer the best solution for the purpose  (Alier, et al., 2012, p. 107).  His prior experience with Daymap, an LMS with the ability to email parents of a class with one click, was raised face-to-face. SIMON is a much cheaper solution.

 

 

Recommendations:

The case study has achieved its goal of leading to a number of recommendations for the school under evaluation. Given that no LMS will answer everyone’s needs, it is better to work with the one that is currently provided and maximise its strengths while minimising its weaknesses (Leaman, 2015 para 7).  In this setting there is the added benefit of access to the developers.

The following recommendations will be passed to the designated, relevant groups.

For SIMON developers:

  1. While interoperability between a range of platforms and SIMON is good, retrieval of information in terms of convolution (number of clicks) and lack of search functionality is a hindrance. This requires simplification in some form.
  2. Collaboration through a range of means: chat, peer assessment and an improved forum interface would be well regarded as beneficial to communities of practice.

For the School Executive:

  1. More effective mentoring of new teachers and ongoing in-servicing of all teaching staff would improve usage for students, thereby enhancing learning.
  2. A clear and consistent statement of expectations for usage by teachers appears to be unclear. Teachers need to model SIMON usage to students more effectively.

For the Teaching and Learning Committee:

Discussion is required to consider the following:

  1. Options for the provision of face-to-face assistance with SIMON mastery need to be provided for teachers and students (beyond their faculty or subject teachers).
  2. Opportunities for learning new aspects of SIMON at relevant times, for example when software is upgraded.
  3. Which LMS facets that may be suggested in other systems are missing from SIMON that are considered desirable.
  4. These survey findings – to enable improved practice.

 

For the Information Services Department:

  • That the location of library related information within the LMS be revisited and evaluated in terms of the most effective location/s for accessing them.

 

Conclusion:

The school studied has been using the SIMON Learning Management System for several years   yet the uptake varies enormously. Some teachers and students rarely use it, others use all aspects of it really well. The reporting package is compulsory and has been effectively used and appreciated by most teachers. Usage of the other features has been inconsistent. This report reveals those elements that have been used, the users’ experience with the LMS, and the outcomes that have been enabled for them through such use.  It is important to determine why some elements have been used, and others avoided. Steps should be taken to improve use, and consider the potential impact of change for learning.

References

Algahtani, M. (2014). Factors influencing the adoption of learning management systems in the Kingdom of Saudi Arabian Universities by female academic staff. Research proposal for confirmation of candidature (PhD) DR209 16th July 2014. Received by personal communication from Bradbeer, Susan, through a dropbox link provided by a lecturer at RMIT, 17 September 2015

Alier, M., Mayol, E., Casan, M. J., Piguillem, J., Merriman, J. W., Conde, M. A., . . . Severance, C. (2012). Clustering projects for interoperability. Journal of Universal Computer Science, 18(1), 106-222.

Archer, N. (2006). A Classification of Communities of Practice. In Encyclopedia of Communities of Practice in Information and Knowledge Management (pp. 21-29). Informationn Science Reference (an imprint of IGI Global).

Budenberg, T. (2015, September 16). personal email. A request for your assistance.

De Smet, C., Bourgonjon, J., De Wever, B., Schellens, T., & Valke, M. (2012). Researching instructional use and the acceptation of learning management systems by secondary school teachers. Computers & Education, 688-696. doi:10.1016/j.compedu.2011.09.013

Elias, L. (2015, February). Intelligent Questionnaire Design for Effective Participant Evaluations. Training and Development, 8-10.

Horn, A., Maddox, A., Hagel, P., Currie, M., & Owen, S. (2013). Enbedded library services: Beyond chance encounters for students from low SES backgrounds. Australian Academic and Research Libraries, 44 (4), pp. 235 – 250. doi:10.1080/00048623.2013.862149

Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: a critical incident technique approach. 249-261. doi:10.1016/j.chb.2013.09.010

Kopcha, T. J. (2012). Teachers’ perceptions of the barriers to technology integration and practices with technology under situated professional development. Computers & Education, 1109 – 1121. doi:10.1016/j.compedu.2012.05.014

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of Learning Management Systems. Computers & Education(53), 686–694. doi:10.1016/j.compedu.2009.04.008

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89. doi:10.1109/LaTiCE.2013.31

McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers & Education, 496 – 508. doi:10.1016/j.compendu.2008.10.002

Ritchie, J. L. (2013). Qualitative research practice: A guide for social science students and researchers. Great Britain: Sage.

Robinson, K., & Aronica, L. (2015). Creative Schools: Revolutionizing Education From The Ground Up. Melbourne: Allen Lane.

Rubin, B., Fernandes, R., Avgerinou, M. D., & Moore, J. (2010). The effect of learning management systems on student and faulty outcomes. Internet and Higher Education, 82 – 83. doi:10.1016/j.iheduc.2009.10.008

Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why intsructors in higher education intend to use some learning management system tools more than others. Computers & Education, pp. 247 – 256. doi:10.1016/j.compedu.2013.09.016

Simkin, M. (2015 a, August 17). Article review. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/08/17/article-review/

Simkin, M. (2015 b, October 6). SIMON. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/10/06/simon/

SIMON Solutions. (2009). Retrieved from SIMON: http://www.simonschools.net/about-simon.html

Straumsheim, C. (2015, May 11). Brick by Brick. Retrieved from Inside Higher Ed: https://www.insidehighered.com/news/2015/05/11/educause-releases-blueprint-next-generation-learning-management-systems

Thomas, G. (2013). How To Do Your Research Project; A Guide For Students in Education and Applied Social Science. London: SAGE.

Yasar, O., & Adiguzel, T. (2010). A working successor of learning management systems: SLOODLE. Procedia Social and Behavioural Sciences 2, 5682 – 5685. doi:10.1016/j.sbspro.2010.03.928

Teacher surveys

Student surveys 

Hunting

Tracking Down references: 

The Journey to find authentic sources for a case study on LMS:

Research is an absorbing process and it will take control of as much of your mind and your life as you allow. The starting point for my case study was Colloquium 2, led by CSU’s Simon Welsh http://thinkspace.csu.edu.au/msimkin/2015/09/06/lms-learning/ (Simkin, LMS and Learning, 2015). Until then I had not really engaged with why I was using our LMS, SIMON, beyond the elements we have to use, and the fact that, as most researchers have noted, it makes managing course material easier for teachers. This attitude was impacted on because my initial passion at having access to an LMS was expended on my first encounter with Moodle, into which I had invested heavily in terms of time and energy. The change to SIMON was a non-negotiable, and when we made the move SIMON was just developing the elements of our highly customized school-wide Moodle. However, taking stock of what has changed since then I realise that I now have a much better opportunity due to the complete integration of resource bookings and reporting, as well as forums, course material of all types and access to student profile information, which integrates with our ability to email all the students in or classes with one click. Obviously this has brought economic benefits to our school as well.

Stemming from the colloquium (Welsh, 2015), the weekly modules added more fuel for the journey. Weller’s work in particular introduced much food for thought around building and maintaining digital collections (Weller, 2011 p.42). Issues such as ownership of data and appropriate ways of creating and sharing information for scholarly purposes led to creating this post http://thinkspace.csu.edu.au/msimkin/2015/09/06/assignment-2/ and chasing more peer-reviewed work on the topic.

Sending out requests through the Twitter PLN resulted in a number of links. A cheeky question to @RMIT_CSIT resulted in a conversation and an interesting research paper (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013).

Initial tweets
Initial tweets
And the conversation continued
And the conversation continued

 

Clarification
Clarification

 

 

And was worthwhile
And was worthwhile

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

@hbaillie provided a link to her INF536 assessment discussing the product that was intended to be the ultimate LMS for the Victorian Education Department: http://thinkspace.csu.edu.au/hbailie/assessed-work/inf536-case-report/

@sbradbeer sent me an unpublished PhD confirmation of Candidature paper, which while very specific was also enlightening in terms of challenges faced in some countries that are not an issue for us here (Algahtani, 2014).

I commenced by using the power of Primo

Linked to CSU's LMS
Linked to CSU’s LMS

 

 

 

 

 

 

Then, ensuring that the titles I accessed were peer reviewed and relatively recent, I selected initially on the basis of relevance to the aspects of LMS on which I wished to focus, then tracked through the references these authors had provided.

Selecting peer reviewed titles
Selecting peer reviewed titles

 

 

 

 

 

There was so much material that it became a necessity to call time in order to meet the impending due date.

 

References

Algahtani, M. (2014). Factors influencing the adoption of learning management systems in the Kingdom of Saudi Arabian Universities by female academic staff. Reaserch proposal for confirmation of candidature (PhD) DR209 16th July 2014. Received by personal communication from Bradeer, Susan, through a dopbox link provided by a lecturer at RMIT, 17 September 2015

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89. doi:10.1109/LaTiCE.2013.31

Simkin, M. (2015, September 6). LMS and Learning. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/09/06/lms-learning/

Weller, M. (2011). The Nature of Scholarship. In M. Weller, The Digital Scholar, How Technology is Transforming Scholarly Practice (pp. 41-51). London: Bloomsbury Collections.

Welsh, S. (Host). (2015, July 28). Learning Analytics: A Traveller’s Guide; Colloquium 2. Retrieved from http://thinkspace.csu.edu.au/msimkin/2015/08/03/2/ Albury, New South Wales, Victoria.

 

 

SIMON

Let me introduce you

                             to the Learning Management System (LMS) under scrutiny for my case study report.

SIMON (SIMON Solutions, 2009) has been used at the school being studied for about 5 years. It is a relatively local product being created 200 kilometres away from this school by practising teachers. As will be demonstrated later in this blog post, this is a significant advantage compared to other products, most of which cost more money to implement. The school concerned had previously used a Moodle LMS developed to suit its purposes by an ex-student studying information technology at a university in Adelaide. SIMON was seen to offer more features and was inexpensive to introduce. As with any change in technology platforms, the early adopters suffered the greatest impact of this decision.

Who is SIMON

SIMON offers most of the functionality of other LMSs within a customizable framework.

 

 

This is the “home” screen – called “Work desk home”, which is customised for each school with name and logo (covered in this image to protect the privacy of the school concerned).

work desk incognito

All other functions link to the work desk in one or more ways. Subject teachers rely on the Assessment and Homework sections:

Assessment view

Teachers can add documents and create folders and students can download tasks and upload their completed work for the teacher to assess:

Topic manager

Ultimately the results and comments from the Assessment module populate the reports – no further writing or reporting package expense is required. This is a great time-saving aspect of the LMS for teachers and of financial benefit to the school.

Teachers can also conduct Forums with their learning areas. Teachers “icons” have a mortarboard to identify their status and both teacher and student icons represent the user’s gender.

Forums incognito

SIMON incorporates the school’s booking system

resource bookings

Like many features, this allows for reports of usage to be generated, although many of these are only accessible to the administrator for security reasons.

Bookings incognitoTwo features, which are heavily used at other schools but not at the school being studied are the Behavioural Tracking (due to a sense that the systems in place prior to SIMON’s introduction were more personalised) and the commendations (something that is acknowledged as good but not yet set up for general use).

Behaviour tracking

Commendations are shown in green while Behaviour Tracking is in red.

commendations

 

 

 

Other areas are available for population at the school’s discretion. The Library is represented in three different locations – two directly linked to the work desk home. This is on the left-hand side of the home screen.

Library links

 

 

 

 

 

 

 

 

 

 

The Knowledge Banks, which cover a range of topics, also has two collections put together by the Library staff (see top left-hand folder in the image below)

Knowledge banks

Inside the Alexandra Library public folder, the Library staff can add items as requested:

Knowledge Bank Alexandra Library

The Junior Campus (Handbury) Library knowledge bank contains less information:

Knowledge Bank Handbury Library

 

 

 

 

 

For the school at the centre of the case study the best aspect of using SIMON is that the teachers who have developed the system continue to work on meeting the needs of schools. To facilitate this they run regular user meetings where information is exchanged, and schools have the potential to request additions or alterations. These slides are from a recent meeting:

assistance

The need for readily available assistance has been noted and will be built into the next upgrade (due early this term). The underlying principle is to improve feedback to all stakeholders: parents, teachers and students:

Communication cycle

The Learning aspect of Learning Management Systems is considered crucial to SIMON’s success and the teaching background of those behind the product is evident:

Basis

Being able to talk to one of the developers, Kevin Brodie, was advantageous in terms of my analysis and in creating the surveys to evaluate teacher and student use of the product. It was also helpful to talk about the vision for the future and to be able to see what the next update will bring to the table.

Looking back over my subject material I found this blog post from last semester:  http://thinkspace.csu.edu.au/msimkin/2015/05/26/new-lms/ . Getting to know updated systems in technology-rich environments does affect our acceptance of the technology itself!

References

Brodie, K. (2015, September 9). (M. Simkin, Interviewer)

PowerPoint created by SIMON developers for the May User Group Meeting and from which screen shots of relevant slides have been used with permission of Kevin Brody

SIMON Solutions. (2009). Retrieved from SIMON: http://www.simonschools.net/about-simon.html

Colloquium 5

Professional Learning and Leadership

The fifth and final visitor led presentation came from Cathie Howe (https://www.linkedin.com/in/cathiehowe) who spoke to us about her work as Professional Learning and Leadership Coordinator, NSW DEC, & Manager Macquarie ICT Innovations Centre http://www.macict.edu.au/ .

MacICT

As we have come to expect from a Colloquium, we learned of another example of knowledge networking and digital innovation which is impacting on the skills of teachers throughout New South Wales. Cathie’s work is centred on meeting the needs of C21t learners by improving teaching and learning in the connected age. She shared this image to summarise this philosophy:

C21st learner

As with all of the colloquiums we have experienced, this presentation also revisited concepts from the first subject in our course (INF 530) such as future work skills which is the motivation behind MacICT.

Pic link to 530

(Institute for the Future, 2011)

Cathie’s presentation was delivered with enthusiasm and she was happy to digress in order to answer our questions. The fact that the colloquium continued beyond the allocated time slot was testimony that she had engaged our class with her material.

Despite Internet connectivity issues making themselves known in every session in a range of ways which were annoying rather than terminal, these sessions were a demonstration of the potential for anywhere, anytime learning which was different to the methodology of a flipped classroom or Khan Academy. The latter examples are more akin to sage on the stage teaching while our colloquiums have seen one or more presenters engage with people in real-time and through oral and text connectivity.

Plenty of food for thought in terms of applications to “classroom” teaching where the room has no walls!

Thanks are due to all the presenters for the semester. Thanks too to Julie for finding such a range of fascinating presenters to further our educational program.

References

Howe, C. [Host] (2015, September 24). Colloquium 5.

Institute for the Future. (2011). Future Work Skills 2020. Retrieved October 4, 2015, from Institute for the Future: http://www.iftf.org/futureworkskills/

Macquarie University; Department of Education, New South Wales. (n.d.). MacICT. Retrieved October 4, 2015, from MacICT Macquarie ICT Innovations Centre: http://www.macict.edu.au/

 

 

 

What LMS should offer

What should an LMS offer?

By deciding to invest in a Learning Management System (LMS) educational institutions are expecting to see an impact on teaching and learning; they require that it generates a reasonable return for the money spent; that it is easy to use; and that it will provide data that leads to improved learning outcomes (Leaman, 2015, p. 1). Stipulations need to allocate uniform consideration to five necessary aspects: “interoperability, personalisation, analytics, collaboration and accessibility” (Straumsheim, 2015).

Often the reality of the system implemented falls short of the expectations and inherent limitations are often hidden. (Leaman, 2015, p. 2). This occurs because LMS are often set up to treat learning as a series of isolated incidents rather than a continuous process which builds on skills incrementally as the course progresses, and the nature of the learning delivery may be generic rather than personalised  (Leaman, 2015, p. 3).  Instructors may not use many functions of the system, and students do not engage as anticipated which compounds the issues as tangible learning is difficult to ascertain (Leaman, 2015, p. 4).

Viewing LMS in terms of learning enhancement needs to be undertaken with the understanding that an ecosystem of effective learning cannot be provided solely by the LMS, and educational institutions need to use such systems within their limitations (Leaman, 2015, p. 6). New iterations of LMS must focus on creating an environment where the parts fit together similarly to a child’s building blocks (Straumsheim, 2015). Whatever the components: assessment modules, or analytics, or others, support must be aimed at competency-based education (Straumsheim, 2015). If there are weaknesses, educators need to augment them by incorporating other tools and build onto what their LMS can achieve rather than replacing it with a different system (Leaman, 2015, p. 6). It is relatively common for faculty personnel to approach their LMS with caution, in a manner similar to someone involved in a “love-hate relationship” (Straumsheim, 2015).

Schools and universities should be prepared to use systems that enable users to move freely between public and private (or open and closed) spaces, and acquiring evidence of collaborations from anywhere online should be made possible (Straumsheim, 2015). New versions of LMS should be centred on the requirements and preferences of the students, whose learning they are intended to support (Straumsheim, 2015).

References

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Straumsheim, C. (2015, May 11). Brick by Brick. Retrieved from Inside Higher Ed: https://www.insidehighered.com/news/2015/05/11/educause-releases-blueprint-next-generation-learning-management-systems

 

 

 

Why Use LMS?

18 Instructional Tasks for Which Instructors Might use an LMS Tool

Schoonenboom published a list of tasks for which instructors might use a Learning Management System (LMS) (Schoonenboom, 2014, p. 248).  This will provide that starting point for a case study on the use of the SIMON LMS tool http://www.simonschools.net/about-simon.html in one school in regional Victoria.

  1. Meeting – defined as a session run through video conferencing software which may be part of the same proprietary suite or through a different medium e.g. Skype for Business or Adobe Connect (such as our Colloquiums.
  2. Guest speaker – see above.
  3. Probing – using a digital tool such as TodaysMeet  or SMS-poll  or Poll Everywhere
  4. Student questions
  5. Office – fixed “open” hours for chat or discussion through mechanisms such as Skype
  6. Reference lists, or reading lists or information sources
  7. Self-testing using assessment software
  8. Exam – administer testing through digital software either in a controlled lab space or classroom or online
  9. Instructor feedback – e.g. through comments and or reporting
  10. Portfolio – examine and comment on students acquired learning through their presentation of evidence in a digital portfolio system or tool e.g. through SharePoint or Class OneNote
  11. Student discussion – e.g. discussion forum
  12. Collaborative writing – e.g. through Class OneNote, wiki, blog, Google Docs
  13. Peer feedback – e.g. through Turnitin
  14. Blog – e.g. Blogger, WordPress
  15. PowerPoint – or other means of producing teacher based material e.g. Teacher notebook with Class OneNote
  16. YouTube – link to videos on YouTube that might support in class learning programs
  17. Web Lecture – record lessons and make available online (using Office Mix record audio to go with slide presentation)
  18. Instruction – as above or other digital artefacts created specifically for the subject by the teacher

 

In constructing a survey, it will be important to raise potential uses as well and investigate uses that are more obvious. There is a pressing need to elicit responses which will evaluate usefulness, ease of use and the LMS intention underpinning pedagogical development and methodology (Schoonenboom, 2014, p. 249)

References

Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, pp. 247 – 256.

 

Invisible but Vital

 

This is Our Challenge:

It is the human capacity of libraries that is critical: staff who are knowledge intermediaries, teacher-librarians, and information and data scientists. Such people work at the junction between development, information science and governance (Gregson, Brownlee, Playforth, & Bimbe, 2015, p. 6).

In today’s paradigm, libraries are responsible for the provision of the invisible infrastructure which enables access to information and inform research (Gregson, Brownlee, Playforth, & Bimbe, 2015, p. 22).

This is the challenge we face for Information Services at  my workplace and it’s probably common to others. We have been working through a period of transition for some time, and with the addition of 1:1 devices from 6 – 12 in 2016 this will accelerate.

fig 2.1

(Gregson, Brownlee, Playforth, & Bimbe, 2015, p.22)

This situation requires a change of focus, from the resources and their appropriate care and display, to the people so that what we provide suits individual needs, is accessible anytime and anywhere, and enables publishing as well as reading. We have been slowly working on this aspect as well.

table 2.2

 

 References:

Gregson, J., Brownlee, J. M., Playforth, R., & Bimbe, N. (2015). Evidence Report No. 125: Policy Anticipation, Response and Evaluation: The Future of Knowledge Sharing in a Digital Age: Exploring Impacts and Policy Implications for Development. London: Institute of Development Studies.

 

 

LMS & Learning

Joining the Traveller’s Journey

(Thanks, Simon Welsh!)

In recently considering digital scholarship, and also reflecting on Colloquium 1 (Welsh, 2015), the potential of Learning Management Systems in comparison to their usage has presented itself as an issue worthy of academic investigation. Until hearing Simon speak passionately about the things many LMSs already measure, and those that could potentially be calculated and then applied to improving learning outcomes for students, I had not considered the possibilities, and these became clear (Welsh, 2015).

For many educators, the LMS is something that has been introduced into their working lives without explanation as to why it is needed, or what it can do for learning.  For secondary teaching colleagues, it has presented a platform for storing work for students, somewhere to host school-wide timetables, and more recently enable roll marking and report writing. Comparing the university LMS to that used at my recent schools has demonstrated some gaps, but the access to analytics, as referred to by Simon (Welsh, 2015), is not obvious to a learner in the former or a teacher in the latter.

Given that students have no say in the specific LMS required by their institution, to what extent do educators have choice in either system or what that system enables them to present (Islam, 2014, p. 253)? Do educators have freedom to create meaningful learning for their students or do the templates offered by the LMS constrain them; or is it incumbent on educators to build on what their LMS enables and augment the weaknesses (Leaman, 2015)?

Rekhari takes these concepts further by declaring that there is a chasm between learning design, technology and the LMS due to a combination of ineffective use by educators and flaws in the design of the systems (Rekhari, 2015, p. 12). She further questions whether the reasons that benefits that LMS intend to deliver to educational design are not entering praxis are the fault of the developers making the software hard to use, or the educators not proactively applying constructivist philosophies to their learning design (Rekhari, 2015, p. 13). She goes on to question whether LMSs are the barriers to educational change (Rekhari, 2015, p. 13).

This publication has led to much questioning of my own practices as an educator using an LMS – and has led to the realisation that beyond managing storage and retrieval of coursework, the other possibilities have not been considered. In order to further my understanding of what our school LMS can do I have requested time with one of the developers. To develop my understanding of practical analyses that already exist I have turned to Twitter, where I have engaged in meaningful dialogue with several professors in the Computer Science and Information Technology Department at RMIT, and who have sent me a document in which they compare Blackboard to Facebook in terms of supporting a specific online course in programming (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013). Additional reading has also been ongoing.

I “attended” the first Colloquium with a degree of disinterest predetermined on the basis of its description, and, due to Simon’s future predictions, it has intrigued me and started me on a learning journey I would never have predicted. This has proved not only interesting but potentially very useful, and will form the basis of my Case Study for Assignment 3.  From passive user to captivated challenger, I am now wondering if a different approach on my behalf could enable my development of a learning ecology for enhancing digital scholarship (Greenhow, Robelia, & Hughes, 2009, p. 248).

References

Greenhow, C., Robelia, B., & Hughes, J. E. (2009). Learning, teaching and scholarship in a digital age. Educational Researcher, 38(4), 246-259.

Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: a critical incident technique approach. Computers in Human Behavior, pp. 249-261.

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89.

Rekhari, S. (2015, August). The Chasm – learning design, technology, and the LMS. Training and Development, pp. 12-13. Retrieved from Australian Institute of Training and Development: www.aitd.com.au

Simkin, Margaret (2015, August 3): #2 http://thinkspace.csu.edu.au/msimkin/2015/08/03/2/

Welsh, S. [Host] (2015, July 28). Learning Analytics: A Traveller’s Guide; Colloquium 2. Albury, Victoria, Australia.

 

 

 

 

PLE & PLN – it’s us!

Open and Social Learning according to Alec Couros:

An open course entitled Education, Curriculum, and Instruction: Open, Connected, Social using Free Ope Source Software through the University of Regina, was implemented in 2008. (Couros, 2010, p. 109). It was based on personal learning networks, and participants quickly realised the value of sustainable knowledge networks. This led to a context built around a series of events which quickly absorbed participants in an engaged community of participation (Couros, 2010, p. 110).

The theoretical foundations of the course were

The open movement (Couros, 2010, p. 111).

Complementary learning theories – social cognitive, social constructivism, andragogy, connectivism, and open teaching (Couros, 2010, pp. 112-115).

The primary learning environment was established collaboratively in the weeks preceding the course. The tools considered were:

Web CT (now Blackboard) – pros: familiar to students and the university had a strong infrastructure of support; cons: proprietary (modifications needed vendor support); directed learning favoured over constructivist; expensive licensing fees.

Moodle – pros: free; open source; modifiable, strong community support; touts a constructivist and social constructivist approach; available. Cons: needs PHP server infrastructure; requires technical expertise leading to hidden costs; software not as available as hoped; course-centric not student-centric; top-down instructivist approach.

Ning – pros: ease of use; freely available in 2008; familiar functionality similar to Facebook; community and individual privacy levels; user-centric spaces; content aggregation; communication tools. Cons: no wiki feature; awkward to add core content material.

Wikispaces – pros: senior, best-known and most stable of wiki providers; solid technical support; theme modification options; simple user interface – see http://eci831.ca/ (Couros, 2010, pp. 117 – 119).

The course required the establishment of a PLN, and it was mandatory that participants developed a personal blog/digital portfolio, participated in a collaborative wiki resource ( no longer active but was located at  http://t4tl.wikispaces.com; this is what happens when such a site is not paid for!) and completed a major digital project (sound like INF 530!) (Couros, 2010, pp. 119 -120).

The course was based on the following tools and interactions:

Synchronous activities: two events per week of between 1.5 and 2 hours in length; the first based on content knowledge (like our INF 537 colloquiums); the second on teaching skills (Couros, 2010, pp. 120-121).

Asynchronous activities: researching and blogging; shared bookmarking; artefact creation; participation in open professional development opportunities; creating content and uploading it to sites such as YouTube; microblogging; collaborative lesson design and contribution to the course wiki (Couros, 2010, pp. 121-122).

Knowledge networks and digital innovation’s forerunner?? Just like INF 530 and INF 536, students developed authentic, dynamic and fluid interactions both within the designated course spaces and in spaces they chose and shared themselves.

Defining Personal Learning Environments, and comparing them to Personal Learning Networks was an exercise undertaken by Couros through Twitter and recorded at http://educationaltechnology.ca/couros/1156. Key agreement indicated that PLEs are the tools, artefacts, processes, and physical connections that allow learners to control and manage their learning (Couros, 2010, p. 125). PLNs explicitly include the human connections that result in the advancement and enabling of a PLE (Couros, 2010, p. 125).

Couros makes the following recommendations for those wishing to use PLNs for teaching and learning:

  • Immersion by participants
  • Social media literacy
  • Active contributions strengthen your PLN
  • Know your “followers” or “friends”
  • PLNs are central to learning for sustained and long-term growth in both facilitators and students(Couros, 2010, pp. 125 -126).

The participatory learning communities developed by courses such as the one Couros describes continue to exist because they are not based around courses per se, but around communal learning (Couros, 2010, p. 127). Those of us taking the Knowledge Networks and Digital Innovation course can already attest to that in terms of the subjects we have already finished because for many of us the content continues to be shared and discussed. If Couros is correct, this course will never have to end – now there’s a challenge to my PLN!

Reference

Couros, A. (2010). Developing personal learning networks for open and social learning. In Veletsianos, G. (Ed.), Emerging technologies in distance education (109–128). Athabasca University: AU Press.