Appraising calibre, authenticating content: (not AC/DC but AC/AC!)

We are living in an era where information is readily available, easily created, generally unedited or moderated, and widely shared. It is vital that readers have the capacity to appraise the calibre of content they encounter. Yet, it would appear that even students entering renowned universities cannot apply even the most basic of filters to images or documents presented to them (Weinberg, 2016).

Some simple starting points:

valenza-based-checklist
Based on Some Rules of Thumb – a guide to assessing online news and adapted to suit all types of information (Valenza, 2016).

Without applying a filter, or lens to what we read we run the risk of spreading misinformation, thereby perpetuating deliberately created and often specifically targeted fabrications which may be destabilising to governments or undermining to individuals. Far from choosing to be part of such a process, many are inadvertently passed on because people aren’t taking the time to evaluate sources (Tiffany, 2016).

Teacher-librarians such as Valenza promote their role as critical in educating more news literate and savvy information consumers. Tiffany states that this is more effective the earlier that students encounter such educators (Tiffany, 2016).

Coupled with the relatively recent rise in the spreading of “untruthiness”, is the concept held by many that free press equates to neutral information (Valenza, 2016). History teachers are adept at demonstrating that the underlying perspective of the creator, or interpretation of the historian affects the way in which the information s viewed. Much harder to teach, however, is the effect our own attitudes and biases affect the way in which we read and often lead us to ignore viewpoints that differ from our own (Valenza, 2016).

Teaching younger students about appraising calibre and authenticating content is made a little easier by using a resource such as the TED talk on “How to choose your own news” (Brown, 2014) – an engaging animation.

There is no doubt that there has been an exponential increase in the publication of extreme, untrue and misleading “fake news” since the rise of social media such as Twitter and Facebook, partly due to the fact that the number of clicks may equate to real income for the posters (Garun, 2016). This poses a real issue for the founders of such sites, such as Facebook’s Mark Zuckerberg, who has expressed concern at the site being forced into becoming arbiters of truth (Liptak, 2016). The sites on which such “untruthiness” is spread have become known for fostering click bait (Zimdars, 2016).

There have also been allegations that social platforms influenced election results in several countries in 2016 (Garun, 2016). This of itself may not be all bad – but it does indicate the serious need for teaching readers how to negotiate the publications of our time by understanding the underlying purpose of the publications to which they are exposed, and to question the authenticity of what they read, in much the same way that commercial transactions advise that the buyer must be aware. It is critical that leading universities such as Stanford do not continue to find that their students are vulnerable to fake news (Weinberg, 2016).

It is crucial that Australian students are able to learn within their own context about the ways this can be an issue locally, as well as seeing information relating to the United States in particular.  We need to be developing Australian resources to support teaching the necessary skills.

As a teacher-librarian and History teacher I am up for the challenge – are you? Join the conversation at #truthinessEDU

References

Brown, D. (Writer), & Harris-Norico, A. (Director). (2014). How to Choose Your News [Motion Picture]. TedED. Retrieved December 3, 2016, from http://ed.ted.com/lessons/how-to-choose-your-news-damon-brown

Garun, N. (2016, November 14). How social platforms influenced the 2016 election. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/14/13626694/election-2016-trending-social-media-facebook-twitter-influence

Liptak, A. (2016, November 13). Mark Zuckerberg warns about Facebook ‘becoming arbiters of truth’. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/13/13613566/mark-zuckerberg-facebook-misinformation-hoax-media

Tiffany, K. (2016, November 16). In the war on fake news, school librarians have a huge role to play. Retrieved December 3, 2016, from The Verge: http://www.theverge.com/2016/11/16/13637294/school-libraries-information-literacy-fake-news-election-2016

Valenza, J. (2016, 26 November). Truth, truthiness, triangulation: A news literacy toolkit for a “post-truth” world. Retrieved December 3, 2016, from School Library Journal: http://blogs.slj.com/neverendingsearch/2016/11/26/truth-truthiness-triangulation-and-the-librarian-way-a-news-literacy-toolkit-for-a-post-truth-world/

Weinberg, S. (2016, November 26). Stanford Study Finds Most Students Vulnerable To Fake News. (K. McEvers, Interviewer) Retrieved from http://www.npr.org/2016/11/22/503052574/stanford-study-finds-most-students-vulnerable-to-fake-news

Zimdars, M. (. (2016, November 15 ?). False, Misleading, Clickbait-y, and/or Satirical “News” Sources. Retrieved December 3, 2016, from http://d279m997dpfwgl.cloudfront.net/wp/2016/11/Resource-False-Misleading-Clickbait-y-and-Satirical-%E2%80%9CNews%E2%80%9D-Sources-1.pdf

 

Final Report

Case Study Research Report:

Learning Management System utilisation by teachers and students at a regional Victorian school.

How well are the affordances of the SIMON LMS being used by teachers and students in one specified school setting?

Executive summary:

 

Learning Managements Systems (LMS) are web based products which have been used by most universities for a significant period of time. Increasingly schools, particularly at the secondary level, are also investing in such digital tools. This study compares the potential usage of a specific LMS in a small, regional, kindergarten to Year 12, Victorian Independent School with those aspects that teachers and students are actually using. Information was garnered by use of online surveys, and the findings suggest not only wide acceptance of some affordances by both teachers and students, but also ignorance of the potential of others. The Primary Campus usage is minimal, for a number of reasons. The data were comparable with results obtained by researchers working on LMS reviews in other institutions, predominantly universities.

 

The Nature and Context of the Case Study:

 

This case study report presents the results of an empirical inquiry investigating the extent to which the SIMON (LMS) (SIMON Solutions, 2009), as one example of a contemporary educational phenomenon, is being used to improve teaching and learning within the context of a specific regional Victorian school.  The inquiry was framed to discover the degree of usage by teachers and students, the individual uptake of those functions offered by the LMS compared to features not adopted, and the perceived advantages, problems and potential of this specific educational software. The underlying purpose was to understand user needs and perspectives, thereby identifying aspects of usage with which users of this LMS may require support, in order to improve the school’s knowledge networking and opportunities for digital innovation.

 

 

Terminology:

LMS, also referred to as Virtual Learning Environments, Digital Learning Environments, Course Management Systems or Electronic Learning Environments are web based applications which are accessible wherever an Internet connection allows (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 688).  While a significant amount of research has been conducted on the impact of such systems in universities, where, for example, uptake in Britain by 2005 was reported at ninety five percent (McGill & Klobas, 2009, p. 496), there are fewer examples focused on schools, and these are not K-12 settings. The circumstances of the chosen setting are therefore different to those institutions reported on in other academic literature.

 

SIMON is a learning management program, created in 2000 by practicing teachers at St Patrick’s College in Ballarat (Simkin, SIMON, 2015 c). It is now owned by the Ballarat Diocese and the original developers are still involved in managing its evolution. Whilst originally used in Catholic schools within this Diocese, SIMON usage has extended to other educational jurisdictions and Australian states. The school on which this case study focuses, was one of the first Independent Schools to adopt the program, moving to SIMON from Moodle about five years ago. It has also developed a relatively collaborative relationship with the founders of the LMS, by suggesting possible changes; an aspect of the specific context that does not apply to many other schools using the same product.

 

Constraints:

The decision to focus on reviewing one LMS in a single school, was selected to meet the constraints of the timeframe available for conducting the research, and the stipulations outlined for the writing of the research report. The chosen school has been using SIMON for six years, however employment of the system has been observably inconsistent from both a teaching and learning perspective. There is, therefore, potential to use the findings of this investigation to lead to improvement. Three forms of understanding are required before educational transformation can occur: a critique of the current, a vision of the desired and a theory for guiding the situation from where it is to where it should be in order to achieve better outcomes (Robinson & Aronica, 2015, p. 58). This sentiment encapsulates the intention of this case study research as investing in an LMS should result in measurable return on investment (Leaman, 2015, para 1).

The Process:

Literature Review:

The process necessitated commencing with a review of relevant literature, taking guidance from Thomas, the quality of material and the publications it was coming from were the first criteria,  including following up references to literature reviewed within the sources investigated (Thomas, 2013, pp. 60-61).  Most titles were retrieved from the university library, but one was provided through a Twitter connection which led to Professor Harland (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013) and another from a colleague, an intriguing and very specific research proposal highlighting issues which apply to segregated education, but which also reminded of the challenges of mixing methodology, and that awareness is not the same thing as use when it comes to LMS (Algahtani, 2014, p. 16).  These papers revealed some common themes surrounding LMS research, as outlined below.

 

Commonalities:

Research has predominantly considered the major LMS providers, notably Blackboard (which now incorporates WebCT and ANGEL (Islam, 2014, p. 252)), but also Dokeos, Smartschool (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689), Sakai (Lonn & Teasley, 2009, p. 687), Desire2Learn (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82) and the popular open source Moodle.  Some papers analyse usage of several LMSs, while others compare the utility offered by different options such as Facebook (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013, p. 83), and SLOODLE (Moodle incorporated with Second Life as a 3D virtual learning environment) (Yasar & Adiguzel, 2010, pp. 5683 – 5685).

 

The majority of the literature was based on surveys, so the decision to collect information through online surveys was validated. Given that the SIMON interface is different for teachers compared to students, two surveys were required. These were constructed using Google Forms.

 

Limitations:

A Parent Access Model survey is being developed for future use to strengthen the evaluation process and enhance the practical application of the recommendations. Lack of time and access to this module for the researcher precluded it from the case study. Use of SIMON by the Primary Campus would benefit from further discussion also. Analysing the purpose and style of the questionnaire was a vital stating point  (Elias, 2015), therefore the main elements of Elias’ work informed the overall structure (Simkin, 2015 a).

 

Survey Methodology:

Qualitative and quantitative surveys elicit very different information, and the literature review resulted in the decision to incorporate both styles of questioning. Qualitative methodology enables detailed descriptions to be provided that are not constrained by the researcher, enabling the respondents to elaborate on the things that matter to them (Ritchie, 2013, p. 4). The style of qualitative questions accessed aspects of critical theory enabling an understanding of the intersect between material conditions and their influence on human behaviour (Ritchie, 2013, p. 12). For example, the last two questions on both surveys (Appendix pages 23-25; 34-35) were ontologically focussed, aiming to compare realistic responses with idealistic possibilities.

 

Selecting the right tool for the anticipated outcome also required quantitative data gathering: deciding appropriate topics to assess by checklists (Appendix pages 20 &  ) compared to items that needed to be evaluated through Likert scale questions (Appendix page) followed (Thomas, 2013, pp. 209-215). It was important to set such questions up in a neutral manner, rather than in a way that directed the result to meet preconceived ideas; commencing with the Likert style questions using a scale of one to five (with one the lowest and five the highest level of agreement) allowed participants to proceed quickly through the quantifiable elements while offering a nuanced range of responses (Thomas, 2013, p. 214). This style of “scale” question allows for opinions to be presented easily; for those who like to explain in more detail, and to have open ended and creative thoughts, the qualitative examples were provided later in the survey (Thomas, 2013, p. 215). Questions needed to cover contextual, explanatory, evaluative and generative options (Ritchie, 2013, p. 27) to allow this report to describe and critique the current, suggest what might be possible and enable recommendations that might be educationally transformational (Robinson & Aronica, 2015, p. 58). The final questions were designed to evoke creative responses and raise the potential for the future of LMS for the next generation of learners, where the ideal system should be more of a learning environment or ecosystem, fitted together in the manner of building blocks to suit subject specific requirements (Straumsheim, 2015, p. 7).

 

In order to ensure clarity and precision (Thomas, 2013, p. 207), the surveys were trialled with fellow university students and work colleagues, including the school’s technical staff, who have strong knowledge of SIMON. Despite this there were some elements that might have offered a different insight: gender and year level of students for example and teaching methods of the staff; such omissions are typical of mixed method research, and hard to avoid in short time frames, especially by relatively inexperienced researchers as myself (Algahtani, 2014, p. 16).

 

The anticipated findings were targeted at establishing the overall satisfaction and learner engagement with SIMON’s functions in terms of organisation, pacing of work, access to resources, collection of materials, class discussion, and feedback, as outlined in the work of Rubin et al (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). The study had to identify the enabling functions as distinct from the hindrances, and whether they were impacting on design of and access to course materials in a positive or negative manner (Rubin, Fernandes, Avgerinou, & Moore, 2010, p. 82). Ease of navigation and number of clicks to access items can facilitate learning, where the inverse will frustrate users and lead to avoidance of features; this is particularly true of feedback. If Blackboard v.12 took twelve click to achieve something that Moodle could do with one, how did SIMON compare (Rubin, Fernandes, Avgerinou, & Moore, 2010, pp. 82-83)?

 

Critical Evaluation:

The Survey Findings:

The survey resulted in thirty-three teacher and sixty-eight student responses, or 47% and 31% respectively. All teaching staff were invited to participate, but only one Junior Campus teacher accepted the opportunity. Another emailed and said it wasn’t really relevant to them. Given that these teachers generally only use a small number of SIMON’s features this was not unreasonable. Teachers on small part-time loads were also not expected to participate; therefore this result was better than expected in the last week of term. Students from Year Nine to Year 12, who have one to one device access were the target population, and the number of respondents for the busy last week was also pleasing.

 

While every recommended avenue had been explored in terms of how to set up a valid survey instrument and pretesting had occurred (Elias, 2015), there were still unexpected outcomes. Omissions and problems arose from the survey’s construction. It would have been helpful to know the gender of recipients given that this has been a factor in a number of other research results, not only Algahtani’s where such issues would be expected (Algahtani, 2014). It would also have been helpful to ascertain the teaching areas of the staff, relative age group of each teacher (or years of experience), and the year levels of the students as was done by Kopcha (Kopcha, 2012, p. 1116).  Including questions to elicit this information would have enabled more targeted recommendations.

 

The use of Likert scale questions in the introductory part of the survey worked well, and respondents benefitted from having a five point scale. The usage responses indicated that 28% of teachers (see teacher and student results below) believe that they use SIMON to some extent or a great extent, while 53% of students (see teacher and student results below) reported that their teachers used SIMON at this level. This is an example where interpretation of results would have been more meaningful if the subjects being taught were known. Staff and students were generally more positive that SIMON supported their teaching and learning in some manner than they were negative.

 

Another anomaly of the type referred to above was revealed by the yes or no option relating to the uploading of work question (see teacher and student results below) where 81% of teachers reported that they did not ask students to do this, but 31% of students said that they did provide work to their teachers in this manner. An astonishing percentage of students reported video and audio feedback being provided (71%) where only 24% of teachers said that they provided this (see teacher and student results below). A follow-up question here on which subjects were making use of this facility would have been beneficial in terms of recommendations, especially if responding teachers had been asked to indicate their faculty.

 

Moving from Likert scale questions and yes or no option to open-ended responses proved valuable on both surveys, as had been anticipated. The number of respondents who completed these optional questions was very pleasing. The slight difference in questions between the two surveys was deliberate to allow for the differing access teachers have to the LMS compared to students. The responses to most of the common questions demonstrated a close correlation between teacher understanding and student use, with a couple of exceptions such as those outlined above.

 

A summation of feelings towards the LMS elicited by the surveys indicated a strong acceptance of the technology. This has been written about by researchers reviewing usage through technology acceptance models (TAM) (De Smet, Bourgonjon, De Wever, Schellens, & Valke, 2012, p. 689). As the school community concerned is technologically experienced, this was expected.  Results also demonstrated that while many users verbally describe a love-hate relationship with SIMON, the use of survey methodology produced more considered feedback (Straumsheim, 2015, para 3). Of the eighteen affordances Schoonenboom lists as desirable in an LMS fourteen are possible using SIMON; only meetings, online examinations, peer feedback, and open hours are not possible in the same manner as she describes (Schoonenboom, 2014, p. 248). Interestingly, the questions aimed at improving SIMON (17 – 19 for teachers and 18 – 19 for students) did not request any of these aspects be made available.

The broad overview of the findings from the open-ended comments (Appendix page) indicated that teachers enjoy the reporting facility because it links to the assessment module and saves them work. The most frustrating facet for both teachers and students is the number of clicks it requires to access work (51%). Students highlighted the inconsistent usage of the LMS by their teachers, and sometimes indicated that components are being incorrectly used: all student work should be in the “curriculum documents” section but some teachers are placing it in “general documents”. While there is an historic reason that may have led to this, it should no longer occur. Reporting live through assessment tasks should indicate more clearly that work is linked to the curriculum module.

 

Facets identified:

Five equal facets should  be provided by any LMS: interoperability, personalisation, analytics, collaboration and accessibility (Straumsheim, 2015, para 6), and, according to the findings, SIMON delivers all of these to some degree. Taking interoperability first, it most elearning tools that teachers currently use with their classes can be accommodated, either by linking the document to the system (such as collaborative OneNote notebooks), locating a file in the system, or providing a weblink.

 

Personalisation is also possible and has led to some confusion as evidenced in the responses.  The school concerned has added a number of links, for example, to library services, which some respondents find bewildering. It does however correspond to the findings reported by Horn et al that a range of “library help objects” and links to resources accounts for user differences and support needs (Horn, Maddox, Hagel, Currie, & Owen, 2013, pp. 238-240).

 

Analytics are available to teachers for administrative use, such as knowing if a student is present or has submitted their work, and also to the LMS administrator for checking the number of log ins for example. The teacher who suggested that it would be good if SIMON could calculate School Assessment Percentages (currently done through Excel 365) would be surprised to know that with teacher education in derived scores, it could.

 

Collaboration was not raised by respondents, although some referred to using the forum space. This is probably SIMON’s weakest suite, but looking at what is planned for the next software update, school, parent, student interaction should be improved (Simkin, 2015 c). Lonn and Teasleys’ research indicates that few users used or rated the interactive tools, preferring to comment on the tools that push out to or collect work from students (Lonn & Teasley, 2009, p. 693).  Collaboration through internal communities of practice and self-organising networks should become more common in the near future as more teachers look to make global connections, and the Senior Campus moves to a one-to-one device model in 2006 (Archer, 2006, p. 67)

 

 

In terms of accessibility, one teacher, who followed up on his survey by sending an email with more detailed information (Budenberg, 2015), found that most of his issues were due to lack of instruction during orientation. In a meeting to resolve some of his issues, Tim passed a comment that SIMON was like a Swiss army multi-purpose knife, citing almost word for word a comment from Alier et al which alludes to the fact that numerous tools, while helpful, may not offer the best solution for the purpose  (Alier, et al., 2012, p. 107).  His prior experience with Daymap, an LMS with the ability to email parents of a class with one click, was raised face-to-face. SIMON is a much cheaper solution.

 

 

Recommendations:

The case study has achieved its goal of leading to a number of recommendations for the school under evaluation. Given that no LMS will answer everyone’s needs, it is better to work with the one that is currently provided and maximise its strengths while minimising its weaknesses (Leaman, 2015 para 7).  In this setting there is the added benefit of access to the developers.

The following recommendations will be passed to the designated, relevant groups.

For SIMON developers:

  1. While interoperability between a range of platforms and SIMON is good, retrieval of information in terms of convolution (number of clicks) and lack of search functionality is a hindrance. This requires simplification in some form.
  2. Collaboration through a range of means: chat, peer assessment and an improved forum interface would be well regarded as beneficial to communities of practice.

For the School Executive:

  1. More effective mentoring of new teachers and ongoing in-servicing of all teaching staff would improve usage for students, thereby enhancing learning.
  2. A clear and consistent statement of expectations for usage by teachers appears to be unclear. Teachers need to model SIMON usage to students more effectively.

For the Teaching and Learning Committee:

Discussion is required to consider the following:

  1. Options for the provision of face-to-face assistance with SIMON mastery need to be provided for teachers and students (beyond their faculty or subject teachers).
  2. Opportunities for learning new aspects of SIMON at relevant times, for example when software is upgraded.
  3. Which LMS facets that may be suggested in other systems are missing from SIMON that are considered desirable.
  4. These survey findings – to enable improved practice.

 

For the Information Services Department:

  • That the location of library related information within the LMS be revisited and evaluated in terms of the most effective location/s for accessing them.

 

Conclusion:

The school studied has been using the SIMON Learning Management System for several years   yet the uptake varies enormously. Some teachers and students rarely use it, others use all aspects of it really well. The reporting package is compulsory and has been effectively used and appreciated by most teachers. Usage of the other features has been inconsistent. This report reveals those elements that have been used, the users’ experience with the LMS, and the outcomes that have been enabled for them through such use.  It is important to determine why some elements have been used, and others avoided. Steps should be taken to improve use, and consider the potential impact of change for learning.

References

Algahtani, M. (2014). Factors influencing the adoption of learning management systems in the Kingdom of Saudi Arabian Universities by female academic staff. Research proposal for confirmation of candidature (PhD) DR209 16th July 2014. Received by personal communication from Bradbeer, Susan, through a dropbox link provided by a lecturer at RMIT, 17 September 2015

Alier, M., Mayol, E., Casan, M. J., Piguillem, J., Merriman, J. W., Conde, M. A., . . . Severance, C. (2012). Clustering projects for interoperability. Journal of Universal Computer Science, 18(1), 106-222.

Archer, N. (2006). A Classification of Communities of Practice. In Encyclopedia of Communities of Practice in Information and Knowledge Management (pp. 21-29). Informationn Science Reference (an imprint of IGI Global).

Budenberg, T. (2015, September 16). personal email. A request for your assistance.

De Smet, C., Bourgonjon, J., De Wever, B., Schellens, T., & Valke, M. (2012). Researching instructional use and the acceptation of learning management systems by secondary school teachers. Computers & Education, 688-696. doi:10.1016/j.compedu.2011.09.013

Elias, L. (2015, February). Intelligent Questionnaire Design for Effective Participant Evaluations. Training and Development, 8-10.

Horn, A., Maddox, A., Hagel, P., Currie, M., & Owen, S. (2013). Enbedded library services: Beyond chance encounters for students from low SES backgrounds. Australian Academic and Research Libraries, 44 (4), pp. 235 – 250. doi:10.1080/00048623.2013.862149

Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: a critical incident technique approach. 249-261. doi:10.1016/j.chb.2013.09.010

Kopcha, T. J. (2012). Teachers’ perceptions of the barriers to technology integration and practices with technology under situated professional development. Computers & Education, 1109 – 1121. doi:10.1016/j.compedu.2012.05.014

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of Learning Management Systems. Computers & Education(53), 686–694. doi:10.1016/j.compedu.2009.04.008

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89. doi:10.1109/LaTiCE.2013.31

McGill, T. J., & Klobas, J. E. (2009). A task-technology fit view of learning management system impact. Computers & Education, 496 – 508. doi:10.1016/j.compendu.2008.10.002

Ritchie, J. L. (2013). Qualitative research practice: A guide for social science students and researchers. Great Britain: Sage.

Robinson, K., & Aronica, L. (2015). Creative Schools: Revolutionizing Education From The Ground Up. Melbourne: Allen Lane.

Rubin, B., Fernandes, R., Avgerinou, M. D., & Moore, J. (2010). The effect of learning management systems on student and faulty outcomes. Internet and Higher Education, 82 – 83. doi:10.1016/j.iheduc.2009.10.008

Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why intsructors in higher education intend to use some learning management system tools more than others. Computers & Education, pp. 247 – 256. doi:10.1016/j.compedu.2013.09.016

Simkin, M. (2015 a, August 17). Article review. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/08/17/article-review/

Simkin, M. (2015 b, October 6). SIMON. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/10/06/simon/

SIMON Solutions. (2009). Retrieved from SIMON: http://www.simonschools.net/about-simon.html

Straumsheim, C. (2015, May 11). Brick by Brick. Retrieved from Inside Higher Ed: https://www.insidehighered.com/news/2015/05/11/educause-releases-blueprint-next-generation-learning-management-systems

Thomas, G. (2013). How To Do Your Research Project; A Guide For Students in Education and Applied Social Science. London: SAGE.

Yasar, O., & Adiguzel, T. (2010). A working successor of learning management systems: SLOODLE. Procedia Social and Behavioural Sciences 2, 5682 – 5685. doi:10.1016/j.sbspro.2010.03.928

Teacher surveys

Student surveys 

Hunting

Tracking Down references: 

The Journey to find authentic sources for a case study on LMS:

Research is an absorbing process and it will take control of as much of your mind and your life as you allow. The starting point for my case study was Colloquium 2, led by CSU’s Simon Welsh http://thinkspace.csu.edu.au/msimkin/2015/09/06/lms-learning/ (Simkin, LMS and Learning, 2015). Until then I had not really engaged with why I was using our LMS, SIMON, beyond the elements we have to use, and the fact that, as most researchers have noted, it makes managing course material easier for teachers. This attitude was impacted on because my initial passion at having access to an LMS was expended on my first encounter with Moodle, into which I had invested heavily in terms of time and energy. The change to SIMON was a non-negotiable, and when we made the move SIMON was just developing the elements of our highly customized school-wide Moodle. However, taking stock of what has changed since then I realise that I now have a much better opportunity due to the complete integration of resource bookings and reporting, as well as forums, course material of all types and access to student profile information, which integrates with our ability to email all the students in or classes with one click. Obviously this has brought economic benefits to our school as well.

Stemming from the colloquium (Welsh, 2015), the weekly modules added more fuel for the journey. Weller’s work in particular introduced much food for thought around building and maintaining digital collections (Weller, 2011 p.42). Issues such as ownership of data and appropriate ways of creating and sharing information for scholarly purposes led to creating this post http://thinkspace.csu.edu.au/msimkin/2015/09/06/assignment-2/ and chasing more peer-reviewed work on the topic.

Sending out requests through the Twitter PLN resulted in a number of links. A cheeky question to @RMIT_CSIT resulted in a conversation and an interesting research paper (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013).

Initial tweets
Initial tweets
And the conversation continued
And the conversation continued

 

Clarification
Clarification

 

 

And was worthwhile
And was worthwhile

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

@hbaillie provided a link to her INF536 assessment discussing the product that was intended to be the ultimate LMS for the Victorian Education Department: http://thinkspace.csu.edu.au/hbailie/assessed-work/inf536-case-report/

@sbradbeer sent me an unpublished PhD confirmation of Candidature paper, which while very specific was also enlightening in terms of challenges faced in some countries that are not an issue for us here (Algahtani, 2014).

I commenced by using the power of Primo

Linked to CSU's LMS
Linked to CSU’s LMS

 

 

 

 

 

 

Then, ensuring that the titles I accessed were peer reviewed and relatively recent, I selected initially on the basis of relevance to the aspects of LMS on which I wished to focus, then tracked through the references these authors had provided.

Selecting peer reviewed titles
Selecting peer reviewed titles

 

 

 

 

 

There was so much material that it became a necessity to call time in order to meet the impending due date.

 

References

Algahtani, M. (2014). Factors influencing the adoption of learning management systems in the Kingdom of Saudi Arabian Universities by female academic staff. Reaserch proposal for confirmation of candidature (PhD) DR209 16th July 2014. Received by personal communication from Bradeer, Susan, through a dopbox link provided by a lecturer at RMIT, 17 September 2015

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89. doi:10.1109/LaTiCE.2013.31

Simkin, M. (2015, September 6). LMS and Learning. Retrieved from Digitalli: http://thinkspace.csu.edu.au/msimkin/2015/09/06/lms-learning/

Weller, M. (2011). The Nature of Scholarship. In M. Weller, The Digital Scholar, How Technology is Transforming Scholarly Practice (pp. 41-51). London: Bloomsbury Collections.

Welsh, S. (Host). (2015, July 28). Learning Analytics: A Traveller’s Guide; Colloquium 2. Retrieved from http://thinkspace.csu.edu.au/msimkin/2015/08/03/2/ Albury, New South Wales, Victoria.

 

 

SIMON

Let me introduce you

                             to the Learning Management System (LMS) under scrutiny for my case study report.

SIMON (SIMON Solutions, 2009) has been used at the school being studied for about 5 years. It is a relatively local product being created 200 kilometres away from this school by practising teachers. As will be demonstrated later in this blog post, this is a significant advantage compared to other products, most of which cost more money to implement. The school concerned had previously used a Moodle LMS developed to suit its purposes by an ex-student studying information technology at a university in Adelaide. SIMON was seen to offer more features and was inexpensive to introduce. As with any change in technology platforms, the early adopters suffered the greatest impact of this decision.

Who is SIMON

SIMON offers most of the functionality of other LMSs within a customizable framework.

 

 

This is the “home” screen – called “Work desk home”, which is customised for each school with name and logo (covered in this image to protect the privacy of the school concerned).

work desk incognito

All other functions link to the work desk in one or more ways. Subject teachers rely on the Assessment and Homework sections:

Assessment view

Teachers can add documents and create folders and students can download tasks and upload their completed work for the teacher to assess:

Topic manager

Ultimately the results and comments from the Assessment module populate the reports – no further writing or reporting package expense is required. This is a great time-saving aspect of the LMS for teachers and of financial benefit to the school.

Teachers can also conduct Forums with their learning areas. Teachers “icons” have a mortarboard to identify their status and both teacher and student icons represent the user’s gender.

Forums incognito

SIMON incorporates the school’s booking system

resource bookings

Like many features, this allows for reports of usage to be generated, although many of these are only accessible to the administrator for security reasons.

Bookings incognitoTwo features, which are heavily used at other schools but not at the school being studied are the Behavioural Tracking (due to a sense that the systems in place prior to SIMON’s introduction were more personalised) and the commendations (something that is acknowledged as good but not yet set up for general use).

Behaviour tracking

Commendations are shown in green while Behaviour Tracking is in red.

commendations

 

 

 

Other areas are available for population at the school’s discretion. The Library is represented in three different locations – two directly linked to the work desk home. This is on the left-hand side of the home screen.

Library links

 

 

 

 

 

 

 

 

 

 

The Knowledge Banks, which cover a range of topics, also has two collections put together by the Library staff (see top left-hand folder in the image below)

Knowledge banks

Inside the Alexandra Library public folder, the Library staff can add items as requested:

Knowledge Bank Alexandra Library

The Junior Campus (Handbury) Library knowledge bank contains less information:

Knowledge Bank Handbury Library

 

 

 

 

 

For the school at the centre of the case study the best aspect of using SIMON is that the teachers who have developed the system continue to work on meeting the needs of schools. To facilitate this they run regular user meetings where information is exchanged, and schools have the potential to request additions or alterations. These slides are from a recent meeting:

assistance

The need for readily available assistance has been noted and will be built into the next upgrade (due early this term). The underlying principle is to improve feedback to all stakeholders: parents, teachers and students:

Communication cycle

The Learning aspect of Learning Management Systems is considered crucial to SIMON’s success and the teaching background of those behind the product is evident:

Basis

Being able to talk to one of the developers, Kevin Brodie, was advantageous in terms of my analysis and in creating the surveys to evaluate teacher and student use of the product. It was also helpful to talk about the vision for the future and to be able to see what the next update will bring to the table.

Looking back over my subject material I found this blog post from last semester:  http://thinkspace.csu.edu.au/msimkin/2015/05/26/new-lms/ . Getting to know updated systems in technology-rich environments does affect our acceptance of the technology itself!

References

Brodie, K. (2015, September 9). (M. Simkin, Interviewer)

PowerPoint created by SIMON developers for the May User Group Meeting and from which screen shots of relevant slides have been used with permission of Kevin Brody

SIMON Solutions. (2009). Retrieved from SIMON: http://www.simonschools.net/about-simon.html

What LMS should offer

What should an LMS offer?

By deciding to invest in a Learning Management System (LMS) educational institutions are expecting to see an impact on teaching and learning; they require that it generates a reasonable return for the money spent; that it is easy to use; and that it will provide data that leads to improved learning outcomes (Leaman, 2015, p. 1). Stipulations need to allocate uniform consideration to five necessary aspects: “interoperability, personalisation, analytics, collaboration and accessibility” (Straumsheim, 2015).

Often the reality of the system implemented falls short of the expectations and inherent limitations are often hidden. (Leaman, 2015, p. 2). This occurs because LMS are often set up to treat learning as a series of isolated incidents rather than a continuous process which builds on skills incrementally as the course progresses, and the nature of the learning delivery may be generic rather than personalised  (Leaman, 2015, p. 3).  Instructors may not use many functions of the system, and students do not engage as anticipated which compounds the issues as tangible learning is difficult to ascertain (Leaman, 2015, p. 4).

Viewing LMS in terms of learning enhancement needs to be undertaken with the understanding that an ecosystem of effective learning cannot be provided solely by the LMS, and educational institutions need to use such systems within their limitations (Leaman, 2015, p. 6). New iterations of LMS must focus on creating an environment where the parts fit together similarly to a child’s building blocks (Straumsheim, 2015). Whatever the components: assessment modules, or analytics, or others, support must be aimed at competency-based education (Straumsheim, 2015). If there are weaknesses, educators need to augment them by incorporating other tools and build onto what their LMS can achieve rather than replacing it with a different system (Leaman, 2015, p. 6). It is relatively common for faculty personnel to approach their LMS with caution, in a manner similar to someone involved in a “love-hate relationship” (Straumsheim, 2015).

Schools and universities should be prepared to use systems that enable users to move freely between public and private (or open and closed) spaces, and acquiring evidence of collaborations from anywhere online should be made possible (Straumsheim, 2015). New versions of LMS should be centred on the requirements and preferences of the students, whose learning they are intended to support (Straumsheim, 2015).

References

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Straumsheim, C. (2015, May 11). Brick by Brick. Retrieved from Inside Higher Ed: https://www.insidehighered.com/news/2015/05/11/educause-releases-blueprint-next-generation-learning-management-systems

 

 

 

Why Use LMS?

18 Instructional Tasks for Which Instructors Might use an LMS Tool

Schoonenboom published a list of tasks for which instructors might use a Learning Management System (LMS) (Schoonenboom, 2014, p. 248).  This will provide that starting point for a case study on the use of the SIMON LMS tool http://www.simonschools.net/about-simon.html in one school in regional Victoria.

  1. Meeting – defined as a session run through video conferencing software which may be part of the same proprietary suite or through a different medium e.g. Skype for Business or Adobe Connect (such as our Colloquiums.
  2. Guest speaker – see above.
  3. Probing – using a digital tool such as TodaysMeet  or SMS-poll  or Poll Everywhere
  4. Student questions
  5. Office – fixed “open” hours for chat or discussion through mechanisms such as Skype
  6. Reference lists, or reading lists or information sources
  7. Self-testing using assessment software
  8. Exam – administer testing through digital software either in a controlled lab space or classroom or online
  9. Instructor feedback – e.g. through comments and or reporting
  10. Portfolio – examine and comment on students acquired learning through their presentation of evidence in a digital portfolio system or tool e.g. through SharePoint or Class OneNote
  11. Student discussion – e.g. discussion forum
  12. Collaborative writing – e.g. through Class OneNote, wiki, blog, Google Docs
  13. Peer feedback – e.g. through Turnitin
  14. Blog – e.g. Blogger, WordPress
  15. PowerPoint – or other means of producing teacher based material e.g. Teacher notebook with Class OneNote
  16. YouTube – link to videos on YouTube that might support in class learning programs
  17. Web Lecture – record lessons and make available online (using Office Mix record audio to go with slide presentation)
  18. Instruction – as above or other digital artefacts created specifically for the subject by the teacher

 

In constructing a survey, it will be important to raise potential uses as well and investigate uses that are more obvious. There is a pressing need to elicit responses which will evaluate usefulness, ease of use and the LMS intention underpinning pedagogical development and methodology (Schoonenboom, 2014, p. 249)

References

Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, pp. 247 – 256.

 

Invisible but Vital

 

This is Our Challenge:

It is the human capacity of libraries that is critical: staff who are knowledge intermediaries, teacher-librarians, and information and data scientists. Such people work at the junction between development, information science and governance (Gregson, Brownlee, Playforth, & Bimbe, 2015, p. 6).

In today’s paradigm, libraries are responsible for the provision of the invisible infrastructure which enables access to information and inform research (Gregson, Brownlee, Playforth, & Bimbe, 2015, p. 22).

This is the challenge we face for Information Services at  my workplace and it’s probably common to others. We have been working through a period of transition for some time, and with the addition of 1:1 devices from 6 – 12 in 2016 this will accelerate.

fig 2.1

(Gregson, Brownlee, Playforth, & Bimbe, 2015, p.22)

This situation requires a change of focus, from the resources and their appropriate care and display, to the people so that what we provide suits individual needs, is accessible anytime and anywhere, and enables publishing as well as reading. We have been slowly working on this aspect as well.

table 2.2

 

 References:

Gregson, J., Brownlee, J. M., Playforth, R., & Bimbe, N. (2015). Evidence Report No. 125: Policy Anticipation, Response and Evaluation: The Future of Knowledge Sharing in a Digital Age: Exploring Impacts and Policy Implications for Development. London: Institute of Development Studies.

 

 

Assignment 2

Digital scholarship in education, in the context of interdisciplinary knowledge and research.

Across millennia, scholarship’s enduring, traditional form has focused on individuals acquiring knowledge from books and lecturers, within single disciplines inside the walls of monolithic institutions which monopolise learning to create and maintain power (Buckley, 2012, pp. 333-334) .  Defining scholarship as acquiring scholastic knowledge within learning institutions (The Australian Pocket Oxford Dictionary, 1996, p. 969) is, however, currently being challenged. Modern academia is undergoing, but inconsistent, transformation due to opportunities provided by web-based communication and behaviour enabled by twenty-first-century digital affordances (Ayers, 2013, pp. 24-28).  Digital scholarship is a term defined as encompassing both scholarly communication and using digital media and research (Libraryowl, 2013), which is increasingly being used to describe this shift, yet it is also a concept with a contested definition requiring deeper investigation (Scanlon, 2011, pp. 177-179).

Understanding digital scholarship, which partially results from economies of information scarcity transforming into profligacies of abundant learning resources, requires examination of the meaning of academia and the measures by which it has traditionally been evaluated (Weller, A pedagogy of abundance, 2011, pp. 85-86). Consideration of its implications, in terms of the future of both higher and school education, should assess whether such changes, are, in fact, desirable, or indeed truly as different as some attest (Baggaley, 2015).

The critical difference between conventional and digital scholarship is connectivism (Veletsianos & Kimmins, 2012, p. 770).  Traditionally, academic knowledge generated by staff employed by a single university has formed the largest percentage of an institution’s market value (Buckley, 2012, pp. 333-334). This ideology has been based on individual research, intra-faculty or, sometimes interfaculty across similar institutions, within a culture of monographic orientation; this model has allowed individual practitioners to add to the conversation around their specialty, while protecting them from departing the norm (Ayers, 2013, p. 28). Connectivists, in contrast, view learning as negotiated, inter-connected, increasingly interdisciplinary, and social; they situate it in complex environments, embracing open values and peer-to-peer networking (Veletsianos & Kimmins, 2012, p. 770).  This dichotomy poses a challenge to faculty members who perceive such an approach as diminishing the hard won traditions of both scholarship and teaching, and also risky. (Ayers, 2013, p. 30).

Despite existing for several decades, it is relatively uncommon that academics and school teachers engage in connectivism; the majority still need to be convinced of the inherent value such practices offer, let alone their inevitability (Scanlon, 2011, p. 177).  There is a philosophical divide between those who have recognised, and are embracing, the potential of technological affordances, and those who are yet to investigate them to any degree (Scanlon, 2011, p. 178).  Those who believe that digital scholarship merely implies copyright free or open access to materials, email interaction, online libraries, employing technology and some online tools, present a diametric contrast to participants in communities of practice: those who have invested in developing or participating in Massively Online Open Courses (MOOCs); learners who collaborate on investigations; and scholars who publish their research in digital format, either individually or together, and invite comment (Ayers, 2013, pp. 27-28).  The former pursue the goal of publishing printed monographs in academic, peer-reviewed journals or theses; the latter consider achieving a doctorate through blogging (Ho, 2015).

Each aspect of digital scholarship at its broadest definition requires examination. Scanlon refers to the seismic shift in patterns of user behaviour, whereby relevant technological and online tools are utilised to lead to new types of collaboration based on openness and interdisciplinarianism (Scanlon, 2011, pp. 178-182).  She identifies the skills of collection, curation, collaboration, creation, and publication as those enabled by digital scholarship, and links the need for such skills to both higher and secondary education (Scanlon, 2011, p. 180). These fit well with the twenty-first century skills now considered so vital for school students that they are embedded in the Australian Curriculum (Australian Curriculum, Assessment and Reporting Authority, n.d.), and promoted by the International Society for Technology in Education Standards for teaching and learning with technology (International Society for Technology in Education, n.d.).

Scanlon refers to an ecological approach to learning  (Scanlon, 2011, p. 179). Ayers promotes similar concepts: ongoing, ever-growing digital environments which generatively enhance the essential aspects of monographic erudition while simultaneously enabling things that could not have been done in print; networking is a prime example of this (Ayers, 2013, p. 34).

Networked participatory scholarship, an exemplar of generative digital ecology, (Veletsianos & Kimmins, 2012)  operates within communities of practice (Archer, 2006), using technologies of cooperation (Saveri, Rheingold, & Vian, 2005). It emerges from an understanding that digital scholarship is something that goes beyond using information and communication technologies to research, teach and collaborate; it also embraces open values, ideology, the potential of peer to peer networking and so-called “wiki ways of working” (Pearce, Weller, Scanlon, & Ashleigh, 2010).  Such scholarship engages with this emergent scholarly practice of using technologies that specifically favour participation in various forms of social media, not only to share concepts, but also to reflect upon them, invite criticism of them, seek suggestions for improving them, validate their worth, and take scholarship further through publication in media that allows for feedback (Veletsianos & Kimmins, 2012, p. 778). Communities of practice, in this sense, have developed in order to manage and grow knowledge as an asset, enabling knowledge exchange in order to improve understanding (Archer, 2006, p. 67). Archer identifies four classifications of such communities: internal, networked within organisations, formal and self-organising (Archer, 2006, p. 67).

These organisational communities of practice networks differ from personal learning networks in that the former entail a level of company or organisational direction while the latter are established by individuals.  There have been a number of examples of open and social learning opportunities for individuals to more formally develop personal learning networks, and many of these have been offered by universities as MOOCs, such as The University of Melbourne’s Coursera on the French Revolution, a subject entailing a blend of traditional and contemporary styles (McPee, 2015). This is very different to the MOOC offered by Regina University: Education, Curriculum and Instruction, taught by Alec Couros, in that the former is content driven while the latter is focused on process (Couros, 2010).  This further illustrates the problem of defining exactly what digital scholarship entails.

The formality of organisationally directed networks is very different from the informality of personal learning networks, and perhaps would be better labelled as professional learning networks. The former often occur randomly within social media circles: Twitter, Facebook, Google Plus; they grow and shrink as people join or lose interest and they persist because of the efforts of the passionate; they rely on open access to the platforms on which they depend and often lead to participation in MOOCs (Couros, 2010, pp. 111-112).  Couros’ course demonstrated the potential for leveraging education through such courses by its cohort: twenty students registered, but more than two hundred others freely interacted with the material under discussion (Couros, 2010, pp. 109 -110). Digital scholarship, as this example illustrates, enables the collection of information for investment in furthering collective knowledge, involves sharing of appropriate tools for collecting and analysing the information found, and may result in the generation of new creating and authoring tools (Weller, The nature of scholarship, 2011, pp. 42- 43).

Some key issues arise from these new ways of learning: the comparability of digital scholarship with the work of “scholarly primitives”, the comparison of open access and publishing with closed and monographic dissemination; the differing pedagogies or andragogies required to deliver them, and the tensions within academia that these cause (Weller, A pedagogy of abundance, 2011, pp. 41 – 47). The tasks traditionally undertaken by the “primitives”: discovering, annotating, comparing, referring, sampling and representing share some similarities to those of digital scholarship; the biggest difference, however, lies in the greater sense of equality for scholars in the truly digital world (Weller, The nature of scholarship, 2011, p. 42). The integration of the knowledge gained (often referred to as emerging from liberated data), its application to wider circumstances, and the teaching that it enables, are seen as threatening the established understanding  of knowledge capital as something residing in published, peer-reviewed articles with restricted circulation within the tertiary sector (Weller, The nature of scholarship, 2011, pp. 43 – 44).

The emancipation of data facilitates unexpected applications (often created in a similar fashion to crowdsourcing) and allows others to integrate the learning in new ways sometimes using new or repurposed tools (Weller, The nature of scholarship, 2011, p. 44) . Opening access to information and publishing the knowledge that is subsequently generated online is a quick and easy process, far removed from the time lag and cost of traditional dissemination of material, especially when subjected to the peer review process (Weller, The nature of scholarship, 2011, p. 45). Speeding up this process offers advantages to universities who can adjust their information generation methodologies, and facilitates an edge in the higher education market (Buckley, 2012, p. 334). Quicker broadcasting of new ideas and ways of collaborating to achieve them, in turn affects application, and all of these processes impact on teaching (Weller, The nature of scholarship, 2011, pp. 45-47).

Possible andragogies and pedagogies in universities and schools intending to adopt digital scholarly practices have been identified as resource-based learning; project based learning; constructivism; communities of practice; and connectivism (Weller, A pedagogy of abundance, 2011, pp. 88 -89); some institutions have also adopted flipped learning approaches, considered to be innovative (Baggaley, 2015). Educators adopting any one of these teaching styles, or a blended combination of two or more, have demonstrated digital resilience (Weller, Digital resilence, 2011, p. 168). Those who are reluctant or resistant are often suffering from techno-angst, risk-averse mindsets, or scepticism (Weller, Digital resilence, 2011, p. 168).  Reasons for anxiety around innovative concepts and practices may be found in disengagement caused by ubiquitous learning management systems and virtual learning environments, through their implicit restrictions; and the tenure system, whereby some staff have ongoing employment that they wish to keep, while others are contracted and know that effluxion of time will end their role (Weller, Digital resilence, 2011, pp. 170 – 171). Pressure to achieve publication in the classic form of peer-reviewed journals or theses may be another factor (Weller, Digital resilence, 2011, pp. 170 – 171).

Issues such as these may be resolved by disassociating government funding from teaching practices, institutions, faculties and individuals must be assured of security if they innovate in terms of their knowledge sharing, development and generation (Buckley, 2012, p. 335). The first step in this process is the building of trust, critical for knowledge creation, and the crux of long-term social relationships which enable powerful collaboration to this end (Buckley, 2012, p. 335). Educators in schools and universities need to develop and employ digital age competencies, which requires mastery of information navigation, connectivity in its broadest sense, and critical evaluation of sources within a trusting ecology  (Greenhow, Robelia, & Hughes, 2009, p. 249). While tertiary educators such as Couros (Couros, 2010) and teachers such as Gail Casey (Casey, 2013) embrace the concept of communities of practice, and immerse themselves and their students in social and participatory networking, and others engage globally through flat connections (Lindsay, 2015) utilising the full extent of digital scholarship, they are still in the minority. The students lucky enough to encounter such educators at school or university, will benefit from an education that will aid them to develop their digital identity, a recent cultural process made possible by the participatory web (Greenhow, Robelia, & Hughes, 2009, p. 251). The greatest benefit identified by Couros’ student Jennifer, is that such learning is truly life-long (Couros, 2010, p. 127).

Digital scholarship is evolving as the technologies of cooperation increase in number and format, and adoption by universities and schools across the world slowly increases (Saveri, Rheingold, & Vian, 2005, p. 1). More research is required to assess and harness the presumed potential of digital technology, and, ensure that the processes being touted as new really are novel, and not just new terminology for older practices, as identified by Baggaley in his somewhat flippant assessment of flipped learning (Baggaley, 2015, pp. 4 – 5). His identification of self-promotion through registering websites, and copyrighting their own terminology is a clear warning for the need for academic rigour (Baggaley, 2015, pp. 3-6).

More academics need to avoid passitivity (Weller, Digital resilence, 2011, p. 170) and become organised participants, possibly by adopting a commando-style role in correcting errors in Wikipedia (Baggaley, 2015, p. 8).  The history of the term digital scholarship in Wikipedia may be an example of such tactics (Libraryowl, 2013). By addressing critical issues of value in relation to risk, and actively engaging in the conversation relating to digital scholarship, academic writers and researchers have the potential to change the politics of educational technology provision and practice (Selwyn, 2010). Once universities endorse the best elements of social participatory networking and its ability to contribute meaningfully to knowledge generation, educators in schools will also embrace a learning ecology perspective, benefitting from the fusion of formal and informal learning, spanning contextual boundaries for self-sustained learning (Greenhow, Robelia, & Hughes, 2009, p. 248) .

References

21st Century Skills: Rethinking How Students Learn. (2010). Bloomington: Solution Tree.

Archer, N. (2006). A Classification of Communities of Practice. In Encyclopedia of Communities of Practice in Information and Knowledge Management (pp. 21-29). Information Science Reference (an imprint of IGI Global).

Ayers, E. L. (2013). Does digital scholarship have a future? Educause Review, 24-34.

Baggaley, J. (2015, May 29). Flips and flops. Distance Education, 1-10.

Buckley, S. (2012). Higher education and knowledge sharing: from ivory tower to twenty-first century. Innovations in Higher Education and Teaching International, 49(3), 333-344.

Casey, G. (2013, September). Interdisciplinary literacy through social media In the Mathematics classroom: an action research study. Journal of Adolescent and Adult Literacy, 57(1), 60-67.

Couros, A. (2010). Developing personal networks for open and social learning. In Emerging Technologies in Distance Education (109–128). Athabasca University: AU Press.  (pp. 109-128). Athabasca University: AU Press.

Greenhow, C., Robelia, B., & Hughes, J. E. (2009). Learning, teaching and scholarship in a digital age. Educational Researcher, 38(4), 246-259.

Ho, C. (2015, August 26). Blogging Your Way To A PhD. Retrieved from The Thesis Whisperer August 27, 2015: http://thesiswhisperer.com/2015/08/26/blogging-your-way-to-a-phd/

International Society for Technology in Education. (n.d.). ISTE Standards. Retrieved from ISTE August 29, 2015: http://www.iste.org/standards

Libraryowl. (2013, July 30). Digital Scholarship. Retrieved from Wikipedia  August 30, 2015: https://en.wikipedia.org/wiki/Digital_scholarship

Lindsay, J. [host] (2015, August 6). Colloquium 3: Flat Classrooms.

McPee, P. (2015). Class Central: The French revolution. Retrieved August 29, 2015, from Coursera: https://www.class-central.com/mooc/1705/coursera-the-french-revolution

Pearce, N., Weller, M., Scanlon, E., & Ashleigh, M. (2010). Digital scholarship considered: how new technologies could transform academic work. in education, 16(1 Spring), 33-14.

Saveri, A., Rheingold, H., & Vian, K. (2005). Technologies of Cooperation. Palo Alto: Institute for the Future. Retrieved August 15, 2015, from www.iftf.org

Scanlon, E. (2011). Digital futures: changes in scholarship, open education resources and the inevitability of interdisciplinarity. Arts And Humanities in Higher Education, 177-184.

Selwyn, N. (2010). Looking beyond learning: notes towards the critical study of educational technology. Journal of Computer Assisted Learning(26 (1)), 65 – 73. doi:10.1111/j.1365-2729.2009.00338.x.

The Australian Pocket Oxford Dictionary. (1996). Melbourne: Oxford University Press.

Veletsianos, G., & Kimmins, R. (2012). Networked participatory scholarship: emergent techno-cultural pressures toward open and digital scholarship in online networks. Computers & Education; An International Journal 58 (2012) 76, 58, 766 -774.

Weller, M. (2011). A Pedagogy of Abundance. In M. Weller, The Digital Scholar: How Technology is Transforming Scholarly Practice (pp. 85-95). London: Bloomsbury Academic.

Weller, M. (2011). Digital Resilence. In The Digital Scholar: How Technology is Transforming Scholarly Practice (pp. 168-184). London: Bloomsbury Academic.

Weller, M. (2011). The Nature of Scholarship. In M. Weller, The Digital Scholar, How Technology is Transforming Scholarly Practice (pp. 41-51). London: Bloomsbury Collections.

 

LMS & Learning

Joining the Traveller’s Journey

(Thanks, Simon Welsh!)

In recently considering digital scholarship, and also reflecting on Colloquium 1 (Welsh, 2015), the potential of Learning Management Systems in comparison to their usage has presented itself as an issue worthy of academic investigation. Until hearing Simon speak passionately about the things many LMSs already measure, and those that could potentially be calculated and then applied to improving learning outcomes for students, I had not considered the possibilities, and these became clear (Welsh, 2015).

For many educators, the LMS is something that has been introduced into their working lives without explanation as to why it is needed, or what it can do for learning.  For secondary teaching colleagues, it has presented a platform for storing work for students, somewhere to host school-wide timetables, and more recently enable roll marking and report writing. Comparing the university LMS to that used at my recent schools has demonstrated some gaps, but the access to analytics, as referred to by Simon (Welsh, 2015), is not obvious to a learner in the former or a teacher in the latter.

Given that students have no say in the specific LMS required by their institution, to what extent do educators have choice in either system or what that system enables them to present (Islam, 2014, p. 253)? Do educators have freedom to create meaningful learning for their students or do the templates offered by the LMS constrain them; or is it incumbent on educators to build on what their LMS enables and augment the weaknesses (Leaman, 2015)?

Rekhari takes these concepts further by declaring that there is a chasm between learning design, technology and the LMS due to a combination of ineffective use by educators and flaws in the design of the systems (Rekhari, 2015, p. 12). She further questions whether the reasons that benefits that LMS intend to deliver to educational design are not entering praxis are the fault of the developers making the software hard to use, or the educators not proactively applying constructivist philosophies to their learning design (Rekhari, 2015, p. 13). She goes on to question whether LMSs are the barriers to educational change (Rekhari, 2015, p. 13).

This publication has led to much questioning of my own practices as an educator using an LMS – and has led to the realisation that beyond managing storage and retrieval of coursework, the other possibilities have not been considered. In order to further my understanding of what our school LMS can do I have requested time with one of the developers. To develop my understanding of practical analyses that already exist I have turned to Twitter, where I have engaged in meaningful dialogue with several professors in the Computer Science and Information Technology Department at RMIT, and who have sent me a document in which they compare Blackboard to Facebook in terms of supporting a specific online course in programming (Maleko, Nandi, Hamilton, D’Souza, & Harland, 2013). Additional reading has also been ongoing.

I “attended” the first Colloquium with a degree of disinterest predetermined on the basis of its description, and, due to Simon’s future predictions, it has intrigued me and started me on a learning journey I would never have predicted. This has proved not only interesting but potentially very useful, and will form the basis of my Case Study for Assignment 3.  From passive user to captivated challenger, I am now wondering if a different approach on my behalf could enable my development of a learning ecology for enhancing digital scholarship (Greenhow, Robelia, & Hughes, 2009, p. 248).

References

Greenhow, C., Robelia, B., & Hughes, J. E. (2009). Learning, teaching and scholarship in a digital age. Educational Researcher, 38(4), 246-259.

Islam, A. N. (2014). Sources of satisfaction and dissatisfaction with a learning management system in post-adoption stage: a critical incident technique approach. Computers in Human Behavior, pp. 249-261.

Leaman, C. (2015, August 20). What If Your Learning Management System Isn’t Enough? Retrieved from eLearning Industry: http://elearningindustry.com/learning-management-system-isnt-enough

Maleko, M., Nandi, D., Hamilton, M., D’Souza, D., & Harland, J. (2013). Facebook versus Blackboard for supporting the learning of programming in a fully online course: the changing face of computer education. Learning and Teaching in Computing and Engineering, pp. 83-89.

Rekhari, S. (2015, August). The Chasm – learning design, technology, and the LMS. Training and Development, pp. 12-13. Retrieved from Australian Institute of Training and Development: www.aitd.com.au

Simkin, Margaret (2015, August 3): #2 http://thinkspace.csu.edu.au/msimkin/2015/08/03/2/

Welsh, S. [Host] (2015, July 28). Learning Analytics: A Traveller’s Guide; Colloquium 2. Albury, Victoria, Australia.

 

 

 

 

Zone of intervention – when to be sage in the classroom.

There are too many teachers who believe that their role is to direct learning from the front of the classroom and keep control over everything that occurs. (In my first Hamilton school (1980) there was a real “stage” at the front of each classroom,  and you taught from behind a big desk which sat on the stage between you and the blackboard – one of those new roller based ones that gave you almost endless space to deliver your words of wisdom – and well separated from the students, who were way down on the lower deck).  I hated it – and quickly created opportunities for students to be on the stage, at the board or for me to join them “down below”. Today, minus the board and the stage, this is what I still see in so many room as I move (occasionally) around the school.

Modern concepts of flipped classrooms focus on the sage role but place it outside the classroom, and leave class time for interaction around the information gained. This still leaves me uneasy.

The main reason I question the sage approach is that there are many things that my students know that I do not, and they are all individuals, not a homogenous body. If I assume the guru position, am I not locking them into the knowledge I have and not extending them beyond it?

The main reason I do assume the sage role at times is because, with 6 years of tertiary education and many, many years of teaching experience there must be things that I know that they can’t know, or fully understand without some intervention on my part. In both my History teaching, and my Teacher-librarian role, I tend to work along the lines of Ross Todd and Carol Kuhlthau’s zone of intervention: Google this to download a Ppt on guided inquiry which covers this topic – tldl.pbworks.com/f/Ross+Todd+Guided+Inquiry+Web+2.0.ppt 

Dr. Ross Todd
Dr. Ross Todd