Sie sind auf Seite 1von 7

1 Francesco Marinucci DETC 630 Section 9040 11/25/2012 Technology report

Introduction

Distance educators work with technologies to effectively practice their multiple duties as online teachers. The approach one fit all does not apply to the current technological and educational scenarios where emerging technologies and digital literacy of learners demand for using the right technology for the right purpose at the right moment (Prensky, 2001). In this regard, it is of crucial importance that distance educators access tools that allow them to quickly assess and compare multiple technologies. In particular, these tools should look at technologies with promising potential both from instructor and learners perspective (Veletsianos, 2010). The purpose of this report is to present a scorecard developed to evaluate emerging technologies used in education by taking into consideration both perspectives. The scorecard, which aims at supporting distance educators in carrying out preliminary screening of technologies, comprises eight major parameters that can be scored from 1 to 3. Interaction The isolation of distance education student is mainly due to the unsatisfactory level of interaction such as instructor-student and student-student (Lee & McLoughlin, 2010). Therefore, this first parameter looks at the capacity of technologies of being used as collaboration tool, both those mimicking face-to-face

2 meetings and those enabling persistent micro-interactions (Lomas, Burke & Page, 2008). A high score for both instructors and students mean that multiple options are available to satisfy multiple modalities for interactions. Design The overall layout and graphics of any online technology are likely to be the first factors to attract or not distant learners. The opportunity that instructors have to modify the design of technologies is very important as well as the capacity of students of personalizing it (Kolowich, 2012). High scores both for instructors and students in this category mean that both parties can contribute to the final design. Content management The growing recognition of the values of ePortfolio urges that both instructor and students have the same privileges in content management (Batson, 2012). Excellent content management allows learning centered approach and it is achieved if both instructors and students have high score in the scorecard in this category. Usability To improve instructors capacity of using multiple technologies, it is important that they are easy-to-use, both in terms of type and complexity of features. The fact that one technology has many features can disorient and give sense of frustration both to instructors and students. High scores both for instructors and students in this category mean that both parties can easily get used to the technology. Accessibility This parameter refers to the capacity of accessing the technology by as many people as possible regardless of the device used. The growing ownership of electronic devices, which has driven the Bring our own device approach (Wikipedia), involves learner more in the learning process and hence it requires that technology is

3 always accessible (Park, 2011). High score for both categories imply high accessibility to the technology. Reusability This refers to the opportunity of developing and using learning objects through different media format with different technologies. Reusability is very important to save instructors time, but it also allows students to build their own ePortfolios effectively. High scores in this category mean that both instructor and learner can reutilize the technology for multiple activities. Cost The costs of new technologies might be a serious limiting factor to their adoption. Despite this, it might be worth an additional fee for students to compensate the initial investment to purchase it if the technology fulfills all the previous features. High score for both users mean that the technology is available at no cost or that the initial expenses are worth investment. Scorecard The following scorecard has been applied to evaluate nine technologies previously tested both from instructor and students perspectives.

Figure 1 Scorecard comprising nine features to assess education technologies (I-instructor, L=learner)

4 The blog Tmblr has among the highest scores both for instructor and learner. The slight difference between the two categories of users is due to the possibility of uploading audio files only. In writer opinion, the option of creating audio files directly with the same technology could benefit student-learner interaction; while studentstudent interaction is very likely to do not benefit too much of this function. The software PmWiki has the lowest scores because of its very basic features in terms of Design, Interaction, and Usability. The latter was really challenging that the writer was not able to effectively use it. Even if this is an open-source software, its free use is not immediate. Audacity is sophisticated software that requires some practice to effectively be used. Developed for advanced recording and editing of audio file, can also be used for creating audio file for educational purposes but it is not the ideal software for this. Skype has a total high score because of the multiple options it offers for communicating, interacting, and sharing files. Even if it was not developed with educational applications in mind, it can effectively and at no cost used for any kind of interaction. The social network Facebook has very rigid structure for effectively managing more formal instructor-learner interaction. Vice versa, it is great tool for enhancing student-student interaction due to the increased social presence it offers. Canvas, which is a learning management system, has obtained the highest scores due to its flexibility and to the fact that it uses lot of content and software from the Internet. The score from instructor and students perspectives are both high because of the numerous features for content management and the option of creating ePortfolios.

5 Camtasia is excellent software that offers several options for screen recording and video editing. The scores are relatively high and identical both for instructor and students. The software GoogleDocs offers multiple features for sharing files and collaborating. Potentially it has several applications in the education field but it only offers asynchronous interaction. Similar to other wikis, PBWorks is very intuitive and easy to use. The higher score for instructors is due to the fact that the features available are very useful to enhance and manage online interaction and collaboration. Conclusion The scorecard was developed having in mind some major features for quick and comprehensive screening of technologies. While few of them can be evaluated just by checking the relative webpages (i.e. cost, accessibility), the remaining parameters need to be explored by the user. In fact, only by accessing the websites or downloading the software it is possible to understand their limits or potentialities. This aspect is an advantage of the scorecard in cases cost and accessibility are huge limiting factors for further implementation. By comparing the nine technologies in this essay using the scorecard, it was possible to highlight major differences across them. Some of the features were clearly different across platforms, but they serve for different purposes. Therefore to really evaluate the effectiveness of the scorecard it would be fruitful to use it to assess similar technologies. The grading scale used from 1 to 3 resulted very helpful in elaborating

6 the final score. A different grading system with 5 options might have created better separation but at the expenses of more objective evaluation. While it might be very helpful to differentiate between instructor and students perspective, to really get use to the software in both roles takes lot of time. It might be useful to abandon this approach since there were very few minor differences only. Besides, the different scores observed between the two perspectives were prone to subjective interpretation more than to an objective analysis. It might be appropriate to insert some additional parameters that can be easily measurable by anyone i.e. amount of characters per message or file size that can be attached. This kind of measurements might sound hard to find but they can make the scorecard more standardizable. In this regard, it is very important to define first the key features to be measured to avoid wasting time.

7 References

Batson, T. (2012). The wait is over: The LMS and the ePortfolio merge to serve a culture of learning. Bring_your_own_device (n.d.). Retrieved Novemebr 24, 2012 from http://en.wikipedia.org/wiki/Bring_your_own_device. Kolovich, S. (2012). Cracking up the LMS. Retrieved from: http://www.insidehighered.com/news/2012/01/11/what-does-lms-future-look. Lee, M., & McLoughlin, C. (2010). Beyond distance and time constraints: Applying social networking tools and Web 2.0 approaches in DE. In G. Veletsianos (Eds.), Emerging Technologies in Distance Education (pp. 61-87). Canada: AUPress. Lomas, C., Burke, M., and Page, C.L.. (2008). Collaboration tools. Educause Learning Initiative. Retrieved from: http://www.educause.edu/library/resources/collaboration-tools. Park, Y. (2011). A pedagogical framework for mobile learning: Categorizing educational applications of mobile technologies into four types. The International Review Of Research In Open And Distance Learning, 12(2), 78-102. Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1-6. Veletsianos, G. (2010). A definition of emerging technologies for education. In G. Veletsianos (Eds.), Emerging Technologies in Distance Education (pp.3-22). Canada: AUPress.

Das könnte Ihnen auch gefallen