IJITCS Vol. 6, No. 5, 8 Apr. 2014
Cover page and Table of Contents: PDF (size: 371KB)
Full Text (PDF, 371KB), PP.11-17
Views: 0 Downloads: 0
Implicit Browsing Behaviour, Task Specific Context, Information Seeking Behaviour
This paper focuses on how students access web pages in a task specific information retrieval. An investigation on how students search the web for their current needs was carried out and students’ behavioural characteristics as they surf the internet to answer some given online multiple choice questions was collected. Twenty three students participated in the study and a number of behavioural characteristics were captured. Camtasia studio 7 was used to record their searching activity. The result shows that 328 web pages were visited by the students, and among the parameters captured, the time spent on the search task has a stronger correlation with the students’ performance than any other captured parameter. The time spent on a document can be used as a good implicit indicator to infer learner’s interest in a context based recommender system.
Stephen Akuma, "Investigating the Effect of Implicit Browsing Behaviour on Students’ Performance in a Task Specific Context", International Journal of Information Technology and Computer Science(IJITCS), vol.6, no.5, pp.11-17, 2014. DOI:10.5815/ijitcs.2014.05.02
[1]Brusilovsky, P., Tasso, C. “Preface to Special Issue on User Modeling for Web Information Retrieval User Model”, User-Adapt. Interact., 2004, Vol. 14, No. 2-3, pp. 147-157.
[2]Zhu, Z., Wang, J., Chen, M., Huang, R. “User interest modelling based on access behavior and its application in personalized information retrieval”, Proceedings of 3rd International Conference on Information Management, Innovation Management and Industrial Engineering, IEEE Conference Publishing Services, November 2010, 26-28, pp.266-270.
[3]Claypool M., Le P., Wased M., David Brown Implicit interest indicators, Proceedings of the 6th international conference on Intelligent user interfaces, p.33-40, Santa Fe, New Mexico, United States January 2001
[4]Vakkari, P. “Task-based information searching”. Annual Review of Information Science and Technology, 2003, 37, 413-464.
[5] Kim, H. & Chan P. Implicit Indicators for Interesting Web Pages, in WEBIST 2005 - 1st International Conference on Web Information Systems and Technologies, Proceedings, 2005, pp. 270-277.
[6]Ajiboye, J. & Tella, A. “University Undergraduate Students’ Information Seeking Behaviour: Implications For Quality In Higher Education In Africa”. The Turkish Online Journal of Educational Technology – TOJET 2007, 6(1), 40-54
[7]Owolabi, K., Jimoh, M., Okpeh S., Information Seeking Behaviour of Polytechnic Students: The Case of Akanu Ibiam Federal Polytechnic, Unwana Nigeria, Library Philosophy and Practice, 2010
[8] Siddiqui, S. “Information Seeking Behaviour of B.Tech. and M.B.B.S. Students in Lucknow: A Comparative Study”. Journal of Library & Information Science 2011, 1(1), 55-70
[9]Chapman, L. and Ivankovic, H. “Russian roulette or pandora’s box: Use of theInternet as a research tool”. A paper presented at VALA 2002 – e-volvinginformation futures. 11th Biennial Conference and Exhibition, Melbourne.
[10]Liu, P., & Wu, I. “Collaborative relevance assessment for task-based knowledge support”. Decision Support Systems, 2008, 44(2): 524-543
[11]Alemán, J. “Automated Assessment in a Programming Tools Course”. IEEE Trans. Education, 2011, 54(4): 576-581
[12]Junni, P. “Student seeking information for their Master’s thesis – the effect of the Internet”. Information Research, 2007, 12(2)
[13]George, C., et al. “Scholarly use of information: graduate students' information seeking behavior”. Information Research, 2006, 11 (4).
[14] Iqbal, R., Grzywaczewski, A., James, A., Doctor, F., Halloran, J. “Investigating the value of retention actions as a source of relevance information in the software development environment”. Proceedings of the IEEE 16th International Conference on Computer Supported Cooperative Work in Design, 2012, pp 121-127
[15]Liu, D., Tao, Q. “Hybrid Filtering Recommendation in E-Learning Environment”. Second International Workshop on Education Technology and Computer Science, 2010
[16]Tankano, K. & Li, K. “An Adaptive Personalized Recommender based on Web-Browsing Behavior Learning”. International Conference on Advanced Information Networking and Applications Workshops 2009
[17]Lee, S., Palmer-Brown, D. & Draganova, C. “Diagnostic Feedback by Snap-drift Question Response Grouping”, In proceedings of the 9th WSEAS International Conference on Neural Networks (NN'08), 2008, pp 208-214.
[18]Nichols, D. M. “Implicit Ratings and Filtering”. In Proceedings of the 5th DELOS Workshop on Filtering and Collaborative Filtering, Budapaest Hungary, ERCIM 1997, pp 10-12.
[19]D. Oard and J. Kim. “Implicit feedback for recommender systems”. In Proceedings of the AAAI Workshop on Recommender Systems. 1998
[20]Morita, M. & Shinoda, Y. “Information Filtering Based on User Behaviour Analysis and Best MatchText Retrieval”. In Proceedings of SIGIR Conference on Research and Development, 1994, pp 272 - 281
[21]Kellar, M., Watters, C., Duffy, J., and Shepherd, M. “Effect of Task on Time Spent Reading as an Implicit Measure of Interest”. In Proceedings of the 67th American Society for Information Science and Technology (ASIS and T) Annual Meeting, 2004, 41, pp. 168–175
[22]Liu C., Liu J., Belkin N., Cole M., Gwizdka J. “Using dwell time as an implicit measure of usefulness in different task types”. Proceedings of the American Society for Information Science and Technology 2011, 48:1–4.
[23]Huang, J., White, R., Buscher, G., & Wang, K. Improving searcher models using mouse cursor activity. In SIGIR. ACM 2012
[24]Akuma, S. and Iqbal R. “Investigation of Students’ Information Seeking Behaviour”. International Journal of Advanced Research in Computer Science and Software Engineering. 2012, 2(12): 28-35
[25]Hearst M. “Search User Interfaces”. Cambridge University Press, 2009.