Home
IMILS
UNESCO TTT
EFIL
St. Da Vinci Huis

INTERNATIONAL MEDIA AND INFORMATION LITERACY SURVEY (IMILS)

There is also a Google Group http://groups.google.com/group/iils

Titles in blue are links

 

The report PRELIMINARY RESULTS FROM THE INTERNATIONAL MEDIA AND INFORMATION LITERACY SURVEY (IMILS) OF THE HABITS AND PRACTICES OF UNIVERSITY STUDENTS WHEN UNDERTAKING RESEARCH ASSIGNMENTS

 

Is now available in IMILS.PDF and E-PUB format



 

                                                WELCOME MESSAGE!

"Welcome to this website, which is devoted to covering key current events and background information relating to the International Information Literacy Survey (IILS) project.  This IILS initiative is a Survey of the Information and Media Literacy Behaviors of University Students in Selected Central Asian Countries, and is still, at the moment, in the very preliminary stage of being planned, organized and structured.  Countries in the central Asian region are welcome to participate, and I hereby invite institutions of higher learning to review the survey Draft Guidelines currently posted hereon, and then to contact me for the purpose of discussing joining the IILS survey.  We are especially interested in inviting a volunteer institution to take the lead in planning and coordinating a survey for their home country, working with those central Asian countries already committed to undertaking a survey, as well as other institutions in their own country, in a collaborative and collegial mode to exchange ideas, strategies, approaches, survey instruments, and so forth.

Yours sincerely,

 

Dr. Jagtar Singh, Chair, IILS Steering Committee
President, Indian Association of Teachers of Library and Information Science (IATLIS)
&
Professor and Head,
Department of Library and Information Science,
Punjabi University, Patiala
Pin - 147 002 (India)
Email: jagtar.kindu@gmail.com
Tel: +91 (0)175 304 6462 & 304 6179 (Work)
       +91 (0)175 228 2727 (Home)
Fax: +91 (0)175 228 3073

 


 



 INTERNATIONAMEDIA AND INFORMATION LITERACY SURVEY (IMILS) GUIDELINES

Stage One of a Three Stage Project to Design, Build, Develop, Test and Evaluate Media and Information Literacy (MIL) Indicators

Revised Draft – December 2, 2010

 

prepared by

Professors Albert K. Boekhorst and Forest Woody Horton, Jr.

Survey Facilitators

 

Note:  These guidelines are intended to be advisory, not prescriptive! Country Survey Coordinators, and their respective participating home institutions, may need to modify the guidelines in some respects in order to customize and adapt them to their own unique country language(s), cultures, traditions and other circumstances.  However, major revisions first should be negotiated in advance with,  and then agreed to, by the Survey Steering Committee because it is important, for the sake of comparability of country survey results, to maximize consistency in survey instrument content and format, as well as the survey methodologies employed by each participating country.  Most minor changes are expected to be acceptable and approved by the Committee.  

 

These Guidelines are structured into six main headings:

  1. Background of Key Events Leading to the Survey;

  2. Roles and Responsibilities of Country Survey Coordinators and Host Institutions;

  3. The Survey Master Plan, Survey Organization, Survey Instrument, Survey Methodology and Timetable;

  4.  Planning, Implementing and Conducting the Survey at Multiple Locations

  5.  Analyzing and Evaluating the Survey Results;

  6.  Some Logistical, Financial and Administrative Considerations. 

 

There are also several appendices containing more detailed, specialized and technical materials, such as the PIL Survey Instrument itself (Appendix A), and the PIL Survey Methodology (Appendix B). 

 

As of the above date, the following LIS professionals with a special interest in Information and Media Literacy have agreed to participate in the survey, petition their home institutions to serve as host institutions for the survey, and solicit the interest and support of other institution, organizations and colleagues for the survey in their respective countries; more country  acceptances are expected from additional  countries in Asia:

 

1. India, Dr. Jagtar Singh, Professor, Punjabi University, Patiala, and Chair of the International Information Literacy Survey Steering Committee

2. Dr. H.P.S. Kalra, Associate Professor, Punjabi University, Patiala, IILS Secretariat

3. Turkey, Dr. Serap Kurbanoglu, Professor, Hacettepe University, Ankara

4. Malaysia, Dr. Szarina Abdullah, Professor, UiTM, Kuala Lumpur

5. Sri Lanka, Dr. Pradeepa Wijetunge, Library, University of Colombo 

6. Bangladesh, Ms. Dilara Begum, Librarian, East West University, Dhaka

7. Pakistan, Dr. Mohammad Ramzan, Librarian, Lahore University of Management Sciences, Lahore

8. Thailand, Professor Aree Cheunwattana, Department of Library & Information Science, Srinakharinwirot University, Faculty of Humanities, Sukhumvit

9. Professor Albert K. Boekhorst, University of Amsterdam, the Netherlands, and University of Pretoria, Republic of South Africa, Facilitator

10. Professor Forest Woody Horton, Jr., Retired Fulbright Professor of Library and Information Science, Facilitator

 

BACKGROUND OF KEY EVENTS LEADING TO THE SURVEY

Emergence of Media and Information Literacy as critical 21st Century, Internet Age priorities

Over the last several decades the concepts of Information Literacy and Media Literacy have emerged prominently on the world stage,  and UNESCO, IFLA and regional LIS associations and societies have provided leadership in co-sponsoring several major international Information Literacy (IL) Expert Meetings (e.g., Prague, Czech Republic in 2003, Alexandria, Egypt, in 2005, Ljubljana, Slovenia in 2006, and lastly, Bangkok in 2010), attended by a total of over 150 world experts from nearly 70 different countries.  Asian countries have always been an enthusiastic participant in these world meetings, but important regional meetings have been held in many countries as well, including, in Asia, notably in Malaysia, India, Turkey, Thailand and China.  In addition, all of the world’s other major geographic regions – the Middle East/North Africa, Sub-Sahara Africa, Oceania, Latin America/Caribbean, Europe, and North America, have each held their own regional, sub-regional and country-based IL and ML seminars, colloquia, conferences, meetings and workshops.  All of those meetings were held to afford IL and ML experts an opportunity to come together in one place, face-to-face, to exchange ideas, strategies, approaches and plans, and to debate key pedagogical issues and alternatives implicit in teaching and learning Information and Media Literacy.  Each region has launched many promising pilot IL and ML projects, and interchanged IL and ML teaching and learning “best practices” and pilot test experiences that were developed, tested and implemented in their own regions and countries. 

 

Some Differences in Theory, Definitions, Strategies and Approaches Remain.

In the aforementioned pioneering international and regional meetings that took place intensively between 2002 and 2010, the experts often succeeded in significantly narrowing their differences in Information Literacy and Media Literacy theories, definitions, and teaching and learning (pedagogy) approaches, techniques, and methods.  But, to confuse matters, other related terms and concepts have also been introduced and utilized by various authors and experts, including ICT Literacy, Digital Literacy and many others.  Moreover, even the term “literacy” itself has had competitors such as “fluency,” “competences” and “competencies.”  Experts sometimes succeeded in reaching a consensus on how to proceed in terms of next steps in the evolution of the concepts and their practice, often on a regional but sometimes even on a global basis.  However, it was not until another landmark meeting of global experts was convened in Bangkok in November 2010 that the inter-relationships between Information and Media Literacy were clarified to a degree that experts are hopefully now able to proceed with the development of indicators that effectively and accurately can be used as tools to observe and measure IL and ML behaviors.

 

Media and Information Literacy: The November 2010 Bangkok Meeting

As mentioned above, UNESCO organized an important expert group meeting on the development of a global framework of indicators to measure media and information literacy (MIL) from 4 to 6 November 2010 in Bangkok, Thailand. The event brought together experts from all geographic regions (representing 17 countries) and from a wide range of domains and disciplines including: media, information, journalism, librarianship, education, curriculum development, psychometrics and statistics.

 

The meeting in Bangkok marked the beginning of a complex process and was specifically aimed at: 

  • reviewing and validating the conceptual foundations of this process, including the examination and validation of a generic set of media and information literacy competences (knowledge, skills, behaviors and attitudes) required for citizens; and
  • elaborating measurement methodologies for MIL indicators to be considered by UNESCO.

The articulation of MIL indicators is part of UNESCO’s global action to promote media and information literate societies. UNESCO believes that MIL indicators will enable governments and other stakeholders to: acquire a greater understand of this field, assess the status of media and information literacy in their countries, ascertain required inputs and measure their progress towards wide-scale media and information literacy take-up. 

 

Where the International Information Literacy Survey Idea Came From and What It Is

The International Information Literacy Survey (IILS) is an initiative being undertaken by an informal consortium of selected Asian regional countries for the purpose of developing both individual country, as well as an (Asian) region-wide baseline of the information literacy knowledge, skills, habits, attitudes, behaviors and other characteristics and attributes of college students, as well as young adults in the workforce, in those countries and in that region.  The initiative was initially inspired by the pioneering, and widely publicized and quoted work done at the Information School (iSchool) of the University of Washington in the USA, under the leadership of Professors Michael Eisenberg and Survey Research Expert Alison Head, during the two year 2008-2010 timeframe, and formally called the “Project Information Literacy” or PIL Survey.  Background and detailed technical information for the PIL can be viewed at http://projectinfolit.org/publications/  See Appendices A, B, C and D.

 

The PIL Survey goals and objectives were stated in this way in one of their key documents:

 

“The purpose of the PIL large-scale student survey was to collect quantitative data about early adultsʼ research approaches, practices, and styles, including sources used, methods for evaluation, and challenges encountered during the research process.  Our ongoing goal at PIL is to release practical and applicable findings, which inform an understanding of the student research process, especially what students experience when conducting research, for use by librarians, faculty, and administrators. Ideally, we hope for direct value to numerous constituents in academic settings, including professors, librarians, and administrators, who may also be trying to impart Information and Media Literacy skills, standards, and competencies to a growing population of students, who are heavily influenced by the convenience of a Google search and the ubiquity of the Web.”

 

Since the PIL survey materials are mostly open-source, they can serve as a unique, invaluable and easily accessible and usable online resource for Asian countries participating in the IILS to refer to, research and study, and then adapt and utilize for their own surveys.  However, professional ethics demand that we acknowledge and credit the PIL survey and its principals in appropriate contexts and at appropriate times, such as citing their materials in accordance with commonly accepted citation conventions.  This commitment has been made already to the iSchool at the University of Washington and we shall monitor the commitment as we proceed. Each country should be sensitive to this requirement.

 

Why and How the IILS was  Initiated and Organized

After studying the PIL Survey data, findings, conclusions and recommendations which were released in November 2010, several leading LIS professionals in various countries  (including some who attended the aforementioned Bangkok meeting which fortuitously co-occurred at almost exactly the same time) discussed informally the idea of conducting a comprehensive, country-wide and well-disciplined survey of information and media literacy college student knowledge, skills, habits, attitudes and behaviors, believing it would be a very timely and practical thing to do, giving the just-released PIL survey results and given the UNESCO and IFLA desires to develop MIL Indicators. 

However, rather than attempt to undertake a survey in each country independently and disconnectedly, the group decided that by collaborating with each other and sharing ideas, plans, approaches, strategies, timetables and so forth, that in so doing, if they joined together in a coordinated and collaborative effort, for all three of the aforementioned stages, that the end results would be much stronger, credible and actionable than would otherwise be the case if the survey was pursued in each country individually and disjointedly.

Moreover, rather than attempt to survey both the Information Literacy and the Media Literacy behaviors of students at the same time and in the same survey instrument, the group felt that it would better to undertake a three stage project.  The first stage would be an IL survey, the second stage would be an ML survey, and the third and final stage would be devoted to an analysis of how the findings from the first two stages fit together.  That is, what commonalities and divergences existed in the results from the two stages, and what were the precise “intersection points” where the two sets of results came together or were essentially disparate - - meaning what were the interdependencies, counterdependencies, and other kinds of inter-relationships in the two sets of data.  Then, finally, the analysis would lead to the design, construction, development and testing of MIL Indicators.  

 

This document is primarily concerned with the first stage of the project - - the Information Literacy behaviors stage.

 

If carefully planned and rigorously executed, the group felt that the survey results could be of great value to their own countries, not only in improving student Information and Media Literacy behaviors, but also in formulating stronger educational, employment and workforce development, health, governance, general economic, socio-cultural and other important national policies and programs as well.  Ultimately the goal would be to help each country become more competitive in regional and world trade, and to better cope with their internal health, employment, governance, educational and learning, and other chronic and long-standing political, economic and social challenges.

 

Professor Serap Kurbanoglu of Hacettepe University in Ankara, Turkey, a renowned academic not only in her own country and the region, as well as internationally, must be credited with first coming up with the idea of such an initiative although she did not attend the Bangkok meeting.  Currently she is moving rapidly to secure the support and interest of the educational sector in Turkey, and also secure professional and governmental and private sector recognition and support as well, recognizing that all sectors of Turkish society would ultimately be stakeholders in the results. Turkey is uniquely situated at the crossroads of Europe and Asia, and it was not long after learning of Professor Kurbanoglu’s plans that other countries, including India, Malaysia, Pakistan, Bangladesh, Sri Lanka, Thailand and others, followed suit, and the endeavor is well underway as of this writing in terms of the first stage - - preliminary planning.  Fortunately a number of distinguished LIS professionals in Asian countries, keenly interested in Information and Media Literacy, such as Professor Jagtar Singh in India, who agreed to serve as Chair of the Survey Steering Committee, and Professor Szarina Abdullah in Malaysia, expressed an interest in the project, and word spread of the project quickly and more and more countries volunteered to join in.

 

A Survey Steering Committee is being established composed of the two authors of this paper and each Survey Coordinator and the Committee will be chaired by Professor Singh of India.  It was agreed that each major step in IILS planning and implementation, including review and approval of these Guidelines, should be discussed and approved by the Steering Committee.  Also, liaison should be established and maintained with appropriate officials of UNESCO and IFLA, and with the various Asian regional LIS associations/societies, as well as the individual participating country national LIS associations/societies.  Some countries may also wish to liaise with selected provincial and local LIS groups.

 

Finally, at the discretion of participating countries, selected government agencies and elements of the country’s private sector, as well as NGOs and foundations, and other elements a country’s civil society may be invited to play a supporting role. Sometimes those institutions have already demonstrated a supportive posture and friendly demeanor toward education and institutions of higher learning and those partnerships are highly valued. Perhaps funds or “in kind” assistance of some sort could be solicited from some of those sources, or their advocacy assistance solicited.

 

Relationship of IILS to Related UNESCO and IFLA Initiatives

As mentioned several times above, UNESCO has an important and major effort underway concurrently concerning the development of Media and Information Indicators (MIL).  And even as these draft Guidelines are being written and circulated for review and comment, a final report from this UNESCO sponsored expert meeting is being prepared, and is expected to be released by the end of December 2010.  But until it is published, we will not be able to pinpoint, with any precision, the direct and/or indirect relationships, between this IILS project and the UNESCO MIL Indicators project.  However, the authors and the IILS countries hope that UNESCO will see many potentially mutually beneficial relationships between the two initiatives, and trust that we will arrive at a point in the near future where the two activities can be inter-related more directly, formally and officially (the Bangkok final report will be added to this document as Appendix C when it is released).

 

With respect to IFLA, the Standing Committee on Information Literacy of IFLA has been advised of the IILS initiative, and asked to consider how IFLA wishes to relate to the survey - - that is, what role it might wish to play.  Many of the individual members of that IFLA Information Literacy Standing Committee, as well as IFLA itself, are involved in both the aforementioned UNESCO MIL Indicators project, as well as this IILS effort.  There has always, traditionally, been a high degree of collaboration between UNESCO and IFLA, and the two institutions often plan and implement communication, information, media, and library-related projects and tasks jointly.  Therefore the authors and the IILS survey countries trust that IFLA will also see potentially mutually beneficial relationships between the two initiatives, and hope that IFLA will play a pro-active role.

 

Where things Stand Now

Even as these Guidelines are being written, the number of Asian countries participating in the survey continues to increase, and more are warmly welcome to join!  Those individuals designated as Country Survey Coordinators in the already-committed countries are beginning to study how the PIL survey was crafted and operated, including examining the detailed contents of the survey questionnaires utilized, and the survey methodologies selected, and the survey results and outcomes (see the appendices to this document).  They are also in the process of forming their country teams so that the survey effort is perceived as being inclusive in terms of involving all sectors of society in the countries participating.  This means not just the academic and research communities, but interested professional and educational associations and societies (e.g. LIS, educational, pedagogical, and so on), and also government agencies and the private sector.

2. ROLES AND RESPONSIBILITIES OF SURVEY COORDINATORS AND HOST INSTITUTIONS

Roles and Responsibilities of the Country Survey Host Institution

The Country Survey Host Institution (the university or college affiliated with the Country Survey Coordinator) is responsible for financially supporting the expenses defrayed as a result of participating in the IILS initiative.  Some expenses can presumably be absorbed within normal operating budgets, but additional support may be needed, in which case the Survey Host Institution will be responsible for seeking and identifying such outside support, including, for example, help from government agencies (e.g. grants), philanthropic foundations and other organizations, corporations and elements of the private sector, and other sources.  Alumni and “Friends of the University” groups may also be approached.

 

But, after exhausting all internal and external resources, if the Survey Host Institutions of participating countries are still unable to fully meet their estimated financial obligations for the survey, they should contact the organizers to seek assistance.  There is no guarantee, however, that outside support can be identified, but a good faith attempt will be made. 

One important benefit of participating in the IILS will be the strengthening of the Survey Host Institution’s teaching, pure research agenda, as well as its applied research agenda in the Information Literacy, Media Literacy and Lifelong Learning areas, including curriculum reform.  Survey Host Institutions will very likely come to be recognized as a “center of Information and Media Literacy excellence” within the country (and perhaps the sub-region and the entire region) in those areas, and from that recognition should follow other benefits such as increased undergraduate and graduate student enrollment, attracting more distinguished faculty, government recognition, and so on.  In short, a benefit to more clearly establish the Survey Host Institution’s reputation within the broader higher educational and training communities within the country as a whole, and within the region of which the country is a part, as a continuing “center of excellence” in Information Literacy, Media Literacy and Lifelong education, training and research, and mounting workshops and conferences that attract experts and professionals from many sectors, economic and social.

Another benefit to the Survey Host Institution will be to enlarge and strengthen the network of IL and ML professionals within the country and the region, the members of which will be encouraged to establish and maintain contact with other for the purpose of exchanging ideas, approaches, techniques and methods, including best IL and ML practices.

 

Roles and Responsibilities of the Country Survey Coordinator

In most cases participating countries have already identified a Country Survey Coordinator and that individual is, de facto, a member of the IILS Steering Committee.  The Coordinator has the authority and responsibility for working closely with the survey organizers and facilitators and members of the Steering Committee, UNESCO, IFLA, regional and national and local LIS associations/societies, and such other organizations such as survey co-sponsors and collaborators, to ensure that the survey is planned, implemented and later assessed in an efficient and expeditious manner. To the extent information is not already available, the name, title, and contact information (including telephone number(s) and e-mail address, of Coordinators should be furnished as soon as possible to the facilitators (see e-mail addresses below).

 

One of the priority first tasks of the Coordinator will be to form a country team and assign role(s) to each member.  The team should be composed of collegial faculty members at his/her own institution and “sister” institutions with an interest in participating in the survey.  Since sampling of students from each major department and discipline is involved, a liaison faculty member from each department should be identified and placed on the country team.  Also, one or more members of the university’s Administration (staff) ideally may be invited so that higher level officials such as chancellors, rectors, vice presidents, etc., can be periodically advised more easily of progress, and/or assist the Coordinator in resolving bottlenecks. 

 

Since the survey will involve many campuses from many different institutions of higher learning throughout the country, the Coordinator should identify a campus survey liaison person on each campus with whom s/he can coordinate details of how, when, where, etc., the survey will be administered on each campus.  Each of these liaison persons should be put onto the team. 

 

Since the main instrument of the survey is a questionnaire, and sampling will be involved, as well as statistical inference techniques and tools and methods, should the Coordinator not already be very familiar with, and experienced in research survey methodologies, it is strongly suggested that such an expert be identified and invited to be a member of the survey team (as mentioned above, perhaps a statistical expert from the statistics department).  There is also the important consideration of whether to utilize printed survey forms rather than attempt to administer the survey online, because many countries and many institutions still do not have adequate ICT technologies to support an online endeavor.  Additionally, as inferred above, perhaps it would make sense to invite selected government agencies to participate on as “as needed” basis, if only for the main reason that they thereby, through participation, are better informed.  The same applies to Alumni groups, “Friends of the University” groups, and so on, as well as to the private sector (especially those corporations that have a demonstrated record of supporting the university in activities of this kind), NGOs, foundations, professional associations, and so on. Finally, the university’s public affairs office and the media should not be overlooked because they can be valuable allies in the survey in helping to advertise, promote and advocate the survey.

 

The Coordinator should discuss these draft Guidelines with all of the appropriate team members who will play a role in the survey, and then confirm to the Steering Committee and the facilitators that the Survey Host Institution is prepared to discharge its responsibilities pursuant to these Guidelines.  If there are problems or exceptions to this commitment, they should be pointed out, and perhaps the Survey Coordinator may wish to recommend certain revisions to these Guidelines.

 

Usually the Survey Coordinator should also, ideally, be the same individual as the one who has day-to-day operational management and control responsibility for administering the survey, compiling the results, and son.  However, sometimes the Survey Coordinator may chose to delegate some of the operational responsibilities to a colleague if s/he is already faced with heavy teaching, research and/or other duties and responsibilities.

 

The Survey Coordinator should ensure that a broad cross-section of students to be surveyed is drawn from all of the surveyed campus major academic departments and fields of study. While no intent need be made to attempt a “statistically perfect” representation of students from all of the major departments and disciplines, on the other hand, the Coordinator should ensure that the final sampling algorithm employed is not skewed in favor of a heavy majority of students representing only one or a very few of major academic departments and fields of study (e.g. the social sciences).

The Survey Coordinator should work with the Host Institution’s website administrator and staff to prepare postings online that advertise, promote and publicize the survey, internally within the institution (students, faculty and staff), as well as to likely audiences who may be have an interest in the workshop (not especially in the technical details of how the survey is conducted but instead in the results achieved and how those results will affect the country as a whole).  Such interested groups include government, the private sector and Civil Society elements within the country and the region.

The Coordinator should work with the institution’s public affairs and/or other in-house publishing office staff, to prepare appropriate press releases for the media and other addressees on the institution’s mailing lists, and other appropriate forms of announcements, to advertise, promote and publicize the survey to targeted audiences.

The Coordinator should work with the institution’s computer lab, or other similar facility (if one exists), to secure and reserve the use of a suitable training room facility for the days scheduled for the survey; ideally the training room in which the survey is administered should have a data projector and desktop or laptop for use by the monitor and presenters, with a widescreen easily visible to all students, including those who may be sitting in the back rows, and individual PCs for students to use.  However, as mentioned above, perhaps some campuses will not have adequate ICT infrastructures available, in which case the survey will have to be administered “manually” as a hard copy printed form.

· Survey Website The facilitators will bring up a special website to serve as an online focal point whereon important survey-related information and documents prepared by the Steering Committee as well as the individual Survey Coordinators, and other concerned and interested organizations and individuals (such as professional societies), will be posted and made available.  The purpose of this special website will be to facilitate the interchange of ideas, approaches, strategies, methods, techniques and tools used by the participating countries so that all may benefit from a common pooling and sharing of this information.  The URL for this website will be provided as soon as it comes up.

 

 
 

3. THE SURVEY MASTER PLAN, ORGANIZATION, INSTRUMENT, METHODOLOGY AND TIMETABLE

The Project Information Literacy (PIL) Survey Instrument and Survey Methodology as a Model or Template for the IILS

As mentioned above, the IILS organizers and facilitators are putting forward the PIL survey instrument and survey methodology as model templates for the IILS, but with the expectation that each participating country will choose to modify somewhat the survey questions asked and the methodology employed in order to meet their own unique culture, language, tradition and other circumstances.  Professors Michael Eisenberg and Alison Head of the Information School at the University of Washington have enthusiastically welcomed the IILS and have promised to informally and unofficially provide assistance as their busy schedules and limited time permit. But, at the same time, having acknowledged that reality, they, and the IILS organizers, hope that most questions and the basic methodological approach used by the PIL survey can be retained by the IILS, even if minor changes in wording or terms employed are needed, for the sake of comparability in the results from the individual country surveys, and also with the results of the PIL survey.  For the sake of comparability with the PIL survey results, as a general rule (the “default position” as it is sometimes called), rather than alter an existing question to accommodate a different nuance in the meaning of a word, term or expression used in a particular question, the coordinator should consider adding a new question.  Obviously if the IILS survey questions and the methodology were significantly altered as between the various participating IILS countries, and/or varied significantly as between the IILS survey model and the PIL survey model, then the findings, conclusions and recommendations would thereby be called into question because of comparability considerations.

 

The Survey Master Plan, Organization and Timetable 

With respect to the question of a timetable, the IILS will be planned, conducted and implemented, results analyzed and compiled, and a final report prepared for the region as a whole, by the end of 2011.  It is hoped that each participating country will be able to administer its survey by the end of the first quarter of 2011 (the target date is April, 2011).  Based on PIL experience, the survey should be administered to as many different campuses as is economically and managerially feasible (i.e. conditioned on available budgets and college approval) over a two week period.  In the case of the PIL 2011 survey, 25 different campuses (both four year and two year institutions) were surveyed, involving a total sample size of over 8,200 students.  But IILS countries will most likely wish to sample a larger number of students because they will conduct the survey on a country-wide basis and therefore the sample size used by PIL, 25 campuses, will be too small to be significant in these countries.  Moreover, a cross-section of different fields of study in the student sample should be sought (e.g. business, social sciences, pure sciences, engineering, arts and humanities, etc.).  

 

Then, sometime during the middle months of 2011 (June, July and August), the results should be analyzed, interpreted, and preliminary findings and conclusions drawn.  By September 1, 2011 survey results from individual participating countries should be interchanged between participating countries, and revisions to preliminary findings and conclusions for the region-wide final report made, including inter-country comparisons of results.  Perhaps sub-regional nuances (i.e. two or more countries that have a special political, economic or socio-cultural commonality between and among them) will be detected in the analysis and, if so, they should be highlighted.  Then, following that, by mid October 2011 a final, region-wide report should be drafted and sent to all participating countries for review, revision, and final approval.  A target date of publishing the final, overall IILS report by the end of December 2011 is hereby established.

 

With respect to organization each participating country will establish its own country team composed of various individuals with a direct or indirect interest in the survey and its expected outcomes.  The suggested composition membership of the country teams is discussed elsewhere in this document. The Survey Coordinators will head their respective country teams, and be members of the Steering Committee.  

 

In sum, the Survey Master Plan will be closely patterned after that utilized by the PIL survey, but with a number of variables introduced as discussed above.

 

The Survey Instrument and Methodology

So, in view of the above overall IILS planning framework, each country should proceed to develop their own specific Survey Plan in consultation with their country team, containing the “who, what, where and how” details of how the survey will be conducted and implemented, who will be responsible for each step and task, etc.. The key details containing the PIL Survey instrument and methods, and other details which will need to be incorporated into the country’s Survey Plan should be studied carefully and then adapted to the country situation.  This information is contained in Appendix A and Appendix B of

“Assigning Inquiry: How Handouts for Research Assignments Guide Today's College Students,” Alison J. Head and Michael B. Eisenberg, Project Information Literacy Progress Report, University of Washington's Information School, July 13, 2010 (41 pages, PDF, 2.14MB).

Note:  Coordinators may wish to download and save this document to their computer hard drive in order to refer to it more easily that trying to access it as a hyperlink each time they need to.

4. PLANNING, IMPLEMENTING AND CONDUCTING THE SURVEY AT MULTIPLE LOCATIONS

The Country Survey Plan

The Country Survey Plan is perhaps the most important document Coordinators will produce because it brings together all Survey activities into one framework, specifying who is responsible for each action, when and in what sequence each step should take place, and where, and how, each step will be executed.. 

 

Here is a checklist of many of the most important considerations and tasks to be taken into account in developing the Plan.  These tasks are not necessarily listed in any logical or priority order and tasks will undoubtedly vary from country to country in terms of their order, priority, exact nature, etc.

 

a. Hold a general meeting with your Survey Team, to clarify how to proceed, including discussing specific roles and responsibilities of each member of the team in some detail to ensure everyone understands what they need to do;

b. Decide which campuses in your country you will invite to participate, and communicate with them to identify a campus survey liaison person, and then communicate with that person to clarify their, and your, responsibilities, timetable, etc., including your estimate of how long you anticipate the survey will take students to complete (perhaps a range of estimated minimal time vs. optimal vs. maximum time would be desirable);

c. Decide whether you prefer to follow a “simultaneous” or “concurrent” approach and survey students at all participating campuses at one time or whether you prefer to a sequential approach and survey campuses “in waves” of, say, a few at a time; prepare a master schedule indicating the dates and times each campus will be surveyed;

d. Study the PIL survey instruments carefully and decide which questions you will choose to include virtually unchanged in either a substantive or language and style of expression sense, and which questions you want to omit entirely, and which you want to change in style of expression; moreover, you may choose to add some new questions of your own which were not included in the PIL survey;

e. Translate the final survey instrument from English into your country’s language(s);

f. Prepare a Student Instruction Sheet to give to students who are selected to take the survey, explaining what you are doing, why you are doing it, what your expectations are for the results, how students will be kept informed of findings and recommendations, specifying any rules you want the students to follow in taking the survey, and thanking the students for their participation;

g. Pre-test the survey instrument with a small number of students from your own institution, ensuring that you select a broad cross-section of pilot test students enrolled in each of the major courses of study and disciplines (e.g. social sciences, pure sciences, engineering, business, etc.), as well as a cross-section of the year of study (i.e. first, second, third or fourth);

h. Revise your survey instrument based on the results of the pre-test; if the number of changes you make is large, you may decide to conduct a second pre-test before you finalize the survey instrument; the Student Instruction Sheet may also need to be revised based on the pre-test results;

i. Where adequate ICT infrastructures exist, including, for example a computer lab or other room equipped with PCs, decide whether you plan to administer the survey online or prefer to have each campus survey liaison official administer it as a hard copy printed document in a classroom on that campus; advise campuses they should select and reserve a room that is convenient and comfortable for the purpose of taking the survey, whether online or printed modality;

j. Set deadlines for the start of, and completion of each major task;

k. Post key information on your special website so that everyone involved can check the latest information online;

l. Consider submitting key information that you wish to share with your colleague Coordinators in other countries and send it to Professor Boekhorst for Posting on the IILS website;

m. Transmit appropriate information on your plans to various media sources, both onsite at your institution, as well as public media;

n. Arrange to periodically brief your superiors on the highlights of what decisions were reached at your meetings, and ask if they have any questions or problems with the decisions that were reached, and whether they might be interested and willing to personally participate in a survey kickoff event; and

o. Set a tentative date for the next team meeting. 

 

Pre-Testing the Survey Instrument

It is essential that the survey instrument be pre-tested in advance of administering it to the student populations to be surveyed for the purpose of identifying the following kinds of potential problems:

 

1. A word, term or phrase, or the writing style use for asking a question, may be unclear and confusing to students; Coordinators should ensure that simple words are used, but consistent with clarity of meaning;

2. The number of questions asked may be too many to reasonably complete in the time allotted for taking the survey; on the other hand, sometimes the number of questions is too few;

3. Unlike the practice sometimes followed of asking “tricky questions,” the purpose of a survey, unlike an examination, is not to test a student’s level of knowledge but rather to solicit an honest and straightforward response to a question; therefore, “tricky questions” which are ambiguous, too complex or too wordy are to be avoided;\

4. The pre-test, like the survey itself, should be administered to a cross-section of students representing all of the different fields of study and disciplines offered on a campus;

5. Also like the survey itself, the pre-test should be administered to a sample of first, second, third and fourth year students;

6. If the answers expected depart significantly from the “correct” or “preferred” response, the Coordinator may wish to discard the question entirely, or modify it and re-administer it as part of the pre-test task;

7. It is sometimes desirable to ask one final question on the pre-test instrument to the effect “did you have any particular difficulties in responding to these questions, and if so, which ones, and why did you have a problem?”  You may also wish to add a general statement such as “apart from individual questions with which you had some difficulty understanding what was required, do you have any general observations you would like to share on the survey itself, and if so, what are they?”

 

 Finally, the Coordinator may wish to pre-test an entire campus rather than just the survey instrument.  If a campus volunteers for this purpose, such a course of action would have the virtue of allowing the Coordinator not just to test the survey instrument, but also the entire logistical arrangements and plans for administering the survey at the many campus locations.

 

Administering and Conducting the Survey

After the survey instrument has been modified based on the results of the pre-test, we are ready to proceed with administering and conducting the survey.  As mentioned above, by this time it will be expected that all of the campuses that will participate in the survey have completed their preparations - - students to be surveyed alerted as to time and place, room reserved, “paper and pencils” and other supplies provided for manual surveys and computers ready in the case of online surveys, food and drink ready to be provided if required, and so on. 

 

Monitors assigned to each survey room should be designated to deal with individual student respondent needs and problems, and while the reader may smile, too often very simple logistical needs such as the location of rest rooms, water and drink supplies, pencils and scratch pads, etc., are overlooked or are inconvenient.  

5. ANALYZING AND EVALUATING THE SURVEY RESULTS

Analysis is concerned with three stages:

 

· First, factually reporting the results of the survey.  Sometimes this first stage is called determining the Findings of the survey.  That is, how did the respondents answer each question, how many, were there discernible differences in responses based on one or more variables such as which field of study the student was enrolled in, what year s/he was in, and so on.  These are the facts and should be directly observable and measurable from the results of the survey.

· Second, drawing inferences or Conclusions from the facts observed and measured; in short, what do the results point to?  Some conclusions may be fairly obvious and intuitive, but most often the drawing of conclusions is a highly skilled endeavor and should involved the most knowledgeable and expert professionals available to undertake the chore;

· Third, making recommendations from the facts observed and measured, and the conclusions drawn; in short, given the results of the survey, how might conclusions point to various recommendations that might be made to improve and strengthen the Information and Media Literacy behaviors of students?  One might pre-suppose that a variety of recommendations could be crafted, such as modifications to curricula, changes in the way research is undertaken, new skill sets and practices introduced into the curricula, testing of various kinds, course assignments, etc.

 

It is imperative that the Coordinate enlist the expertise of research, survey and statistical experts at this stage because analysis is perhaps the most difficult of all of the stages involved in a survey.  It is comparatively easy to design a survey instrument, pre-test it, administer it, compile the results, and so on, but when it comes to synthesis of the results and analysis of the results in order to intelligently move forward to try and cope with dysfunctional behaviors, is extraordinarily difficult.

 

Countries may wish to help each other bi-laterally and multi-laterally at this stage, because certain findings, conclusions and/or recommendations may not even be documented by one country, but observed and publicized by another.  Therefore, it is hoped that countries will be willing to post their results and share with other participants in the survey. 

6. SOME LOGISTICAL, FINANCIAL AND ADMINISTRATIVE CONSIDERATIONS

Protocol Considerations.  

From a goodwill and protocol standpoint Survey Coordinators may wish to invite one or more Survey Host Institution dignitaries (e.g. a president, chancellor, rector or dean in the case of a university serving as a Survey Host Institution), as well as one or more local government dignitaries (e.g. governor or mayor), and perhaps even some other distinguished public figure(s), but individuals whose public distinction and reputation is appropriate to the context of the survey, to deliver welcoming and opening remarks,  and/or closing remarks at a kickoff event before the beginning of the administering of the actual survey to students, or perhaps at some appropriate time and event following the start of the survey at different campuses.

 

Webcasts, Podcasts, Videos, etc..  

Coordinators may wish to produce one or more video products in advance of the actual survey so as to facilitate explaining the background, purposes, and operation of the survey for different audiences, include a “general” category, one for faculty and staff, and one for students who will be among the survey sample (or potentially willing to participate in the survey as a respondent).  Such products, as the PIL Survey principals learned, can be very helpful in soliciting support for and interest in the survey.

 

Travel & Living Arrangements and Expenses.  

As indicated above, Survey Coordinators and their colleagues with assigned roles and responsibilities in the survey are responsible for making their own travel and living arrangements, and for arranging for the reimbursement of their personal expenses while on official survey business (e.g. visiting another participating campus to help make arrangements for conducting the survey there) with the host institution. 

 

General Meeting of Campus Liaison Officials.  

If the Survey Coordinator and host institution decide it would be desirable to hold a single general meeting and inviting all of the campus liaison officials participating in the survey, then, or cost, the host institution will need to plan and budget for the travel and living expenses of approved and invited campus liaison officials.  The decision on whether to hold such a general meeting or to rely alternatively on Email and informal communications is entirely at the discretion of each Survey Coordinator.

 

Food and Drink and Room Arrangements.  

Survey Host Institutions will be expected to provide water and perhaps a snack for students taking the survey, and for making necessary room arrangements so that students are comfortable and there are no interruptions while the survey is administered.

 



APPENDIX A - PIL SURVEY INSTRUMENT*

Question 1.

 

What is the name of the institution where you are enrolled?

 

• Boise State University

• Colgate University

• College of William & Mary

• Colorado State University

• Corban College

• CSU Maritime

• Eastern Michigan University

• Felician College

• Gettysburg College

• Holy Names University

• Linfield College

• New Mexico State University

• Northern Kentucky University

• Northern Michigan University

• Ohio State University

• Purdue University

• St. Mary's College of Maryland

• Southern Nazarene University

• State College of Manatee-Sarasota

• Temple University

• University of Arizona

• University of Michigan

• University of Minnesota

• West Virginia University

• Winston-Salem University

 

Question 2.

 

Your current status as a student is:

 

• Freshman (skip to end of survey for contest entry, since freshmen are excluded from study population)

• Sophomore

• Junior

• Senior

• Does not apply to me

No response

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

62

 

Question 3.

 

Which one of the following disciplines does your major area of study fall under?

(Click ONLY ONE.)

 

• Architecture

• Art (includes Ceramics, Dance, Digital Arts, Drama, Industrial Design, Music, Photography,

Sculpture, Art History)

• Business Administration (includes Finance, Accounting, and Management)

• Computer Science

• Education

• Occupational training (includes Paralegal, Radiology Technician, Electrician, Recreation

Programs)

• Engineering (includes Aeronautical, Civil, Chemical, Electrical, Mechanical)

• Humanities (includes English, Languages, History, Geography, Literature, Communication, Philosophy,

Religion)

• Social Sciences (includes Anthropology, Economics, Political Science, Psychology, Sociology)

• Sciences (includes Astronomy, Plant, Biology, Chemistry, Physics)

• Mathematics (includes Statistics)

• Nursing

• Still undecided about my major area of study

• Other:

 

Question 4. Part One: Course-Related Research Assignments

In this part of the questionnaire, we want to learn about how you work on research assignments in

humanities and/or social science courses you may have taken on this campus. First, what kinds of

assignments have you had?

 

Over the last year, which of the following types of research assignments have you had for the

social science and/or humanities courses you have taken?

(Click ALL that apply.)

 

• Papers that present an argument about an issue(s)

• Papers that present a historical analysis of an event(s)

• Papers that present a “close reading” or interpretation of a text

• Papers that present a case study analysis

• Papers that present a literature review

• Papers that present a proposed study

• Oral presentation

• Oral presentation and an accompanying paper

• Multimedia product that requires research (i.e., Web site, video)

• I have no experience writing course-related research papers on this campus

• Other:

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

63

 

Question 5. Some students use certain resources, but not others, when they are working on

research assignments for humanities and social science courses. We want to find out which

resources YOU use.

 

HOW OFTEN do you CONSULT THESE RESOURCES during your course-related research

process? (If you do not consult these resources at all, let us know, too.)

 

Course readings

Blogs

Search engines (e.g., Google, Bing, Yahoo!, Ask.com)

Wikipedia

Governmental Web sites (.gov sites)

Research databases through the library Web site (e.g., EBSCO, JSTOR, ProQuest)

Librarians

Library shelves

Instructors

Encyclopedias (e.g., Britannica, either online or print)

Classmates

Friends/family

My personal collection (materials I already own or buy)

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Don't Know

 

Question 6. Evaluating What You Have Found

 

When you find a source through the LIBRARY (books or articles from library Web databases),

DO YOU CONSIDER the following things?

 

Consider how current the library source is.

Consider an author's credentials (e.g., where he/she is faculty or works).

Consider whether the content acknowledges different viewpoints (i.e., not biased).

Consider whether the author gives credit for using someone else's ideas (e.g., footnotes).

Consider whether the library source has a bibliography.

If there are charts, consider whether they have vital information (i.e., not just attractive graphics).

Consider who the publisher of the library source is.

Consider whether a librarian mentioned using the library source.

Consider whether I have ever heard of the library source before.

Consider whether I have used the library source before.

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

64

 

Question 7. Now let's focus on sources that you find out on the WORLD WIDE WEB.

 

Let's say you find a source "out on the WEB" (e.g., .com or .gov sites), DO YOU CONSIDER

the following things?

 

Consider how current the Web site is.

Consider a Web site author's credentials (e.g., where he/she is faculty or works).

Consider whether the Web site content acknowledges different viewpoints (i.e., not biased).

Consider whether the Web site gives credit for using someone else's ideas (e.g., footnotes).

Consider what the URL (i.e., Web site address) is and what it may mean.

Consider whether the Web site has links to other resources on the Web.

Consider whether the Web site has bibliography.

If there are charts, consider whether vital information is added (i.e., not just attractive graphics).

Consider whether a librarian mentioned using the Web site.

Consider whether I have ever heard of the Web site before.

Consider whether I have used the Web site before.

Consider whether the Web site's design tells me it's a legitimate site.

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

 

Question 8. Some students ask people for help with evaluating different kinds of sources (i.e., Web

and library sources), while other students do not.

 

Do you ask any of the following PEOPLE for ASSISTANCE with evaluating COURSE-RELATED

sources? (If you don't ask any of the following people for help, we want to know this, too.)

 

Instructors

Librarians

Classmates

Friends and family

Writing Center staff

Licensed professionals (i.e., physicians, attorneys, therapists)

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

65

 

Question 9. What's YOUR "Research Style"? Students have different practices, routines,

techniques, and workarounds for completing course-related research assignments. Below are

statements different students have made about how they approach assignments.

 

How OFTEN do you use each of these research PRACTICES during YOUR OWN course related

research process?

 

Once I find the number of citations the instructor expects, I end my research process.

If I don't find something in one or two searches, I start over with a brand new topic.

I work my own perspective into the assignment, so the instructor knows what I think.

I come up with a thesis statement early on.

I develop an outline for how to proceed with the assignment (e.g., writing the paper).

If the assignment is a paper, I sit down and just start writing, without much of a plan for what

I'm going to say at all.

One of the first things I do is to figure out search terms to use.

I develop an overall research plan to guide my research process.

I use a system for organizing the research sources I find along the way.

I use interlibrary loan or document delivery services if my library doesn't have what I need there.

I tend to use the same set of research resources from one assignment to the next.

I tend to write about the same topic from one assignment to the next.

I tend to spend the same amount of time on assignments.

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

66

 

Question 10. There are a lot of different productivity tools, some online, others that are not, which

students can use for supporting various tasks during their own course-related research process.

 

Have YOU used any of these PRODUCTIVITY TOOLS for course-related research tasks in the

LAST SIX MONTHS?

 

Highlighting feature for underlining text on a computer screen

Digital "sticky notes" for use with a computer (e.g., Post-It digital notes)

Citation-making programs (e.g., RefWorks, EndNote, EasyBib)

Social bookmarking (e.g., digg, delicious)

Alerting services (e.g., programs that send out automatic Web feeds for newly appearing content)

Microblogs (i.e., Twitter)

Document sharing programs (e.g., Google Documents)

Online time management programs with sharing (e.g., Google Notebook)

Wikis for creating and sharing Web content (other than Wikipedia)

Photo-sharing sites (e.g., Flickr, Photobucket)

Virtual research environments

Blogging (e.g., LiveJournal)

Voice over Internet Protocol (e.g., Skype)

An online forum where I can post a question and get an answer from someone

• Yes

• No

• Donʼt Remember

• Never Heard of this Before

 

Question 11. 

 

HOW IMPORTANT are each of the following to you when you are working on a

course-related research paper?

 

Getting a good grade from the instructor.

Passing the course.

Getting the paper finished.

Meeting the paper-length requirement (if there is one).

Meeting the number of citations required (if there it exists).

Doing a comprehensive investigation about my research topic.

Finding answers I can insert into the paper to prove I've done research.

Improving my writing skills.

Improving my research skills.

Improving my analytical skills.

Integrating my own perspective into the paper.

Learning something new.

Impressing the instructor with my intellectual abilities.

Impressing my parents with the grade I end up receiving.

Having the chance to be creative with an assignment.

• Very important

• Important

• Moderately Important

• Of Little Importance

• Not Important

• Donʼt Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

67

 

Question 12. Overall, when you think about the ENTIRE research process--from the moment you

get the assignment until you turn in your research paper--what is DIFFICULT for you?

 

How strongly do you AGREE OR DISAGREE with each of the following statements about what

is DIFFICULT about the course-related research?

 

Getting started on the assignment is difficult.

Defining a topic for the assignment is difficult.

Narrowing down a topic is difficult.

Coming up with search terms is difficult.

Finding articles in the research databases on the library's Web site is difficult (e.g., EBSCO, JSTOR,

ProQuest).

Finding sources to use "out on the Web" is difficult (e.g., Google, Wikipedia, government sites).

Determining whether a Web Site is credible or not is difficult.

Figuring out where to find sources in different parts of the campus is difficult.

Finding up-to-date materials is difficult.

Having to sort through all the irrelevant results I get to find what I need is difficult.

Evaluating the sources I've found is difficult.

Reading through the material is difficult.

Taking notes is difficult.

Integrating different sources from my research into my assignment is difficult.

The writing part is difficult.

Knowing when I should cite a source is difficult.

Knowing how to cite a source in the right format is difficult.

Knowing whether my use of a source, in certain circumstances, constitutes plagiarism or not is

difficult.

Deciding whether "I'm done" or not is difficult.

Knowing whether I've done a good job on the assignment or not is difficult.

• Strongly Agree

• Somewhat Agree

• Neither Agree or Disagree

• Somewhat Disagree

• Strongly Disagree

• Don't Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

68

 

Question 13. Part Two: Conducting “Everyday Life Research”

Now, weʼd like to ask you about something entirely different. We'd like to know a little about your

experiences with conducting what might be called “everyday life research.” Everyday life research

consists of collecting materials for solving information problems that may occur during the course

of your daily life.

 

Over the last six months have you carried out EVERYDAY LIFE RESEARCH about one of

these topics? (Click ALL that apply.)

 

Health/wellness issue (either for yourself or someone close to you)

News/current events

Purchasing something (e.g., product or service)

Something related to what I am asked to do at my job

Domestic life (e.g., figuring out where to live)

Work/career (e.g., salaries for certain types of professions, job openings).

Spiritual information (e.g., finding out about different religious beliefs)

Travel information (e.g., trip-planning)

Advocacy information (e.g., finding out about different political/social causes)

Social contacts (e.g., using a social networking site to find others with similar interests)

Searched for an expert of some kind (e.g., medical doctor)

Other:

 

Question 14. Some people use certain resources, but not others, to find everyday life information.

What do you use?

 

HOW OFTEN do you CONSULT THESE RESOURCES during your EVERYDAY LIFE research

process? (If you do not consult these resources at all, let us know, too.)

 

Blogs

Search engines (Google, Bing, Yahoo!, Ask.com)

Wikipedia (either from a Google result or direct visit to Wikipedia Site)

Governmental Web sites (.gov sites)

Research databases on library Web site (e.g., EBSCO, JSTOR, ProQuest)

Librarians

Library shelves

Instructors

Encyclopedias (Britannica, either online or print)

Classmates

Friends/family

Social networking site (e.g., Facebook)

My own personal collection (e.g., materials I already own or buy)

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

69

 

Question 15. Evaluating What You Have Found for Everyday Life Research

 

When you have found a source for EVERYDAY LIFE research on the Web, DO YOU CONSIDER

the following things?

 

Consider how current the Web site is.

Consider a Web site author's credentials (e.g., where he/she is faculty or works).

Consider whether the Web site acknowledges different viewpoints (i.e., not biased).

Consider whether the Web site author gives credit for using someone else's ideas (e.g.,

footnotes).

What the URL (i.e., Web site address) is and what it may mean.

Whether the Web site has links to other resources on the Web.

Whether the Web site has bibliography.

If there are charts, consider whether vital information is added (i.e., not just attractive graphics).

Consider whether a librarian mentioned using the Web site.

Consider whether I have ever heard of the Web site before now.

Consider whether I have used the Web site before.

Consider whether the Web site's design tells me it's a legitimate site.

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

 

Question 16. 

 

Do you ask any of the following PEOPLE for ASSISTANCE when you are

evaluating sources for EVERYDAY LIFE research? (If you don't ask any of the following

people for help, we'd like to know this, too.)

 

Friends and family

Classmates

Librarians

Instructors

Licensed professionals (i.e., physicians, attorneys, therapists)

• Almost Always

• Often

• Sometimes

• Rarely

• Never

• Donʼt Know

• No Experience with this Situation

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

70

 

Question 17. Now, let's talk about difficulties with the ENTIRE EVERYDAY LIFE research process.

What is DIFFICULT for you?

 

How much do you agree or disagree with each of the statements about what's difficult about

EVERYDAY LIFE research?

 

Getting started on everyday life research is difficult.

Defining what I need during everyday life research is difficult.

Narrowing down a topic for everyday life research is difficult.

Coming up with search terms for everyday life research is difficult.

Finding articles for everyday life research in the research databases on the library's Web site is

difficult (e.g., EBSCO, JSTOR, ProQuest).

Finding sources to use "out on the Web" for everyday life research is difficult (e.g., Wikipedia, Google, .gov

sites)

Determining whether a source for everyday life research I find is credible or not is difficult (online

or print).

Finding up-to-date materials for everyday life research is difficult (i.e., online or print)

Having to sort through all the irrelevant results I get to find what I need for everyday life

research is difficult.

Evaluating the resources I find and may end up using for everyday life research is difficult.

Figuring out where to find sources for everyday life research is difficult.

Reading through material is difficult.

Knowing the "answer" for everyday life research is online, but not being able to find it is difficult.

Integrating information from different sources is difficult.

Deciding whether "I'm done" or not with my everyday life research is difficult.

• Strongly Agree

• Somewhat Agree

• Neither Agree or Disagree

• Somewhat Disagree

• Strongly Disagree

• Don't Know

• No Experience with this Situation

 

Question 18. Tell Us a Little More About Yourself

Now just a few questions to find out a little more about you. . .

 

What is your GPA?

 

• Below 1.4

• 1.4 - 1.6

• 1.7 - 2.0

• 2.1 - 2.3

• 2.4 - 2.6

• 2.7 - 3.0

• 3.1 - 3.3

• 3.4 - 3.7

• 3.8 - 4.0

• Declined to State

Project Information Literacy Progress Report: “Truth Be Told” | November 1, 2010 | Head and Eisenberg

71

 

Question 19.

 

What is your age?

 

• 18-20 years old

• 21-22 years old

• 23-25 years old

• Over 25 years old

• Declined to State

 

Question 20.

 

What is your gender?

 

• Male

• Female

• Declined to State

 

Question 21.

 

If you would be willing to participate in a follow-up interview (15 - 30 mins.) to tell us about

your experiences conducting research, please provide us with a telephone number and your

first name (only) for contacting you.

 

 

*Source:  From Appendix C, “Assigning Inquiry: How Handouts for Research Assignments Guide Today's College Students,” Alison J. Head and Michael B. Eisenberg, Project Information Literacy Progress Report, University of Washington's Information School, July 13, 2010 (41 pages, PDF, 2.14MB).

APPENDIX B – PIL SURVEY METHODOLOGY*

Appendix A: Methods

 

From October 1, 2009 through December 17, 2009, the Project Information Literacy (PIL)

Team conducted a quantitative content analysis of 191 course-related research

handouts. Instructors who taught undergraduates at 28 U.S. colleges and universities,

voluntarily submitted the handouts.

 

The goal of the content analysis was to find out what types of guidance and support

instructors provide to undergraduate students for completing a course-related research

assignment.

 

This content analysis study is part of PILʼs ongoing research about how college students

conceptualize and operationalize course-related and everyday life research. In light of

PILʼs previous research findings, we were interested about learning more about the

coaching role instructors may play in the student research process and specifically, in

providing students with situational and information-gathering contexts.34

 

Our unit of analysis was the course-related research handout. We used instructorsʼ

handouts as one of the communication artifacts that instructors use to convey information

about course-related research. In our prior research, students have reported written

guidelines as being useful to them in completing their research assignment.

 

For the purposes of the study, we defined a course-related research handout as an

explanatory handout about a research assignment, prepared and distributed by a college

instructor in the previous year.

 

The handout may have been distributed to students in class or through other methods

(e.g., posted on a Blackboard site). A course-related assignment could result in a

research paper or another deliverable (e.g., multimedia presentation). In either case, the

assignment requires students to conduct some “outside research” and collect

substantiating information from existing primary and secondary sources.

 

Research Liaisons

 

At each institution, we enlisted research liaisons, often librarians, who worked on campus

and facilitated PILʼs instructor recruitment process. Each liaison submitted instructor

names and emails (approximately 15 faculty names per institution). PIL, in turn, emailed

each instructor with study details. In exchange for their time and participation, instructors

who submitted handouts were entered in a PIL drawing for a $100 bookstore gift card.

To mitigate any “pro-library” bias, we asked liaisons to collect the names from sources

other than themselves. That is, liaisons collected instructor names by asking a dean or

department head on their campus to recommend an instructor for the study, instead of

relying on their own contacts through, perhaps, library support and consultation.

 

Appendix Figure 1 shows baseline information about each institution where handouts

were collected.

 

34 See page 5 of this report for a discussion of how PIL defines situational context, in light of PILʼs typology of

research contexts early adults seek when conducting course-related and everyday life research.

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 30

 

Appendix Figure 1: Institutions Participating in the Content Analysis Study

 

Four-Year Colleges and Universities

Institution

Type

Full-time

Undergraduate

Enrollment

Research Liaison

Handouts

Submitted

from Faculty

 

(DETAILED LIST OF CAMPUSES SURVEYED OMITTED HEREIN)

 

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 32

 

Institutional Sample

 

Of the 28 institutions participating in the content analysis, 7 were community colleges

(25%), 13 were four-year public colleges and universities (46%), and 8 were four-year

private colleges and universities (29%).

 

From each institution we received the following number of handouts: 77 handouts from

four-year public institutions (40%), 54 handouts from four-year private institutions (29%),

and 60 handouts from community colleges (31%). Appendix Figure 2 on the next page

shows a breakdown of the type of institutions.

 

Appendix Figure 2: Handouts Submitted by Institutional Type

n = 191

Description of Instructors who Submitted Handouts

 

We collected handouts from instructors who taught sophomores, juniors, and seniors and

came from a range of broad disciplines (e.g., humanities and arts, social sciences,

sciences, architecture and engineering). During our analysis, individual disciplines were

collapsed into these broad disciplinary categories so that we could fill cells for more

meaningful comparisons. Appendix Figure 3 shows a breakdown of the handouts we

received by discipline.

 

Appendix Figure 3: Handouts by Discipline

n = 191

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 33

 

The handouts in the PIL sample had been distributed to students within the last three

semesters. Our sample did not include courses where the majority of the curriculum was

focused on how to conduct library research. We also excluded syllabi from the sample.

Our sample of handouts is a “voluntary sample.” We fully recognize that voluntary

samples are always somewhat biased, since they are limited to people (and handouts)

that are self-selected. Subsequently, inferences from a voluntary sample are not as

reliable as those from a random sample of an entire population, which was not a realistic

option, given our study design.

 

The handouts in the sample varied in length. On the average, handouts were 960 words,

or nearly four-single spaced pages each. Appendix Figure 4 shows a breakdown of the

handouts analyzed by word length.

 

Appendix Figure 4: Length of Handouts

n = 191

In addition to collecting the handouts from instructors, we asked participants to voluntarily

provide demographic information about the highest degree that they held.

 

The majority of the instructors who participated in the study had PhDs (79%) with fewer

having Masters (20%) or JDs (1%) as their highest degree.

 

We collected data from instructors about how many years they had been teaching. The

mode for teaching experience was 11-20 years with 37% of the sample falling into this

category. Appendix Figure 5 shows a breakdown of instructorsʼ experience with college

teaching.

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 34

 

Appendix Figure 5: Instructors and Years of Teaching Experience

n = 191

Human Subjects Review and Confidentiality

The Human Subjects Division at University of Washington (UW) approved our research

protocol on July 31, 2009 (Certification #36818). UW is the sponsoring institution for PILʼs

ongoing research study, which is based in the Information School.

UW’s Human Subjects’ reviewers certified PIL’s survey project as “exempt.” The exempt

status was due to the no-risk nature of the methodologies used to collect data and to

guarantee confidentiality. Our research protocol was also submitted and approved at

each of the 28 institutions where data was collected from instructors.

 

All measures were used to protect any identifiable data about instructors who submitted

handouts (e.g., each participant was assigned an identification code; all responses and

code keys were stored separately in locked files or on secured computers). No

participants or individual institutions were identified this report.

 

Handout Coding Procedures

 

The content analysis coders were three working librarians, who generously donated their

time. The coders were Sue Gilroy (Harvard), Sara Prahl (Colby College), and Sarah Vital

(St. Maryʼs College of California). Coders who worked at institutions also in the handout

sample, were not allowed to code any of the “home base” handouts from faculty.

Before the official coding process began, Sarah Vital, the Lead Coder, conducted a

training session with coders Sue Gilroy and Sara Prahl. The codebook was also pilot

tested with a sample of three handouts from St. Maryʼs College of California, a campus

not in the sample.

 

Handouts were coded for 28 individual properties (the coding form used during the

analysis is included in Appendix B). During the coding phase, the three coders

systematically identified 26 manifest properties of wording and phrasing that appeared in

the 191 handouts. When coding is conducted during content analyses, manifest

describes what an author or speaker (or in our case, an instructor) has definitely written

right into the text. Manifest coding is different from latent coding, since latent coding

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 35

requires the coder to make a qualitative and critical interpretation of inferred meanings in

a text.

 

The PIL Team used only latent coding for coding two properties—guidelines provided to

students for evaluating the currency and authority of resources. Since currency and

authority are terms used in library and information science for characterizing resources,

we had to infer how instructors who had not been trained in library and information

science may have described the similar concepts.

 

Intercoder Reliability

 

In order to measure intercoder reliability, we had each coder read the same 19 handouts

in the sample. Krippendorffʼs alpha was used to measure the variations among the three

codersʼ individual coding decisions.

 

The current version of PASW Statistics 17 (formerly SPSS) was used to test intercoder

reliability and to measure the degree of variation among the three codersʼ decisions.

Krippendorffʼs alpha is the most rigorous means of testing intercoder reliability. The

statistic takes into account chance agreement among content analysis coders and

adjusts for nominal, ordinal, interval, and ratio variables.

 

Although there is no universally accepted standard for interceder reliability,

communication research scholars have argued that a coefficient of .90 is “highly

acceptable” and that .80 is “acceptable.”35 Overall, the intercoder reliability among the

individual decisions was .80 and therefore within the “acceptable” range. This means that

there was an 80% degree of reliability in the PIL Teamʼs coding among the three codersʼ

individual decisions.

 

Follow-Up Interviews with Instructors

 

Many of the results from our analyses provided some answers about the kinds of

guidelines instructors provided to students in course-related research handouts. At the

same time, the analysis raised new questions. In the hope of answering some of these

questions and providing supplementary qualitative details to our content analysis, we

conducted 15 follow-up interviews with instructors who had submitted handouts to our

sample and from the 174 instructors (91%) who agreed to be contacted.

 

The sample was segmented along three lines: (1) by respondents from community

college vs. those from four-year institutions, (2) disciplinary area of expertise, including a

balance of humanities, social science, sciences, and business administration, and (3)

instructors whose handouts recommended using library resources and services and

those who handouts did not.

 

Fourteen of the interviews were conducted by telephone and lasted for 15-30 minutes.

One interviewee responded by email to the list of questions. A script with six open-ended

questions was employed. The same interviewer was used throughout for consistency.36

35 K. A. Neuendorf (2002). The content analysis guidebook. Thousand Oaks, CA: Sage.

36 John Marino, a doctoral student in the University of Washingtonʼs Information School and a member of the

PIL Research Team, conducted the interviews in April and May, 2010.

 

Project Information Literacy Progress Report: “Assigning Inquiry” | July 12, 2010 | Head and Eisenberg 36

 

The questions were as follows:

 

Q1a. Do you assign many course-related research assignments to students?

 

Q1b What would you say the "research part" of the research assignments actually entails

for students? What are your expectations of students for course work that involves

researching a topic or issue?

 

Q2. When you ask students to complete a course-related research assignment, we know

that you often distribute a handout of some sort that explains the assignment (that's how

you ended up in our study, since you submitted a handout to us last fall). Would you say

in the handout (or in class or one-on-one, e.g., face to face or via email) that you spend a

lot of time discussing assignment particulars with students or do you assume students

know what course-related research involves by the time they enroll in your course?

 

Q3. From your experience, how much skill would you say students bring to the courserelated

research process? Are most students, in your experience, well prepared to

conduct the level of research you expect of them? Would you say students are better at

certain things than others when completing course-related research assignments?

Please describe.

 

Q4. Do you tend to recommend other people on campus whom students may consult for

help with finding information and conducting research? If so, who? Why? How about

librarians? Do you recommend that students use librarians, or do you assume students

already know about consulting with librarians? Why or why not?

 

Q5a. There are so many online sources available to students for conducting courserelated

research, as we both know. Do you ever make suggestions to students about

what sources to use (or not to use) for course-related research? Why or why not? What

sources are you likely to recommend or discourage? Do you find students have a fairly

good idea of what sources to use?

 

Q5b. Would you say you, yourself, are up-to-date about the different research sources—

online and in print—which students might use for one of your course-related research

assignments?

 

Q6. Lastly, let's talk about plagiarism and course-related research assignments. By the

time you have students enrolled in one of your classes, would you say students know

what plagiarism is and what constitutes an act of plagiarism? How much is plagiarism a

problem and what forms does it seem to take?