ICT Literacy AssessmentInformation and Communication TechnologyEducational Testing Service (ETS)Teresa Egan, Project Leader, New Product Development1Topics What is ICT literacy, and why measure it?The development of the ICT Literacy AssessmentThe current test designSample tasksScore reports2What is ICTliteracy, and whymeasure it?31

ICT literacy is the ability to use digital technology, communicationtools, and/or networks todefineaccess,manage,integrate,evaluate,create, andcommunicateinformation ethically and legally in order to function in aknowledge society.4Proficiency teEthicalEvaluateTechnicalCreateCommunicate5Built on prior efforts International ICT Literacy Panel, DigitalTransformation: A Framework for ICT Literacy(2002). Association of College and Research Libraries(ACRL), Information Literacy CompetencyStandards for Higher Education (2000).62

ICT Literacy: a bridge betweenInformation and Communication Literacy– Can you find information on the web?– Can you create a persuasive presentation?TechnicalLiteracyDatabase WordPresentationProcessingCan you bold a word?Can you open a database?Information LiteracyAccess EvaluateUseCan you find information?Can you evaluate authority?7Why measure ICT literacy? The nature, value, and availability of informationhas changed enormously, and this change impactsthe way we live, learn, and work. To succeed, it is not enough to master technicalskills: you must know how to apply them withinan information society. There is a lack of information about the ICTliteracy of students, and debate about how best toaddress this issue in academic curriculum.8The developmentof the ICTLiteracyAssessment93

It started with our highereducation partnersCharter ClientsExpanded Consortium California CommunityColleges California StateUniversity UCLA University of Louisville University of NorthAlabama University of Texas University ofWashington Arkansas StateUniversity Bowling Green StateUniversity Miami Dade College Oklahoma StateUniversity/DefenseAmmunition Center Portland StateUniversity Purdue University University of Memphis10Development timeline Spring/Summer 2003 – Higher education partnersidentified July, 2003 – Initial definitional work began. December, 2003 – Test development began withcharter client institutions. July, 2004 – Field trials of tasks began. January, 2005 – Operational version of large scaleassessment released for use.11Next 6 months January – April 2005: multiple institutions aretesting; goal is to test 6,000-7,000 students April – May 2005: more market research May – June 2005: analysis and score reporting;score reports sent to institutions June 2005: use analysis to finalize decisions re:– Design for 2006 Large Scale Assessment andindividual reporting version– Testing schedule (Fall and Spring? Just Spring?) Summer 2005: design “workforce version” of thetest124

The current testdesign13The ICT Literacy Assessment Two separate sets of results:– ICT Literacy Large Scale Assessment(2005)—institutional profile only– ICT Literacy Individual Assessment (2006)—option toreceive feedback at individual student level Use in higher education– Institutions: accreditation, state accountability,curriculum guidance– Individuals: academic guidance, fulfillment of info.literacy or tech. literacy requirement, certification14Basic Design Features Interactive tasks using simulated software withlook and feel of typical applications (databases,search engines, spreadsheets, email, etc.) 4-minute, 15-minute, and 30-minute tasks Real-life scenarios “Get back on track” mechanisms if test-taker getsreally lost Multiple scorable elements per task155

Assessment Content Length of tasks– Long (30 minute)– Medium (15 minutes)– Short (4 minute) Contexts– Academic– Business– Personal Subject Areas– Humanities– Social Sciences– Natural Science– Practical Affairs– Popular Culture16Delivery and Scoring Delivery– Two-hour assessment window– Web delivery at client institutions through securebrowser Scoring– Automated scoring– Proficiency estimation via IRT analysis17Validity and reliability research Validity research:– “Chain of validity” method used during development to ensurecontent validity– Extensive questionnaire before the test, to see how resultscorrelate with self-reported exposure to and confidence in specificproficiencies– Questionnaire after the test to check face validity among testtakers– Larger validity study in progress with charter clients– Ongoing research planned Reliability efforts:– Analysis of November field test– Analysis of current 6,000-7,000 test-takers– Ongoing scrutiny186

Sample tasks19Sample Tasks Example 1: Comparing Information Example 2: Display Data20Comparing Information Simple task: Four-minute duration Purpose: Assess the test-takers’ proficiency inintegrating information– Integrate: The ability to interpret and representinformation in an ICT context. This includes the abilityto synthesize, summarize, compare, and contrastinformation from multiple sources.217

Within the task Summarize information from three different typesof sources Compare the information to reach a conclusion22In this task, examinees summarize information from avariety of sources and then draw conclusions from theirsummary.23Examinees are presented with aninformation need and three different typesof information sources:248

Examinees are presented with aninformation need and three different typesof information sources:1. Webpage25Examinees are presented with aninformation need and three different typesof information sources:2. Email26Examinees are presented with aninformation need and three different typesof information sources:3. AdvertisementThe three sources present information according to differentconventions. A successful candidate must be able to locatethe relevant information in each source.279

Deciding how tocompare thesources involvesidentifying therequirements ofthe statedinformation need.28After filling in the table,examinees must interpretthe summary to rank thethree sources correctly.29Display Data Simple task: Four-minute duration Purpose: Assess test-takers’ proficiency increating information– Create: The ability to generate information byadapting, applying, designing, or inventing informationin ICT environments.3010

Within the task Visually represent data in a graph Interpret the graph to answer research questions31In this task, examinees create a visual representationof data to answer two research questions.32Examinees select whichvariables to display oneach of the axes.3311

Identifying the correcttime span involvesconsidering the implicitrequirements of theinformation need.34Identifying the correctdependent variable (y-axis)involves thinking about howbest to reflect “popularity.”35Examinees have the opportunity to try out different graphsbefore settling on their response, and this process is factoredinto their score.3612

Answering the researchquestions involves correctlyinterpreting the graph.37The two research questionsrequire different degrees ofanalytic skill.38Score reports3913

Score Reporting Score reporting will be online. Reports will come approximately eight weeksafter close of testing window, with reporting timereduced for subsequent administrations. Group comparisons if adequate samplingprovided.40Score Report Sample Performancelevels byproficiencypairings Explanatory textto guideinterpretation41Sample of explanatory text -HighA High performance on ACCESS indicates that the studentsefficiently and effectively located information via browsing orsearching. Students identified appropriate information resources(e.g., the correct database) for a particular research question orcourse assignment. At this level, examinees tended to use searchterms and queries that yield useful results without a lot of irrelevantresults (i.e., they conduct a precise search). Examinees executed theirweb or database search strategy carefully, going well beyond thespecific words of the research statement or assignment. Forexample, examinees may have identified useful synonyms of keyterms and constructed a search that takes into account bothprovided terms and these synonyms. At this level, examineesformulated an advanced search query that utilized Booleanoperators and other syntactic elements . . .4214

Summary of plans for the future Individual student feedback in 2006 Workforce version of the assessment by 2006,so that colleges can show employers that theirstudents are ICT-ready. Considering assessments for internationaland/or K-12 markets43Questions?Comments?4415