Assessing Information Literacy In Engineering: Integrating A College-Wide Program With Abet-Driven Assessment
Donna Riley, Smith College, described how they formalized their information literacy program in fall 2003 and are moving towards an institutional-wide program that incorporates assessment of discipline-specific measures. At first they tapped into the first year writing intensive course. Their IL program has grown into a discipline-by-discipline curriculum-integrated approach based on ACRL information literacy standards and science and technology standards. Around 40% of the departments have developed department-specific IL outcomes already, others are working on theirs. They are now at phase two and collecting data to assess student learning. The engineering faculty wanted their outcomes to dovetail with ABET criteria so they have hybridized the ACRL/ABET. ILST performance indicators are highly detailed and ABET outcomes are broader so faculty needed to design an assessment plan that could work for both. The literature shows that each institution develops their own outcomes even though there have been some papers, including some within ASEE-ELD in past years that have dealt with alignment, or mapping ABET to ACRL standards.
At Smith, they decided to map their own standards with ABET/ACRL in various focus areas related to information literacy. First mentioned is lifelong learning which Smith has developed more detailed performance goals that are measurable. The second area of focus on is ethics as stated in ABET broadly as “an understanding of professional and ethic responsibility” which related to ACRL 4/ILST4. They developed their own outcome and again, performance criteria that embodies this. Communication is the third focus area Riley mentioned and showed specific performance criteria that includes “student exhibits clear writing style,” etc, see paper for details. The final area of focus for Smith Engineering is experimentation which does not map well with ABET so they added wording into their performance criteria “finding and using information” in addition to “data.”
Riley discussed how their use of e-portfolios allows for assessment of their performance indicators in the aforementioned categories. Student assessment also occurs within courses, for instance students produce final portfolio instead of taking a final exam in her course. At the program level, assessment occurs after sophomore year by review of the portfolio and later near graduation a panel of faculty review the e-portfolios. Possible evidence for IL: annotated bibliographies, ethics case analyses related to information, reports from design projects, and so on.
Riley thinks that ABET should revise 3(b) to include language that addresses need for information literacy and makes suggestions for how they could do this in their paper.
Riley followed up by presenting another paper, Integrating Information Literacy Into A First-Year Mass And Energy Balances Course, co-authored Smith College librarian Rocco Piccinino. Smith’s curriculum-integrated approach to IL is sequenced throughout students college career, however this paper focuses on one specific course, a first year second semester course which is required of all engineering majors. Course objectives include engineering calculations, mass/energy balances, as well as engineering ethics, and information literacy which revolves around a life cycle assessment project.
Riley assigned a reading and held discussion with her students prior to the library research session. The reading (Graham, L. and Metaxas, P.T. (2003). “Of course it’s true, I saw it on the Internet”: Critical thinking in the Internet era. Communications of the ACM 46 (5):70-75.) is about students’ performance on an information literacy test showed that they tend to be over confident in their research abilities. Riley felt this helped her students check their own over confidence. She also asked students to do a homework assignment to practice information retrieval and access, in addition incorporated a question on her mostly content-intensive mid-term exam about IL. She mentions the importance for faculty to integrate and reinforce IL skills but throughout the course. She has her students put skills into immediate practice and makes them accountable by ensuring students are using appropriate documentation, creating annotated bibliographies, and so on in an iterative way.
Assessment of student learning included various components. First, they used a one minute paper at the end of the session, which students rated learning experience as excellent or good, mentioned that highlights were learning about databases, navigating web site, full text icon, etc. but students did not mention the in-class group activity or evaluation of sources, so they may revise assessment to determine value of these components.
Second, Riley performed focus groups and of the 24 students invited, 9 participated in 2 sessions. She had three guiding questions not specifically related to information literacy and 4 of 9 mentioned information literacy and the value to their learning and at least one related IL to critical thinking.
Third, a course survey, or student self-assessment of initial course objectives, which showed that IL was on high end, though “it wasn’t most central it made an impression on them.”
Fourth, analysis of Student Work including homework, tests and projects showed from an initial quiz where students performed poorly on information literacy after research session and projects, students showed significant improvement. Still one of their biggest difficulties was determining holding for a journal with both online and print formats.
Riley feels factors contributing to student learning include:
- Librarian inclusion
- Reinforcement by faculty member
- Integration – accountability across coursework
This approach may be more resource intensive but authors recommend featuring and sharing faculty work in these areas, gaining faculty buy-in and creating incentives for participation. Most important is developing relationships to make this happen.