Conference Report
EDUCAUSE Mid-Atlantic Regional Conference
This conference,
Sean Carton, Dean, School of Design & Communication,
Carton’s main point was that current traditional undergraduates expect constant and steady internet access and are not loyal to any one medium of communication. He cited the Pew Internet and American Life Project’s finding: 46% of 18-27 year olds use instant messaging more than email. He then noted that at one institution, students were annoyed at official announcements sent via IM because they considered this a personal medium, not an official one. On the other hand, he also said some students are beginning to think of the web as a traditional medium and may consider it as outdated as print. Carton exhorted conference-goers to “be everywhere your users are in the form that best suits them at the time” and recommended that we publish in multiple channels (print, web, email, etc), perhaps trying just a few official IM messages per semester to test the waters.
He cautioned, “Technology changes quickly, people change slowly.”
Comments: This is a difficult balance to establish and maintain: We have to innovate, or our users will lose us in the sea of ever-emerging technologies, but we can’t be too innovative, or we’ll lose our users. His multiple channels suggestion is a good one, but libraries often do not have the staff and financial resources for that kind of duplication.
Robert B. Kvavik, ECAR
Senior Fellow & Associate Vice President,
Library-related points:
Ø Students self-report spending less than an hour a week “using a university library resource to complete a class assignment” but spending between 3-5 hours a week on “classroom activities and studying using an electronic device.”
Ø On a scale of 1 to 4, students put their skill level with “online library resources” at 2.88.
Ø 94.8% of students that used course management systems use online readings (e-reserves), and 24.9% of students perceived online readings in CMS as improving their learning.
A few other tidbits from the presentation:
Ø Most students felt the primary benefit of technology in classrooms was convenience. They were most comfortable with a moderate amount of technology in the classroom—not so little that the convenience benefits were removed, but not so much that the tech was creating extra work for them.
Ø Students also reported that PowerPoint (and presumably similar software programs) presentations often put them to sleep and felt such presentations depersonalized the relationship between instructors and students.
Ø Researchers did not find attitudes about tech correlated with gender or academic standing in this study but did seem to be related to year and major. Year and major may be connected, since freshman students are often undeclared. One definite point of correlation they found was that young males who spend a lot of time playing video games are likely to have worse grades.
Comments: The audience grilled Kvavik on the methodology of the study and he provided the details (available online in their key findings). ECAR is working on a similar study that includes more students and institutions. Kvavik said that we shouldn’t apologize for technology being a convenience, rather than something that exclusively, quantitatively, improves learning. Unfortunately, this can be a tough pitch during budget negotiations, especially when things like CMS had been billed by some as a tool to improve and enhance learning, rather than just making it easier for students to get their online readings in their pajamas. On the other hand, improving convenience can be a way to increase competitiveness, at least according to EDUCAUSE—see their Student Guide to Evaluating Information Technology on Campus. www.educause.edu/ StudentGuidetoIT/873
Mary McAleer Balkun,
Associate Professor, Chair, English Department, and Marta Deyrup,
Assistant Professor, Librarian, both at
After initially simply including an email link for librarians within Blackboard in each first-year English class section with a brief in-class introduction of the librarian, they realized, to be most effective, they needed to limit themselves to fewer faculty members in higher-level courses who would actively promote it to the students. Balkun gave Deyrup access to two Blackboard courses and added her to the list of participants. There were a small number of students and Deyrup came to the class twice in the semester.
Based on the class list, students would select her name (the name on the list of participants they recognized as not belonging to a fellow student) and email her with questions. The four sample questions they showed were complex research questions, asking which resources would be best for specific topics. They said there was a range of complexity in the questions, but that questions were not frivolous and students liked having “anonymous” email exchanges, although sometimes they came into the library later for more in-depth conversations. Deyrup said this could also help to identify assignments or topics that were difficult or impossible to complete with the available library resources and to encourage more dialogue between librarians and instructors.
Comments: The library did not include emailed correspondence between Deyrup and students in the email reference question statistics, which surprised me. Setting guidelines might be mildly tricky—is it only the first emailed question, or subsequent followups?—but I would think counting these transactions would be important in libraries where reference statistics might otherwise be falling.
Rae S. Brosnan, Senior Information Technology Specialist, Donald Juedes, Librarian for Art History, Classics, & Philosophy, Milton S. Eisenhower Library, and Michael Reese, Assistant Director, Center for Educational Resources, all at The Johns Hopkins University, spoke during this session.
They worked to develop a model where there was a flow of information between faculty, librarians, instructional designers, and information technologists. The library received a grant to create the Center for Educational Resources. CER staff are instructional designers who provide technical expertise and project coordination. There were times, they noted, that not everyone needed to be at a particular meeting, and they had to learn to pull back from wanting to be involved with everything.
Juedes used the concept of librarians as concierges (rather than gatekeepers)—we know how to find and provide information on many topics. Librarians created subject guides for WebCT course in art history and also assisted IT with GIS/map creation for instructional purposes. Juedes described it as “shell-shock” to experience the increase in emailed reference questions after posting the librarian’s email address in a WebCT course, but after awhile there is a sameness to the questions.
“Buzzword Bistros,” weekly brownbags with specific topics, were hosted by different departments to increase cross-departmental communication.
Comments: I find it mildly disturbing at presentations of this type when audience members report that this all sounds very impressive but unworkable at their institution given the unwillingness of staff to cooperate. The curmudgeon in me can accept stagnation where the technology is too expensive or beyond the capacity of the existing staff to develop, but the optimist is disheartened at technology’s potential to assist users being defeated by departmental feuds. This was a reminder to try to set a good example.
Wendy Wigen and Garret Sern, EDUCAUSE, focused on the impact of Voice over IP (VoIP) technology and what federal regulations might be developed because of it. Specifically, VoIP may cause changes or additions to the Telecommunications Act of 1996. Currently, there is a divide between regulations on telephones, TV, radio, computers/Internet. VoIP breaks the current molds, because it can involve telephone to computer, computer to computer, telephone to computer to telephone, and computer to telephone interactions.
Wigen and Sern point out that while EDUCAUSE is lobbying for the least possible amount of regulation (and the lowest fees and/or taxes), they pleaded for this issue to be taken up by upper-level educational institution administration (i.e. college and university presidents) as a lobbying point, since they are considered a more important voice than EDUCAUSE by members of U.S. Congress.
The presentation is a good overview of the topic and worth a look. www.educause.edu/LibraryDetailPage/666?ID=MAC0512
William N. Dobbins,
TPM, technology protection measures, are being developed so that the copyright holder may prevent use even when fair use principles would otherwise allow it. Dobbins used the example of Universal Studios v. Corley, the case about the DeCSS code that decrypted one type of DVD copy prevention, to talk about fair use not being a guarantee of the ability to duplicate copyrighted materials.
Take a look back at Cites & Insights January 2003 and June 2003 for more on copyright and the DMCA.
Steven DeCaroli, Assistant Professor,
Philosophy and Religion,
I don’t remember who said what, but these two quotes stuck with me.
Ø After referring to a basic telephone as a “transparent” technology. “Technology reveals itself when it breaks. Computers are not yet transparent.”
Ø “Outcomes assessment must include sustainability.” This was in reference to a project that introduced first-year students to a technology that was not used in higher-level courses. The technology didn’t produce the hoped-for increased retention in that field of study.
A theme in this conference overall was not just innovation, but supporting newly implemented technology over the long term.
Elena O’Malley, eom@post.harvard.edu, is the Head of Library
Computer and Internet Services at
Session Report:
MARS Hot Topics
Despite disclaimers that this was just a discussion with a few introductory remarks, it felt like a program, particularly since those introductory remarks took 40 minutes. The crowd started at perhaps 80 and grew to more than 100, I’d guess, including representatives from metasearch vendors, resource vendors (places like OCLC and RLG that are targets for metasearch), and lots of library people.
His underlying belief: Metasearch will work perfectly when or if all the data is in one database—and won’t work perfectly until/unless that’s possible.
Improving metasearch turns out to be hugely complex. The goal is to help users find what they need while minimizing what they need to know. (He quoted a bunch of Tennant’s Tenets, his name for Roy Tennant’s pithy sayings, such as “Only librarians want to search; everyone else wants to find” and “Good enough is frequently just that.”)
The NISO-MI Wiki (www.lib.ncsu.edu/niso-mi/index.php/Main_Page) is supposed to be the key repository for what the groups are doing. There’s still not a lot there, but it does include minutes for quite a few meetings.
Quick search for
Results are currently grouped by resource in BC’s implementation. When they go to version 3, they expect auto-deduping and tabular results. Use of metasearch may depend on discipline.
She’s hoping that standards will make more resources searchable, make results more consistent, and build user and librarian confidence in metasearch.
While slow to start, this became interesting and sometimes mildly heated.
Peter Noerr made a big spiel for how great metasearch engines really are. He said that metasearch engines could (universally?) handle fielded searches, translating to less-specific searches as resources require. Relevance engines are still difficult. He seemed to suggest that only getting a few records from each resource was a good thing, as it didn’t overwhelm users the way Google can (but unless those few records from each resource all come from comparable relevance engines, and unless the resources are all of comparable richness for the search, I find it hard to agree with that stance). He says searching is moving away from Boolean logic.
Someone seemed to say that library searching was, and should be, moving away from specificity in general, with stemming at the metasearch level. “Give ‘em something” seemed to be the theme here. Noerr clearly liked the idea that adding more words to a search would not penalize the searcher, and seemed to assert that this is true in Google (which I have not found to be the case).
How are results sorted? It depends.
Walt Crawford asked why metasearch engines did screen-scraping (a term Noerr despises, preferring “HTML parsing”) against resources with robust Z39.50 servers. Todd Miller from Webfeat gave a partially responsive answer: “We do whatever the clients want…. Most clients don’t ask for Z39.50” Paraphrasing, clients want to see results in the display format of the original resource, so HTML parsing is preferable.
Noerr noted that some Z39.50 implementations are good, while some are not; the metasearch engine he represents parses HTML by preference. Crawford later raised the issue of resource overhead; Noerr asserted that Z39.50 searches might just as well represent more overhead for the resource. (This raises an interesting point: Maybe part of Zeerex, the new method for explaining a resource, should be an assertion about the lowest overhead and preferred protocol for searching. It’s certainly possible that HTML parsing represents lower overhead and use of resources for some databases than would Z39.50, although it seems unlikely for library vendors.)
Someone from JSTOR noted that most
metasearch engines were commercial and wondered
whether there were Open Source alternatives. A woman from
Serials Solutions, now part of ProQuest, is working on its own metasearch engine. Their rep noted, and several other vendor reps agreed, that “connectors [to resources] are tough.”
There was some discussion of relevance rankings, one of the great mysteries of info retrieval. Someone asked whether any sort of “popularity” measure was plausible for relevance within bibliographic databases. RedLightGreen uses number of holdings libraries as part of its relevance algorithm and so does WorldCat.
Discussion included the possibility of using circulation count as part of relevance, at which point the problems with popularity equaling relevance emerged: It penalizes newer items, reference books, and the most unique items, which may be the most relevant.
Cites & Insights: Crawford at Large, Volume 5, Number 5, Whole Issue 61, ISSN 1534-0937, a journal of libraries, policy, technology and media, is written and produced by Walt Crawford, a senior analyst at RLG.
Cites & Insights is sponsored by YBP Library Services, http://www.ybp.com.
Hosting
provided by
Opinions herein may not represent those of RLG, YBP Library Services, or Boise State University Libraries.
Comments should be sent to wcc@notes.rlg.org. Cites & Insights: Crawford at Large is copyright © 2005 by Walt Crawford: Some rights reserved.
All
original material in this work is licensed under the Creative Commons
Attribution-NonCommercial License. To view a copy of
this license, visit http://creativecommons.org/licenses/by-nc/1.0 or send a
letter to Creative Commons,
URL: citesandinsights.info/civ5i5.pdf