Wednesday, November 24, 2010

Unit 11: Finding content: Discovery tools

At last, thanks to Anna presentation and the talk from Judith, and in-class demonstrations, I kind of know how FindIt works.

Important concepts:

Knowledgebase
OpenURL
LinkResolver

FindIt is a link resolver for OpenURLs. Its brand name is SFX, from Ex libris. Link resolvers give libraries local control over OpenURLs, by directing user to copy of resource (or target) that is subscribed to (source) by the library. When users click the FindIt button, a query is placed into a knowledgebase. The knowledgebase contains detailed information about electronic resources, and the query finds its target there. The user is directed to the electronic resource that is licensed by the UW-Madison libraries. The link can vary in granularity. Sometimes you go straight to the article, but sometimes you are just taken to the journal and must search for the article. I think if there are multiple copies available, then you are presented with a list of target providers. If no electronic resource exists, the link resolver provides a link to a MadCat query that looks for the item in library holdings.

I also now understand what a DOI (digital object indentifier) is. DOI syntax is a NISO standard, that provides a uniform way for publishers to identify material at the article level. CrossRef is the biggest repository of DOIs. DOI is used extensively in publishing and works well as an identifier in OpenURLs. However, DOI is not used as much in libraries, because it points to THE COPY--the authoritative version from the publisher, whereas libraries want to point to their own copies.

Monday, November 22, 2010

Unit 10: Data Standards and Silos

Name these standards...







The readings this week focused on standards for data related to the management and use of electronic resources. Data standards facilitate cross-system data transfer, and storage of data. This takes us back to last week's unit, and Tim Jewel. Based on research by Tim Jewel indicating that libraries were growing their own ERMS, the DLF started the Electronic Resource Management Initiative. The major goal of the initiative was to develop a standard to keep track of licensing details. The readings this week discuss some of the standards that arose through groups like DLF and NISO, and where they are used in the life-cycle of an electronic resource.

Paushan Yue's article "Standards for the Management of Electronic Resources (ER)" starts with a look at the DLF ERMI. Tim Jewel's research into use of homegrown ERMs, and the Web Hub that he set up with Adam Chandler of Cornell University, prompted some big group meetings that included librarians, publishers, PAMS, vendors, and subscription agents. Everyone agreed that standards were needed in order for ERMS to be effective. DLF ERMI was formed to develop standards and define functional requirements for ERMS

In addition to DLF ERMI, Yeu talked about subsequent standardization initiatives for ERM:
  • ONIX (Online Information Exchange)--Commonly used in the publishing trade. Adopted as standard for license information
  • XML based Metadata--Identifying objects, making bibilographic MARC data into XML data. Library of Congress standards include MARCXML, MODS, and MADS
  • OpenURL--Standard for dynamic linking, getting users to the right copy of content. Has sources, target and link resolver.
  • NISO Metasearch Initiative--improving cross database searching.
  • International Standard Serial Number (ISSN) Revision--Unique identifiers are needed for electronic resources. The ISSN had flaws: was not being universally used by publishers, and dealing with the issue of format. ISO working group trying to fix those problems.
  • COUNTER (Counting Online Usage of Networked ER)--An international effort to track online usage. Vendors who meet COUNTER guidelines can register themselves as COUNTER compliant.
Oliver Pesch, Chief Stratagist at Ebsco Information Services, writes about the "information supply chain," where information about e-resources is transferred across multiple systems. Some of this information includes pricing information, holdings details, bibliographic elements and rights and permissions. In order for this information to be interoperable, it is necessary to avoid proprietary data formats.
Pesch provides detailed figures of the life cycle of an electronic resource, the information detail for each phase in the life cycle, and the standards being used to store and transfer the information. Life cycle phases include:
  • Acquire (Title lists, license terms, order information, etc.)
  • Provide access (A to Z lists, Proxy, Catalog, Link resolver etc.)
  • Administer (Usage rights and restrictions, claims, holdings changes)
  • Support (contacts, troubleshooting)
  • Evaluate (Usage data, costs data)
  • Renew (Title lists, business terms, invoices)
ONIX is big for lots of things--license terms, pricing info, title lists, MARC is used for bibliographic records, SERU for license Terms (SERU! a return visit from Unit 4), OpenURL for linking, ICEDIS for order and invoice information, COUNTER and SUSHI for usage data.

We read an article about COUNTER, which discussed some future directions for COUNTER, such as a JUF (Journal Use Factor) which would take usage measurements and use them to calculate a journals relevance and popularity.

References:

1. Todd Carpenter (2008) Improving Information Distribution Through Standards. Presentation at ER&L 2008. http://hdl.handle.net/1853/20877 2. Oliver Pesch “Library Standards and E-Resource Management: A Survey of Current Initiatives and Standards Efforts.” The Serials Librarian, Vol 55 No 3, 2008, pp 481-486.
3. Paoshan W. Yue “Standards for the Management of Electronic Resources” in Mark Jacobs (Ed) Electronic Resources Librarianship and Management of Digital Information: Emerging and Professional Roles, Binghamton NY: Hayword pp155-171.
4. Peter T. Shepard (2010) “Counter: Current Developments and Future Plans.”Chapter in The E-Resources Management Handbook. (2006-present) Editor Graham Stone, Rick Anderson, Jessica Feinstein. http://uksg.metapress.com/openurl.asp?genre=article&id=doi:10.1629/9552448-0-3.23.1

Images:
ONYX image: By Simon Eugster --Simon 14:41, 11 April 2006 (UTC) (Own work) [GFDL (www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0/) or CC-BY-SA-2.5-2.0-1.0 (www.creativecommons.org/licenses/by-sa/2.5-2.0-1.0)], via Wikimedia Commons
COUNTER image: By Biol (Own work) [Public domain], via Wikimedia Commons
Shirtless Mark Twain: http://commons.wikimedia.org/wiki/File:Mark_Twain-Shirtless-ca1883.jpg
SUSHI: By Lionel Allorge (Own work) [GFDL (www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0/) or CC-BY-SA-2.5-2.0-1.0 (www.creativecommons.org/licenses/by-sa/2.5-2.0-1.0)], via Wikimedia Commons
Applecore: By Philippe Proulx (Own work (Photo personnelle)) [GFDL (www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0/) or CC-BY-SA-2.5-2.0-1.0 (www.creativecommons.org/licenses/by-sa/2.5-2.0-1.0)], via Wikimedia Commons

Thursday, November 18, 2010

Unit 9: Electronic Resources Managment Systems

Electronic Management Systems (ERMS) are confusing in their scope and variety. But necessary, because managing electronic resources is even more confusing in scope and variety. Managing large volumes of electronic resources is too complicated for spreadsheets and word documents and humans alone-- which poses a problem for librarians. A cohesive system that can manage all the tasks relating to electronic resource management is the ideal solution. Many different systems have evolved that manage these tasks, but none quite live up to the ideal, and almost all are named with an acronym of some sort.

What is an ERMS:
Wikipedia says it is a software system that keeps track of information about electronic resources.

What should an ERMS be able to do:

The Collins Chapter and the Hogarth and Bloom Chapter list some of the processes that can and should be handled with an ERMS:
  • Public Display
  • License Management
  • Collection Evaluation
  • Statistical Analysis (usage, etc)
  • Maintenance of multiple spreadsheets and databases
  • Ability to generate A to Z lists
  • Financial/purchasing (align publishers, handle renewals)
  • Track processes of electronic resources through acquisition, purchasing, and licensing.
  • Contact and Support
How ERMS evolved:
The H& B chapter discusses of the needs that began to arise in the early 2000's. Libraries needed to create and maintain patron-accessible, searchable lists of their electronic resources, to save and share information about licensing, and track usage.

A variety of systems evolved to fill these needs, but information management needs kept getting bigger and more complex. It was important to not have to enter data in multiple places, and to be able to import it automatically. To do this, one needs standards!

Collins says that in the early 2000's the need for ERMS became pressing, as well as the need to develop standards for such said ERMS. Tim Jewel (University of Washington) did some key research into the issue of ERMS and standards for license information. He noticed that many libraries were cooking up their own ERMS, and that the variety of data forms was going to cause some long-run interoperability problems. Tim Jewel formed Web Hub in 2001 to provide a place for information exchange on ERMS by interested parties. The DLF Electronic Resource Management Initiative was formed, and has been working on developing data standards for license agreements and administrative details. Some goals of DLF ERMI are to:
  • Provide XML Schema
  • Create Data Dictionary
  • Describe functional requirements of an ERM
  • Identify and support data standards (like ONIX for license agreements, from EDItEUR)
What types of ERMs available:
There are a variety of ERMS available for libraries to choose from. Each type has pros and cons associated with it. The Collins chapter lists some of these types and their strengths and weaknesses.

ERMS available from ILS vendors.
Examples: Endeaver's Meridian, Ex Libris' Verde, Innovative's ERM
Pros: Interoperability!
Cons: Overly dependent on a single vendor, and potential lack of a knowledge database

ERMS available from 3rd party vendors
Examples: Carls Gold Rush, Serials Solutions, TDnet TERM
Pros: knowledgebase, A to Z lists, link-resolvers
Cons: Integration with ILS and with tools from other 3rd parties

Homegrown ERMS
Pros: Tailored and customized
Cons: Time and staff intensive to develop and need ongoing tech support
Note- there is a table on page 190 of the Collins reading that lays out information on assorted ERMS, and has a checklist on p. 192 for determining the ERMS needs of a particular library


Implementation of ERMS:
In my Health Information Systems class last semester, we discussed how adoption of an information system is often hard on a workplace, especially if it involves changes in workflow. Unrealistic expectations about how the system will be integrated and how well it will work often get in the way of successful implementation. This seems to be pretty applicable to Collins' discussion of implementation, and how important planning is for the process.

Some implementation complaints that Collins notes are:
  • overwhelming amount of manual data needs to be entered
  • hard to incorporate tool into workflow
  • not having enough staff involved
  • poor mapping
  • underappreciation of the value of the ERMS
Summary:
In class we discussed ERMs and read an issue from Against the Grain that covered ERMS. A survey from Against the Grain noted that 75% of respondants use ERMS in their library. The top uses are:
  • E-journal package management
  • Online database management
  • Access to license terms and conditions
Many librarians in the survey expressed frustration at the amount of manual data entry required. One librarian referred to it as "care and feeding" of the ERM. But there was a general feeling that ERMS are still a work in progress, and are improving, and are worth the effort.

UW Madison libraries use Ex Libris's VERDE ERM, Voyager ILS and SFX OpenURL link resolver.


References:
1. Maria D.D. Collins “ERM Systems: Background, Selection and Implementation, Chapter 10 in Maria D.D. Collins and Patrick L. Carr (Eds) Managing the Transition from Print to Electronic Journals and Resources. New York: Routledge, pp 181-206.
2. Hogarth, M.; Bloom, V. “Chapter XVII: Panorama of Electronic Resource Management Systems” Chapter 17 in H. Yu and S. Breivold Electronic Resource Management in Libraries: Research and Practice. Information Science Reference: Hershey PA, 2008.
3. ERM Special Reports (2010) Against the Grain, Vol 22, No 2

Saturday, November 13, 2010

Unit 8: Technological protections measures

This is unit is about:

  1. Which kinds of technological barriers stand between prospective users and the information they are seeking
  2. Which technological barriers are being used by libraries/museums/archives
  3. Which technological barriers are being used by publishers and vendors that provide licensed scholarly resources to librarians
  4. Why is it important for librarians to be aware of items 1,2 and 3?


1) So, what stands between users and information? At the most basic level it is control of access.

The Millman article "Authentication and Authorization" discusses methods of information systems security, privacy and access management by authentication and authorization.

Authentication is defined by Millman as the "Process of validating an assertion of identity" p.229.
Authentication involves "telling" who you are to a computer system, and the way the system decides whether you are telling the truth. This has grown increasingly complex since the early days of authentication ( into the 1980's) when users had a more intimate, physical relationship with their computer system. The emergence of remote resources has promoted the evolution of a variety of authentication methods.

This makes me think about the many ways that I use authentication services during the day: school, banking, email, social networks, etc. I probably go through an authentication process about 10 times per day, at least. As the article points out, methods of authentication are becoming cumbersome--it is very difficult to remember all of one's passwords, usernames, security questions, etc.

The article discusses the various methods of authentication, including:
  • Passwords. This is a "what one knows" or "shared secret" type of authentication. Is the most common type of authentication method. Has security risks--passwords can be hacked, or shared. The more frequently passwords change, the more secure they are.
  • digital signatures
  • Network topology: Systems identifying other systems based on where they are in network (i.e. IP addresses)
  • Biometrics (what one is): comparing physical characteristics with information in a databases. This is the most stringent method of authentication.
  • Public key cryptography: (what one possesses?) I did not understand the Millman explanation for this. I looked it up on "HowStuffWorks.com" and I still don't really understand. Each user has two keys, public and private. Private is assigned to user's computer. Public is used to transfer information. The keys are inverse of each other. You need both to decrypt information. Extra note 0n public key cryptography--Windows Media Audio uses a form of public key cryptography in its DRM. User gets an encrypted key and an unencrypted key to decrypt WMA files. A program called FairUse4WM was created in 2006, by an entity named "Viodentia," to strip DRM from WMA's. Windows tried to sue, but could not find the identity of Viodentia.
  • Smart Card (What one possesses) tiny low-power computer that can store authentication information (for example, a private key)
  • Digital Signatures: A small bit of data associated with a larger bit of data that "fingerprints" or identifies data. For example, a private key can leave a fingerprint when a user encrypts with it.

The article also discusses authorization, or the "process of determining which operations are permitted between a given subject and object" p.233.
Authorization methods can be:
  • mandatory access control (MAC): An administrator assigns classifications to subjects and objects, representing levels of security.
  • discretionary access control (DAC): Owner determines access permissions
  • role-based access control (RBAC): Permissions change based on a subject's role

2) What technological barriers to access and use are commonly used by libraries, museums and archives?

The article "Technologies Employed to Control Access to or use of Digital Cultural Collections: Controlled Online Collections," contained surveys that were used to evaluate what the most popular types of Technology Protection Measures (TPM). The article defines TPM as "computer hardware and software based systems or tools that seek to limit access to a work or use of a work" where systems are "branded (often commercial) software packages involving numerous interrelated functionalities" and tools can be seen in many different systems.

Access control was most commonly accomplished by use of authentication methods (usually networkID authentication), authorization systems, and IP ranges. Some use of terminal-restricted access was also reported.

Use control was most commonly accomplished by resolution limits, clips and thumbnails. Visible watermarking and click-through agreements were also used by a significant number of institutions.

No one reported using Biometric controls... :)

3) How are vendors and publishers restricting use?

For material to be valuable to users, they often need to be able to print it, save it, and cut and paste from it. The article "Every Library's Nightmare? Digital Rights Management, Use Restrictions and Licensed Scholarly Resources" discusses some of the ways that vendors restrict use to licensed scholarly resources.

First type of restriction discussed by the article is "soft restriction." Soft restrictions do not strictly prevent use, but deter certain uses by making it difficult to perform the needed operations. 6 kinds of soft restrictions were discu-ssed:
  • Extent of use: Vendors warn against excessive use of material, or give print/save batch limits
  • Restriction by frustration:Breaking contents into chunks that must be saved/printed as a chunk rather than as needed. Common in ebooks.
  • Obfustication: Interfaces that make finding print/save functions difficult
  • Interface omission: Interfaces that do not have certain functions like print/email/cop/save/paste.
  • Restriction by decomposition: When material breaks down into files when it is saved
  • Restriction by warning: Using warnings against certain uses, like saving, but not preventing them with technology.
The other kind of restrictions are referred to as "hard" restrictions in the article, and are restrictions that strictly prevent use.
There are two types described in the article
  • Type 1: No copy/pasting/printing possible
  • Type 2: Secure container TPM in place
These were rarer than the soft restrictions. No examples of Type 2 were found in the article's surveys.

4. Why is it important for librarians to be aware of methods for preventing user access/use and employment of these methods in libraries and by vendors?

First of all, many librarians work with communities of users who must be authenticated before they can access the library's licensed resources. Levels of authorization are also common in these libraries, where different staff are giving different permissions to access the library's network. In class we discussed the Shibboleth system, which is an open source authentication system that can identify which 'group' a user is in. This can affect libraries, because it offers a way to restrict information to departments.

Secondly, librarians should be informed about what types of TPM are being used in libraries/archives/museums. They may be in a position to select TPM themselves, and even if not, should be aware of how TPM affect users (i.e. visible watermarking is detrimental to use).

Third, it is important for librarians to be aware that vendors may use soft restrictions that do not reflect their terms of use or signed licenses (for example, making saving difficult, when it is allowable under the terms of use).

As a user, I find it difficult to separate TPM in place by the library from TPM used by vendors. Clearly NetID and password authentication process is through the UW system. Generally the TPM that I come across most often, aside from login, are TPM that prevent copy and pasting from articles that are accessed through licensed resources, which I presume is a vendor/publisher thing. Another example I can think of ( and we discussed in class) that represents soft restriction--by the library itself--is the use of the goprint system, which uses ID cards as print cards. This makes it much more difficult for the public to print.

References:

1. Eschenfelder, K. R. (2008). “Every Library‟s Nightmare? Digital Rights Management and Licensed Scholarly Digital Resources.” College and Research Libraries, 69(3), 205-225.
2. Eschenfelder & Agnew (2010) Technologies Employed to Control Access to or Use of Digital Cultural Collections: Controlled Online Collections. D-Lib Magazine. Vol 16, No 1. http://www.dlib.org/dlib/january10/eschenfelder/01eschenfelder.html
3. David Millman (2003) “Authentication and Authorization” Encyclopedia of Library and Information Studies, 2nd Edition.
4. Zhu, A.; Eschenfelder, K.R. L(2010) Social Construction of Authorized Users in the Digital Age. College and Research Libraries. Anticipated Publication Date: November 2010 http://crl.acrl.org/content/early/2010/04/29/crl-62r1.full.pdf+html

Friday, November 5, 2010

Unit 7: Distance Education and TEACH Act

Prior to this unit I had not even realized that distance teaching would be subject to more rigorous copyright restrictions than classroom teaching. I suppose broadcast or digital material is treated as more sensitive than hard copy material.

An article by Tomas Lipinksi was assigned for this unit. The TEACH Act is difficult to understand, and the article seemed...vague, so I had to refer to some more basic sources to get a grip on what we were talking about.

I referred to the Copyright Clearance Center Guide to the TEACH Act to get an overview of the TEACH Act.

TEACH is an acronym for "Technology, Education and Copyright Harmonization. Harmonization? The Merriam-Webster dictionary does not even consider "Harmonization" to be a word. I looked in BusinessDictionary.com, which defined Harmonization like this:
Adjustment of differences and inconsistencies among different measurements, methods, procedures, schedules, specifications, or systems to make them uniform or mutually compatible.
Which makes more sense to me...the TEACH Act is intended to make technology, education and copyright mutually compatible. OK.

Other facts about TEACH:

  • Modifies previous Copyright Law regulations (pre-digital) about distance teaching.
  • Applies to accredited, non-profit educational institutions (and some Government institutions).
  • Materials displayed most be for educational purposes directly related to class, limited to students in class, technologically protected, and be only a "reasonable and limited portion" of the total work.
  • Does not apply to electronic course reserves, digital textbooks, document delivery
Dr. Lipinski's article discusses the TEACH Act in some legal detail.
The TEACH Act, according to Lipinski, is a very complex law legislated to update the pre-TEACH law 17 U.S.C. 110 (2)
, which was restrictive. It allowed unlimited performance of non-dramatic works, and unlimited display of any other category of work. It was written in the era of the mid-1970's, and the legislative language was meant to apply to analog transmissions (like television) to a physical classroom. A goal of the TEACH Act was to remove the concept of broadcasting to a physical classroom, and provide for use of digital materials distance education beyond the classroom. While TEACH Act expanded the rights to use digital materials, it also limited amount. Pre-TEACH law allowed for unlimited display, but TEACH Act limited to display to what would normally be shown in a live classroom session.
Lipinski specifically address streaming films, saying:
It can further be argued that use of the entire video can never be used under TEACH. If Congress had desired educators to be able under section 11o(2) to be able to use the entire video no such conditional language would have been included...
The second reading for the unit, the ARL/ALA Issue Brief, "Streaming of Films for Educational Purposes," argues in the other direction. The authors suggest that Fair Use is a better guideline to follow than the TEACH Act when it comes to deciding how much of a film to stream. However, it seemed to me that the overall tone of the brief was hopeful rather than decisive. Like...you will PROBABLY be ok if you decide to stream a full film. It is LIKELY that the courts would decide in your favor if it came down to it. There is a definite risk implied in streaming a full film. It is interesting that Fair Use is a separate guideline from the TEACH Act, and that authors prefer using the Fair Use guidelines over the TEACH Act.

Mr. Lipinski came and gave a (well-attended) lecture to our class. He addressed the full-length streaming issue further, saying that it is very unclear how much a video you can show--there is no set limit. You could, for example, show everything but the credits, and say you didn't show the whole film.

He spent some time discussing "ephemeral recordings" which has a lovely sound to me, like tapes of ghosts talking--which is not what it is. The U.S. Copyright office defines ephemeral records as
Ephemeral Recording is a phonorecord created solely for the purpose of facilitating a transmission of a public performance of a sound recording under the limitations on exclusive rights specified by 17 U.S.C. 114(d)(1)(C)(iv) or under a statutory license in accordance with 17 U.S.C. 114(f), and subject to the limitations specified in 17 U.S.C. 112(e).
So, that is kind of old school. Now ephemeral recordings seem to be more defined as a copy made for purposes of electronic distribution. The TEACH Act updated ephemeral recordings guidelines so that instructors could make digital copies of teaching materials, if they were following TEACH guidelines (limited portion, time-limited, etc).
Lipinski emphasized that intention is very important in copyright law. Being able to use the Good Faith argument prevents you from having to pay damages. On the other extreme, willful copyright infringement brings the highest penalties. Conduct limits does not limit liability, but does limit damages.

My take home message is that the TEACH Act is classic copyright law...confusing, hard to understand, open to interpretation.

Additional TEACH Resources:

References:

Tomas A. Lipinski (2003) “The Climate Of Distance Education In The 21st Century: Understanding And Surviving The Changes Brought By The TEACH (Technology, Education, And Copyright Harmonization) Act Of 2002” Journal of Academic Librarianship 362, (362-374).

ARL Issue Brief: Streaming of Films For Educational Purposes
(http://www.arl.org/bm~doc/ibstreamingfilms_021810pdf.pdf)




Friday, October 8, 2010

Unit 6: Pricing models and consortial agreements

I think this was my favorite unit so far....out of many good ones.

Last week I was talking to a reference Librarian at Steenbock, and she explained to me that the different campus libraries all have their own budget and use their individual budgets to purchase resources that are then used by the whole campus. I am curious to know how this fits in with bundling though...it seems like there would be a lot of overlap between departments and database needs, and the big bundles include such a variety. Maybe there is a umbrella budget that covers the big ticket items that contain a variety of journal types.

I took a few main messages from the readings this week.
The articles that we read about the Big Deal were really interesting because it touched on the concept of how bundles both help and hurt libraries. In particular, I liked Ken Frazier's 2001 piece "The Librarian's Dilemma: Contemplating the costs of the Big Deal" about the psychology of how individual parties lose sight of the big picture when it comes to short-term gain.
In psychology 101, years ago, I read about a similar experiment to the one Ken Frazier describes, where participants could play a game. In the game they could co-operate with each other, and be gauranteed small success, or work against each other with the possibility of bigger success, at the cost of the greater good. And, just as in Frazier's article, the kicker is that they don't know what choices the other participants are making. It always stuck with me as a truism about the way people operate. Its not that we are not willing to work together, but no one wants to be the shmuck who gives up things when no one else is giving them up.
Frazier's point (that I took) is that libraries need to be willing to develop a trust relationship with each other that will permit them to take united stands against publishers when the time calls for it, and support alternative publishing.

The article about journal pricing in the 1930s and 1980s (Astle & Hamaker, 1988) was also very interesting, and seemed to mirror some of the ideas in Frazier's piece (even though it was not written with electronic resources in mind). It echoed the message that libraries hurt themselves when they become complacent about pricing. A quote within the article kind of previewed what Frazier discussed happening in later times: "American libraries, by their ability and willingness to pay, have enabled publishers to persist in charging exorbitant prices" (p.12).
A longer quote from the
Astle & Hamaker article tied into Frazier's concept of "disintermediation"--where libraries are become passive vessels for publishers:
The academic and research library community must become actively involved in the development and implementation of alternative technologies for information distribution as an adjunct to print sources if they are to maintain their central place in the information chain (p.31).


The other readings were also very good, and talked in more detail about pricing considerations, consortia, and publisher-library conflict.

In addition, there were two speakers from WILS (Wisconsin Library Services) who came to discuss how WILS negotiates with vendors on behalf of libraries and how ILL works.


References:




Friday, October 1, 2010

Week 5: E reserves, Fair Use and GSU

The readings this week were about the fair use guidelines, how they applied to E-Reserves, and a real life case featuring Georgia State University.

In class we discussed three sets of fair use guidelines discussed in the readings:

1. Agreement on Guidelines for Classroom Copyright (Approved by Congress 1976)
  1. Brevity--Article of 2,500 words or less or excerpt of 1000 words or 10%
  2. Spontaneity--Use must be from individual teacher and a directive from the Institution. Use should be for late-breaking, current information
  3. Cumulative effects--Work can be used for only one course; not used from term to term; only one article/work per author and no more than three from a periodical volume

2. ALA Recommendations (1982)

I. Presents the four fair use factors of Title 17, 107.
II. Unrestricted photocopying:
a.)Writing published before Jan 1, 1978 that has not been copyrighted
b.) Published works with expired copyright.
c.) Unpublished works. (pre Jan1, 1978)
d.) U.S. Gov Publications
III.Permissible photocopying of copyrighted works.
a.) Research and preparing for teaching--chapter, article, story, essay, poem, chart.
b.) Classroom uses--One semester, only one copy/student, copyright notice, no profit.
c.) Library Reserve Uses

3. CONFU (1991)
Educational Fair Use Guidelines for Educational Multimedia is the only one of these three fair use guidelines to actually be formulated with digital works in mind. It is a long set, so better to follow the link.

We specifically discussed the case of Cambridge University Press, Oxford University Press, and SAGE Publications vs. individuals at Georgia State University for "systematic, widespread and unauthorized copying and distribution of a vast amount of copyrighted works." I thought perhaps the library could be protected by claiming the "good faith" defense. However, it seems that their original policy had been out of the mainstream enough to make it a target.
Because there is no clear set of guidelines, libraries devise their own policies regarding fair use and distribution. The trick seems to be to maintain the push/pull balance between academic users (who need to take stands on fair use, because they are the major stakeholders in the survival of fair use practices)and academic publishers (who rightly want to not give away all of their products for free). Giving in too much to publishers could mean erosion of fair use practice, but there are limits that will need to be followed.