Saturday, November 13, 2010

Unit 8: Technological protections measures

This is unit is about:

  1. Which kinds of technological barriers stand between prospective users and the information they are seeking
  2. Which technological barriers are being used by libraries/museums/archives
  3. Which technological barriers are being used by publishers and vendors that provide licensed scholarly resources to librarians
  4. Why is it important for librarians to be aware of items 1,2 and 3?


1) So, what stands between users and information? At the most basic level it is control of access.

The Millman article "Authentication and Authorization" discusses methods of information systems security, privacy and access management by authentication and authorization.

Authentication is defined by Millman as the "Process of validating an assertion of identity" p.229.
Authentication involves "telling" who you are to a computer system, and the way the system decides whether you are telling the truth. This has grown increasingly complex since the early days of authentication ( into the 1980's) when users had a more intimate, physical relationship with their computer system. The emergence of remote resources has promoted the evolution of a variety of authentication methods.

This makes me think about the many ways that I use authentication services during the day: school, banking, email, social networks, etc. I probably go through an authentication process about 10 times per day, at least. As the article points out, methods of authentication are becoming cumbersome--it is very difficult to remember all of one's passwords, usernames, security questions, etc.

The article discusses the various methods of authentication, including:
  • Passwords. This is a "what one knows" or "shared secret" type of authentication. Is the most common type of authentication method. Has security risks--passwords can be hacked, or shared. The more frequently passwords change, the more secure they are.
  • digital signatures
  • Network topology: Systems identifying other systems based on where they are in network (i.e. IP addresses)
  • Biometrics (what one is): comparing physical characteristics with information in a databases. This is the most stringent method of authentication.
  • Public key cryptography: (what one possesses?) I did not understand the Millman explanation for this. I looked it up on "HowStuffWorks.com" and I still don't really understand. Each user has two keys, public and private. Private is assigned to user's computer. Public is used to transfer information. The keys are inverse of each other. You need both to decrypt information. Extra note 0n public key cryptography--Windows Media Audio uses a form of public key cryptography in its DRM. User gets an encrypted key and an unencrypted key to decrypt WMA files. A program called FairUse4WM was created in 2006, by an entity named "Viodentia," to strip DRM from WMA's. Windows tried to sue, but could not find the identity of Viodentia.
  • Smart Card (What one possesses) tiny low-power computer that can store authentication information (for example, a private key)
  • Digital Signatures: A small bit of data associated with a larger bit of data that "fingerprints" or identifies data. For example, a private key can leave a fingerprint when a user encrypts with it.

The article also discusses authorization, or the "process of determining which operations are permitted between a given subject and object" p.233.
Authorization methods can be:
  • mandatory access control (MAC): An administrator assigns classifications to subjects and objects, representing levels of security.
  • discretionary access control (DAC): Owner determines access permissions
  • role-based access control (RBAC): Permissions change based on a subject's role

2) What technological barriers to access and use are commonly used by libraries, museums and archives?

The article "Technologies Employed to Control Access to or use of Digital Cultural Collections: Controlled Online Collections," contained surveys that were used to evaluate what the most popular types of Technology Protection Measures (TPM). The article defines TPM as "computer hardware and software based systems or tools that seek to limit access to a work or use of a work" where systems are "branded (often commercial) software packages involving numerous interrelated functionalities" and tools can be seen in many different systems.

Access control was most commonly accomplished by use of authentication methods (usually networkID authentication), authorization systems, and IP ranges. Some use of terminal-restricted access was also reported.

Use control was most commonly accomplished by resolution limits, clips and thumbnails. Visible watermarking and click-through agreements were also used by a significant number of institutions.

No one reported using Biometric controls... :)

3) How are vendors and publishers restricting use?

For material to be valuable to users, they often need to be able to print it, save it, and cut and paste from it. The article "Every Library's Nightmare? Digital Rights Management, Use Restrictions and Licensed Scholarly Resources" discusses some of the ways that vendors restrict use to licensed scholarly resources.

First type of restriction discussed by the article is "soft restriction." Soft restrictions do not strictly prevent use, but deter certain uses by making it difficult to perform the needed operations. 6 kinds of soft restrictions were discu-ssed:
  • Extent of use: Vendors warn against excessive use of material, or give print/save batch limits
  • Restriction by frustration:Breaking contents into chunks that must be saved/printed as a chunk rather than as needed. Common in ebooks.
  • Obfustication: Interfaces that make finding print/save functions difficult
  • Interface omission: Interfaces that do not have certain functions like print/email/cop/save/paste.
  • Restriction by decomposition: When material breaks down into files when it is saved
  • Restriction by warning: Using warnings against certain uses, like saving, but not preventing them with technology.
The other kind of restrictions are referred to as "hard" restrictions in the article, and are restrictions that strictly prevent use.
There are two types described in the article
  • Type 1: No copy/pasting/printing possible
  • Type 2: Secure container TPM in place
These were rarer than the soft restrictions. No examples of Type 2 were found in the article's surveys.

4. Why is it important for librarians to be aware of methods for preventing user access/use and employment of these methods in libraries and by vendors?

First of all, many librarians work with communities of users who must be authenticated before they can access the library's licensed resources. Levels of authorization are also common in these libraries, where different staff are giving different permissions to access the library's network. In class we discussed the Shibboleth system, which is an open source authentication system that can identify which 'group' a user is in. This can affect libraries, because it offers a way to restrict information to departments.

Secondly, librarians should be informed about what types of TPM are being used in libraries/archives/museums. They may be in a position to select TPM themselves, and even if not, should be aware of how TPM affect users (i.e. visible watermarking is detrimental to use).

Third, it is important for librarians to be aware that vendors may use soft restrictions that do not reflect their terms of use or signed licenses (for example, making saving difficult, when it is allowable under the terms of use).

As a user, I find it difficult to separate TPM in place by the library from TPM used by vendors. Clearly NetID and password authentication process is through the UW system. Generally the TPM that I come across most often, aside from login, are TPM that prevent copy and pasting from articles that are accessed through licensed resources, which I presume is a vendor/publisher thing. Another example I can think of ( and we discussed in class) that represents soft restriction--by the library itself--is the use of the goprint system, which uses ID cards as print cards. This makes it much more difficult for the public to print.

References:

1. Eschenfelder, K. R. (2008). “Every Library‟s Nightmare? Digital Rights Management and Licensed Scholarly Digital Resources.” College and Research Libraries, 69(3), 205-225.
2. Eschenfelder & Agnew (2010) Technologies Employed to Control Access to or Use of Digital Cultural Collections: Controlled Online Collections. D-Lib Magazine. Vol 16, No 1. http://www.dlib.org/dlib/january10/eschenfelder/01eschenfelder.html
3. David Millman (2003) “Authentication and Authorization” Encyclopedia of Library and Information Studies, 2nd Edition.
4. Zhu, A.; Eschenfelder, K.R. L(2010) Social Construction of Authorized Users in the Digital Age. College and Research Libraries. Anticipated Publication Date: November 2010 http://crl.acrl.org/content/early/2010/04/29/crl-62r1.full.pdf+html

Friday, November 5, 2010

Unit 7: Distance Education and TEACH Act

Prior to this unit I had not even realized that distance teaching would be subject to more rigorous copyright restrictions than classroom teaching. I suppose broadcast or digital material is treated as more sensitive than hard copy material.

An article by Tomas Lipinksi was assigned for this unit. The TEACH Act is difficult to understand, and the article seemed...vague, so I had to refer to some more basic sources to get a grip on what we were talking about.

I referred to the Copyright Clearance Center Guide to the TEACH Act to get an overview of the TEACH Act.

TEACH is an acronym for "Technology, Education and Copyright Harmonization. Harmonization? The Merriam-Webster dictionary does not even consider "Harmonization" to be a word. I looked in BusinessDictionary.com, which defined Harmonization like this:
Adjustment of differences and inconsistencies among different measurements, methods, procedures, schedules, specifications, or systems to make them uniform or mutually compatible.
Which makes more sense to me...the TEACH Act is intended to make technology, education and copyright mutually compatible. OK.

Other facts about TEACH:

  • Modifies previous Copyright Law regulations (pre-digital) about distance teaching.
  • Applies to accredited, non-profit educational institutions (and some Government institutions).
  • Materials displayed most be for educational purposes directly related to class, limited to students in class, technologically protected, and be only a "reasonable and limited portion" of the total work.
  • Does not apply to electronic course reserves, digital textbooks, document delivery
Dr. Lipinski's article discusses the TEACH Act in some legal detail.
The TEACH Act, according to Lipinski, is a very complex law legislated to update the pre-TEACH law 17 U.S.C. 110 (2)
, which was restrictive. It allowed unlimited performance of non-dramatic works, and unlimited display of any other category of work. It was written in the era of the mid-1970's, and the legislative language was meant to apply to analog transmissions (like television) to a physical classroom. A goal of the TEACH Act was to remove the concept of broadcasting to a physical classroom, and provide for use of digital materials distance education beyond the classroom. While TEACH Act expanded the rights to use digital materials, it also limited amount. Pre-TEACH law allowed for unlimited display, but TEACH Act limited to display to what would normally be shown in a live classroom session.
Lipinski specifically address streaming films, saying:
It can further be argued that use of the entire video can never be used under TEACH. If Congress had desired educators to be able under section 11o(2) to be able to use the entire video no such conditional language would have been included...
The second reading for the unit, the ARL/ALA Issue Brief, "Streaming of Films for Educational Purposes," argues in the other direction. The authors suggest that Fair Use is a better guideline to follow than the TEACH Act when it comes to deciding how much of a film to stream. However, it seemed to me that the overall tone of the brief was hopeful rather than decisive. Like...you will PROBABLY be ok if you decide to stream a full film. It is LIKELY that the courts would decide in your favor if it came down to it. There is a definite risk implied in streaming a full film. It is interesting that Fair Use is a separate guideline from the TEACH Act, and that authors prefer using the Fair Use guidelines over the TEACH Act.

Mr. Lipinski came and gave a (well-attended) lecture to our class. He addressed the full-length streaming issue further, saying that it is very unclear how much a video you can show--there is no set limit. You could, for example, show everything but the credits, and say you didn't show the whole film.

He spent some time discussing "ephemeral recordings" which has a lovely sound to me, like tapes of ghosts talking--which is not what it is. The U.S. Copyright office defines ephemeral records as
Ephemeral Recording is a phonorecord created solely for the purpose of facilitating a transmission of a public performance of a sound recording under the limitations on exclusive rights specified by 17 U.S.C. 114(d)(1)(C)(iv) or under a statutory license in accordance with 17 U.S.C. 114(f), and subject to the limitations specified in 17 U.S.C. 112(e).
So, that is kind of old school. Now ephemeral recordings seem to be more defined as a copy made for purposes of electronic distribution. The TEACH Act updated ephemeral recordings guidelines so that instructors could make digital copies of teaching materials, if they were following TEACH guidelines (limited portion, time-limited, etc).
Lipinski emphasized that intention is very important in copyright law. Being able to use the Good Faith argument prevents you from having to pay damages. On the other extreme, willful copyright infringement brings the highest penalties. Conduct limits does not limit liability, but does limit damages.

My take home message is that the TEACH Act is classic copyright law...confusing, hard to understand, open to interpretation.

Additional TEACH Resources:

References:

Tomas A. Lipinski (2003) “The Climate Of Distance Education In The 21st Century: Understanding And Surviving The Changes Brought By The TEACH (Technology, Education, And Copyright Harmonization) Act Of 2002” Journal of Academic Librarianship 362, (362-374).

ARL Issue Brief: Streaming of Films For Educational Purposes
(http://www.arl.org/bm~doc/ibstreamingfilms_021810pdf.pdf)




Friday, October 8, 2010

Unit 6: Pricing models and consortial agreements

I think this was my favorite unit so far....out of many good ones.

Last week I was talking to a reference Librarian at Steenbock, and she explained to me that the different campus libraries all have their own budget and use their individual budgets to purchase resources that are then used by the whole campus. I am curious to know how this fits in with bundling though...it seems like there would be a lot of overlap between departments and database needs, and the big bundles include such a variety. Maybe there is a umbrella budget that covers the big ticket items that contain a variety of journal types.

I took a few main messages from the readings this week.
The articles that we read about the Big Deal were really interesting because it touched on the concept of how bundles both help and hurt libraries. In particular, I liked Ken Frazier's 2001 piece "The Librarian's Dilemma: Contemplating the costs of the Big Deal" about the psychology of how individual parties lose sight of the big picture when it comes to short-term gain.
In psychology 101, years ago, I read about a similar experiment to the one Ken Frazier describes, where participants could play a game. In the game they could co-operate with each other, and be gauranteed small success, or work against each other with the possibility of bigger success, at the cost of the greater good. And, just as in Frazier's article, the kicker is that they don't know what choices the other participants are making. It always stuck with me as a truism about the way people operate. Its not that we are not willing to work together, but no one wants to be the shmuck who gives up things when no one else is giving them up.
Frazier's point (that I took) is that libraries need to be willing to develop a trust relationship with each other that will permit them to take united stands against publishers when the time calls for it, and support alternative publishing.

The article about journal pricing in the 1930s and 1980s (Astle & Hamaker, 1988) was also very interesting, and seemed to mirror some of the ideas in Frazier's piece (even though it was not written with electronic resources in mind). It echoed the message that libraries hurt themselves when they become complacent about pricing. A quote within the article kind of previewed what Frazier discussed happening in later times: "American libraries, by their ability and willingness to pay, have enabled publishers to persist in charging exorbitant prices" (p.12).
A longer quote from the
Astle & Hamaker article tied into Frazier's concept of "disintermediation"--where libraries are become passive vessels for publishers:
The academic and research library community must become actively involved in the development and implementation of alternative technologies for information distribution as an adjunct to print sources if they are to maintain their central place in the information chain (p.31).


The other readings were also very good, and talked in more detail about pricing considerations, consortia, and publisher-library conflict.

In addition, there were two speakers from WILS (Wisconsin Library Services) who came to discuss how WILS negotiates with vendors on behalf of libraries and how ILL works.


References:




Friday, October 1, 2010

Week 5: E reserves, Fair Use and GSU

The readings this week were about the fair use guidelines, how they applied to E-Reserves, and a real life case featuring Georgia State University.

In class we discussed three sets of fair use guidelines discussed in the readings:

1. Agreement on Guidelines for Classroom Copyright (Approved by Congress 1976)
  1. Brevity--Article of 2,500 words or less or excerpt of 1000 words or 10%
  2. Spontaneity--Use must be from individual teacher and a directive from the Institution. Use should be for late-breaking, current information
  3. Cumulative effects--Work can be used for only one course; not used from term to term; only one article/work per author and no more than three from a periodical volume

2. ALA Recommendations (1982)

I. Presents the four fair use factors of Title 17, 107.
II. Unrestricted photocopying:
a.)Writing published before Jan 1, 1978 that has not been copyrighted
b.) Published works with expired copyright.
c.) Unpublished works. (pre Jan1, 1978)
d.) U.S. Gov Publications
III.Permissible photocopying of copyrighted works.
a.) Research and preparing for teaching--chapter, article, story, essay, poem, chart.
b.) Classroom uses--One semester, only one copy/student, copyright notice, no profit.
c.) Library Reserve Uses

3. CONFU (1991)
Educational Fair Use Guidelines for Educational Multimedia is the only one of these three fair use guidelines to actually be formulated with digital works in mind. It is a long set, so better to follow the link.

We specifically discussed the case of Cambridge University Press, Oxford University Press, and SAGE Publications vs. individuals at Georgia State University for "systematic, widespread and unauthorized copying and distribution of a vast amount of copyrighted works." I thought perhaps the library could be protected by claiming the "good faith" defense. However, it seems that their original policy had been out of the mainstream enough to make it a target.
Because there is no clear set of guidelines, libraries devise their own policies regarding fair use and distribution. The trick seems to be to maintain the push/pull balance between academic users (who need to take stands on fair use, because they are the major stakeholders in the survival of fair use practices)and academic publishers (who rightly want to not give away all of their products for free). Giving in too much to publishers could mean erosion of fair use practice, but there are limits that will need to be followed.

Sunday, September 26, 2010

Unit 4, Part 2: Licensing

After reading Lesley Ellen Harris' s book Licensing Digital Content, I learned lots of practical new things about licensing. She writes in a lovely, easy-to-read style, and I will make sure to keep this book around in case need should arise.

In order to further simplify the book in my mind, I have little set of bullet points...

If I had to negotiate a license/contract/licensing contract (all the same thing) tomorrow I would remember to:
  • get a piece of paper and write down what my library needs
  • look up my libraries licensing policy, if one exists
  • make sure that I am authorized to sign the license
  • make sure the person who I am licensing from warranties that he/she is authorized to license the content (shocking! I would never think there were rogue licensors running around)
  • know who my Authorized Users (sublicensors) will be, and make sure they are clearly included in the contract.
  • remember UCITA and make sure my library can't end up in court in Maryland or Virginia (but what if my library is IN Maryland or Virginia?)
  • know what I am getting-- what the content will be, what form it will take, how long I will get it for, whether it can be archived, and what rights will be granted to that content--view/print/save/share via ILL etc...
  • what my library's obligations will be, and not to promise to prevent unauthorized use, because that is really hard
  • payment, including currency
  • what territory will be covered
  • renewal vs. termination
  • remember to be nice and ask open-ended questions
I do realize, after looking at examples of licenses in class that my list is sadly oversimplified compared to real life, but I need a basic model that fits the big picture into my mind. A diagram would be nice too, but that is a future project.

One more note--this book was written pre-SERU. I wonder if Harris would have mentioned SERU as a license alternative if it had existed when she was writing this book?

*I wanted to use the image of the book cover of Licensing Digital Content. I want to say this falls under fair use, because this blog is a kind of a review of the book. However, after reading some online material about this, I think it is possible that it would be copyright infringement. The Kentucky Department for Libraries and Archives suggests that "fair use statutes may not apply to use of digital images in online publications open to the public."


Reference:
Harris, L. E. (2002). Licensing digital content: A practical guide for librarians. Chicago: American Library Association.

Thursday, September 23, 2010

Unit 4, Part 1: Licensing vs. Understanding

House of UCITA vs. House of SERU:
Two opposing philosophies of publisher-library relations.

UCITA: Uniform Computer Information Transactions Act of 1999. Similar to Uniform Commercial code, but intended for software instead of goods. Proposed state contract law that attempts to standardize the way that licensing is done. Libraries and others criticize UCITA as being skewed towards the interests of copyright holders

SERU:Shared Electronic Resource Understanding is a project hosted by NISO that attempts to articulate a set of understandings that libraries and publishers can agree on. The idea is that the costly, time-consuming process of license negotiation can be forgone, if participating libraries and publishers instead choose to operate on a level of trust based on common understandings.

Difference one:
Length and understandability:

UCITA is very long (200 pages) and very complicated. So much so that the American Bar Association said that they needed to set up tutorial groups to help each other understand what exactly UCITA was saying.

SERU is short and sweet (less than 10 pages) and written to be understandable.

Difference two.
Underlying principle/tone:

UCITA assumes that publishers are vulnerable and need to protect themselves from their users via ironclad licensing. The obsession with detail in UCITA sets a tone of combativeness:
UCITA effectively encourages publishers to maximize their short-term profits at the cost of longer term socially beneficial goals such as innovation, research and free speech( Franklin, 2003, p.1)

SERU assumes that publishers and libraries need each other, and want to work together as smoothly as possible. SERU sets a tone of shared understanding between publishers and libraries, where socially beneficial goals get a place at the table.

Difference three:
To license or not to license:

UCITA stands for uniformed standardized licensing, licensing and more licensing.

SERU stands for license-free baby (although license-free is not the same as contract-free).


Difference four.
Popularity...?

After over a decade since its 1999 proposal date, only two states have adopted UCITA. Other states have adopted "bombshelter" legislation to protect consumers from UCITA.

SERU was developed in 2007. In 2008 it was released as as NISO recommended practice RP-7-2008. There are over twenty publishers, forty academic libraries, and four consortia that have signed up to participate (Lamoureux & Bernhardt, 2008, p. 152).

References:

Franklin, J. (2003). The perils of clicking I agree: UCITA and intellectual freedom. Alki, 19(1), 10-12.

Lamoureux, S., & Bernhardt, B. (2009). Innovations: Where are they now? The Serials Librarian, 56(1-4), 146-154.




*image taken from Wikimedia Commons. Drawing of "Romeo and Juliet Act 1, Scene 1" by Sir John Gilbert. In the public domain because it has been more than 70 years since death of creator.

Thursday, September 16, 2010

Unit 3: Green Paper, White Paper, Go.

The readings of this week noted the erosion of the first sale doctrine and the doctrine of fair use by the introduction of 'copyright licensing' and its accompanying technological protection. The Green/White Paper Reports put out by the Lehman group to protect the rights of digital content owners circumvented issues of fair use by employing licensing strategies. Licensing combined with technology allows copyright holders to retain control of what is done with their product after it is in the possession of the user. Licensing can even restrict use that would normally be protected by fair use guidelines and/or the first sale doctrine. Sometimes the user owns the product--like buying software; and sometimes the license is similar to a rental agreement--like a license contract between a library and an electronic journal publisher.

One of the fun things about investigating the Lehman Group Papers are the 1990's html-ish blogs that Internet users were using at the time to communicate with the Internet community (I mean, the "National Information Infrastructure") about the Lehman Group findings. One blog, Teleread--still a functioning blog with lots of interesting copyright finds--had an entry from 1995 called Lehman Panel's Report on Net Commerce in Final Phases of Tugs, Pulls and Faxes, which discussed some of the criticisms of the Green and White Papers. A particular quote from this blog illustrates Litman's point that a common strategy to create copyright law was to claim it already was law (Litman, p.95):
Discussion of court cases in the draft also seems filtered through the lenses of a copyright owner rather than users. There was one phrase, in particular, that irked many: "It has long been clear under U.S. law that the placement of a work into a computer's memory amounts to a reproduction of that work." In fact, that is a recent and controversial aspect of U.S. law.
The author of the blog noted that the Green Paper had been drafted with a pro-copyright holder bias, and noted that "One lawyer who knows Lehman says the patent commissioner has an unduly harsh fear of the dangers of technology, such as rampant infringement, without a corresponding appreciation for its upside."

The rational that brought about the Lehman group's embrace of licensing was the old tried-and-true strategy of claiming industry death would result without copyright protection from the voracious industry-consuming public. Another excellent Teleread find was a link to a current paper: Lemley, Mark A., Is the Sky Falling on the Content Industries? (August 10, 2010). The author makes some very entertaining points about how industry stakeholders were always sure that the next invention would destroy their product, whether the product was books, radio, television, movies, etc. Lemley makes a great reference (p.7)to a music industry campaign to block audio cassettes, on the ground that recording music at home would destroy radio audience-ship. He also suggests that monks probably objected to the printing press, on the grounds that it would destroy print culture (p. 1). Both Lemley's paper and Litman's book agree on the point that creators keep creating regardless of changes in media. Infringement on the public's usage rights is possibly not as necessary as Lehman and worried industry stakeholders would have Congress, and the public, believe.

One last particular reading of the week, the pro-CD vs Zeidenberg case, is a real-life example of the setting and enforcement of copyright law by litigation. This case provided a precedent for shrink-wrap licensing enforcement. The take-home message of the case seemed to be that companies could use contracts to push past the limits of copyright law. Companies can create contracts whereby users agree to relinquish rights in order to use the product--contracts and licenses can be used to secure restrictions above and beyond copyright law.
A few questions that I had after reading the case were:
  1. If a book came in a wrapper with fine print that said if you removed the wrapper you were agreeing to give up your right to use the book within the guidelines of the first sale doctrine, would that be legal?
  2. How can a consumer browse products at a store when there are hidden guidelines inside the packaging? What if there had been an equivalent product that did not have the same restrictions that Pro-CD had?

Note: The "Home Taping is Killing Music" image was taken from a Wikipedia reference. It was also used in the Lemley paper (with permission, one would assume). I am using it under the guidance of the four fair use factors:

  1. This is a blog for educational purposes, not for profit or of any commercial value, nor likely to be read by more than a very few unlucky people.
  2. The copyrighted work is an imaginative work--counts against fair use...
  3. But, it is only one small image used once.
  4. And, as it is now a defunct campaign of long ago, should have no effect on value of copyrighted work.

References:
Lemley, Mark A., Is the Sky Falling on the Content Industries? (August 10, 2010). Available at SSRN: http://ssrn.com/abstract=1656485

Litman, J. (2001). Digital copyright: Protecting intellectual property on the Internet. Amherst, N.Y: Prometheus Books.


ProCD v. Zeidenberg, 86 F.3d 1447 (7th Cir.1996)