Project Description

This project sought to improve information-access systems for large digital collections by incorporating metadata about the documents' genres. Genre metadata facilitates information retrieval and use. For example, people rarely have to read every word to understand a document's meaning. Instead, they start by identifying the kinds of documents they are faced with, then use different types of documents in appropriate ways. For example, a grant proposal is used differently from a syllabus, a product brochure or a bank statement. Information-access systems would be more useful for many tasks if they could similarly distinguish the form and purpose of documents (that is, their genres).

The study included a series of human-centered studies of information use by K-12 teachers, journalists and medical residents to develop an understanding of genre and its role in information use. Based on these studies, the researchers developed a taxonomy of relevant genres and the features that identify each. The study included the development and preliminary testing of genre-enhanced interface prototypes for information access tools. The study also provided an understanding of the role of document genre in information-based tasks, evidence of the level of utility of genre metadata in improving user performance on those tasks and development of methods for studying genre.

Broader impacts of the study included a reusable classification of genres, enhancing the infrastructure for research and advancement of the domain of education, both in the population targeted by the studies and by including students in the research.

See the grant proposal: How can document-genre metadata improve information-access for large digital collections? (NSF Grant IIS-0414482) and first annual report.
We also submitted a proposal for further research but it was not funded: User-centered design for automated genre recognition.

Creative Commons License
The works on this website are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License.