Digital Annotation

 

This proposal explores digital annotation as an opportunity for making solitary reading practices more visible and social. I aim to modify an existing annotation tool to use in my English 220: Introduction to Writing about Literature class at Hunter College. As an instructor for this course, I learned the difficulty of modelling, scaffolding, and encouraging active reading and close attention to language in the classroom. Because reading consists of largely isolated and invisible work, it is challenging to guide students toward becoming better readers. In order to do so, one must visibly model responses and provoke students’ participation. How can teachers effectively demonstrate active reading strategies in a way that all students can see and follow? In the classroom specifically, where instruction relies heavily on a digital projector, how can instructors make visible the attention to language—specific sentences, phrases and words—required in close reading? Additionally, how might teachers encourage those less confident and talkative students to engage with class discussion and learning? And to stimulate the student’s reading process, the moments of insight and questioning that occurs during the act of reading? My ideal annotation tool would address these problems of visibility and participation by offering more options for controlling the appearance of and responding to digital annotations. Specifically, the tool will facilitate different sizes and colors of text, as well as allow students to elaborate on their responses through tagging and categorization. There would be options for identifying questions, themes, and emotions that occur during the act of reading, in a way that is more provocative than prescriptive. The overall goal of these features is to scaffold students’ engagement with texts in a way that stimulates, rather than limits or determines, criticism.

The following are a set of user stories to demonstrate how and why users will use this tool.

Jonathan is a sophomore at Hunter College, majoring in Nursing. He has a lot of commitments besides school, including caring for a younger family member and working at a part-time job, which require him to constantly be on the move between home, school and work. To adequately keep up with his studies, Jonathan must make the most of those moments during the day when he can focus and be productive. He tends to complete his reading on the subway and writing assignments on campus, if he can find a quiet spot. Jonathan relies on his smartphone, his only consistent means of accessing the internet, on a daily basis. An annotation tool that works on his phone would allow him to complete his reading anywhere that he has internet access, without having to configure anything or worry about buying and carrying around a heavy laptop computer.

Talia is a first-year math student at Hunter College. She is apprehensive about taking the required English courses, because English has always been her weakest subject. Moreover, she is naturally shy, and refrains from speaking out in class. She worries about what to say and what others might think of her. After years of struggling in English, Talia has internalized her inability to understand the deeper meanings behind challenging authors like William Shakespeare and Virginia Woolf. She is convinced that these kinds of authors and their books will always elude her. With the annotation tool, however, Talia can participate from behind the safety of her keyboard. Additionally, she can see her teacher and classmates’ responses, which give her some guidance and spark some ideas.

This annotation tool will be integrated into a self-hosted version of WordPress. Before using the tool, the user must first create an account and login to their account. Then, users can highlight the desired text and type their comment in a simple text box that appears. After saving their comment, the original text is highlighted, and all users may view the annotation by hovering the mouse over the highlighted text, or on a collapsible sidebar (depending on whether the source code is based on Hypothes.is or Annotator.js). By clicking on this text or the “reply” button, users then can respond to the comment, which will appear below the previous annotation or as a reply. This basic scheme will be supplemented by further functionality, such as formatting for size and color and categorization/tagging for themes, emotions, and questions.

There are many situations in which a student might use the tool for an English class, including more guided uses (such as responding to specific prompts and questions in the reading) or more open-ended uses (such as responding to any aspect of the text that students find intriguing or confusing). One specific exercise, which I have already used in a class, is what I call “collaborative close-reading”. This is a classroom assignment in which students work in groups to collaboratively construct a close-reading of specific passage and post it as an annotation on the online document. In their close reading, students are asked to comment on one or more literary devices (such as simile, metaphor, personification), analyze the meaning and significance of the device, and connect their analysis to a theme from the work. After posting their annotations on the online document, the groups then take turns presenting their findings while receiving constructive feedback from the teacher. The presentations are opportunities to strengthen their understanding of the literary terms, the importance of analysis, and connecting analysis to themes; however, using the current available tools, it is very difficult for students who do not possess a computer or smartphone to see and follow along with the annotations on the board, which appear very small and cannot be reformatted.

My ideal version of the tool will modify one of the existing annotation tools for visibility and participation, making annotations more provocative and interactive. To do so, my implementation would offer extensive formatting options, allowing for different sizes and colors of text, so that students can read the annotations from a projected screen (in a classroom) and use color-coding to indicate different types of responses. Additionally, I would include a comprehensive system of tagging, which allows for “sentiment tags” (more on this below) and thematic tags that provoke students to think more deeply about their reactions and analysis of the text. Finally, I would include social features, such as responding, voting, and/or “liking” buttons, to encourage students to engage with their classmates’ comments.

In order to realize this project, I would borrow code and ideas from various existing and (mostly) open source digital annotation projects. The basis of my code will come from either Annotator.js and/or  Hypothes.is. Though they look different, both tools allow users to comment directly on digital text and to read each others’ comments. The complete code for both projects is on Github, and both Annotator.js and Hypothes.is encourage extension of their work. Annotator.js is particularly extensible, due to its modular structure which is made for building. Hypothes.is works similarly to Annotator.js, but it is much more developed, including options for different reading modes and responding to existing comments, as well as a nicer interface for annotations. On Hypothes.is, users can control the visibility of their annotations by working in “public”,  “private” or “group” mode, as well as making all annotations temporarily invisible, to display a clean interface. Additionally, rather than rendering or “floating” the annotations over the text, Hypothes.is stores them in a sidebar, which can be expanded or minimized by the user. The sidebar allows for threaded conversations, and offers a tagging functionality, where users can view tags and respond to specific comments.

Once I have selected the codebase, then I can borrow code and ideas from other projects to develop the features I describe above. Annotator.js, though currently not maintained, has inspired a variety of modules, extensions, and plugins that provide the functionality I want. Some of these include “Annotator-categories”, currently in use by Lacuna Stories, which allows users to categorize annotations by highlight color; as well as “Annotator-offline”, a system for storing annotations offline if supported by current browser. From an unrelated project, another interesting features is “sentiment tags”, developed by Ponder (which is, unfortunately, proprietary) that allows students to tag their responses according to clarification, analysis, or emotion.

As my comments above suggest, I would first have to decide whether to use Annotator.js or Hypothes.is as my codebase. Each one has drawbacks and benefits. Annotator’s modular structure would make it more theoretically extensible, getting the features I want, but it is not currently maintained. Hypothes.is already has many of the features I want (responding to annotations, basic tagging scheme) but is much more developed and has a higher learning-curve to get started. As a plus, however, Hypothes.is has an extensive development community and support. Without doing further research, I cannot say for sure which one is more practical, though I would prefer to develop from Hypothes.is due to its wider use and relevance today.

To build this tool, I will have to familiarize myself with the existing codebase for Annotator.js and/or Hypothes.is, and be comfortable enough with several coding languages to make changes. Though I have implemented both Hypothes.is and Annotator.js in my past teaching and research, I have never built or customized anything with this level of complexity, and I’ve certainly never written code from scratch. From browsing the documentation and Github repositories for these tools, I sense that I would need proficiency in HTML, CSS, Javascript, Python, and WordPress.org. I feel comfortable enough with HTML and have a working knowledge of CSS, Javascript and Python. I have previously developed websites with WordPress.com and Drupal, but have never worked with WordPress.org. Therefore, at this point, I anticipate that I would need to focus on strengthening my CSS, Javascript, Python, and familiarity with WordPress.org, before getting started. I also anticipate that much of my preparation will also be spent researching other annotation tools, and familiarizing myself with the research already done (for example, the excellent  “Annotation Tools” website by previous ITP student, Anke Geertsma). The entire process would take somewhere between 3-6 months, with 1-2 months spent on research and 2-4 months on building and testing.

The stripped down version of the tool is actually the same tool, but with features prioritized (according to a modular process), and will be supplemented by OER teaching materials on annotation. Basically, I would develop the tool according to Agile methodologies, which pursues the project step by step, assuming that things will change. Therefore, I would prioritize my features in the following order: (1) formatting options (size and color); (2) tagging options (which can be subdivided into tags for themes and emotional/analytical reactions); and (3) interactive functionalities (“liking” or “voting” options). If any of these things prove too difficult to accomplish, I will then scale back the technical aspects of the project and turn to creating teaching materials about using annotation for close-reading instruction and discuss the wider use of annotation in the classroom. Here, I will research and create lesson plans that set up methods for using the existing tool(s) and question the significance of annotation as a strategy. These questions will broadly consider how reading online engages the emotions of the reader, and how annotation practice can be both limiting and opening to critical interventions.  Here, I will ask how the scaffolded annotation structure might become limiting. In other words, how might the format guide the student along preset ways of responding to literature? And I will also think through the “tool-like” versions of digital annotation versus the platform based versions, which offer extensive features for tracking student progress through the text and assess learning. There are a few platforms, like Lacuna Stories and Annotation Studio, that make a point to follow and analyze a student’s progress through the text through visualization methods. At what point does the implementation of these tools actually reduce or flatten the critical interventions that they are supposed to inspire?

This “minimum viable product” (the modular tool combined with OER materials) is scalable, or “fail-safe” to technological obstacles. It guarantees that I will actually produce a viable product. If things go great with programming, I will try to incorporate all of the features of the ideal version. If things go badly, I will then focus on writing supplementary teaching materials and analyzing digital annotation’s use for close-reading in the classroom. Because the learning curve is the same, this version will take the same amount of time and require the same preparation as the ideal version. The only difference is that I may at some point direct my work to building supplementary material rather than the tool itself.

Responses

  • Lauren Kehoe says:
    I think this project is really worthwhile and I’d like to see what kind of student outcomes you observe upon implementation. Is their engagement heightened, is there more substantive reflection of the readings, etc. due to this tool? As I stated in my comment above, a tool like this would’ve really helped me in my undergraduate years to be more participatory as I was very shy and regularly unsure of my contributions to class discussion.

    It’s obvious you’ve given serious thought to how you’d build this tool. Have you thought about the process for introducing it to your students–will you devote class time/labs for students to learn this tool? I also really like your scaled down version that allows for a prioritized roll out and agile development of the tool!
    -Lauren

  • Zohra Saed says:
    This is such a wonderful and necessary project! The case studies also are examples of typical students attending CUNY.
  • I think your project is fantastic Filipa! I am not very familiar with annotation software and appreciate your detailed breakdown of the tools here. I think they could be incredibly useful in the classroom. It’s often difficult to make sure students are reading (much less close-reading!) class material and this could serve as a way to get them engaged without having to resort to what has always struck me as a problematically “punitive” approach to learning (e.g., making students take an exam on a text to ensure they read it). Very excited to see your project develop.

Your email address will not be published. Required fields are marked *

Welcome to Social Paper (beta)!

Skip to toolbar