Mistakes Were Made: Computer History, Decompiled

On Friday, April 17, we attended Mistakes Were Made: Computer History, Decompiled. Sponsored by NYU’s Department of Media, Culture and Communication, the program was organized around a series of conversations which paired “emerging scholars” with practitioners in the field. Continue reading

SAA: We’re All Digital Archivists: Forensic Techniques in Everyday Practice

At the recent Society of American Archivists annual conference, I was fortunate enough to present as part of a panel discussing the application of digital forensics in an archival setting. I touched on the work I’ve been doing with the D-Recs committee and on developing the forensics workflows that I’ve discussed previously. My co-presenters, Cal Lee, Don Mennerich, and Christie Peterson, discussed different aspects related to digital forensics in archives, from learning forensics techniques to an overview of current research in the field. I highly recommend checking out the audio for the session, which is available on our shared drive.

The Signal Interview
As a result of my presentation, I was asked to do an interview with Trevor Owens for The Signal, the Library of Congress blog on digital preservation. The interview went live last week and touches on some points I made during my presentation as well as current and future D-Team projects. I hope you enjoy it!

SAA Report: Getting Things Done with Born-Digital

One of the first sessions I attended at this year’s SAA annual meetings was “Getting Things Done with Born-Digital Collections,” and it stuck with me as a great entry-level review of how to deal with born digital materials in a variety of different institutional environments. It also introduced tools to help archivists jump into their work, while providing some advice for those looking implement or expand born-digital programs. Many of the following tools/concepts may seem familiar in the work that we do here at the RAC.

The panel included five panelists: Gloria Gonzalez, Jason Evans Groth, Ashley Howdeshell, Dan Noonan, and Lauren Sorensen. While all of the panelists covered slightly different experiences, there was one universal takeaway: preserving digital collections needs to be an institutional endeavor, and in many cases, that endeavor is a constant work-in-progress, from tools to processes.

Continue reading

Legacy Digital Media Survey

In November 2013, the first phase of the Legacy Digital Media Survey began with examining collection information of the legacy collections at the Rockefeller Archive Center.

The Legacy Digital Media Survey aim is to gain intellectual and physical control over digital media materials in the collections. This survey came about due to the accumulated backlog of unprocessed and unknown amount of digital media over the past forty years of collection building at the RAC.  Initial steps are being taken in order to manage the backlog of born-digital content for identification, separation and accessibility of these at-risk items. Continue reading

MetaArchive and Distributed Digital Preservation

About a year ago, I started reviewing ways to secure our digital assets against potential catastrophic losses due to disaster (natural or otherwise), technical error, hardware failure, and system attacks. I wanted a solution that offered geographically-dispersed server space to minimize the risk of loss due to disaster, regular fixity checks to help uncover any potential hardware issues on those servers, and administration differentiation between those servers and our own systems to help alleviate technical error and system attack issues.

After a review of Distributed Digital Preservation Services (as of Spring, 2013), we selected MetaArchive. MetaArchive is a digital preservation network created and hosted by memory organizations like libraries and archives. It uses LOCKSS software, which was developed by Stanford University and is used by about 12 different preservation networks worldwide. In a LOCKSS based system, materials are ingested and stored on servers hosted by network members in disparate geographical locations. The fixity of the materials is checked at regular intervals. This system helps prevents data loss occurring during natural disasters or other emergencies, or due to malicious or negligent factors. No one administrator has access to all copies of the data or can tamper without detection. A step-by-step review of how it works can be found here. Current MetaArchive membership institutions span 13 states and 4 countries. These members include many universities, the Library of Congress, and a few smaller libraries. Continue reading

NYART event: Preserving and Archiving Electronically Generated Materials

Last week I spoke at the NYART event, Preserving and Archiving Electronically Generated Materials, which was sponsored by the Leon Levy Foundation. My slides are attached below. Slides from other presenters will be made available on the event website, and you can find the event schedule here. Continue reading

Accessioning and Ingest of Electronic Records #1409

On October 25th, 2013 I attended SAA’s DAS workshop of Accessioning and Ingest of Electronic Records.  The workshop was hosted by Harvard and led by Erin Faulder, the Digital Archivist for Tisch Library at Tufts University.

The workshop covered key concepts and issues regarding policy decisions based on institutional mandates, suggestions for working with donors, key elements of the digital transfers, and the need for digestible workflows and guidelines.  There was a quick overview of the OAIS model with a breakdown of SIP’s, DIP’s, and AIP’s, and an emphasis to use the OAIS model as the foundation for the digital workflow each institution creates.  Two major themes emphasized throughout the workshop were 1) building trust and communication with donors, and 2) accessioning digital material goes beyond traditional accessioning and incorporates elements of appraisal and processing. Continue reading

Workshop Report: Digitizing A/V Collections – to Outsource or Not to Outsource?

On Wednesday, October 2nd, I attended the workshop “Digitizing Audiovisual Collections – to Outsource or Not to Outsource,” hosted by METRO and the Moving Image Archiving and Preservation program at NYU. Three speakers from different institutions (Chris Lacinak, Jonah Volk, and Julie May) came together to speak about their experiences in deciding whether to digitize audiovisual materials in-house, or work with an outside vendor. Each speaker laid out important criteria and considerations that will prove invaluable for anyone planning to start an audiovisual digitization project, as well as those that need guidance when working with vendors. In my opinion, the presentations naturally split into two categories: how to assess whether to outsource an audiovisual project, and the issues and concerns that you must take into account when working with a vendor. I have listed below some of the more salient takeaways from the workshop that any institution should keep in mind before undertaking any audiovisual digitization process.
Continue reading