1968: The Ford Foundation Gets a Computer

Today’s post comes from Rachel Wimpee, Historian and Project Director in our Research and Education division. Rachel uncovered this story while working with the Ford Foundation archives held at the RAC, and asked if it might be worth posting here. I only had to quickly skim the text to see the relevance for this blog.

A couple of broad themes jumped out at me when I read this piece. The first is the durability of modes of speaking and thinking about technology, which seem to persist despite (or perhaps because of) rapid technological changes. Artificial intelligence and machine learning, both hot tech trends currently, figure heavily in this story from 1965. You’ll also notice efficiency being employed as the ultimate justification for technology, even in a situation where increasing the profit margin didn’t apply. This story is also an excellent illustration of the socially constructed nature of technology. As Rachel’s piece reveals, technology is the result of consensus and compromise. There are negotiations mediated by money, practicality, and personality. Not only that, but technology and underlying processes are often so intertwined as to be indistinguishable, and each is often blamed for things the other produces.

In many ways, this is cautionary tale of what happens when we start with the new shiny thing rather than the needs of users (something that Evgeny Morozov and others have called “solutionism”). It’s not all bad, though. Rachel writes about the training plan implemented by the Ford Foundation at the same time staff began to use an IBM 360/30 mainframe for data processing in the late 1960s, as well as a regular process of evaluation and change implementation which lasted well into the 1970s. This reminded me of the importance of ongoing cycles of training and evaluation. New technologies usually require humans to learn new things, so a plan for both teaching and evaluating the effectiveness of that teaching should be part of any responsible technology change process. The D-Team is thinking a lot about training these days, particularly in the context of Project Electron, which will embed technologies into our workflows in holistic way. Even though the project won’t be complete until the end of the year, we’re already scheduling training to amplify our colleague’s existing skills and expertise so they can feel confident working with digital records.

Continue reading

Learning from Liberating Structures

Like many managers, I have a lot of meetings, so I’m always looking for ways to make sure I get the most out of them. Am I hearing from everyone at the table? Are a group’s best ideas being surfaced, or am I just hearing from the extroverts? How can I get my team engaged in strategic planning? Consequently, I’m always on the lookout for tools and techniques to make meetings – one-on-ones, team conversations, administrative updates and beyond – useful, engaging and inclusive.

A couple of years ago, the always-incredible Tara Robertson pointed me towards Liberating Structures. Although I’ve experimented with them a bit over the past few years, I’ve struggled to cut through some of the jargon (particularly the innovation-speak, which especially bugs me) and, let’s face it, the information architecture and graphic design of the official website. However, Tara encouraged me to look for a training opportunity, and after several years, I was finally able to attend a Liberating Structures training led by Fisher Qua at NYC’s Outward Bound headquarters in Long Island City. This two-day intensive workshop made all the difference in helping me understand what Liberating Structures are and how they can be used. Continue reading

Managing Technical Debt: Code4Lib 2018 report

I’m just back from this year’s Code4Lib conference, held in Washington D.C. As I’ve written here before, it’s an event that is, without fail, productive, provocative and exhausting. Over the years, the things that have stayed with me from the conference have changed (arguably the focus of the conference has changed as well) from technological tools for solving problems to values and frameworks for thinking through problems. Continue reading

SHOT 2017: expertise and power

A few weeks ago, I attended the annual meeting of the Society for the History of Technology in Philadelphia. Along with my colleague Eira Tansey, I presented a paper titled “For Good Measure: The Role of Regulatory Records in Environmental Maintenance,” which made the case that environmental regulation relies on the work of recordkeeping. Eira was fresh off of delivering the opening keynote at NDSA’s Digital Preservation 2017: Preservation is Political, a talk which covered many of the same themes. Continue reading

Virtual Vault: making access to digitized records easier

This month, we launched a system called Virtual Vault, which allows us to deliver digitized content to any user within the RAC network. It’s a temporary solution that we hope will help us better understand responsible access to digital archival records. Our thinking around this solution is motivated by one central question: given the limitations of copyright and donor agreement restrictions, what is the most and best access we can provide? Continue reading

Project Electron June Update

This month we’re excited to announce the release of the first version of a specification for transferring digital records to the RAC over a network connection. In line with our project value of supporting archival practices and standards, we’ve built many parts of this specification on existing standards and frameworks such as BagIt, BagIt Profiles, Activity Streams, and OAIS. We believe this approach will make the products we come up with more easily reproducible at other institutions, which is another one of our project values. Continue reading

Project Electron May Update

Our major news for this month is that, after evaluating a number of existing solutions against our requirements for archival storage, we have decided to use Fedora as the repository solution for Project Electron. Although there were other systems that met many of our requirements – DSpace for example – in the end we felt that Fedora was the closest match for our needs both in terms of feature coverage and scope. It does what we want it to do without requiring us to support a lot of extra functionality or complexity. Continue reading

Project Electron April Update

As I mentioned last month, we’re moving forward with Project Electron on two fronts: defining the process by which digital records are transferred to the Rockefeller Archive Center and selecting a solution to provide archival storage for those records once they are in our custody. Continue reading

Maintainers II: performance, invisibility and professionalism

Last week, I attended Maintainers II: Labor, Technology, and Social Orders, a conference at Stevens Institute of Technology, and presented a talk which attempted to make the case that folk music is maintenance work by looking at the songs and methodologies of Woody Guthrie. This was a followup to last year’s conference, which you may remember from my rave review. This year’s conference matched the first in terms of relevance and resonance, while opening up some new ground for me related to maintenance and archives. Continue reading

Project Electron March Update

March was a busy month for the Project Electron team, with conference presentations at Code4Lib, attendance at LDCX, Born Digital Archiving eXchange and Personal Digital Archiving, and participation in the DACS Principles revision process. Despite this, we managed to make significant progress on Project Electron, specifically in developing requirements for archival storage as well as transfer of records from donor organizations to the Rockefeller Archive Center. Continue reading