This past week, we both attended the Code4Lib conference in Princeton, New Jersey. This year’s Code4Lib was exceptionally well programmed, both in terms of content and sequencing. Our reflections below offer a brief insight into our experience.


One of the main themes for me from this year’s conference was an emphasis on intentional slowness, in opposition to the “move fast and break things” mantra often heard at tech conferences. This played out across many talks, from an entire block of the conference devoted to project planning and process for technology teams to Danny Nanez’s excellent talk on UX/UI considerations for enslavement websites, in which he proposed several strategies for “toning down” the excitement (visual and otherwise) around the presentation of sensitive or offensive historical records.

This theme was perhaps best captured by a quote from Kate Zwaard (Associate Librarian for Discovery and Preservation at the Library of Congress) mentioned by one of the presenters: “Through the slow and careful adoption of tech, the library can be a leader.” Speed, in other words, is not what distinguishes our work, rather slowness and care.

My favorite talk of the conference was a presentation Bohyun Kim (Associate University Librarian for IT, University of Michigan) gave on frameworks for planning technology work. I found her discussion of different models for planning absolutely fascinating, and very much connected to conversations I’ve had over the years with folks at the Wellcome Collection about their approach to managing technological change.


Like Hillel, I was thinking a lot about intentionality during Code4Lib: the intentionality of how we do our work as technologists, and the type of information we choose to convey in our discovery systems. One of the main themes I noticed throughout the conference was how important it was to show care for both the humans who do the technological work and the humans that will use our systems. I saw this theme reflected in Dr. Lydia Tang’s keynote on the importance of accessibility, Ryan McCarthy’s (Senior Software Engineer at Ithaka) presentation on making JSTOR resources as accessible to incarcerated persons as possible, Vickie Karasic and Robert-Anthony Lee-Faison’s presentation about tackling imposter syndrome in developers, and many more across all three days.

This conference centered the people both creating and consuming systems in a way that many conferences don’t, and I think that’s a product of “slow and careful adoption of tech” that Zwaard spoke of, and that Hillel outlines above. Stepping away from the “move fast and break things” philosophy allows us, as technologists, to spend more time thinking about how to make the development cycle more comfortable and welcoming for all involved, as well as giving us as developers time to think through some of the ethical implications and questions related to our work and our creations. Some of this human-centered design thinking was shown in Jackson Huang’s (Digital Content and Collections Coordinator University of Michigan) talk on using programmatic workflows to identify elements of description in need of human intervention for materials relating to the US occupation of the Philippines and how to help Filippino users better engage with the materials.

I thought I had heard all I ever wanted to hear about AI coming into the conference, which is why I was so surprised that Andromeda Yelton’s (Senior Software Engineer JSTOR Labs) presentation on debiasing search algorithms ended up my favorite presentation of the conference. Yelton didn’t have all the answers on how fix the biases that pop up in search engine algorithms but did spend their time talking frankly about how they attempted to translate their own personal values to a search engine interface. I thought the questions they raised were thought-provoking, and a necessary reminder that sometimes there are no easy answers.