Join Current for “Get with the program!: Shows that shaped public television”

2017 is the 50th anniversary of the Public Broadcasting Act. Join Current for Get with The Program!: Shows that Shaped Public Television, a series of online events looking at some of the most influential public TV programs of all time. First up: Firing Line, the legendary public affairs program hosted by conservative intellectual William F. Buckley. Watch clips of Firing Line, courtesy of the Hoover Institution Archives, and discuss the impact of this groundbreaking show on American culture and public TV itself. Guests include Heather Hendershot, author of “Open to Debate: How William F. Buckley Put Liberal America on The Firing Line” and former ABC News analyst Jeff Greenfield. This free event is Wednesday, May 24 at 1 pm ET. Reserve your spot here: bit.ly/pba50-firingline.

FiringLine
Image courtesy Hoover Institution Archives

AAPB Launches Crowdsourcing Game

WGBH, on behalf of the American Archive of Public Broadcasting (AAPB) and with funding from the Institute of Museum and Library Services, is excited to announce today’s launch of FIX IT, an online game that allows members of the public to help AAPB professional archivists improve the searchability and accessibility of more than 40,000 hours of digitized, historic public media content.

fixit

For grammar nerds, history enthusiasts and public media fans, FIX IT unveils the depth of historic events recorded by public media stations across the country and allows anyone and everyone to join together to preserve public media for the future. FIX IT players can rack up points on the game leaderboard by identifying and correcting errors in machine-generated transcriptions that correspond to AAPB audio. They can listen to clips and follow along with the corresponding transcripts, which sometimes misidentify words or generate faulty grammar or spelling. Each error fixed is points closer to victory.

Fix it game desktop.gif

Visit fixit.americanarchive.org to help preserve history for future generations. Players’ corrections will be made available in public media’s largest digital archive at americanarchive.org. Please help us spread the word!

PBS NewsHour Digitization Project Update: Ingest and Digital Preservation Workflows

In our last blog post (click for link) on managing the PBS NewsHour Digitization Project, I briefly discussed WGBH’s digital preservation and ingest workflows. Though many of our procedures follow standard practices common to archival work, I thought it would be worthwhile to cover them more in-depth for those who might be interested. We at WGBH are responsible for describing, providing access to, and digitally preserving the proxy files for all of our projects. The Library of Congress preserves the masters. In this post I cover how we preserve and prepare to provide access to proxy files.

Before a file is digitized, we ingest the item-level tape inventory generated during the project planning stages into our Archival Management System (AMS – see link for the Github). The inventory is a CSV that we normalized to our standards, upload, and then map to PBCore in MINT, or “Metadata Interoperability Services,” an open-source web-based plugin designed for metadata mapping and aggregation. The AMS ingests the data and creates new PBCore records, which are stored as individual elements in tables in the AMS. The AMS generates a unique ID (GUID) for each asset. We then export the metadata, provide it to the digitization vendor, and use the GUID identifiers to track records throughout the project workflow.

Screen Shot 2016-07-07 at 3.30.19 PM.png
Mapping a CSV to PBCore in MINT

For the NewsHour project, George Blood L.P. receives the inventory metadata and the physical tapes to digitize to our specifications. For every GUID, George Blood creates a MP4 proxy for access, a JPEG2000 MXF preservation master, sidecar MD5 checksums for both video files, and a QCTools report XML for the master. George Blood names each file after the corresponding GUID and organizes the files into an individual folder for each GUID. During the digitization process, they record digitization event metadata in a PREMIS spreadsheets. Those sheets are regularly automatically harvested by the AMS, which inserts the metadata into the corresponding catalog records. With each delivery batch George Blood also provides MediaInfo XML saved in BagIt containers for every GUID, and a text inventory of the delivery’s assets and corresponding MD5 checksums. The MediaInfo bags are uploaded via FTP to the AMS, which harvests technical metadata from them and creates PBCore instantiation metadata records for the proxies and masters. WGBH receives the digitized files on LTO 6 tapes, and the Library of Congress receives theirs on rotating large capacity external hard drives.

For those who are not familiar with the tools I just mentioned, I will briefly describe them. A checksum is a computer generated cryptographic hash. There are different types of hashes, but we use MD5, as do many other archives. The computer analyzes a file with the MD5 algorithm and delivers a 32 character code. If a file does not change, the MD5 value generated will always be the same. We use MD5s to ensure that files are not corrupted during copying and that they stay the same (“fixed”) over time. QCTools is an open source program developed by the Bay Area Video Coalition and its collaborators. The program analyzes the content of a digitized asset, generates reports, and facilitates the inspection of videos. BagIt is a file packaging format developed by the Library of Congress and partners that facilitates the secure transfer of data. MediaInfo is a tool that reports technical metadata about media files. It’s used by many in the AV and archives communities. PREMIS is a metadata standard used to record data about an object’s digital preservation.

Now a digression about my inventories – sorry in advance. ¯\_(ツ)_/¯

I keep two active inventories of all digitized files received. One is an Excel spreadsheet “checksum inventory” in which I track if a GUID was supposed to be delivered but was not received, or if a GUID was delivered more than once. I also use it to confirm that the checksums George Blood gave us match the checksums we generate from the delivered files, and it serves as a backup for checksum storage and organization during the project. The inventory has a master sheet with info for every GUID, and then each tape has an individual sheet with an inventory and checksums of its contents. I set up simple formulas that report any GUIDs or checksums that have issues. I could use scripts to automate the checksum validation process, but I like having the data visually organized for the NewsHour project. Given the relatively small volume of fixity checking I’m doing this manual verification works fine for this project.

Screen Shot 2017-04-10 at 2.37.28 PM.png
Excel “checksum inventory” sheet page for NewsHour LTO tape #27.

The other inventory is the Approval Tracker spreadsheet in our Google Sheets NewsHour Workflow workbook (click here for link). The Approval Tracker is used to manage reporting about GUID’s ingesting and digital preservation workflow status. I record in it when I have finished the digital preservation workflow on a batch, and I mark when the files have been approved by all project partners. Partners have two months from the date of delivery to report approvals to George Blood. Once the files are approved they’re automatically placed on the Intern Review sheet for the arrangement and description phase of our workflow.

Screen Shot 2017-04-10 at 2.38.11 PM.png
The Approval Tracker in the NewsHour Workflow workbook.

Okay, forgive me for that, now back to WGBH’s  ingest and digital preservation workflow for the NewsHour project!

The first thing I do when we receive a shipment from George Blood is the essential routine I learned the hard way while stocking a retail store – always make sure everything that you paid for is actually there! I do this for both the physical LTO tapes, the files on the tapes, the PREMIS spreadsheet, the bags, and the delivery’s inventory. In Terminal I use a bash script that checks a list of GUIDs against the files present on our server to ensure that all bags have been correctly uploaded to the AMS. If we’ve received everything expected, I then organize the data from the inventory, copying the submission checksums into each tape’s spreadsheet in my Excel “checksum inventory”. Then I start working with the tapes.

Important background information is that the AAPB staff at WGBH work in a Mac environment, so what I’m writing about works for Mac, but it could easily be adopted to other systems. The first step I take with the tapes is to check the them for viruses. We use Sophos to do that in Terminal, with the Sweep command. If no viruses are found I then use one of our three LTO workstations to copy the MP4 proxies, proxy checksums, and QCTools XML reports from the LTO to a hard drive. I use the Terminal to do the copying, which I leave run while I go to other work. When the tape is done copying I use Terminal to confirm that the number of files copied matches the number of files I expected to copy. After that, I use it to run an MD5 report (with the find, -exec, and MD5 commands) on the copied files on the hard drive. I put those checksums into my Excel sheet and confirm they match the sums provided by George Blood, that there are no duplicates, and that we received everything we expected. If all is well, I put the checksum report onto our department server and move on to examining the delivered files’ specifications.

I use MediaInfo and MDQC to confirm that files we receive conform to our expectations. Again, this is something I could streamline with scripts if the workflow needed, but MDQC gets the job done for the NewsHour project. MDQC is a free program from AVPreserve that checks a group of files against a reference file and passes or fails them according to rules you specify. I set the test to check that the delivered batch are encoded to our specifications (click here for those). If any files fail the test, I use MediaInfo in Terminal to examine why they failed. I record any failures at this stage, or earlier in the checksum stage, in an issue tracker spreadsheet the project partners share, and report the problems to the vendor so that they can deliver corrected files.

Screen Shot 2017-04-10 at 2.39.55 PM
MDQC’s simple and effective user interface.

Next I copy the set of copies on the hard drive onto other working hard drives for the interns to use during the review stage. I then skim a small sample of the files to confirm their content meets our expectations, comparing the digitizations to the transfer notes provided by George Blood in the PREMIS metadata. I review a few of the QCTools reports, looking at the video’s levels. I don’t spend much time doing that though, because the Library of Congress reviews the levels and characteristics of every master file. If everything looks good I move on, because all the proxies will be reviewed at an item level by our interns during the next phase of the project’s workflow anyways.

The last steps are to mark both the delivery batch’s digital preservation complete and the files as approved in the Approval Tracker, create a WGBH catalog record for the LTO, run a final MD5 manifest of the LTO and hard drive, upload some preservation metadata (archival LTO name, file checksums, and the project’s internal identifying code) to the AMS, and place the LTO and drive in our vault. The interns then review and describe the records and, after that, the GUIDs move into our access workflow. Look forward to future blog posts about those phases!

Free Webinar Recordings: Strategies for Advancing Hidden Collections

The Council on Library and Information Resources (CLIR) recently completed a six-part webinar series to share best practices and lessons learned from their Cataloging Hidden Collections program. Sponsored through the generous support of The Andrew W. Mellon Foundation, the Strategies for Advancing Hidden Collections (SAHC) series aims to help those working in GLAM (Gallery, Library, Archive, Museum) organizations build the confidence they need to tackle the processing of hidden archival collections. This series may also be particularly useful for public media organizations that are planning preservation projects.

Webinars include:

The complete series, including recordings, slides, and transcripts, is now freely available on the CLIR SAHC home page: https://www.clir.org/hiddencollections/sahc/sahc.

To supplement the series, an Online Resource Library was also created for increasing the visibility, usability, and sustainability of collections in the GLAM community: https://wiki.diglib.org/Strategies_for_Advancing_Hidden_Collections.

Voegeli and Setziol Radio Collections Added to the Online Reading Room

In the past few months, we’ve added several new radio collections to our Online Reading Room!

The Donald Voegeli collection preserves the music and memory of Don Voegeli, who wrote the theme music for All Things Considered on NPR, along with providing many other contributions to public radio over the course of a long and impressive career.

Variations on the  All Things Considered theme make up just a fraction of the Don Voegeli collection. There’s also plenty of Voegeli’s other work to explore, from musical compositions in the vein of the ATC theme like Swiss Clock Maker to catchy educational jingles like Math Song (“you bisect an angle by using a ruler and compass / you bisect a compass by using a good sharp axe”)

Donald Voegeli’s son Jim Voegeli, a radio producer in his own right, has also contributed four audio documentaries of his own as a separate collection. “Speaking of Wilderness,” Jim’s first documentary on the importance of the conservation of wild places, aired on NPR when he was only 16.  Jim’s piece “Remembering Aldo Leopold,” a radio documentary essay on the life and legacy of the visionary conservationist and writer, went on to win an Ohio State Award.

Finally, for more award-winning environmental journalism, check out our newest collection of works by Ilsa Setziol, longtime environmental reporter for KPCC. Among other honors, Setziol has been recognized for Outstanding Beat Reporting in Radio by the Society of Environmental Journalists for pieces like this 2003 report on the environmental aftermath of fires in San Bernardino County, “Fire Recovery, Part 1.”

The archive of Setziol’s work for KPCC offers an invaluable record of environmental concerns and activism from the past 20 years, from reports on the projected devastating impact of global warming in California to stories of activists like Josh Quigley, who spent months sitting in an oak tree to try and save it from being cut down.

Browse the collections to listen to hundreds more great radio pieces:

Donald Voegeli
James F. Voegeli
KPCC (Ilsa Setziol)

PBS NewsHour Digitization Project Update: Workflow Management

NewsHour_Project_LogosIn January 2016, the Council on Library and Information Resources awarded WGBH, the Library of Congress, WETA, and NewsHour Productions, LLC a grant to digitize, preserve, and make publicly accessible on the AAPB website 32 years of NewsHour predecessor programs, from October 1975 to December 2007, that currently exist on obsolete analog formats. Described by co-creator Robert MacNeil as “a place where the news is allowed to breathe, where we can calmly, intelligently look at what has happened, what it means and why it is important,” the NewsHour has consistently provided a forum for newsmakers and experts in many fields to present their views at length in a format intended to achieve clarity and balance, rather than brevity and ratings. A Gallup Poll found the NewsHour America’s “most believed” program. We are honored to preserve this monumental series and include it in AAPB.

Today, we’re pleased to update you on our project progress, specifically regarding the new digitization project workflows that we have developed and implemented to achieve the goals of the project.

The physical work digitizing the NewsHour tapes and ingesting the new files across the project collaborators has been moving forward since last fall and is now healthily and steadily progressing. Like many projects, ours started out as a great idea with many enthusiastic partners – and that’s good, because we needed some enthusiasm to help us sort out a practical workflow for simultaneously tracking, ingesting, quality checking, digitally preserving, describing, and making available at least 7512 unique programs!

In practice the workflow has become quite different from what the AAPB experienced with our initial project to digitize 40,000 hours of programming from more than 100 stations. With NewsHour, we started by examining the capabilities of each collaborator and what they already intended to do regarding ingestion and quality control on their files. That survey identified efficiencies: The Library of Congress (the Library) took the lead on ingesting preservation quality files and conducting item level quality control of the files. WGBH focused on ingestion of the proxies and communication with George Blood, the digitization vendor. The Library uses the Baton quality control software to individually pass or fail every file received. At WGBH, we use MDQC from AVPreserve to check that the proxy files we receive are encoded in accordance with our desired specifications. Both institutions use scripts to validate the MD5 file checksums the vendor provides us. If any errors are encountered, we share them in a Google Sheet and WGBH notifies the vendor. The vendor then rectifies the errors and submits a replacement file. Once approved, it is time for WGBH to make the files accessible on the AAPB website.

I imagined that making the files accessible would be a smooth routine – I would put the approved files online and everything would be great. What a nice thought that was! In truth, any one work (Global Unique Identifier or “GUID” – our unique work level identifier) could have many factors that influence what actions we need to be taken to prepare it to go online. When I started reviewing the files we were receiving, looking at transcripts, and trying to keep track of the data and where various GUIDs were in the workflow, I realized that the “some spreadsheets and my mind” system I intended to employ would result in too many GUIDs falling through the cracks, and would likely necessitate far too much duplicate work. I decided to identify the possible statuses of GUIDs in the NewsHour series and every action that would need to be taken to resolve each status. After I stared at a wall for probably too long, my coworkers found me with bloodshot eyes (JK?) and this map:

newshourworkflowwall
(It seems appropriate that the fire alarm is in this picture of the map)

Some of the statuses I identified are:

  • Tapes we do not want captured
  • Tapes that are not able to be captured
  • GUIDs where the digitization is not yet approved
  • GUIDs that don’t have transcripts
  • GUIDs that have transcripts, but they don’t match the content
  • GUIDs that are not a broadcast episode of the NewsHour
  • GUIDs that are incomplete recordings
  • GUIDs that need redacting
  • GUIDs that passed QC but should not have

Every status has multiple actions that need to be taken to resolve that issue and move the GUID towards being accessible. The statuses are not mutually exclusive, though some are contingent on or preclude others. It was immediately clear to me that this would be too much to manually track and that I needed a centralized automated solution. The system would have to allow simultaneous users and would need to be low cost and maintenance. After discussions with my colleagues, we decided that the best solution would be a Google Spreadsheet that everyone at the AAPB could share.

Here is a link to a copy of the NewsHour Workflow workbook we built. The workbook functions through a “Master List” with a row of metadata for every GUID, an “Intern Review” phase worksheet that automatically assigns statuses to GUIDs based on answers to questions, workflow “Tracker” sheets with resolutive actions for each status, and a “Master GUID Status Sheet” that automatically displays the status of every GUID and where each one is in the overall workflow. Some actions in trackers automatically place the GUID into another tracker – for instance, if a reviewer working on an episode for which we don’t have a transcript in the “No Transcript Tracker” and that GUID is identified as having content that needs to be redacted, the GUID is automatically placed on the “Redaction Tracker”.

A broad description of our current project workflow is: All of the project’s GUIDs are on the “Master GUID List” and their presence on that list automatically puts them on the “Master GUID Status Sheet”. When we receive a GUID’s digitized file, staff put the GUID on the “Approval Tracker”. When a GUID passes both WGBH and the Library’s QC workflows it is marked approved on the “Approval Tracker” and automatically placed on the “Intern Review Sheet.” Interns review each GUID and answer questions about the content and transcript, and the answers to those questions automatically place the GUID into different status trackers. We then use the trackers to track actions that resolve the GUIDs statuses. When a GUID’s issues in all the status trackers are resolved, it is marked as “READY!” to go online and placed in the “AAPB Online Tracker.” When we’ve updated the GUID’s metadata, put the file online, and recorded those actions in the “AAPB Online Tracker,” the GUID is automatically marked complete. Additionally, any statuses that indicate a GUID cannot go online (for instance, a tape was in fatal condition and unable to be captured) are marked as such in the “Master GUID Status Sheet.” This function helps us differentiate between GUIDs that will not be able to go online and GUIDs that are not yet online but should be when the project is complete.

Here is a picture of a portion of the “Master GUID Status Sheet.”’

newshourworkflowstatus
Right now there are a lot of red GUIDs in this picture of the Master sheet, but in the coming months they will be switching to green!

The workbook functions through cross-sheet references and simple logic. It is built with mostly “IF,” “COUNTIF,” and “VLOOKUP” statements. Its functionality depends on users inputting the correct values in action cells and confirming that they’ve completed their work, but generally those values are locked in with data validation rules and sheet permissions. The workflow review I had conducted proved valuable because it provided the logic needed to construct the formulas and tracking sheets.

Building the workflow manager in Google Sheets took a few drafts. I tested the workflow with our first few NewsHour pilot digitizations, unleashed it on a few kind colleagues, and then improved it with their helpful feedback. I hope that the workbook will save us time figuring out what needs to happen to each GUID and will help prevent any GUIDs from falling through the cracks or incorrectly being put online. Truthfully, the workbook struggles under its own weight sometimes (at one point in my design I reached the 2,000,000 cell limit and had to delete all the extra cells spreadsheet programs always automatically make). Anyone conducting a project any larger or more complicated than the NewsHour would likely need to upgrade to a true workflow management software or a program designed to work from the command line. I hope, if you’re interested, that you take some time to try out the copy of the NewsHour Workflow workbook! If you’d like more information, a link to our workflow documentation that further explains the workbook can be provided.

This post was written by Charles Hosale, WGBH.

AAPB launches new exhibit “Speaking and Protesting in America”

6t53r04
Image courtesy of the Library of Congress

The long history of Americans exercising their right to speak, assemble and petition is brought to life in a vibrant new online exhibition from the American Archive of Public Broadcasting (AAPB). “Speaking and Protesting in America” explores the role of dissent in American life, ranging from peaceful marches to acts of civil disobedience. This digital look into how Americans have demanded the attention of governing powers brings each movement to life through the rich collection of audio and visual materials preserved and digitized by AAPB, a collaboration between Boston-based public broadcaster WGBH and the Library of Congress.

The exhibit, curated by AAPB Digital Exhibits Intern Michelle Janowiecki, includes a diverse range of public radio and television content from 1956 – 2009, pulling from more than 40 historic radio call-in shows, local news, raw footage, and interviews that document the profound impact of the First Amendment on American life.

The exhibit is accessible online at http://americanarchive.org/exhibits/first-amendment.

documenting_protest

On Saturday, January 21, in conjunction with the exhibit’s launch, AAPB and PBS’ flagship history documentary series American Experience held a Facebook live event to discuss how protests throughout American history have been documented and preserved.  AAPB Project Manager Casey E. Davis Kaufman, exhibit curator Michelle Janowiecki, American Experience Historian in Residence Gene Tempest, and American Experience Managing Editor for Digital Content Lauren Prestileo participated in the “Documenting Protest” panel discussion, which was held at the WGBH Studio at the Boston Public Library. The recording of the event is available online at https://www.facebook.com/AmericanExperiencePBS/videos/10154919655949122/.

Listen to a sample recording from the exhibit, courtesy of WYSO-FM:

On March 8, 1973, women met at Antioch College in Yellow Springs, Ohio to hold a rally celebrating International Women’s Day. This rally was part of an annual worldwide celebration to recognize the achievements of women and to call for the end of sexism in the work force. Listen to the full recording online: http://to.wgbh.org/61838Ryuz

For more information and to explore the exhibit visit http://americanarchive.org/exhibits/first-amendment.

AAPB NDSR Resources Round-up

 

In 2015, the Institute of Museum and Library Services awarded a generous grant to WGBH on behalf of the American Archive of Public Broadcasting (AAPB) to develop the AAPB National Digital Stewardship Residency (NDSR). Through this project, we have placed seven graduates of master’s degree programs in digital stewardship residencies at public media organizations around the country.

AAPB NDSR  has already yielded dozens of great resources for the public media and audiovisual preservation community – and the residents aren’t even halfway done yet! As we near the program’s midpoint, we wanted to catch you up on the program so far.

We started off in July 2016 with Immersion Week in Boston, which featured presentations on the history of public media and the AAPB, an overview of physical and digital audiovisual materials, an introduction to audiovisual metadata, and instructional seminars on digital preservation workflows, project management, and professional development. Attendees also participated in a full-day session on “Thinking Like a Computer” and a hands-on command line workshop.

Several sessions from Immersion Week were filmed by
WGBH Forum Network, including:

In August 2016, the residents dispersed to their host stations, and began recording their experiences in a series of thoughtful blog posts, covering topics from home movies to DAM systems to writing in Python.

AAPB NDSR blog posts to date include:

Digital Stewardship at KBOO Community Radio,” Selena Chau (8/9/16)

Metadata Practices at Minnesota Public Radio,” Kate McManus (8/15/16)

NDSA, data wrangling, and KBOO treasures,” Selena Chau (8/30/16)

Minnesota Books and Authors,” Kate McManus (9/23/16)

Snapshot from the IASA Conference: Thoughts on the 2nd Day,” Eddy Colloton (9/29/16)

Who just md5deep-ed and redirected all them checksums to a .csv file? This gal,” Lorena Ramirez-Lopez (10/6/16)

IASA Day 1 and Voice to Text Recognition,” Selena Chau (10/11/16)

IASA – Remixed,” Kate McManus (10/12/16)

Learning GitHub (or, if I can do it, you can too!)” Andrew Weaver (10/13/16)
Home Movie Day,” Eddy Colloton (10/15/16)

Snakes in the Archive,” Adam Lott (10/20/16)

Vietnam, Oral Histories, and the WYSO Archives Digital Humanities Symposium,” Tressa Graves (11/7/16)

Archives in Conversation (A Glimpse into the Minnesota Archives Symposium, 2016),” Kate McManus (11/15/16)

Inside the WHUT video library clean-up – part 1: SpaceSaver,” Lorena Ramirez-Lopez (11/21/16)

Is there something that does it all?: Choosing a metadata management system,” Selena Chau (11/22/16)

Inside the WHUT video library clean-up – part 2: lots of manual labor,” Lorena Ramirez-Lopez (12/20/16)

Just Ask For Help Already!” Eddy Colloton (12/22/16)

August also kicked off our first series of guest webinars, focusing on a range of topics of interest to audiovisual and digital preservation professionals. Most webinars were recorded, and all have slides available.

AAPB NDSR webinars to date include:

Metadata: Storage, Modeling and Quality,” by Kara Van Malssen, Partner & Senior Consultant at AVPreserve

Public Media Production Workflows,” by Leah Weisse, WGBH Digital Archive Manager/Production Archival Compliance Manager (slides)

Imposter Syndrome” by Jen LaBarbera, Head Archivist at Lambda Archives of San Diego, and Dinah Handel, Mass Digitization Coordinator at the NYPL (slides)

Preservation and Access: Digital Audio,” by Erica Titkemeyer, Project Director and AV Conservator at the Southern Folklife Collection (slides)

Troubleshooting Digital Preservation,” by Shira Peltzman, Digital Archivist at UCLA Library (slides)

Studs Terkel Radio Archive: Tips and Tricks for Sharing Great Audio,” by Grace Radkins, Digital Content Librarian at Studs Terkel Radio Library (slides)

From Theory to Action: Digital Preservation Tools and Strategies,” by Danielle Spalenka, Project Director of the Digital POWRR Project (slides)

Our first two resident-hosted webinars (open to the public) will be happening this month! Registration and more info is available here.

The residents also hosted two great panel presentations, first in September at the International Association of Sound and Audiovisual Archives Conference, and in November at the Association of Moving Image Archivists Conference. The AMIA session in particular generated a lot of Twitter chatter; you can see a roundup here.

To keep up with AAPB NDSR blog posts, webinar recordings, and project updates as they happen, follow the AAPB NDSR site at ndsr.americanarchive.org.

17k for 2017

AAPB is kicking off the new year by adding a lot more content to our Online Reading Room. We now have more than 17,000 historic public broadcasting programs available for anyone in the United States to watch or listen to on our site!

Highlights from the newly available recordings include:

evening_exchange

Episodes of WHUT’s Evening Exchange, including this episode on The Future of the Black Family (see left). Evening Exchange is a series featuring discussions with “writers, philosophers and newsmakers whose work offers insight into the black community.”

  • Episodes of the children’s radio series, Afield with Ranger Mac, which was broadcast on Wisconsin Public Radio as part of the Wisconsin School of the Air.
  • A speech by a United Mine Workers of America official recorded for the Appalshop documentary UMWA 1970: A House Divided.
  • Episodes of WFMU’s series Wasted Vinyl, including this interview with Joseph Shabalala, founder of Ladysmith Black Mambazo.modoc
  • A locally-produced chronicle of the Modoc War (1872 – 1873) and Modoc leader, Captain Jack from Southern Oregon Public Television’s collection (see right).
  • Episodes of Iowa Press, including this one about Rural Poverty. Iowa Press is a news talk show, featuring an in-depth news report on one topic each episode, followed by a conversation between experts on the issue.

Overall, the new content in the ORR includes recordings from 23 different organizations across the country:

We are very excited to continue making more historic public media available again to the American public, helping to fulfill public media’s mission to enlighten, inspire, and educate its audiences.