Tanya Yule, Public Broadcasting Preservation Fellow at CAAM

 

Screen Shot 2018-05-07 at 4.13.46 PM
Drives loaded up and ready to be sent to the AAPB!!

 

Hello, my name is Tanya Yule and I am one of the five, in the first cohort of the AAPB Public Broadcast Preservation Fellows. Later this month I will be receiving my Masters in Library and Information Science, and an advanced certificate in Digital Assets Management from San Josè State University, with an emphasis in archives and preservation.

When I began the program at SJSU it was with a focus on photography preservation; this was initially a means of utilizing my background in historic photography practices as a way to protect and preserve images for future generations. However, through my work at the Hoover Institution Archives (where I am an intern), I began to fall in love with working in all areas of archives, not just with photographs, and have had the fortunate experience to process incredible collections that range from the Russian Revolution to the Vietnam War, each providing a unique glimpse of someone’s life that I get to describe, organize, and preserve for future generations. When the fellowship was posted, I had a “this was made for me” moment and applied instantly. I have wanted to work with A/V media for quite sometime, and have yet to have the opportunity, until now.

For the last three-months I have been entrenched in material spanning the globe; each item as unique as the next, and giving me more in return than I was prepared for. As I am sitting here trying to tap out a structure and synthesis of what the heck just occurred during the American Archive of Public Broadcasting’s Preservation Fellowship, I am almost overwhelmed with the task.

 

Screen Shot 2018-05-07 at 4.13.34 PM
Bay Area Video Coalition (BAVC) Set-up

 

The specialness of this particular fellowship has been based in the opportunity to work with at-risk magnetic media, multiple stakeholders, and learn a very complex technique for capturing. I was fortunate to be able to work with two amazing San Francisco based non-profit organizations that focus on representing arts and culture for underrepresented communities, and have been pillars in what they do for several decades. The collection I worked from came from the Center for Asian American Media (CAAM); CAAM isn’t a traditional archives, but their holdings are significant and represent a wide range of diverse films and documentaries; many which have appeared on local and national PBS stations over the years. The collection contained U-matic, Betacam, and Digibeta tapes, many which haven’t been viewed in decades. The majority of the fellowship was spent over at the Bay Area Video Coalition (BAVC), under the watchful (and extremely patient and knowledgeable) eye of Jackie Jay. I was fortunate to be able to have my experience take place with the help of a staff that do this work daily, and could help me capture and learn in the best possible situation. I would like to also give a shout out to Morgan Morel for suffering though my lack of commandline knowledge, he has inspired me to take a python class when this is all over.

What is in a name?

While inventorying the items for the collection at CAAM, I couldn’t help but be curious about some of the titles: Anatomy of a Springroll, Dollar a Day, 10 Cents a Dance, A Village Called Versailles, Sewing Woman, to name a few. Since all of the items are on some form of video (magnetic media) it isn’t as easy as just popping in a deck and taking a peek. While capturing in the dark room with my noise cancelling headphones on, there were moments that I would literally laugh out loud, or cry; the subjects are heavy, as is the perspective and history, my work at the Hoover Archives had helped prepare me for dealing with difficult collections, especially when it comes to visual materials regarding war and atrocities.

 

Screen Shot 2018-05-07 at 4.08.43 PM
Many videos have some form of image error, the above “watermark” is a blemish on an old tape, this can be seen in 1/30 of a second. After capturing I would go back to any discrepancy to investigate further

 

Cleaning, cleaning, and some baking!

I soon learned that the majority of my time was in making sure that the decks and tapes were in tip-top shape before capturing. It is quite amazing how much time is spent cleaning tapes, cleaning the decks, baking tapes (in a really high tech food dehydrator), re-cleaning tapes, and re-cleaning machines, as well as setting up levels and making sure that the item being digitized is as close to the original as possible. The cleaning ensures that there is no transfer of dust or debris from another tape, and that the output from the deck is precise. I am extremely fortunate to have my digitization station at BAVC, as they understand the fundamentals of video preservation and digitization, and helped me learn more about the process then I thought I would be capable of in such a short time.

About the collection

As archivists often times we really don’t know what the collection is “about” until the end, there are usually surprises, and most the times these records don’t come with a “read me” file, so I figured I would save this portion to the end as well. The collection as a whole speaks to the diversity of Asian American life, culture, and experiences; evoking the universal struggle of the human condition. When curating the featured films for the AAPB Special Collections page it was difficult to choose, however, many of the films tell the history of women who have defied odds, been outspoken, or who had sacrificed so much for so little in return, I wanted to put these women upfront and recognize their stories and the ones who decided to tell them.

 

Screen Shot 2018-05-07 at 4.13.20 PM
CAAM Video Archive

 

Having this wonderful opportunity to participate in this fellowship while completing my degree allowed me to expand my technical and historical knowledge base, which I am forever grateful for. I would like to thank SJSU and my wonderful advisor Alyce Scott, James Ott and Davin Agatep at the CAAM for helping me out with the project, the entire preservation crew at BAVC for making sure I didn’t break anything, and of course the AAPB and all of the wonderful WGBH folks that made this fellowship happen.

If you are interested in learning more, here is a Q & A I did with CAAM when I started, you can also follow #aapbpf for photos of the stations and process.

 

 

Written by Tanya Yule, PBPF Spring 2018 Cohort

*******************

About PBPF

The Public Broadcasting Preservation Fellowship (PBPF), funded by the Institute of Museum and Library Services, supports ten graduate student fellows at University of North Carolina, San Jose State University, Clayton State University, University of Missouri, and University of Oklahoma in digitizing at-risk materials at public media organizations around the country. Host sites include the Center for Asian American Media, Georgia Public Broadcasting, WUNC, the Oklahoma Educational Television Authority, and KOPN Community Radio. Contents digitized by the fellows will be preserved in the American Archive of Public Broadcasting. The grant also supports participating universities in developing long-term programs around audiovisual preservation and ongoing partnerships with their local public media stations.

For more updates on the Public Broadcasting Preservation Fellowship project, follow the project at pbpf.americanarchive.org and on Twitter at #aapbpf, and come back in a few months to check out the results of their work.

AAPB Announces Collaboration with Dartmouth College Media Ecology Project

 

cropped-mep_banner5112000px-Dartmouth_College_wordmark.svg

The American Archive of Public Broadcasting (AAPB) and Dartmouth College are pleased to announce a new collaboration in which AAPB’s Online Reading Room of public television and radio programming will now be accessible through the Media Ecology Project (MEP) at Dartmouth.

The Media Ecology Project is a digital resource directed by Dartmouth Associate Professor of Film and Media Studies Mark J. Williams. MEP provides researchers with not only online access to archival moving image collections but also with tools to participate in new interdisciplinary scholarship that produces metadata about the content of participating archives. By providing annotated knowledge about the archival materials, students and scholars add value back to the archives, making these materials more searchable in the future. The MEP aims to facilitate the awareness of and critical study of media ecology—helping to save and preserve at-risk historical media and contribute to our understanding of their role in the public sphere and in popular memory.

Through this new AAPB-Dartmouth collaboration, historic public broadcasting programs available in the AAPB Online Reading Room will be accessible through the MEP platform. Scholars, researchers and students using the MEP platform will be able to access AAPB collection materials for research, in-classroom presentations and other assignments as part of their academic and scholarly work. MEP scholarly participation spans the disciplines from Arts and Humanities to the Social Sciences, Computer Science and Medical Science. One topic that Williams will immediately pursue with students and colleagues is coverage of the civil rights era that exists in the collection.

While conducting their research via MEP, scholars will be able to give back to AAPB by creating time-based annotations and metadata under a public domain license. Basic descriptive metadata such as credit information for video and audio files is desired, but more granular time-based annotations that describe specific sub-clips within media files will designate more particular areas of scholarly interest. These sub-clips can then be utilized in research essays that are open to scholarly emphases across the academic disciplines. The annotations that students and scholars produce will be made available on the AAPB website for improved searching, navigation and discoverability across the collection and within individual digitized programs and recordings.

The American Archive of Public Broadcasting (AAPB) is a collaboration between the Library of Congress and the WGBH Educational Foundation to coordinate a national effort to preserve at-risk public media before its content is lost to posterity and provide a central web portal for access to the unique programming that public stations have aired over the past 70 years. To date, over 50,000 hours of television and radio programming by more than 100 public media organizations and archives across the United States have been digitized for long-term preservation and access. The entire collection is available on location at WGBH and the Library of Congress, and almost 31,000 programs are available online at: americanarchive.org.
For more information or to request access to specific materials at either of the two sites, researchers can request a research appointment.

Making the AAPB more accessible, useable, and engaging for scholars, researchers and students furthers AAPB’s mission to facilitate the use of historic public broadcasting materials. Further, the capacity of participants in the MEP to generate and provide tagged annotations and metadata to the AAPB will support the archive in becoming a centralized web portal for discovery of the historic content created by public broadcasting over the past 70+ years.

Historic WRVR-FM Archives to be Digitized, Preserved and Made Available in the American Archive of Public Broadcasting

Historic WRVR-FM Archives Receives CLIR
Digitizing Hidden Special Collections and Archives Award

More than 4,000 hours of cultural and political radio programming from the 60s and 70s to be made public

 

Morningside Heights, NY – The Council on Library and Information Resources has awarded a grant of $330,000 to digitize, preserve, and make publicly accessible previously unavailable archives of the Peabody Award winning radio station WRVR. Public Radio as a Tool for Cultural Engagement in New York in the 60s and early 70s: Digitizing the Broadcasts of WRVR-FM Public Radio is a joint project between The Riverside Church in the City of New York and the American Archive of Public Broadcasting, a collaboration between the Library of Congress and the WGBH Educational Foundation. The collection includes culturally significant non-commercial programming, including interviews, speeches, and musical interpretations on matters such as civil rights, war, and fine arts, from laypersons to famed scholars, including Martin Luther King, Jr., Malcolm X, and Pete Seeger.

Funded by the Andrew W. Mellon Foundation, the Council on Library and Information Resources’ Digitizing Hidden Collections program supports the creation of digital representations of unique content of high scholarly significance. This award will support the preservation and digitization of over 3,502 recordings representing 4,000 hours of programming from WRVR from the 1960s and early 1970s. Owned and operated by The Riverside Church from 1961-1976, WRVR was the first station to win a Peabody for its entire programming, in part for its coverage of the Civil Rights movement in 1963 Birmingham. In addition to featuring progressive religious and philosophical discussions with Riverside clergy, theologians, and scholars, such as Rev. Dr. Martin Luther King, Jr., WRVR programming included culturally significant topics, speakers, and performances, such as Langston Hughes’ “Jericho-Jim Crow” directed by Alvin Ailey, and interviews and readings by Robert Frost, John Ashbery, and Allen Ginsberg. The station also featured the program “Just Jazz with Ed Beach,” which collection currently resides at the Library of Congress.

Preservation of these materials will enhance study in many disciplines, including theology/religion, political science, and communications, especially related to American Christianity, homiletics, progressive responses to the Civil Rights movement, contemporary issues of race and sexuality, the cultural impact of the 1960s, and public radio as a tool for cultural engagement and social media precursor.

These recordings will be made publicly available at the American Archive of Public Broadcasting (AAPB), a collaboration between the Library of Congress and WGBH. The AAPB coordinates a national effort to preserve at-risk public media before its content is lost to posterity and provide a central web portal for access to the unique programming that public stations have aired over the past 70 years.

Sample recordings include:

  • Arthur Miller. Statement for World Theater Day, March 27, 1963 Riverside Radio, WRVR, Riverside Archives (The Riverside Church) Arthur Miller remarks on theater’s ability to speak universal truths and understanding in art, and how this particular art form, above many others, informs society’s response to war, politics, freedoms, and all matters of the human condition across nations and cultures.
  • “Listen! William Sloane Coffin Jr.: Conscience, Protest & War.” Interview on WRVR, March 31, 1968 Riverside Radio, WRVR. Riverside Archives (The Riverside Church) William Sloane Coffin Jr., chaplain at Yale University (later Riverside Senior Minister, 1977-1987), discusses his indictment for conspiracy to encourage draft evasion and the politics of the Vietnam War; peace activism, civil rights and Dr. King’s Poor People’s Campaign, and how Dr. Coffin’s privilege informs his work as a clergyperson, activist, and American.

About The Riverside Church
riverside
Located in Morningside Heights on the Upper West Side, The Riverside Church in the City of New York is one of the leading voices of Progressive Christianity, influential on America’s religious and political landscapes for more than 85 years.  Built by John D. Rockefeller Jr. and currently led by The Rev. Dr. Amy Butler, the interracial, interdenominational, and international church has long been a forum for important civic and spiritual leaders, including Dr. Martin Luther King, Jr., Nelson Mandela, President Clinton, the Dalai Lama, and countless others.  Visit www.trcnyc.org or find us on social media to learn more about our rich history and the latest news and events.

About the American Archive of Public Broadcasting
AAPB_Logo_Color_4Square
The American Archive of Public Broadcasting (AAPB) is a collaboration between the Library of Congress and the WGBH Educational Foundation to coordinate a national effort to preserve at-risk public media before its content is lost to posterity and provide a central web portal for access to the unique programming that public stations have aired over the past 70 years. To date, over 50,000 hours of television and radio programming contributed by more than 100 public media organizations and archives across the United States have been digitized for long-term preservation and access. The entire collection is available on location at the Library of Congress and WGBH, and more than 30,000 programs are available online at americanarchive.org.

About WGBH
wgbh_logoWGBH Boston is America’s preeminent public broadcaster and the largest producer of PBS content for TV and the Web, including Masterpiece, Antiques Roadshow, Frontline, Nova, American Experience, Arthur and more than a dozen other prime-time, lifestyle, and children’s series. WGBH also is a leader in educational multimedia, including PBS LearningMedia™, and a pioneer in technologies and services that make media accessible to the 36 million Americans who are deaf, hard of hearing, blind, or visually impaired. WGBH has been recognized with hundreds of honors: Emmys, Peabodys, duPont-Columbia Awards…even two Oscars. Find more information at www.wgbh.org.

About the Library of Congress
PrintThe Library of Congress is the world’s largest library, offering access to the creative record of the United States – and extensive materials from around the world – both on site and online. It is the main research arm of the U.S. Congress and the home of the U.S. Copyright Office.  Explore collections, reference services and other programs and plan a visit at loc.gov, access the official site for U.S. federal legislative information at congress.gov and register creative works of authorship at copyright.gov.

About CLIR
CLIR_red_w_wordmark
The Council on Library and Information Resources is an independent, nonprofit organization that forges strategies to enhance research, teaching, and learning environments in collaboration with libraries, cultural institutions, and communities of higher learning.

About the Mellon Foundation
Founded in 1969, the Andrew W. Mellon Foundation endeavors to strengthen, promote, and, where necessary, defend the contributions of the humanities and the arts to human flourishing and to the well-being of diverse and democratic societies by supporting exemplary institutions of higher education and culture as they renew and provide access to an invaluable heritage of ambitious, path-breaking work. Additional information is available at mellon.org.

PBS NewsHour Digitization Project Update: “Asset Review” and Access and Description Workflows

I’ve previously written about developing and automating management of our workflows for the NewsHour project (click for link), and WGBH’s processes for ingesting and preserving the NewsHour digitizations (click for link). Now that the project is moving along, and over one thousand episodes of the NewsHour are already on the AAPB (with recently added transcript search functionality!!), I thought I would share more information about our access workflows and how we make NewsHour recordings available.

In this post I will describe our “Asset Review” and “Online Workflow” phases. The “Asset Review” phase is where we determine what work we will need to do to a recording to make it available online, and the “Online Workflow” phase is where we extract metadata from a transcript, add the metadata to our repository, and make the recording available online.

The goals and realities of the NewsHour project necessitate an item level content review of each recording. The reasons for this are distinct and compounding. The scale of the collection (nearly 10,000 assets) meant that the inventories from which we derived our metadata were generated only from legacy databases and tape labels, which are sometimes wrong. At no point were we able to confirm that the content on any tape is complete and correct prior to digitization. In fact, some of the tapes are unplayable before being prepared to be digitized. Additionally, there is third-party content that needs to be redacted from some episodes of the NewsHour before they can be made available. A major complication is that the transcripts only match 7pm Eastern broadcasts, and sometimes 9pm or 11pm updates would be recorded and broadcast if breaking news occurred. The tapes are not always marked with broadcast times, and sometimes do not contain the expected content – or even an episode of the NewsHour!

These complications would be fine if we were only preserving the collection, but our project goal is to make each recording and corresponding transcript or closed caption file broadly accessible. To accomplish that goal each record must have good metadata, and to have that we must review and describe each record! Luckily, some of the description, redaction, and our workflow tracking is automatable.

Access and Description Workflow Overview

As I’ve mentioned before, we coordinate and document all our NewsHour work in a large Google Sheet we call the “NewsHour Workflow workbook” (click here for link). The chart below explains how a GUID moves through sheets of the NewsHour workbook throughout our access and description work.

NewsHour_AccessWorkflowChart.png
AAPB NewsHour Acces and Description workflow chart

After a digitized recording has been delivered to WGBH and preserved, it is automatically placed in queue on the “Asset Review” sheet of our workbook. During the Asset Review, the reviewer answers thirteen different questions about the GUID. Using these responses, the Google Sheet automatically places the assets into the appropriate workflow trackers in our workbook. For instance, if a recording doesn’t have a transcript, it is placed in the “No Transcript tracker”, which has extra workflow steps for generating a description and subject metadata. A GUID can have multiple issues that place it into multiple trackers simultaneously. For instance, a tape that is not an episode will also not have a transcript, and will be placed on both the “Not an Episode tracker” and the “No Transcript tracker”. The Asset Review is critical because the answers determine the work we must perform, and ensures that each record will be correctly presented to the public when work on it is completed.

A GUID’s status in the various trackers is reflected in the “Master GUID Status sheet”, and is automatically updated when different criteria in the trackers are met and documented. When a GUID’s workflow tasks have been completely resolved in all the trackers, it appears as “Ready to go online” on the “Master GUID Status sheet.” The GUID is then automatically placed into to the “AAPB Online Status tracker”, which presents the metadata necessary to put the GUID online and indicates if tasks have been completed in the “Online Workflow tracker”. When all tasks are completed, the GUID will be online and our work on the GUID is finished.

In this post I am focusing on a workflow that follows digitizations which don’t have problems. This means the GUIDs are episodes, contain no technical errors, and have transcripts that match (green arrows in the chart). In future blog posts I’ll elaborate on our workflows for recordings that go into the other trackers (red arrows).

Asset Review

NewsHour_AssetReview
An image of a portion of our Access Review spreadsheet

Each row of the “Asset Review sheet” represents one asset, or GUID. Columns A-G (green cell color) on the sheet are filled with descriptive and administrative metadata describing each item. This metadata is auto-populated from other sheets in the workbook. Columns H-W (yellow cell color) are the reviewer’s working area, with questions to answer about each item reviewed. As mentioned earlier, the answers to the questions determines the actions that need to be taken before the recording is ready to go online, and place the GUID into the appropriate workflow trackers.

The answers to some questions on the sheet impact the need to answer others, and cells auto-populate with “N/A” when one answer precludes another. Almost all the answers require controlled values, and the cells will not accept input besides those values. If any of the cells are left blank (besides questions #14 and #15) the review will not register as completed on the “Master GUID Status Sheet”. I have automated and applied value control to as much of the data entry in the workbook as possible, because doing so helps mitigate human error. The controlled values also facilitate workbook automation, because we’ve programmed different actions to trigger when specific expected text strings appear in cells. For instance, the answer to “Is there a transcript for this video?” must be “Yes” or “No”, and those are the only input the cell will accept. A “No” answer places the GUID on the “No Transcript tracker”, and a “Yes” does not.

To review an item, staff open the GUID on an access hard drive. We have a multiple access drives which contain copies of all the proxy files delivered NewsHour digitizations. Reviewers are expected to watch between one and a half to three minutes of the beginning, middle, and end of a recording, and to check for errors while fast-forwarding through everything not watched. The questions reviewers answer are:

  1. Is this video a nightly broadcast episode?
  2. If an episode, is the recording complete?
  3. If incomplete, describe the incompleteness.
  4. Is the date we have recorded in the metadata correct?
  5. If not, what is the corrected date?
  6. Has the date been updated in our metadata repository, the Archival Management System?
  7. Is the audio and video as expected, based on the digitization vendor’s transfer notes?
  8. If not, what is wrong with the audio or video?
  9. Is there a transcript for this video?
  10. If yes, what is the transcript’s filename?
  11. Does the video content completely match the transcript?
  12. If no, in what ways and where doesn’t the transcript match?
  13. Does the closed caption file match completely (if one exists)?
  14. Should this video be part of a promotional exhibit?
  15. Any notes to project manager?
  16. Date the review is completed.
  17. Initials of the reviewer.

Our internal documentation has specific guidelines on how to answer each of these questions, but I will spare you those details! If you’re conducting quality control and description of media at your institution, these questions are probably familiar to you. After a bit of practice reviewers become adept at locating transcripts, reviewing content, and answering the questions. Each asset takes about ten minutes to review if the transcript matches, the content is the expected recording, and the digitization is error free. If any of those criteria are not true, the review will take longer. The review is laborious, but an essential step to make the records available.

Online Workflow

A large majority of recordings are immediately ready to go online following the asset review. These ready GUIDs are automatically placed into the “AAPB Online Status tracker,” where we track the workflow to generate metadata from the transcript and upload that and the recording to the AAPB.

About once a month I use the “AAPB Online Status tracker” to generate a list of GUIDs and corresponding transcripts and closed caption files that are ready to go online. To do this, all I have to do is filter for GUIDs in the “AAPB Online Status tracker” that have the workflow status “Incomplete” and copy the relevant data for those GUIDs out of the tracker and into a text file. I import this list into a FileMaker tool we call “NH-DAVE” that our Systems Analyst constructed for the project.

NewsHour_NHDAVE.png
A screenshot of our FileMaker tool “NH-DAVE”

“NH-DAVE” is a relational database containing all of the metadata that was originally encoded within the NewsHour transcripts. The episode transcripts provided by NewsHour contained the names of individuals appearing and subject terms for that episode in marked up values. Their subject terms were much more specific than ours, so we mapped them to the more broad AAPB controlled vocabulary we use to facilitate search and discovery on our website. When I ingest a list of GUIDs and transcripts to “NH-DAVE” and click a few buttons, it uses an AppleScript to match metadata from the transcript to the corresponding NewsHour metadata records in our Archival Management System and generate SQL statements. We use the statements to insert the contributor and subject metadata from the transcripts into the GUIDs’ AAPB metadata records in the Archival Management System.

Once the transcript metadata has been ingested we use both a Bash and a Ruby script to upload the proxy recordings to our streaming service, Sony Ci, and the transcripts and closed caption SRT files to our web platform, Amazon. We run a Bash script to generate another set of SQL statements to add the Sony Ci URLs and some preservation metadata (generated during the digital preservation phase) to our Archival Management System. We then export the GUIDs’ Archival Management System records into PBCore XML and ingest the XML into the AAPB’s website. As each step of this process is completed, we document it in the “Online Workflow tracker,” which will eventually register that work on the GUID is completed. When the PBCore ingest is completed and documented on the “Online Workflow tracker,” the recording and transcript are immediately accessible online and the record displays as complete on the “Master GUID Status spreadsheet”!

We consider a record that has an accurate full text transcript, contributor names, and subject terms to be sufficiently described for discovery functions on the AAPB. The transcript and terms will be fully indexed to facilitate searching and browsing. When a transcript matches, our descriptive process for NewsHour is fully automated. This is because we’re able to utilize the NewsHour’s legacy data. Without that data, the descriptive work required for this collection would be tremendous.

A large majority of NewsHour records follow the workflow I’ve described in this post in their journey to the AAPB. If, unlike those covered here, a record is not an episode, does not have a matching transcript, needs to be redacted, or has technical errors, then it requires more work than I have outlined. Look forward to blog posts about those records in the future! Click here to see a NewsHour record that went through this workflow. If you’re interested in our workflow, I encourage you to open the workbook and use “Find” to follow this GUID (“cpb-aacip-507-0r9m32nr3f”) through the various trackers. Click here to see all NewsHour records that have been put online!

WGBH Awarded $1 Million Grant by Andrew W. Mellon Foundation to Support American Archive of Public Broadcasting

Grant will bolster capacity and usability of the American Archive of Public Broadcasting

BOSTON (June 22, 2017) – WGBH Educational Foundation is pleased to announce that the Andrew W. Mellon Foundation has awarded WGBH a $1 million grant to support the American Archive of Public Broadcasting (AAPB). The AAPB, a collaboration between Boston public media station WGBH and the Library of Congress, has been working to digitize and preserve nearly 50,000 hours of broadcasts and previously inaccessible programs from public radio and public television’s more than 60-year legacy.

WGBH will use the grant funds to build technical capacity for the intake of new content, develop collaborative initiatives, build training and support services for AAPB contributors and foster scholarly use and enhance public access for the collection. These efforts will include the creation of advisory committees for scholars, stations and educators.

“The work of the American Archive of Public Broadcasting is crucial for preserving our public media history and making this rich vault of content available to all,” said WGBH President and CEO Jon Abbott. “I am grateful that the Mellon Foundation has recognized the invaluable efforts of our archivists to save these historic programs for the future. WGBH is honored to accept this generous grant.”

WGBH also will hire a full-time Engagement and Use Manager to lead outreach and engagement activities for the AAPB. Candidates can find the job posting on WGBH’s employment website: http://www.wgbh.org/about/employmentopportunities.cfm.

The AAPB is a national effort to preserve at-risk public media and provide a central web portal for access to the programming that public stations and producers have created over the past 60 years. In its initial phase, the AAPB digitized approximately 40,000 hours of radio and television programming and related materials selected by more than 100 public media stations and organizations across the country. The entire collection is available for research on location at WGBH and the Library, and currently more than 20,000 programs are available in the AAPB’s Online Reading Room at americanarchive.org to anyone in the United States.

###

About WGBH

WGBH Boston is America’s preeminent public broadcaster and the largest producer of PBS content for TV and the Web, including Masterpiece, Antiques Roadshow, Frontline, Nova, American Experience, Arthur, Curious George, and more than a dozen other prime-time, lifestyle, and children’s series. WGBH also is a leader in educational multimedia, including PBS LearningMedia, and a pioneer in technologies and services that make media accessible to the 36 million Americans who are deaf, hard of hearing, blind, or visually impaired. WGBH has been recognized with hundreds of honors: Emmys, Peabodys, duPont-Columbia Awards…even two Oscars. Find more information at www.wgbh.org.

About the Library of Congress

The Library of Congress is the world’s largest library, offering access to the creative record of the United States – and extensive materials from around the world – both on site and online. It is the main research arm of the U.S. Congress and the home of the U.S. Copyright Office.  Explore collections, reference services and other programs and plan a visit at loc.gov, access the official site for U.S. federal legislative information at congress.gov and register creative works of authorship at copyright.gov.

About the American Archive of Public Broadcasting

The American Archive of Public Broadcasting (AAPB) is a collaboration between the Library of Congress and the WGBH Educational Foundation to coordinate a national effort to preserve at-risk public media before its content is lost to posterity and provide a central web portal for access to the unique programming that public stations have aired over the past 60 years. To date, over 40,000 hours of television and radio programming contributed by more than 100 public media organizations and archives across the United States have been digitized for long-term preservation and access. The entire collection is available on location at WGBH and the Library of Congress, and more than 20,000 programs are available online at americanarchive.org.

About the Andrew W. Mellon Foundation

Founded in 1969, the Andrew W. Mellon Foundation endeavors to strengthen, promote, and, where necessary, defend the contributions of the humanities and the arts to human flourishing and to the well-being of diverse and democratic societies by supporting exemplary institutions of higher education and culture as they renew and provide access to an invaluable heritage of ambitious, path-breaking work. Additional information is available at mellon.org.

 

PBS NewsHour Digitization Project Update: Ingest and Digital Preservation Workflows

In our last blog post (click for link) on managing the PBS NewsHour Digitization Project, I briefly discussed WGBH’s digital preservation and ingest workflows. Though many of our procedures follow standard practices common to archival work, I thought it would be worthwhile to cover them more in-depth for those who might be interested. We at WGBH are responsible for describing, providing access to, and digitally preserving the proxy files for all of our projects. The Library of Congress preserves the masters. In this post I cover how we preserve and prepare to provide access to proxy files.

Before a file is digitized, we ingest the item-level tape inventory generated during the project planning stages into our Archival Management System (AMS – see link for the Github). The inventory is a CSV that we normalized to our standards, upload, and then map to PBCore in MINT, or “Metadata Interoperability Services,” an open-source web-based plugin designed for metadata mapping and aggregation. The AMS ingests the data and creates new PBCore records, which are stored as individual elements in tables in the AMS. The AMS generates a unique ID (GUID) for each asset. We then export the metadata, provide it to the digitization vendor, and use the GUID identifiers to track records throughout the project workflow.

Screen Shot 2016-07-07 at 3.30.19 PM.png
Mapping a CSV to PBCore in MINT

For the NewsHour project, George Blood L.P. receives the inventory metadata and the physical tapes to digitize to our specifications. For every GUID, George Blood creates a MP4 proxy for access, a JPEG2000 MXF preservation master, sidecar MD5 checksums for both video files, and a QCTools report XML for the master. George Blood names each file after the corresponding GUID and organizes the files into an individual folder for each GUID. During the digitization process, they record digitization event metadata in a PREMIS spreadsheets. Those sheets are regularly automatically harvested by the AMS, which inserts the metadata into the corresponding catalog records. With each delivery batch George Blood also provides MediaInfo XML saved in BagIt containers for every GUID, and a text inventory of the delivery’s assets and corresponding MD5 checksums. The MediaInfo bags are uploaded via FTP to the AMS, which harvests technical metadata from them and creates PBCore instantiation metadata records for the proxies and masters. WGBH receives the digitized files on LTO 6 tapes, and the Library of Congress receives theirs on rotating large capacity external hard drives.

For those who are not familiar with the tools I just mentioned, I will briefly describe them. A checksum is a computer generated cryptographic hash. There are different types of hashes, but we use MD5, as do many other archives. The computer analyzes a file with the MD5 algorithm and delivers a 32 character code. If a file does not change, the MD5 value generated will always be the same. We use MD5s to ensure that files are not corrupted during copying and that they stay the same (“fixed”) over time. QCTools is an open source program developed by the Bay Area Video Coalition and its collaborators. The program analyzes the content of a digitized asset, generates reports, and facilitates the inspection of videos. BagIt is a file packaging format developed by the Library of Congress and partners that facilitates the secure transfer of data. MediaInfo is a tool that reports technical metadata about media files. It’s used by many in the AV and archives communities. PREMIS is a metadata standard used to record data about an object’s digital preservation.

Now a digression about my inventories – sorry in advance. ¯\_(ツ)_/¯

I keep two active inventories of all digitized files received. One is an Excel spreadsheet “checksum inventory” in which I track if a GUID was supposed to be delivered but was not received, or if a GUID was delivered more than once. I also use it to confirm that the checksums George Blood gave us match the checksums we generate from the delivered files, and it serves as a backup for checksum storage and organization during the project. The inventory has a master sheet with info for every GUID, and then each tape has an individual sheet with an inventory and checksums of its contents. I set up simple formulas that report any GUIDs or checksums that have issues. I could use scripts to automate the checksum validation process, but I like having the data visually organized for the NewsHour project. Given the relatively small volume of fixity checking I’m doing this manual verification works fine for this project.

Screen Shot 2017-04-10 at 2.37.28 PM.png
Excel “checksum inventory” sheet page for NewsHour LTO tape #27.

The other inventory is the Approval Tracker spreadsheet in our Google Sheets NewsHour Workflow workbook (click here for link). The Approval Tracker is used to manage reporting about GUID’s ingesting and digital preservation workflow status. I record in it when I have finished the digital preservation workflow on a batch, and I mark when the files have been approved by all project partners. Partners have two months from the date of delivery to report approvals to George Blood. Once the files are approved they’re automatically placed on the Intern Review sheet for the arrangement and description phase of our workflow.

Screen Shot 2017-04-10 at 2.38.11 PM.png
The Approval Tracker in the NewsHour Workflow workbook.

Okay, forgive me for that, now back to WGBH’s  ingest and digital preservation workflow for the NewsHour project!

The first thing I do when we receive a shipment from George Blood is the essential routine I learned the hard way while stocking a retail store – always make sure everything that you paid for is actually there! I do this for both the physical LTO tapes, the files on the tapes, the PREMIS spreadsheet, the bags, and the delivery’s inventory. In Terminal I use a bash script that checks a list of GUIDs against the files present on our server to ensure that all bags have been correctly uploaded to the AMS. If we’ve received everything expected, I then organize the data from the inventory, copying the submission checksums into each tape’s spreadsheet in my Excel “checksum inventory”. Then I start working with the tapes.

Important background information is that the AAPB staff at WGBH work in a Mac environment, so what I’m writing about works for Mac, but it could easily be adopted to other systems. The first step I take with the tapes is to check the them for viruses. We use Sophos to do that in Terminal, with the Sweep command. If no viruses are found I then use one of our three LTO workstations to copy the MP4 proxies, proxy checksums, and QCTools XML reports from the LTO to a hard drive. I use the Terminal to do the copying, which I leave run while I go to other work. When the tape is done copying I use Terminal to confirm that the number of files copied matches the number of files I expected to copy. After that, I use it to run an MD5 report (with the find, -exec, and MD5 commands) on the copied files on the hard drive. I put those checksums into my Excel sheet and confirm they match the sums provided by George Blood, that there are no duplicates, and that we received everything we expected. If all is well, I put the checksum report onto our department server and move on to examining the delivered files’ specifications.

I use MediaInfo and MDQC to confirm that files we receive conform to our expectations. Again, this is something I could streamline with scripts if the workflow needed, but MDQC gets the job done for the NewsHour project. MDQC is a free program from AVPreserve that checks a group of files against a reference file and passes or fails them according to rules you specify. I set the test to check that the delivered batch are encoded to our specifications (click here for those). If any files fail the test, I use MediaInfo in Terminal to examine why they failed. I record any failures at this stage, or earlier in the checksum stage, in an issue tracker spreadsheet the project partners share, and report the problems to the vendor so that they can deliver corrected files.

Screen Shot 2017-04-10 at 2.39.55 PM
MDQC’s simple and effective user interface.

Next I copy the set of copies on the hard drive onto other working hard drives for the interns to use during the review stage. I then skim a small sample of the files to confirm their content meets our expectations, comparing the digitizations to the transfer notes provided by George Blood in the PREMIS metadata. I review a few of the QCTools reports, looking at the video’s levels. I don’t spend much time doing that though, because the Library of Congress reviews the levels and characteristics of every master file. If everything looks good I move on, because all the proxies will be reviewed at an item level by our interns during the next phase of the project’s workflow anyways.

The last steps are to mark both the delivery batch’s digital preservation complete and the files as approved in the Approval Tracker, create a WGBH catalog record for the LTO, run a final MD5 manifest of the LTO and hard drive, upload some preservation metadata (archival LTO name, file checksums, and the project’s internal identifying code) to the AMS, and place the LTO and drive in our vault. The interns then review and describe the records and, after that, the GUIDs move into our access workflow. Look forward to future blog posts about those phases!

PBS NewsHour Digitization Project Update: Workflow Management

NewsHour_Project_LogosIn January 2016, the Council on Library and Information Resources awarded WGBH, the Library of Congress, WETA, and NewsHour Productions, LLC a grant to digitize, preserve, and make publicly accessible on the AAPB website 32 years of NewsHour predecessor programs, from October 1975 to December 2007, that currently exist on obsolete analog formats. Described by co-creator Robert MacNeil as “a place where the news is allowed to breathe, where we can calmly, intelligently look at what has happened, what it means and why it is important,” the NewsHour has consistently provided a forum for newsmakers and experts in many fields to present their views at length in a format intended to achieve clarity and balance, rather than brevity and ratings. A Gallup Poll found the NewsHour America’s “most believed” program. We are honored to preserve this monumental series and include it in AAPB.

Today, we’re pleased to update you on our project progress, specifically regarding the new digitization project workflows that we have developed and implemented to achieve the goals of the project.

The physical work digitizing the NewsHour tapes and ingesting the new files across the project collaborators has been moving forward since last fall and is now healthily and steadily progressing. Like many projects, ours started out as a great idea with many enthusiastic partners – and that’s good, because we needed some enthusiasm to help us sort out a practical workflow for simultaneously tracking, ingesting, quality checking, digitally preserving, describing, and making available at least 7512 unique programs!

In practice the workflow has become quite different from what the AAPB experienced with our initial project to digitize 40,000 hours of programming from more than 100 stations. With NewsHour, we started by examining the capabilities of each collaborator and what they already intended to do regarding ingestion and quality control on their files. That survey identified efficiencies: The Library of Congress (the Library) took the lead on ingesting preservation quality files and conducting item level quality control of the files. WGBH focused on ingestion of the proxies and communication with George Blood, the digitization vendor. The Library uses the Baton quality control software to individually pass or fail every file received. At WGBH, we use MDQC from AVPreserve to check that the proxy files we receive are encoded in accordance with our desired specifications. Both institutions use scripts to validate the MD5 file checksums the vendor provides us. If any errors are encountered, we share them in a Google Sheet and WGBH notifies the vendor. The vendor then rectifies the errors and submits a replacement file. Once approved, it is time for WGBH to make the files accessible on the AAPB website.

I imagined that making the files accessible would be a smooth routine – I would put the approved files online and everything would be great. What a nice thought that was! In truth, any one work (Global Unique Identifier or “GUID” – our unique work level identifier) could have many factors that influence what actions we need to be taken to prepare it to go online. When I started reviewing the files we were receiving, looking at transcripts, and trying to keep track of the data and where various GUIDs were in the workflow, I realized that the “some spreadsheets and my mind” system I intended to employ would result in too many GUIDs falling through the cracks, and would likely necessitate far too much duplicate work. I decided to identify the possible statuses of GUIDs in the NewsHour series and every action that would need to be taken to resolve each status. After I stared at a wall for probably too long, my coworkers found me with bloodshot eyes (JK?) and this map:

newshourworkflowwall
(It seems appropriate that the fire alarm is in this picture of the map)

Some of the statuses I identified are:

  • Tapes we do not want captured
  • Tapes that are not able to be captured
  • GUIDs where the digitization is not yet approved
  • GUIDs that don’t have transcripts
  • GUIDs that have transcripts, but they don’t match the content
  • GUIDs that are not a broadcast episode of the NewsHour
  • GUIDs that are incomplete recordings
  • GUIDs that need redacting
  • GUIDs that passed QC but should not have

Every status has multiple actions that need to be taken to resolve that issue and move the GUID towards being accessible. The statuses are not mutually exclusive, though some are contingent on or preclude others. It was immediately clear to me that this would be too much to manually track and that I needed a centralized automated solution. The system would have to allow simultaneous users and would need to be low cost and maintenance. After discussions with my colleagues, we decided that the best solution would be a Google Spreadsheet that everyone at the AAPB could share.

Here is a link to a copy of the NewsHour Workflow workbook we built. The workbook functions through a “Master List” with a row of metadata for every GUID, an “Intern Review” phase worksheet that automatically assigns statuses to GUIDs based on answers to questions, workflow “Tracker” sheets with resolutive actions for each status, and a “Master GUID Status Sheet” that automatically displays the status of every GUID and where each one is in the overall workflow. Some actions in trackers automatically place the GUID into another tracker – for instance, if a reviewer working on an episode for which we don’t have a transcript in the “No Transcript Tracker” and that GUID is identified as having content that needs to be redacted, the GUID is automatically placed on the “Redaction Tracker”.

A broad description of our current project workflow is: All of the project’s GUIDs are on the “Master GUID List” and their presence on that list automatically puts them on the “Master GUID Status Sheet”. When we receive a GUID’s digitized file, staff put the GUID on the “Approval Tracker”. When a GUID passes both WGBH and the Library’s QC workflows it is marked approved on the “Approval Tracker” and automatically placed on the “Intern Review Sheet.” Interns review each GUID and answer questions about the content and transcript, and the answers to those questions automatically place the GUID into different status trackers. We then use the trackers to track actions that resolve the GUIDs statuses. When a GUID’s issues in all the status trackers are resolved, it is marked as “READY!” to go online and placed in the “AAPB Online Tracker.” When we’ve updated the GUID’s metadata, put the file online, and recorded those actions in the “AAPB Online Tracker,” the GUID is automatically marked complete. Additionally, any statuses that indicate a GUID cannot go online (for instance, a tape was in fatal condition and unable to be captured) are marked as such in the “Master GUID Status Sheet.” This function helps us differentiate between GUIDs that will not be able to go online and GUIDs that are not yet online but should be when the project is complete.

Here is a picture of a portion of the “Master GUID Status Sheet.”’

newshourworkflowstatus
Right now there are a lot of red GUIDs in this picture of the Master sheet, but in the coming months they will be switching to green!

The workbook functions through cross-sheet references and simple logic. It is built with mostly “IF,” “COUNTIF,” and “VLOOKUP” statements. Its functionality depends on users inputting the correct values in action cells and confirming that they’ve completed their work, but generally those values are locked in with data validation rules and sheet permissions. The workflow review I had conducted proved valuable because it provided the logic needed to construct the formulas and tracking sheets.

Building the workflow manager in Google Sheets took a few drafts. I tested the workflow with our first few NewsHour pilot digitizations, unleashed it on a few kind colleagues, and then improved it with their helpful feedback. I hope that the workbook will save us time figuring out what needs to happen to each GUID and will help prevent any GUIDs from falling through the cracks or incorrectly being put online. Truthfully, the workbook struggles under its own weight sometimes (at one point in my design I reached the 2,000,000 cell limit and had to delete all the extra cells spreadsheet programs always automatically make). Anyone conducting a project any larger or more complicated than the NewsHour would likely need to upgrade to a true workflow management software or a program designed to work from the command line. I hope, if you’re interested, that you take some time to try out the copy of the NewsHour Workflow workbook! If you’d like more information, a link to our workflow documentation that further explains the workbook can be provided.

This post was written by Charles Hosale, WGBH.

AAPB NDSR Resources Round-up

 

In 2015, the Institute of Museum and Library Services awarded a generous grant to WGBH on behalf of the American Archive of Public Broadcasting (AAPB) to develop the AAPB National Digital Stewardship Residency (NDSR). Through this project, we have placed seven graduates of master’s degree programs in digital stewardship residencies at public media organizations around the country.

AAPB NDSR  has already yielded dozens of great resources for the public media and audiovisual preservation community – and the residents aren’t even halfway done yet! As we near the program’s midpoint, we wanted to catch you up on the program so far.

We started off in July 2016 with Immersion Week in Boston, which featured presentations on the history of public media and the AAPB, an overview of physical and digital audiovisual materials, an introduction to audiovisual metadata, and instructional seminars on digital preservation workflows, project management, and professional development. Attendees also participated in a full-day session on “Thinking Like a Computer” and a hands-on command line workshop.

Several sessions from Immersion Week were filmed by
WGBH Forum Network, including:

In August 2016, the residents dispersed to their host stations, and began recording their experiences in a series of thoughtful blog posts, covering topics from home movies to DAM systems to writing in Python.

AAPB NDSR blog posts to date include:

Digital Stewardship at KBOO Community Radio,” Selena Chau (8/9/16)

Metadata Practices at Minnesota Public Radio,” Kate McManus (8/15/16)

NDSA, data wrangling, and KBOO treasures,” Selena Chau (8/30/16)

Minnesota Books and Authors,” Kate McManus (9/23/16)

Snapshot from the IASA Conference: Thoughts on the 2nd Day,” Eddy Colloton (9/29/16)

Who just md5deep-ed and redirected all them checksums to a .csv file? This gal,” Lorena Ramirez-Lopez (10/6/16)

IASA Day 1 and Voice to Text Recognition,” Selena Chau (10/11/16)

IASA – Remixed,” Kate McManus (10/12/16)

Learning GitHub (or, if I can do it, you can too!)” Andrew Weaver (10/13/16)
Home Movie Day,” Eddy Colloton (10/15/16)

Snakes in the Archive,” Adam Lott (10/20/16)

Vietnam, Oral Histories, and the WYSO Archives Digital Humanities Symposium,” Tressa Graves (11/7/16)

Archives in Conversation (A Glimpse into the Minnesota Archives Symposium, 2016),” Kate McManus (11/15/16)

Inside the WHUT video library clean-up – part 1: SpaceSaver,” Lorena Ramirez-Lopez (11/21/16)

Is there something that does it all?: Choosing a metadata management system,” Selena Chau (11/22/16)

Inside the WHUT video library clean-up – part 2: lots of manual labor,” Lorena Ramirez-Lopez (12/20/16)

Just Ask For Help Already!” Eddy Colloton (12/22/16)

August also kicked off our first series of guest webinars, focusing on a range of topics of interest to audiovisual and digital preservation professionals. Most webinars were recorded, and all have slides available.

AAPB NDSR webinars to date include:

Metadata: Storage, Modeling and Quality,” by Kara Van Malssen, Partner & Senior Consultant at AVPreserve

Public Media Production Workflows,” by Leah Weisse, WGBH Digital Archive Manager/Production Archival Compliance Manager (slides)

Imposter Syndrome” by Jen LaBarbera, Head Archivist at Lambda Archives of San Diego, and Dinah Handel, Mass Digitization Coordinator at the NYPL (slides)

Preservation and Access: Digital Audio,” by Erica Titkemeyer, Project Director and AV Conservator at the Southern Folklife Collection (slides)

Troubleshooting Digital Preservation,” by Shira Peltzman, Digital Archivist at UCLA Library (slides)

Studs Terkel Radio Archive: Tips and Tricks for Sharing Great Audio,” by Grace Radkins, Digital Content Librarian at Studs Terkel Radio Library (slides)

From Theory to Action: Digital Preservation Tools and Strategies,” by Danielle Spalenka, Project Director of the Digital POWRR Project (slides)

Our first two resident-hosted webinars (open to the public) will be happening this month! Registration and more info is available here.

The residents also hosted two great panel presentations, first in September at the International Association of Sound and Audiovisual Archives Conference, and in November at the Association of Moving Image Archivists Conference. The AMIA session in particular generated a lot of Twitter chatter; you can see a roundup here.

To keep up with AAPB NDSR blog posts, webinar recordings, and project updates as they happen, follow the AAPB NDSR site at ndsr.americanarchive.org.

Celebrating National Radio Day

BPT_NRD_Square_Graphic-01

August 20 is National Radio Day!

National Radio Day “is a time to honor one of the most longstanding electronic media and its role in our lives.” To celebrate National Radio Day, we have added more than 500 historic radio programs to the American Archive of Public Broadcasting (AAPB) Online Reading Room, now accessible from anywhere in the United States. With these new additions, there are now more than 14,000 historic public radio and television programs available for research, educational and informational purposes in the Online Reading Room.

The following radio series are now available for listening online:

Cross Currents from Vermont Public Radio (1978 – 1980)
Cross Currents is a series of recorded lectures and public forums exploring issues of public concern in Vermont.

Hit the Dirt from WERU Community Radio (1990s)
Hit the Dirt is an educational show providing information about a specific aspect of gardening each episode.

Herbal Update from WERU Community Radio (1990s)
Herbal Update is an educational show providing information about the health and nutrition benefits of a specific herb each episode.

The following series were contributed to the AAPB by the University of Maryland’s National Public Broadcasting Archives as part of the National Association of Educational Broadcasters (NAEB) collection. NAEB was established in 1934 from a precursor organization that formed in 1925. In 1951, NAEB established a tape duplication exchange system in Urbana, IL, where programs produced by university radio stations across the country were copied and distributed to member stations, an early networking scheme that influenced the history of later public radio and television systems. The more than 5,500 NAEB radio programs available in the AAPB were produced between 1952 and 1976, and include radio documentaries, coverage of events (hearings, meetings, conferences, and seminars), interviews, debates, and lectures on public affairs topics such as civil rights, foreign affairs, health, politics, education, and broadcasting.

WRVR | Riverside Church
The American People  (1964 – 1965)
Series examines contemporary issues through interviews and personal essays.

Automation and Technological Change (1964)
Documentary series on automation and technological change.

Conversations on Public Relations (1967)
Series of informal half-hour discussions on the nature and ethics of public relations.

WMUK | Western Michigan University
Where Minds Meet (1962 – 1963)
Discussions explore world of speech, conducted by Professors John Freund and Arnold Nelson of Western Michigan University.

WMUB | Miami University
As We See It: Vietnam ‘68 (1968)
Lecture/debate series on aspects of the war in Vietnam and Southeast Asia.

WBFO | SUNY Buffalo
The Only Way to Fly (1968)
Series about the safety aspects of commercial airlines and commercial air transport in the United States.

WUOM | University of Michigan
News in Twentieth Century America (1959)
A series of documentaries on the gathering, writing and dissemination of news in this country today, compiled from interviews with journalists.

Medical Research (1960)
Series about behavioral sciences and medicine.

Behavioral Science Research (1961)
Documentary series on the role of behavioral sciences.

The Challenge of Aging (1961)
Nine segments on aging within the series Behavioral Science Research.

Aspects of Mental Health (1962)
Documentary series about behavioral sciences and medicine research.

Wingspread Conference (1966)
Three programs of the major speeches given at the Wingspread Conference on Educational Radio as a National Resource, held Sept. 26-28, 1966, at Johnson Foundation in Racine, Wisconsin.

The American Town: A Self-Portrait (1967)
Historical documentary series drawn from the recollections of senior citizens in a variety of American towns.

The Truth about Radio (1967)
Interview by Richard Doan with Edmund G. Burrows, chairman of NAEB and manager of WUOM at U. of Michigan. He discusses his station and educational radio and television programming.

Public Broadcasting Act of 1967 (1967)
Panel discussion on Public Broadcasting Act of 1967.

University of Iowa
Russia Revisited (1959)
An informal talk by John Scott, assistant to the publisher of Time, Life and Fortune, recounting his recent trip to the Soviet Union.

Space Science Press Conference (1962)
Press conference at Univ. of Iowa at conclusion of 1962 Space Science Summer Study Program, hosted by National Aeronautic and Space Administration.

University of Florida
Revolution in Latin America (1961)
Documentary series on problems facing Latin America.

University of Denver
Indian Country (1957)
The problems of social adjustment in the attitudes and through the words of the modern American Indian.

Michigan State University
The Tender Twigs (1958)
Discussions of problems affecting today’s youth: mental health, delinquency, crime, social pressures; it considers solutions.

Hold Your Breath (1963)
Series about the impacts of air pollution.

The Music Makers (1965 – 1966)
Distinguished Americans discuss their profession of music, from composition to criticism; the business of music and its current place in our national culture.

San Bernardino Valley College
Politics in the Twentieth Century (1957)
Moderated panel discussion on American political affairs in mid-20th century.

Man is not a Thing (1958)
Discussion of the discoveries and errors of Sigmund Freud and his impact on the American family, politics and religion.

WGUC | University of Cincinnati
Interview with Dr. Albert B. Sabin (1961)
Interview with Dr. Albert B. Sabin, developer of the anti-polio vaccine.

Metaphysical Roots of the Drama (1968)
Lectures given at the Hebrew Union College-Jewish Institute of Religion at Cincinnati by Robert Brustein, Dean of the Yale School of Drama.