The AAPB team is excited to share our Cataloging Guidelines. For those of you who aren’t versed in librarian speak, cataloging is the process of creating quality metadata for materials, such as those in the AAPB collection, in order to improve searching and discoverability — a process beyond basic inventory level gathering. Information typically captured in cataloging can include titles, dates, subject, descriptions, genre information, creators and contributors, rights statements, etc. This set of guidelines is a working document that is being updated as needed during the course of our cataloging efforts.
The Cataloging Guidelines are based on PBCore, a metadata schema for describing audiovisual collections, and tailored to meet the needs of the AAPB project.
The Guidelines consist of required and recommended elements and attributes, or fields, to be recorded, their definitions, rules for data input, and examples of compliant data. Additionally, there is an appendix with controlled vocabularies, limited to terms being used in the AAPB, for:
Asset Date Type
Genres (describing the format of the content and topic of the content)
These controlled vocabularies, subsets of PBCore and other vocabularies, reflect the types of content that we expect will be included in the collection. As we become more familiar with the actual content, we may expand these vocabularies to better fit the range of our collection.
That said, we may occasionally update this document. Anytime you access the guidelines, be sure to check the date it last was updated, which will be recorded in the title page of the Guidelines.
When the AAPB started, most of the 2.5 million records gathered were inventory level records, mainly created by looking at tape labels of assets. Because playback was difficult, there wasn’t an easy way to look at or listen to content to create detailed records; that is, until the tapes were digitized. Now that we have 40,000 hours of content easily accessible as digital files, cataloging efforts can begin in earnest.
All of this digitized content will soon be available to researchers on location at WGBH and the Library of Congress; however, without cataloging, many of the assets will remain hidden because of the sparse or non-existent descriptive information.
AAPB metadata specialist Sadie Roosa has just started working with a team of library and information science student interns to catalog the digitized content. Because of the size and scope of the collection, we decided to focus on “minimally cataloging” all the content before fully cataloging it. We have defined a minimally cataloged record as one that contains an Asset Type, Title, Short Description, and Genre, as well as Asset Date, Creators, Contributors, and a Rights Statement, if known. Following this method, we’ll be able to minimally catalog all of the digitized assets much more quickly than we’d be able to fully catalog just a portion of them.
In addition to following the Cataloging Guidelines, catalogers have also reviewed the Listening and Viewing Guidelines, which outline our recommended practices for cataloging audio and video without having to listen or view assets all the way through in real time.
In the future we hope to engage volunteers through crowdsource efforts, through which anyone would be able to watch/listen to the content in the archive and add data about it. Crowdsourcing cataloging efforts will help us obtain richer descriptive data to enhance discoverability across the collection. Since there is so much content in the archive, our limited resources prohibit us from being able to tackle it all on our own. We will definitely keep you posted on this opportunity!
If you have any questions about the guidelines or the AAPB cataloging efforts, please feel free to contact Sadie Roosa at sadie_roosa [at] wgbh [dot] org.
The following is a post by Karen Cariani, Director of the WGBH Media Library and Archives and Project Director for the American Archive of Public Broadcasting.
This past weekend a group of dedicated PBCore enthusiasts met prior to the Code4Lib conference in a suburban Portland, Oregon house for two days. It was a healthy mix of developers, archivists, and managers. The goal was to discuss how to move PBCore towards development of an RDF ontology. With the desire to fully utilize repositories like Fedora 4 and the desire to store data as RDF streams, users of PBCore were beginning to talk about building a PBCore ontology.
Before I continue, I want to sincerely thank everyone else who participated in the hackathon:
Jack Brighton, Illinois Public Media
Glenn Clatworthy, PBS
Laurence Cook, MetaCirque
Casey E. Davis, WGBH
Jean-Pierre Evain, EBU
Rebecca Fraimow, WGBH
Peggy Griesinger, Museum of Modern Art (MOMA)
Rebecca Guenther, New York University
Julie Louise Hardesty, Indiana University
Cliff Ingham, City of Bloomington
Andrew Myers, WGBH
Adam Wead, Penn State
PBCore is a metadata schema for audiovisual materials. Its original development in 2004 was funded by the Corporation for Public Broadcasting, with a goal of creating a metadata standard for public broadcasters to share information about their video and audio assets within and among public media stations. Since its conception, PBCore has been adopted by a growing number of audiovisual archives and organizations that needed a way to describe their archival audiovisual collections. The schema has been reviewed multiple times and is currently in further development via the American Archive of Public Broadcasting and the Association of Moving Image Archivists (AMIA) PBCore Advisory Subcommittee.
A number of PBCore users contribute to and are part of the Project Hydra community, a collaborative, open source effort to build digital repository software solutions at archives institutions. Hydra is built on a framework that uses Fedora Commons as the repository for storing metadata. Many users are seeking to update their Fedora repositories to the latest version (Fedora 4), which provides a great opportunity to develop an RDF data structure. If PBCore had an RDF ontology, it would be easier for PBCore users to take full advantage of Fedora 4 capabilities in managing data and encourage adoption of Fedora 4. In addition, managing data in RDF allows much more flexibility for data relationships and linking data to other repositories.
Knowing how much work building an ontology can be, the hope was to build upon existing work that is already well established. In particular, the EBUCore ontology is quite developed and established. EBUCore was developed from the need of European broadcasting community to express audiovisual materials in common data structures to allow for easier sharing. There seemed no need to develop something that already exists and does much of what we need it to do. In fact, the uses of EBUCore and PBCore are so similar we began to wonder why the two exist separately and we are not joining forces to develop one standard. Certainly in this day and age of limited resources and time, collaborating is more productive than working at odds with each on different but similar paths.
We were graced with the presence of Jean-Pierre Evain from the European Broadcasting Union (EBU) He clearly showed us what EBUCore did, how it was so similar to PBCore, and how far they had gotten with an RDF ontology. The gap between EBUCore and PBCore turned out to be not so far apart. Perhaps bridging that gap was easier than building a brand new ontology based on PBCore. Within a day, many of the issues had been identified, or it felt doable in a reasonable time frame with a solid workplan in place.
The group quickly came to the decision to not start from scratch by building a PBCore ontology, but try to build a bridge between PBCore and EBUCore so PBCore adopters could use the EBUCore ontology. We even talked about a new name for this new collaborative schema.
However, it was fully recognized that current PBCore users would need a path for migration, and some would not be interested in using an RDF ontology and therefore migrating. So how do we manage this community of diverse needs? There is certainly more work to do within the PBCore community around communication and education. And the PBCore community should speak up about this idea.
I am always amazed at how productive it is to gather together, face to face, dedicated people. If not for setting aside the weekend to focus on this issue, the work and decision would have lagged for months through bi-weekly one-hour phone calls and virtual meetings. The group more or less self organized and stayed focused with great guidance from Casey Davis. By the end, most everyone was in github making XSLT mappings from PBCore to EBUCore, as we completed a gap analysis (still in progress). We finished the day with a plan to move forward and a group dinner.
The PBCore Schema Team is working on an updated version of PBCore (PBCore 2.1), the changes of which will consist of minor tweaks and bug fixes, and is expected to be released in March 2015. The group thought that this work should continue, until 2.1 is released. At this point work on PBCore XML schema should freeze and efforts will go into aligning with EBUCore – making sure elements can map across, that we all understand the mapping, and building tools to help with the mapping. The PBCore community needs to comment about this direction. Does it make sense? What are your concerns? The group that met is by no means the end of the discussion.
In the end, it was worth it. For the cost of some snacks, and a home made pasta dinner, we had 11 people from across the country working on a solution, come to a consensus, and enjoy the camaraderie. I really want to thank everyone who participated and took the time to join us. It was a weekend after all.
The hackathon notes are documented here: http://wiki.code4lib.org/PBCore_RDF_Hackathon
The following is a guest post by Emily Halevy, Director of Media Management Sales at Crawford Media Services. In this blog post, Emily records her interview with Chip Stephenson, Crawford Project Manager, and David Braught, Crawford Logistics Coordinator. Crawford and the AAPB Project Team recently completed the American Archive of Public Broadcasting Digitization Project, funded by the Corporation for Public Broadcasting. Crawford’s role in the project was the coordination and digitization of approximately 40,000 hours of public broadcasting video and audio archival content, as well as the transcoding of approximately 20,000 born-digital files, contributed by more than 100 stations and organizations nationwide!
Now that the digitization is complete, the files will be preserved and made accessible as much as possible through the American Archive of Public Broadcasting, and the AAPB Project Team at WGBH and the Library of Congress is excited to begin working on these efforts. Continue reading below for an account of Crawford’s experience throughout the AAPB digitization project.
Happy New Year, Everyone! I’m delighted to be a guest blogger for the American Archive of Public Broadcasting, once again! As we come to the end of this migration project, I thought this time it would be fun to sit down with Chip Stephenson and David Braught and discuss some of the successes and challenges this project brought. It’s also a great time to reflect on the importance and value of the project as a whole.
Emily: What’s the first thing that comes to mind now that the project is over?
Chip: It’s over? What? We’ve been living it for over three years!
David: It’s hard to believe it’s over.
Chip: Well, it’s not quite over yet. We’re still wrapping- the engineers are finalizing data, project management is compiling spreadsheets and financials. But we’re almost there.
David: I’ve never worked on anything like this before- the logistics- everything.
Chip: Logistics of shipping, receiving, and accounting for all of the content. And then the amount of data, file configurations, bags, copying files for the individual stations. Over 125 different spreadsheets- audio, video, born digital, plus over 100 stations, which sometimes had multiple spreadsheets. It was more like 100 individual projects than one big project.
David: And every station had its own set of quirks to deal with.
Chip: Every station required multiple phone calls and emails to set things up. It’s an amazing project. The stations were all great to work with and they all had an amazing amount of work to do to make it happen. Some like New Jersey Network and University of Maryland had an incredible amount of content.
David: I’m sure the stations wanted to kill me with the number of emails about checking their files so we could delete them from our system.
Chip: Our engineers were amazing.
David: I can’t say enough good things about our engineers. Guy (Boyd) was able to adapt and push through data, JP (Lesperance) handling all of the born digital, Nathan (Lewis) re-transcoding every single proxy to meet the requirements for the Library of Congress, Herve (Bergeron) and Dr. Dave (Wolaver) switching out and repairing decks.
Chip: And don’t forget the thousands of tapes baked and repaired by Dr. Dave as well.
David: It really was a tremendous team effort.
Emily: We really do have a great team, don’t we? And we can’t leave out the migrators.
Chip: At the peak we had 3 audio migrators running 5 days a week, 24 hours a day. We had 5 video migrators digitizing content, with one pod running 5 days a week, 16 hours a day, and the other pod running 24 hours, 5 days a week. There were even many months running 7 days a week. There were also others just doing QC. And others handling born digital content, copying files into working storage, and then checking to be sure they worked and renaming and creation of the proxy file.
David: Haha! So what was the question again?
Emily: The question was “What’s the first thing that comes to mind now that’s it is over?”
David: Evidently everything! Haha!
Chip: You never understand the true complexity of the project until you look back and have time to reflect. Before the project even started, during a visit by Stephanie (Sapienza) and Caitlin (Hammer) from CPB, we were reviewing the process and we all started to realize how complex the overall project was going to be. Caitlin kept asking me, “How are you going to do this?” And my answer was “One station at a time.” Thinking about all of it at once was just overwhelming. So David and I sat down and thought about how we wanted to parse this project out. How do we want to think about this on a daily and weekly basis? So we came up with an operational spreadsheet, which then became two spreadsheets, which then became multiple spreadsheets. And there were times over the past year when we just took a deep breath and said, “Ok. 40 stations down, 60 to go.”
David: It was a constant balancing act. Nothing ended up being accurate in terms of tape counts. More audio, less video, double ¾”, which is more time consuming. We had to rearrange our thinking and the pods on a regular basis. And adjust accordingly.
Chip: But working with CPB, then the transfer of the host to WGBH went incredibly smooth. We had some discussions about what they thought and what we thought, but it was very easy moving through issues and problems as they came up.
David: And we always got great support from CPB and then WGBH.
Emily: What turned out to be the most challenging aspect of the project? (If you could name one thing.)
Chip: For me-
David: Oh! Born digital.
Chip: For me it was the born digital for a couple of reasons…
David: Well you take the issues we had with receiving the physical assets and multiply that times a million.
Emily: The born digital was one of the “orphan items” that wasn’t completely fleshed out when we got started.
Chip: We started the born digital about 8 months later than we’d hope and there were many more individual steps dealing with the stations and how they’d build their drive and name their files and create their spreadsheets. So we had to develop ways to review the file names and correct them to make them legal- spaces had to be replaced with underscores, no illegal characters, they all must have file extensions, etc. Then we had to combine GUIDs for the project with the individual station’s file name. When you do this with thousands and thousands and thousands of files, it becomes complex. And then we had to create proxy files for all of them. And the process you use to create a proxy of one file type might be different from another file type. And then all of the files needed to be QC’d and compared to the master file. Some stations, when they built their initial hard drives, had a large amount of bad files. Sometime up to 50% of the files were bad. And we had to give the stations time to rebuild. Remember the whole purpose of this project was to migrate, capture and acquire as many of these files as possible. Migrate as much as we could within the time frame we had to work with and that time frame was closing in on us.
Emily: Again- another area where we got great support from Casey and the American Archive team.
Were there any hurdles that turned out to be no big deal?
David: Just getting the content here.
Chip: In the beginning, logistics were slow. We were still trying to figure out the most efficient way to get stuff here.
David: And at the start the stations didn’t really know what they were getting into, but honestly, it went smooth.
Chip: We started to realize- let’s not worry about having too many tapes here, let’s worry about not having enough.
David: KQED for instance, they were ready to ship immediately. So we told Robert (Chehoski), “Alright, let’s bring it on!”
Chip: At one point, we had the equivalent of 65 pallets of assets in our crypt. And of course it was interesting shipping things from Alaska. But every single station helped us find a way to get their assets to us. And every single station, despite issues (time of year, reduction of staff, etc.) they all worked their butts off. They all worked really hard to pull, barcode, pack and ship their tapes to us and make this a success. Between dozens of Fed Ex shipments, three semi-truck runs across the country and an airline delivery, we managed to get everything here and under budget!
Emily: What did you learn from the project?
Chip: Efficiency. Efficiency. Efficiency. Rethink everything you do and realize there might be a better way to do something. And if it sounds like there might be, try it. When David and I sat down and put a plan together we realized quickly we were too rigid. We needed to be flexible. We had to find compromises throughout the project. There were many times we’d get off the phone with a station and say to each other, “How is this going to work?” We could not be afraid to come up with new solutions for the stations. We had to be receptive to their ideas, especially when it came to timing.
David: It didn’t do any good to stick to a timeline that wouldn’t work for them.
Chip: Initially, our idea was to do all the beta tapes together, then all the DVCPro tapes together, but we ended up digitizing several formats simultaneously.
David: Sometimes even 6 video tape formats simultaneously.
Chip: We had a few stations that had only one or two formats, but most of them had a little of everything.
David: Halfway through the project we realized we were dealing with 20 stations at one time- shipping tapes, migrating, moving data, shipping delivery drives, bagging and backing up file data, literally tracking upwards of 30 stations in a given period.
Chip: So being as flexible as possible was important, because no matter how well you thought you had it figured out, it changed on you. And, honestly, at first we fought it, but then we realized that it just wasn’t going to work. So stop fighting it. We had to maintain the flow of tapes required in order to meet the deadline, and being rigid was not going to get us there.
David: I don’t know if a day went by without asking Dr. Dave to switch out tape decks to accommodate our revised workflows.
Emily: What was your favorite “found” item from the project?
David: For me, it was the famous Akira Kurosawa footage. One of our migrators found that the tape label didn’t match the content. It was labeled as a cooking show, but turned out to be an interview with Kurosawa and George Lucas and Francis Ford Coppola. I was like, “Give me that tape!” It turned out to be a program that was thought lost for many years at the station.
Chip: For me, at one point it was all hands on deck, so I had to QC several hundred files. The content just happened to be all the history of New York City and Boston and The Revolutionary War. WNET had a whole series on the history of Manhattan dating back to the revolution. Growing up in that area, I knew a lot of the city’s history, but I never really knew the intricate history of Manhattan and the Bronx and Queens. I didn’t know that Wall Street really was a wall. I learned there’s a fence in Bowling Green Park, which still exists to this day, that was erected in 1770 to protect a statue of George III. The history in this collection is amazing. Meanwhile, I was supposed to be spending 2-3 minutes QC’ing these files and 20 minutes later I had to stop myself and get back to work!
David: That happened all the time!
Chip: The programming is so great! From arts and symphonies to theatricals, history- everything you can think of from all across the country.
Emily: Hence the “American Archive” project!
Chip: Now that the project is coming to an end, I’m just dealing with the data and the files. We did massive shipments out in October and November. It was amazing. The last truck run went up in first week of December. Right now we’re just pulling the little tidbits and reviewing everything and making sure we crossed all of our Ts and dotted all of our Is. We’re shipping out LTO tapes to the Library of Congress. And I’m a little sad it’s come to an end. On the other side, it’s a great sense of accomplishment. A year of planning and discussions. Two years of migration. Then changing all of the planning several times throughout. It all comes back to flexibility. Understanding you can’t be rigid.
Yesterday, Louisiana Public Broadcasting and the Louisiana Secretary of State’s Office officially launched the Louisiana Digital Media Archive (LDMA), the home of the Louisiana Public Broadcasting Digital Collection and the Louisiana State Archives Multimedia Collection! This is the first project in the nation to combine the media collections of a public broadcaster and a state archives.
LPB has been a participant in the AAPB since its conception, digitizing civil rights related material in the American Archive Pilot Project (AAPP), inventorying their collection during the American Archive Content Inventory Project (AACIP), and digitizing more than 500 hours as part of the most recent AAPB digitization project funded by the Corporation for Public Broadcasting. Since then, LPB archivist Leslie Bourgeois and her team have continued cataloging their archival material and have digitized more than 1,500 videos including interviews with Louisiana civil rights heroes, notable political figures, war heroes, artists and literary icons. Thousands of other videos will be added to the collection in the coming years.
The following is a guest post by Producer/Writer Elizabeth Deane.
Every Picture Tells a Story had its premiere in the Great Hall of the Library of Congress in February, 2014, at the launch of the American Archive of Public Broadcasting (AAPB).
Sound and images from six decades of public media filled that stately space, giving the audience a six-minute tip-of-the-iceberg glimpse at some of the treasures that will be part of the AAPB collection.
We’d made the film drawing mostly on media that had already been digitized by the AAPB — the first wave of stories that I had come to think of as locked away, imprisoned on ¾” videotape, VHS and Betacam tapes, ¼” audio tape, DVCPRO and more —the dreaded “obsolete formats” that can be such a barrier to access.
Few stations maintain playback machines for them any more, and the few in existence can be tricky to maintain and possibly risky to use; if they’re not working properly they can damage the footage, sometimes irrevocably.
Worse, as Every Picture points out, old videotapes can deteriorate, and the images are lost forever.
I found it heartening to know that even as the launch ceremony unfolded on that wintery day in Washington, trucks containing thousands of video and audio tapes from public stations all over the country were rolling towards Atlanta, where Crawford Media Services would create multiple digital versions of each tape — television and radio shows, raw footage, even outtakes and experiments — in science, natural history, drama, children’s programs, arts, education, history, local lore, news, and more — the entire broad and inspiring realm of public media programming.
Master copies will be kept safe for future generations at the Library of Congress, with access copies going to WGBH to be added to the growing AAPB database, and made available on a forthcoming website, when rights permit, to a national audience – researchers and scholars, filmmakers, educators, students, and kids of all ages. In addition, all of the digitized materials will be made available to researchers who visit WGBH and the Library’s Moving Image and Recorded Sound Research Centers.
The film is a celebration of the American Archive of Public Broadcasting at its moment of birth, just beginning to tap into its vast collection. “As of this posting close to a year later, all of it has been digitized,” says AAPB Project Manager Casey Davis. “But much of it came with only a brief description. Now we have the pleasure of watching and listening, so we can improve our records and make this remarkable collection more discoverable for all.”
Watch for the new AAPB website, set to launch with the first batch of records in April 2015, with video and audio to follow in October.
WGBH, in collaboration with the Library of Congress, has been awarded a grant from the Council on Library and Information Resources (CLIR) to lead the National Educational Television (NET) Collection Catalog Project, the first project to build upon the American Archive of Public Broadcasting initiative. This project will involve the creation of a national catalog of records documenting the existence and robust description of titles distributed by NET, public media’s first national network and its earliest and among its most at-risk content.
The NET Collection includes 8,000–10,000 programs produced from 1952-1972, a period marked by societal and cultural shifts of great importance. Public television itself changed significantly during this time. From its early dedication to childhood and adult education, NET by 1963 transitioned to serving adult audiences with documentaries exploring citizenship issues of urgency and cultural programming dedicated to the arts, humanities and sciences.
The NET Collection is an invaluable record of non-commercial TV programming on public affairs, social issues, arts, culture, the humanities, science and education. NET programs, most of which were created by 30 public television stations across the US, often covered topics of international relevance. During this time period, public affairs documentaries and discussions explored the Civil Rights Movement, the Vietnam War, the Cuban Missile Crisis, poverty, student activism and issues such as radicalism, privacy, the environment, the elderly and welfare. The NET Collection includes reporting on the Vietnam War, interviews with American and Vietnamese leaders, public hearings and a controversial report from North Vietnam. Arts and cultural programming includes interviews with artists, poets, writers, filmmakers, actors and dancers. Science Reporter and other series cover issues such as the latest in medical advances, space exploration and the progressive steps that led to the 1969 moon launch. Educational programming includes materials for classroom use, innovative children’s programming and adult education programs. The catalog will help to prioritize titles for preservation and will make this hidden collection known to scholars, researchers, and the public.
Few NET titles are known to scholars because they are in unprocessed collections. WGBH, WNET, Indiana University and the Library of Congress hold the largest collections of NET materials, while copies are known to exist at some of the original producing stations. Currently, programs are scattered, descriptions are limited and in obscure sources, and there is no publicly accessible catalog of titles.
With the NET Collection inventoried and made accessible, television studies scholars can embark on in- depth studies of NET, access its innovative series, compare commercial and noncommercial television, and examine programs that deal with bias in newscasts, effects of television on politics, effects on children, and federal involvement in public broadcasting, with perspectives from FCC and NAB officials, network executives, critics and scholars.
A huge chunk of the project will be accomplished thanks to CLIR’s funding; however, the project team envisions more work that could be undertaken to enhance the NET collection catalog. This includes the incorporation of Indiana University’s collection of titles into the catalog and expanding the scope of description and access activities; the project team will be seeking additional funds for this work. The AAPB team will keep stations and others updated as we move forward with the project. We’re looking forward to being in touch with all NET-era stations in the next several months!
Happy Thanksgiving! We hope everyone’s celebrations will be wonderful this year.
In remembrance of Thanksgivings past, we want to share this historic audio clip with you today. On November 28th 1963, newly sworn in president Lyndon B. Johnson, delivered a speech honoring president John F. Kennedy, who had died just a week earlier. His message is one of thanks and hope, as he details the coming together of the American people and the people of the world, in the face of such a tragedy. As part of his address, he says,
“A great leader is dead. A great nation must move on. Yesterday is not ours to recover, but tomorrow is ours to win or to lose. I am resolved that we shall win the tomorrows before us. So I ask you to join me in that resolve, determined that from this midnight of tragedy we shall move toward a new American greatness. More than any generation before us, we have cause to be thankful, so thankful, on this Thanksgiving Day. Our harvests are bountiful, our factories flourish, our homes are safe, our defenses are secure. We live in peace. The goodwill of the world pours out for us.
Listen to the entire speech:
This audio came to the AAPB as part of a WGBH radio broadcast that was recorded onto ¼ inch audio tape. It was digitized as part of the 40,000 hours of digitization funded by CPB.
This post was written by Sadie Roosa, AAPB team member at WGBH.
These records were created during the CPB-funded and WGBH-managed American Archive Content Inventory Project (AACIP), an inventory effort to gather item-level PBCore data from legacy at-risk audiovisual assets obtained from public media stations across the nation: from KEXP in Seattle to Unalaska Community Broadcasting to Ozarks Public Broadcasting. Public media stations then selected video and audio from their own collections for digitization, many local programs never seen before except by immediate geographic communities.
While the American Archive of Public Broadcasting wraps up the digitization of these 40,000 hours of selected content, begins cataloging the digitized material and developing our digital archive website, we’re excited to offer access to the almost 2.5 million records collected as part of the AACIP, now available through the Interim Access Portal.
Almost all of these records were created before stations or archivists had the capability of playing back the content stored on increasingly obsolete video and audio formats. The now-in-progress digitization of 40,000 hours of this content will allow catalogers to view and fully describe the content. So if you don’t find what you’re looking for now, that doesn’t mean it doesn’t exist. This data might be irregular at the moment, but we’re excited to expose it to the public for faceted browsing, and let you track our progress as we go forward in improving our records and exposing content. In the spring, further work will have been completed and normalized data will be exposed to the public via our online digital archive, currently in the works.
Please don’t hesitate to contact the project team with any questions and research requests.
On October 23, 2014, the AMIA PBCore Advisory Subcommittee’s Education Team offered a webinar titled “PBCore: A How-to and Why-to Webinar.” The presenters offered contextual background; explained the benefits and reasons why PBCore is perfectly suited for managing audiovisual collections; offered step-by-step guidance on inventorying av assets and getting started with PBCore; and described the use of PBCore in different settings, such as asset management, digital preservation, archival description, and use with other schemas such as PREMIS and METS.
The PBCore Advisory Subcommittee is encouraged by the recent invigoration among archivists, librarians, and others managing media collections who are beginning to deal with their deteriorating av collections, as well as the digital video and audio collections, and we are confident that PBCore has a place in these efforts. PBCore is uniquely suited to provide a standard way to record and manage metadata for video and audio.
We look forward to providing more opportunities like this webinar in the future, as well as improving the schema over the next few months, clarifying and improving documentation, creating our new website, and generating new PBCore resources.
Many thanks to all of those who attended the webinar, and if you have any questions, please don’t hesitate to reach out to the presenters whose email addresses I have listed below:
Casey E. Davis, WGBH | casey_davis [at] wgbh [dot] org
Maureen McCormick Harlow, PBS | mmharlow [at] pbs [dot] org
Sadie Roosa, WGBH | sadie_roosa [at] wgbh [dot] org
Morgan Oscar Morel, George Blood Audio Video Film | moran.morel [at] georgeblood [dot] com
Enjoy the recording and please feel free to share it among your colleagues and networks!
(The chat text is best readable when the video is viewed in full-screen.)