Europeana 1914-1918 thematic collection launches during Europeana Transcribathon Campus Berlin 2017

Officially launching the new Europeana 1914-1918 thematic collection, Europeana Transcribathon Campus Berlin 2017 marks the next milestone for the crowdsourcing digital archive dedicated to the historical conflict, and puts a spotlight on the involvement of its community.

On 22 and 23 June, the Berlin State Library will host the Europeana Transcribathon Campus Berlin 2017. Over two days, teams from three generations and several European countries  will compete to digitally transcribe as many World War One documents as possible, and link them to other historical sources such as early 20th century newspapers. Transcribathons gather people from across Europe and online to create digital versions of handwritten items found on Europeana 1914-1918. These innovative events are the latest crowdsourcing initiative enriching the Europeana 1914-1918 digital archive. Since their launch in November 2016, more than several million characters and 12,000 documents, from love letters to poems, have been transcribed.

Frank Drauschke, of Europeana 1914-1918 project team says: “Most sources on Europeana 1914-1918 are written by hand, and often hard to decipher. Transcribathon aims to help us ‘polish’ a raw diamond by this making private memorabilia readable online. We utilise the power of our community to transcribe as many private stories and documents from diverse languages and regions of Europe and make them available to the  public.”

These unique resources found on Europeana 1914-1918 have been collected and digitized since 2011 during collection days and online uploads inviting people to submit their personal documents. During  Europeana Transcribathon Campus Berlin 2017, Europeana 1914-1918, previously living on a separate website, will officially move platform and re-launch as a new Europeana thematic collection. This move onto the Collections site aims to broaden the current audience by opening up World War One related content to all Europeana visitors and to enrich their experience. People can now discover digital versions of testimonies handwritten 100 years ago, complemented by millions of digitized newspapers and documents provided by libraries and archives. Linking user generated content with other historical sources makes it possible to view them within the bigger picture. And thanks to the ability to search across the Europeana platform, people can now also easily access related items from the other four thematic collections: Europeana Art, Music, Fashion and Photography.

Europeana Transcribathon Campus Berlin 2017 is organized by Europeana, Facts & Files and the Berlin State Library, in cooperation with the German Digital Library and Wikimedia.

transcribathon

Europeana is Europe’s digital platform for cultural heritage, collecting and providing online access to tens of millions of digitized items from over 3,500 libraries, archives, audiovisual collections and museums across Europe, ranging from music, books, photos and paintings to television broadcasts and 3D objects. Europeana encourages and promotes the creative reuse of these vast cultural heritage collections in education, research, tourism and the creative industries.

Europeana Collections are the result of a uniquely collaborative model and approach: the web platform is provided by Europeana, the content comes from institutions across Europe, while consortiums provide the theme and editorial expertise to bring the content alive for visitors through blogs and online exhibitions.

Europeana 1914-1918 is a thematic collection that started as a joint initiative between the Europeana Foundation, Facts & Files, and many other European partner institutions. It originates from an Oxford University project in 2008. Since 2011, over 200,000 personal records have been collected, digitized and published. These events have now expanded to over 24 countries across Europe, building up an enthusiastic community of about 9,000 people.

Europeana Transcribe is a crowdsourcing initiative that allows the public to add their own transcriptions, annotations and geo-tags to sources from Europeana 1914-1918. Developed by Facts & Files and Olaf Baldini, piktoresk, the website is free to use and open to all members of the public. New contributors can now register and submit their own stories within the Europeana Collections site.

Europeana Newspapers is making historic newspaper pages searchable, in creating full-text versions of about 10 million newspaper pages. www.europeana-newspapers.eu

Europeana DSI is co-financed by the European Union’s Connecting Europe Facility


International Survey on Advanced documentation of 3D Digital Assets

The e-documentation of Cultural Heritage (CH) assets is inherently a multimedia process and a great challenge, addressed through digital representation of the shape, appearance and conservation condition of the heritage/cultural object for which 3D digital model is expected to become the representation. 3D reconstructions should progress beyond current levels to provide the necessary semantic information (knowledge/story) for in-depth studies and use by researchers, and creative users, offering new perspectives and understandings. Digital surrogates can add a laboratory dimension to on-site explorations originating new avenues in the way tangible cultural heritage is addressed.

The generation of high quality 3D models is still very demanding, time-consuming and expensive, not at least because the modelling is carried out for individual objects rather than for entire collections and formats provided in digital reconstructions/representations are frequently not interoperable and therefore cannot be easily accessed and/or re-used or preserved.

survey

This 15-20 minutes long survey aims to gather your advice concerning the current and future e-documentation of 3D CH objects. We would appreciate your taking the time to complete it.

Please access the survey HERE

Your responses are voluntary and will be confidential. Responses will not be identified by individual. All responses will be compiled together and analyzed as a group. The results of this survey will be published before the end of this year on the channels of Europeana Profesional (pro.europeana.eu), CIPA (Comité International de Photogrammétrie Architecturale – http://cipa.icomos.org/), Digital Heritage Research Lab (http://digitalheritagelab.eu/dhrlab/lab-overview/) and Digitale Rekonstruktion (http://www.digitale-rekonstruktion.info/uber-uns/).


VIEW Journal Celebrates Fifth Anniversary with New Interface

VIEW Journal started five years ago as the first peer-reviewed, multimedia and open access e-journal in its field. The online open access journal now has a fresh new look. Its new interface makes reading and navigation easier. More importantly, it now offers room for discussion – with the possibility to leave comments and responses under every article. Articles still feature embedded audiovisual sources. The journal continues to provide an online reading experience fit for a 21st century media journal.

view

Fifth Anniversary

VIEW Journal was started by EUscreen and the European Television History Network. It is published by the Netherlands Institute for Sound and Vision in collaboration with Utrecht University, Université du Luxembourg and Royal Holloway University of London. A heartfelt thank you goes to the support of all authors, the editorial board, and team, who have worked hard over the years to build up a journal with renown.

For the past five years, VIEW has published two issues per year. The journal’s aim – to offer an international platform in the field of European television history and culture – still stands. It reflects on television as an important part of our European cultural heritage and is a platform for outstanding academic and archival research. The journal was and remains open to many disciplinary perspectives on European television; including but not limited to television history, television studies, media sociology, media studies, and cultural studies.

Issue 10: Non-Fiction Transmedia

With the new design it also proudly presents its 10th issue on Non-fiction Transmedia. This issue was co-edited by Arnau Gifreu-Castells, Richard Misek and Erwin Verbruggen. The issue offers a scholarly perspective on the emergence of transmedia forms; their technological and aesthetic characteristics; the types of audience engagement they engender; the possibilities they create for engagement with archival content; technological predecessors that they may or may not have emerged from; and the institutional and creative milieux in which they thrive.

You can find the full table of contents for the second issue below. We wish you happy reading and look forward to your comments on the renewed viewjournal.eu.

 

Table of Contents

EDITORIAL

DISCOVERIES

EXPLORATIONS


MediaConch in action! Issue #1

Hey Eddy! Introduce yourself please.

Hey Ashley! I’ve recently become a “Denverite” and have started a new job as an Assistant Conservator specializing in electronic media at the Denver Art Museum (DAM). Before that, I was down in Baton Rouge, Louisiana working as a National Digital Stewardship Resident with Louisiana Public Broadcasting (LPB). When I’m not working, I like to listen to podcasts, read comics, play bass, and stare into the endless void that is “twitter” (@EddyColloton). I’m on a big H. P. Lovecraft kick right now so let me know if you have any recommendations (Dunwich Horror is my current fav, but At the Mountains of Madness is a close second).

What does your media ingest process look like? Does your media ingest process include any tests (manual or automated) on the incoming content? If so , what are the goals of those tests?

The ingest procedures I’m using for the Denver Art Museum are pretty different from the ones we worked out at Louisiana Public Broadcasting, for all kinds of reasons. The two institutions have very different types of collections, and they use their repositories very differently, too.

At the Denver Art Museum, ideally, material will enter the digital repository upon acquisition. Ingest procedures need to be able to be tailored to the eccentricities of a particular media artwork, and flexible enough to cover the wide array of media works that we acquire (websites, multi-channel video installations, software-based artworks, or just a collection of tiff files). With this in mind, we’re using Archivematica for ingest of media into our digital repository. It allows us to automate the creation of METS wrapped PREMIS XML documentation, while manually customizing which microservices we choose to use (or not use) as we ingest new works. Some of the microservices I use on a regular basis are file format identification through Siegfried, metadata extraction with MediaInfo, ExifTool, and Droid, and normalization using tools like FFmpeg.

Things couldn’t be more different at LPB. All completed locally produced programming automatically becomes part of the LPB archive. The LPB Archive is then responsible for preserving, describing and making that content accessible through the Louisiana Digital Media Archive (LDMA), located at http://www.ladigitalmedia.org/. LPB’s ingest procedures need to allow for a lot more throughput, but there’s much less variability in the type of files they collect compared to the DAM. With less of a need for manual assessment, LPB uses an automated process to create MediaInfo XML files, an MD5 checksum sidecar, and a MediaConch report through a custom watchfolder application that our IT engineer, Adam Richard, developed. That code isn’t publically available unfortunately, but you can see the scripts that went into it on my GitHub.

Where do you use MediaConch? Do you use MediaConch primarily for file validation, for local policy checking, for in-house quality control, for quality testing for vendor files?

Primarily policy checking, as a form of quality assurance. At LPB, most of our files were being created through automated processes. Our legacy material was digitized using production workflows, to take advantage of existing institutional knowledge. This was very helpful, because we could then repurpose equipment, signal paths, and software. But, using these well worn workflows also meant that occasionally files would be encoded incorrectly, aspect ratio being one of the most common errors. We would check files against a MediaConch policy as a way of quickly flagging such errors, without having to invest time watching and reviewing the file ourselves.

At the Denver Art Museum, we plan to use MediaConch in a similar way. The videotapes in the museum’s collection will be digitized by a vendor. Pre-ingest, I plan to do tests on the files for quality assurance. After fixity checks, I will check to make sure our target encoding and file format was met by the vendor using MediaConch. I intend to use the Carnegie Archive’s python script from their GitHub to automate this process. Once I know that the files are encoded to spec, I will be creating QCTools reports and playing back the files for visual QC. I’ve been following the American Archive of Public Broadcasting’s QC procedures with interest to see if there’s any tricks I can cop from their workflow.

At what point in the archival process do you use MediaConch?

Basically pre-ingest for both LPB and the DAM. When using MediaConch as a policy checker, my goal is to make sure we’re not bothering to ingest a file that is not encoded to spec.

Do you use MediaConch for MKV/FFV1/LPCM video files, for other video files, for non-video files, or something else?

I use MediaConch for MKV/FFV1/LPCM video files and for other types of video files as well. At LPB we were using MediaConch as a policy checker with IMX50 encoded files in a Quicktime wrapper and H.264 encoded files in a .mp4 wrapper. You can find the policies I created for LPB here, and I talk through the rationale of creating those policies in the digital preservation plan that I created for LPB, available here (MediaConch stuff on page 24, and page 45). I’m happy to report that LPB is currently testing a new workflow that will transcode uncompressed .mov files into lossless MKV/FFV1/LPCM files (borrowing heavily from the Irish Film Archive’s lossless transcoding procedures, as well as the CUNY TV team’s “make lossless” microservice).

At the DAM, we’ll be using MediaConch as a policy checker with Quicktime/Uncompressed/LPCM files, and for validation of our MKV/FFV1/LPCM normalized preservation masters.

My understanding is that MediaConch is going to be integrated into Archivematica’s next release. I’m really looking forward to that update, since at the DAM we have decided to create MKV/FFV1/LPCM files for any digital video in the collection that uses a proprietary codec, or an obsolete format. A lot of the electronic media in the museum’s design collection comes from the AIGA Archives, which the DAM collects and preserves. A ton of the video files from the AIGA Archives were created in the aughts, and they use all kinds of whacky codecs – my favorite so far is one that MediaInfo identifies as “RoadPizza” (apparently a QuickTime codec). Given that I don’t want to rely on the long-term support of the RoadPizza codec, we’re normalizing files like that to MKV/FFV1/LPCM through an automated Archivematica micro-service that uses the following FFmpeg script (which I cobbled together using ffmprovisr):

ffmpeg -i input_file -c:v ffv1 -level 3 -g 1 -slicecrc 1 -slices 16 -c:a pcm_s16le output_file.mkv

To be clear we are also keeping the original file, but just transcoding a second version of the file to be cautious.

Through that implementation we intend to use the Archivematica MediaConch microservice to validate the encoding of the video files that we have normalized for preservation.

Why do you think file validation is important the field?

I wish this was a joke but it honestly helps me sleep better at night. MediaConch and melatonin make for a well rested AV archivist/conservator, hah. I like knowing that the video files that we are creating through automated transcoding processes are up to spec, and comply with the standards being adopted by the IETF.

Also, using MediaConch as a policy checker saves me time, and prevents me from missing bonehead mistakes, of which their are loads, because I work with human beings (and possibly some aliens, you never know).

Anything else you’d like to add?

Just want to offer a big thanks to the MediaConch team for everything that they do! I know there’s a pretty big overlap betwixt team Conch, CELLAR, and the QCTools peeps – you’re all doing great work, and regularly making my job easier. So thanks for that.

To read more from Eddy, check out his Louisiana Public Broadcasting Digital Preservation Plan


veraPDF 1.6 released

veraPDF-logo-600-300x149The latest release of veraPDF is available to download. The validation logic and test corpus of veraPDF 1.6 have been updated to comply with the resolutions of the PDF Association’s PDF Validation Technical Working Group (TWG). The TWG brings together PDF technology experts to analyse PDF validation issues in a transparent way. It also connects veraPDF to the ISO committee responsible for PDF/A.

The GUI and command line applications feature a new update checker which lets you know if you are running the latest version of the software. If you’re using the GUI application select “Help->Check for updates”, command line users type “verapdf –version -v” to ensure you have the latest features and fixes.

Other fixes and improvements are documented in the release notes: https://github.com/veraPDF/veraPDF-library/releases/latest

 

Download veraPDF

http://www.preforma-project.eu/verapdf-download.html

 

Help improve veraPDF

Testing and user feedback is key to improving the software. Please download and use the latest release. If you experience problems, or wish to suggest improvements, please add them to the project’s GitHub issue tracker: https://github.com/veraPDF/veraPDF-library/issues  or contact us through our mailing list: http://lists.verapdf.org/listinfo/users.

User guides and documentation are published at: http://docs.verapdf.org/.

 

PREFORMA International Conference  – Shaping our future memory standards

To find out more about veraPDF and the PREFORMA project, join us at the PREFORMA International Conference in Tallinn on 11-12 October 2017. For more information see: http://finalconference.preforma-project.eu/.

 

About

The veraPDF consortium (http://verapdf.org/) is funded by the PREFORMA project (http://www.preforma-project.eu/). PREFORMA (PREservation FORMAts for culture information/e-archives) is a Pre-Commercial Procurement (PCP) project co-funded by the European Commission under its FP7-ICT Programme. The project’s main aim is to address the challenge of implementing standardised file formats for preserving digital objects in the long term, giving memory institutions full control over the acceptance and management of preservation files into digital repositories.


NEM Summit 2017 – call for abstracts

The NEM Summit is an international conference and exhibition, open to co-located events and organised every year since 2008 by the NEM Initiative (New European Media – European Technology Platform – www.nem-initiative.org) for all those interested in broad area of Media and Content. Over the years, the NEM Summit has grown to become an annual not-to-be-missed event, providing attendees with a key opportunity to meet and network with prominent stakeholders, access up-to-date information, discover latest technology and market trends, identify research and business opportunities, and find partners for upcoming EU-funded calls for projects.

nem

The 10th edition of the NEM Summit conference and exhibition will be held in Spanish capital Madrid at the exciting venue of the Museo Reina Sofía. Please, reserve these dates to attend the NEM Summit 2017 and take part in discussions on the latest development in European media, content, and creativity.

NEM Summit 2017 – Call for Extended Abstracts

  • Expected length of the extended abstracts is two A4 pages with possibility to provide further supporting information
  • The extended abstracts have to be submitted until 26 June 2017
  • Fast track evaluations of the received contributions will be applied and results will be known by 26 July 2017
  • More information can be found in the attached Call for Extended Abstracts
  • Submission portal is available on the NEM Initiative website at www.nem-initiative.org

Further details about the NEM Summit 2016, further opportunities to participate and exhibit at the event, and online Summit registration will be provided soon on the NEM Initiative website at www.nem-initiative.org.

Download the call for abstracts (PDF, 91 kb)

 


Photoconsortium Annual Event, hosted by CRDI. Public seminar and general assembly.

Girona-cathedral-PD-694x416img. Ajuntament de Girona, Vista exterior de l’absis de la Catedral de Girona, Public Domain.

test1aHosted by Photoconsortium member CRDI, the 2017 Annual Event of Photoconsortium is organized in the beautiful city of Girona (Spain), seat of an important audiovisual archive of millions photographs, films, hours of video and hours of sound recordings, mostly from private sources.

The archive is managed by CRDI, a body of the city Municipality created in 1997 with the mission to discover, protect, promote, provide and disseminate cinematographic and photographic heritage of the city of Girona.

Photos and follow-up


Friday 9th June 2017

PUBLIC SEMINAR: PHOTOCONSORTIUM INFORMATIVE SESSION

Languages of the seminar: English, Catalan and Spanish

Chair: David Iglésias, Officer of Photoconsortium Association

09.45 Welcome message by Joan Boadas, Director of CRDI

10:00 Prof. Fred Truyen, KU Leuven. President of Photoconsortium Association. Presenting the Photography Collection in Europeana

10:15 Antonella Fresa, Promoter Srl. Vice-president of Photoconsortium Association. Photoconsortium, the expert hub for photography

10.30 Pierre-Edouard Barrault, Operations Officer at Europeana. Publishing in Europeana – Tools, Data & good practices

11:30 Coffee break and networking

12:00 Sílvia Dahl, Laia Foix. Photography deterioration terminology. A proposal to broaden the EuropeanaPhotography vocabulary.

12:30 Pilar Irala. The Jalon Ángel collection.

13:00 Debate

14:00 Lunch

15:00 – 16:00 Visit to the Cinema Museum


On the day before, 8th June, the General Assembly of Photoconsortium members took place.

 

 


PREFORMA hands-on sessions in Europe
the_reel_thing_4

Photo credit: CC BY-SA Sebastiaan Ter Burg.

During May 2017, successful hands-on sessions and workshops have been organised by PREFORMA in several places to explain to the participants what does conformance checking mean, why is file format validation so important in long-term digital preservation, how to create their own policy profiles and how to download, install, configure and use the conformance checker to analyse their files. These workshops invite participants to bring their files and analyse them with the PREFORMA tools..

 

Barcelona, 10 May 2017

20170510_104903The first hands on session was organised in Barcelona with members of the Official Association of Librarians (Col·legi Oficial de Bibliotecaris i Documentalistes de Catalunya – COBDC), to show the functionalities offered by the DPF Manager to check TIFF files.

It took place in the premises of COBDC on the 10th of May and the session was conducted by Sònia Oliveras from the Girona City Council and Xavier Tarrés and Víctor Muñoz from Easy Innova. The attendees, who have large amounts of TIFF files, came mostly from local and national memory institutions and weren’t aware of any file conformance checker. The tools offered by PREFORMA project were the first solution in order to solve the file format conformance.

 

Amsterdam, 28 May 2017

the_reel_thing_5

Photo credit: CC BY-SA Sebastiaan Ter Burg.

A second hands-on session was organised in Amsterdam in the framework of The Reel Thing XL workshop, focusing on the challenges of using FFV1 and MKV for film digitisation.

Presentations were delivered by Erwin Verbruggen (Netherlands Institute of Sound and Vision), introducing PREFORMA and the AV challenges, Jérôme Martinez (MediaArea.net), introducing MediaConch, Eva Verdoodt & Noortje Verbeke (VIAA), presenting their film digitisation workflow and considerations using FFV1/MKV, and Reto Kromer (reto.ch), presenting the last developments on the FFV1 standardisation as far as colour information for films is concerned.

the_reel_thing_2

Photo credit: CC BY-SA Sebastiaan Ter Burg.

The session was closed by a panel discussion guided by the British Film Institute on the practical thresholds implementing FFV1/MKV in film digitisation. Basically the importance of the standardisation effort was discussed, as well as the posibilities for crowdfunding further development of the standards and the MediaConch tool.

The workshop was attended by 25-30 participants, mainly from film archives and film scanning services, including British Film Institute, Irish film institue, INA, Catalunyan Film Archive, Austrian Film Archive, German Film Archive, VIAA, Sound & Vision, Picturae Digitisation Services.

 

Quedlinburg, 29 May 2017

20170510_130301The third hands-on session was organised in Quedlinburg by SPK in cooperation with the Museum Association of Saxony-Anhalt, focusing again on the DPF Manager.

The session was embedded in a general meeting of the Working Group Digitisation of the Museum-Association (AG Digitalisierung MVSA). There were 22 participants mainly from medium-sized and small museums – museum-directors, curators, IT-people. There was a general introduction on file-formats for digital preservation, especially for text and images, followed by an introduction to PREFORMA and the tools created in the project.

Participants brought their own laptops and installed the DPF Manager. After that the functionalities available in the GUI version were explained and participants were able to try both the conformance checker and the policy checker.

In the end the participants agreed that the DPF Manager is a valuable tool for their digitisation-work, not only for digital preservation but also for checking image-files produced by external companies in the framework of a digitisation project of a museum.

 

Stockholm, 30 May 2017

the_reel_thing_1

Photo credit: CC BY-SA Sebastiaan Ter Burg.

Finally, the last hands-on session organised by PREFORMA in May was hosted at the National Archives of Sweden and it focuses on PDF/A and on the use of veraPDF conformance checker. The session brought together 21 persons working with archives and records management issues at public and private memory institutions, state and municipal agencies and organisations, and in SMEs.

The Riksarkivet team made an introduction to the seminar followed by a brief walkthrough of the PDF/A format. The hands-on part was divided into two main blocks: the first one focusing on conformance checking, the second one on policy checking. Each block began with a demonstration by the Riksarkivet team, and was then followed by practical exercises where the participants used veraPDF to check the conformance and policy respectively of their sample files.

The seminar ended with an informal discussion on the results of the seminar and whether the participants initial expectations were met. These expectations was mainly about learning more about PDF/A to better understand its “pros and cons” but also to learn about validation and the PREFORMA conformance checker. The overall feedback from the participants was very positive and participants were interested to continue to follow the developments of the PREFORMA project , possibly as part of a Swedish informal reference group.

 

Additional workshops and seminars will be organised by the PREFORMA partners after Summer. Stay tuned at www.preforma-project.eu!


Preservation courses at the IS&T conference Archiving 2017

House-of-Blackheads-and-St-Peter-s-Church-Tower-at-dusk-Riga-Latvia-by-DAVID-ILIFF-CC-BY-SA-3-0

 

Riksarkivet (the Swedish National Archive) and Packed, together with veraPDF and EasyInnova, were invited to the conference to arrange a course about formats for preservation on the 15th of May. The framework of the course was based on the work done in Riksarkivet’s research and development program ArkivE 2.0 — fundamental principles for selection of format — which within the PREFORMA project was applied.

 

The participants were introduced to an abstract and a generic overview of the meaning of “format” and digital preservation. Within that framework the value and importance of a Conformance Checker and what makes a format appropriate in a specific user case was explored. The course covered technical, legal and archival challenges facing those who work with digital preservation and handed the task of recommending and selecting “preservation formats”.

 

A demonstration of the difference between file identification and conformance checking provided the participants with a much appreciated practical connection to the theoretical presentation. A more concrete and detailed view of formats were also given through the presentations of veraPDF on PDF/A, and EasyInnova on TIFF and TI/A.

 

The course had 12 pre-registered participants (mainly librarians, academics, photographers and digitisation companies), of which 11 attended and 10 gave their evaluation through the IS&T provided feedback form. Background of the participants included the Swedish Media Conversion Center (formally part of Riksarkivet), Latvian National Library, Dutch City of Leiden Heritage Network and ancestry.com. The overall feedback was very positive.


MediaConch Newsletter #10 – June 2017

Updates

We are pleased to announce the progress we’ve made this year. There are new updates to MediaConch that expand its capabilities, user stories to share, and new events related to implementation checking and the standardization of open audiovisual formats.

The MediaConch team recently collaborated with VIAA on tests to migrate their JPEG2000/MXF collection to FFV1/Matroska. Learn more about our process and findings!

The development of MediaConch has been closely following the work of the IETF CELLAR working group which is creating specifications for EBML, Matroska, FFV1 and FLAC. Review the latest versions of those documents.

Some Highlights:

  • MediaConch optimized its FFV1 parser, which will allow upcoming versions to provide more comprehensive implementation checking of FFV1 against its specification.
  • We improved the Matroska checker, particularly to support the CELLAR working group’s development of the EBML Schema, which defines Matroska’s structure in a manner similar to an XML Schema.
  • Attachments to Matroska files may now be analyzed against other implementation checkers. For instance, Matroska can now use VeraPDF or DPF Manager to assess TIFF or PDF data.
  • In collaboration with the Tate Museum, we have added a TN2162 policy checker to MediaConch. This policy assesses uncompressed video in QuickTime against Apple’s list of additional requirements that affect that combination of formats.
  • Several bugs were reported and fixed. Thanks to our users for their reports!
  • See what’s new in MediaConch’s GUI ad CLI!

Jérôme, Reto, and Kieran recently presented MediaConch and CELLAR standardization at the Reel Thing Conference in Amsterdam.

the_reel_thing

The presentations and conversations at the Reel Thing demonstrated a growing interest in the standardization process of Matroska and FFV1 as well as methods to integrate these formats into preservation environments. Conversations focused on use of these formats for film preservation challenges and collaborative work to develop more tools around lossless video.

Next steps:

  • “FrameMD5” computing in MediaConch, in order to compare resulting file to a source file, pixel per pixel, after a conversion to Matroska/FFV1, if the conversion includes a “FrameMD5” of the source file.
  • Adding more information (specific location of the error in the bitstream) when FFV1 validation fails.
  • Stabilization of the software, with bug fixes and more automatic non-regression tests.

Ashley recently interviewed Eddy Colloton from the Denver Art Museum about his process and experience using MediaConch. Read it here!

 

Upcoming Events

June 21st, 2017 4:45 – 5:30 pm PDT: Ashley will discuss MediaConch and format standards in her talk How Open Source Audiovisual Tools Help Archivists (And You Too!) at Open Source Bridge this week in Portland, OR.

July 10th, 2017 2:00 – 5:00 pm EDT: Dave will be hosting An Archivists’ Guide to Matroska workshop at the Metropolitan New York Library Council.

July 19th, 2017 15:20 – 16:50 CEST : CELLAR (Codec Encoding for LossLess Archiving and Realtime transmission) will hold its second face-to-face meeting during IETF 99 at the Hilton in Prague. The final agenda will be published on June 23rd, 2017. Tessa and Jérôme will be attending in person.

September 20th, 2017 from 9:45a – 11:15a CEST: Jérôme and Dave will host a workshop: Checking Audiovisual Conformance, for IASA at the Ethnological Museum of Berlin.

October 11th – 12th, 2017: The PREFORMA International Conference will be held at the National Library of Estonia, Tõnismägi 2, Tallinn.

 

Latest Downloads

Download MediaConch’s latest release or a daily build.

 

New Release Notes

What’s new in MediaConch 17.05

GUI/CLI/Server/Online

  • Less verbose output by default
  • CSV output (useful for automation with the command line)
  • Option for creating policy from a file directly in the policy editor
  • New policy example based on Apple’s TN2162 which defines requirements when storing uncompressed video in QuickTime
  • Analyze attachments in Matroska files (useful especially with PDF or TIFF plugin, for validating attachments)
  • Better support of some broken Matroska files, displaying more information about the reason it is a broken file
  • More Matroska and FFV1 validity tests
  • Performance improvements
  • For Mac users, MediaConch is now available directly in the Mac App Store, with automatic updates

The MediaConch project has received funding from PREFORMA, co-funded by the European Commission under it’s FP7-ICT Programme.

 

Feedback

MediaArea is eager to build a community of collaborators and testers to integrate the software in their workflows and participate in usability testing. Please contact us if you’d like to be involved!