Skip to content

Workflow

Christian Egli edited this page Apr 30, 2015 · 3 revisions

Open a production

Via ABACUS

Books, periodicals and other productions are created from ABACUS, which generates an XML file which is placed in /opt/abacus/share/abac/out/SN_Madras2. From a cron job the script in /home/madras2/bin/abacus_madras2.sh on that same machine picks up the XML and sends it to mdr2 via HTTP POST. If the XML is valid and the product number is unique mdr2 will create a new production, i.e. add it to the db and generate the necessary directories in the file system.

From the library

Productions for commercial audio books are started via a file upload of an XML file that has previously been exported from Vubis. If the XML is valid and the library number (PNX) is unique a new production will be created as above

Structuring

The production now waits for an upload of the title structure. This is done by uploading a valid DTBook file. A template with all the meta data for this production is provided via the web interface. When the structure file is uploaded it is validated using the validator from DAISY Pipeline 1 and against the meta data in the data base. If it is valid the DTBook XML is stored in the structured directory and a config file for Obi is created in the same directory. The state of the production is set to structured.

Recording

To record the audio the DTBook file is imported with Obi. This will automatically also import the config file and set up the Obi project to export with the right options and to the proper place in the file system. The Obi project will be created in the recording directory. Once the recording is finished it is exported which will place a DAISY202 book in the recorded directory.

Completion of recording

Via ABACUS

Once the recording has been completed the user indicates this in ABACUS. This will cause an export of an XML file which is picked up by a cron job that invokes /home/madras2/bin/abacus_madras2.sh. For periodicals this will be invoked immediately, for all other productions this will be after office hours. The status change will be sent to mdr2 via HTTP POST. If the XML is valid, the production is in state structured and there is a valid DAISY202 book in the recorded directory mdr2 will start with the further processing.

First mdr2 will update the meta data in the data base with the production data from the recording such as audio length, depth, production dates, etc. Then it will make sure the meta data in the exported production is correct by overwriting it with the data from the data base such as title, author, etc. After that the production will be put on a queue and sequentially start to encode the Wav files to mp3 (in the encoded directory) and finally create an iso file of the whole production in the iso directory and set the state to encoded.

If the production doesn’t fit on one CD it will set the state to pending-split.

Via web interface

Commercial audio productions and productions that are repaired are completed via a button in the web interface which is only available for certain roles. The buttons is only visible if the production is in state structured and there is a valid DAISY202 book in the recorded directory. The ensuing processing is the same as above.

Splitting

If a production doesn’t fit on one CD there is manual intervention required as the automatic splitters in DAISY Pipeline 1 apparently do not produce DAISY books that play for our customers. So a user manually splits the exported DAISY202 book and places it in the split directory.

Cataloging

All productions that have the state encoded are listed in a web interface where the staff from the library has the ability to assign a library signature to a production. This is only relevant for productions that are added to the library, i.e. productions of production type book. As soon as a signature is assigned the book will be queued for archiving.

Productions of production type periodical or other aren’t added to the library and hence do not need a library signature. They will be archived automatically as soon as they have been encoded.

Archiving

The archiving process takes productions off the archiving queue and processes them sequentially. Generally archiving entails copying the relevant files to a spool directory, creating an RDF file with the meta data of the production and finally inserting or updating the archive data base. For a production master the Wav files are copied and for a distribution master the iso file is copied. Depending on the production type the behavior is slightly different

book
Both production master and distribution master are archived and placed in the spool directory for books.
periodical
The production master is archived and the distribution master is placed in the spool directory for periodicals.
other
The production master is archived and the distribution master is placed in the spool directory for other productions.

Once the entry has been made in the archive data base the state of the production is moved to archived. The production will no longer be listed in the open productions but instead will appear in the list of archived productions.

Repair

Productions can be repaired which means they are fetched from the archive made available for re-recording. For that purpose web interface allows to select an archived production by DAM number, production number, library signature, title or author. Once the production has been selected it is fetched from the archive, unpacked via a temporary directory (/var/lib/mdr2/structured/:id:_repair) and placed in the structured directory alongside with a corresponding config file. Now a new or improved structure file can be uploaded or the production can also be re-imported into Obi to just fix some of the audio.

After the usual DAISY 202 export the production can be processed as above. The only exception is that the completion of the recording is done via a web interface and there is no cataloging as these productions already have a library signature.

Clone this wiki locally