You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 21 Next »

Scope: This procedure explains the process of the pickup and scripted processing of order files from various vendors and our own POOF! system. It includes the methodology of automated vendor selection and the loading of bibliographic records and creation of purchase orders through Voyager's bulk import function.

Contact: Natalya Pikulik

Unit: Batch Processing

Date created: April 1, 2017

Date of next review:  April 2019


Pick up file from source

Processing files

Bulk import profile will discard records that match too closely


Automated orders are processed through a variety of sources:  POOF!, Coutts Oasis Ally, Harrassowitz, and others. We attempt to make the processing steps from each source as similar as possible, creating variations only when necessary and customization only when practical. The premise is to be able to add new vendors with a minimum of script adjustments.   

  1. Pick up file from source:
    1. A POOF file is picked up from a cron job scheduled to run at 8:05 each weekday morning.
    2. Coutts Ally orders must go through an authorization process to produce a file which is downloaded to an LTS Share space: \\input\vendorRecords\Coutts_Ally   That file is then processed by an lstools script. (NB: We are still attempting to get Coutts to automate the authorization and file produce process.  When that is accomplished, we will institute a cron job for pickup.)
    3. Harrassowitz generates a file each day. Titles are selected in OttoEditions. From there we will establish a cron job to process the file. (In Process 4/18)
  2.  Processing of files:
    1.  Both POOF and Coutts Ally orders are processed through a set of rules commonly called the order matrix. Each line depends on a separate rule and the first time a record matches the criteria of the rule, instructions are in place to handle the record appropriately. For example, early in the table, there is a rule that "discards" records which have in the 008 position 28 an f or s for federal or state publications.  They are sorted for manual examination as government publications before further processing.  Another example of records removed from automated processing are bib records which contain field 981 subfield n  REQ:  Those records are sorted to a separate file for probable Amazon ordering.
    2. After initial special exceptions are handled, the country code in the 008 field is examined, and a vendor assigned accordingly.
    3. Titles for which no vendor is easily automatically assigned according to country code, will be sent to another LTS share space: \\input\vendorFirmOrd\cts\Assign\  Depending on which rule applies, the titles will be sorted as and rules applied based on whether they are electronic records,serials, early imprint dates, requesters, etc.
    4. MARC records with data errors are placed in a separate folder, Assign_Vendor, for manual processing.
    5. Harrassowitz files end with the suffix ".24.ord.mrc"  generate a Harrassowitz.purchase order. Cron job scansHarrassowitz server to pick up new order files and bulk import conditionally into catalog and create purchase orders.
    6. In most cases for which a vendor creates a MARC file of ordered titles,  bib records and purchase order are generated via a bulk import profile established for that vendor.
  3. The bulk import profile will discard records that match too closely to existing records.  They will be placed in vendor-named discard folders in the cts folder on the LTS share space established for each vendor mentioned above.
    1. Discarded "duplicates" are based on complete title matches or ISBN or LCCN.
    2. Discards are manually examined to ensure discarded title is indeed a duplicate.
    3. Discarded duplicates are placed in a folder, REJECTED_ORDERS, and are reported back to selectors the following morning by a cron job.

  Next steps: Follow LTS procedure number #117 for firm order final processing.

  • No labels