Minutes for 04/27/2016

Marmot Union Cataloging Committee Wednesday, April 27, 2016

Announcements:

  •  AUP update from LTI is ready to be loaded.

Action Items:

  • Completed Action
  • Person(s) responsible - Completed  
  • Request Mark Noble and Logan McDonald attend the next UCC meeting to discuss BIBFRAME -  Jimmy - Yes
  • Contact directors about possibility of special in-person acquisitions meeting. Report below.   Jimmy - Yes
  • Contact SkyRiver about reintroducing OCLC numbers to records - Jimmy/Lloyd - Yes
    • This is done, but there are not as many OCLC numbers in 035 fields as we had hoped.
    • Lloyd will continue to investigate what we can do with these numbers.
  • Establish monthly reminder emails for cleaning up Data Exchange, Create Lists, and Statistics results files. Lloyd - Yes
  • Develop procedures and training materials for dummy items to allow holds on on-order items. - Brandon/Tammy/Lloyd  - Yes
  • Remove 210 fields from journal title index. - Lloyd - Yes
    • Jamie says the Nursing students at CMU use 210 field in their searches. He would not like this to be changed, so we will leave it alone.

Ongoing

  • Designate 907 and 917 fields for deletion  - Lloyd - No
  • Add info about 958 “No A.C.” note to cataloging documentation Lloyd find out if anyone is still using the 500 field? Why? -  Lloyd -  No
  • Develop proposal for use of authority records in Pika  - Jimmy/Mark/Lloyd -  No
  • Create a list correlating all MARC tags and III field group tags, put up on the wiki. -  Lloyd -  No

New

  • Investigate possible re-indexing project. What else should be re-indexed? 019 and 079 are in the ISBN index and probably should not be. Look through old UCC minutes for any reference to indexing, and start a wiki page to keep track of re-indexing needs. -  Lloyd
  • Large list of duplicates in Sierra Create List 333. 11,000 pairs of dups. Everyone should work on their library’s holdings. -  Everyone
  • Find out from Backstage and Marcive about URIs in records (Logan will contact them and let Jimmy know what they say) -  Jimmy
  • Email UCC when new Z39.50 connections are working for PV/SS schools. -  Lloyd
  • Edit One-Click loader to insert 995 field to identify a record when it has been changed by this loader.  - Lloyd
  • Send email to acquisition libraries for short-term testing of One-Click loader. -  Lloyd
  • Find out about possibility of OCLC reclamation project. To what extent it would resolve our issue of records with no 001. How many Marmot libraries are due a free reclamation? Price of full catalog reclamation. -  Jimmy
  • Train Salida on One-Click load table procedure. -  Lloyd
  • Investigate how we can take advantage of OCLC numbers in 035 of SkyRiver records. - Lloyd
  • Subcommittee on the 001 meet to discuss what we want our load tables to do with this field. -  Lloyd/Jamie/Karen/Shelly
  • Determine if there is any need for more training for dummy items in order records to allow holds. -  Lloyd

Old Business

  • Discussion of BIBFRAME with Logan MacDonald from Anythink.
    • Link http://link.anythinklibraries.org/
    • Jimmy: LibHub and Bibframe ETA is TBD as they need to evaluate costs.
    • Logan: Anythink libraries are not seeing local hits from implementing Bibframe with Zepheira.
    • Most of the search engine hits are being generated from LA, NY and Mexico City. Based on the cost of Zepheira, Anythink is paying about $.25 a hit, and they are mostly not local users. They are not sure if that is worthwhile. Bibframe appears to be expensive marketing. Sirsi/Dynix are extracting data from Anythink and sending it to Zepheira monthly till 3/16. Continuing that service would be a third of their total maintenance budget with S/D. Mark has said there are snippets of code visible, title and author in Schema.org using linked data form. Mark showed Jimmy & Logan evidence of Bibframe activity Structured Data Testing Tool (using title “The Girl on the Train”).
    • Karen doesn’t want linked data applied to entire catalog.
    • Logan is emailing Backstage about getting URIs in MARC records, *Jimmy to follow up (action item).
    • Jimmy asks Logan, “What can Marmot do?” Possible MUG item to get more people thinking about this.
  • Z39.50 and SkyRiver issues (Lloyd).
    • Two school libraries, Plateau Valley and Steamboat, have canceled SkyRiver and will use Z39.50 again next year. Z39.50 will be set up to not create so many duplicates.

New Business

  • Report from Acq/Dup meeting in New Castle
    • We made a change to the One-Click load profile, changing how it handles multiple matches. Previously if it found more than one match it would insert a new record, creating a duplicate. Now if it finds more than one match, it will attach to the first match it sees not creating a duplicate. The other change is that previously it would first look for an ISN match, then a BIB UTIL match. We switched those so now it looks for a BIB UTIL first. If 2 or more matches, grab first one.
    • The possible problem with this plan is that it will be more common to attach an order record to an incorrect format bib because the bib has multiple ISBNs for different formats. Then when the final bib with the item arrives from the vendor it would overlay that bib, changing the record for other libraries to the wrong format. We will have to see if creating fewer duplicates will be worth the problems this creates.
    • It is suggested that libraries using One-Click keep track of order record loads for a month to see how common this problem is.
    • Lloyd can have the One-Click loader insert a field “Loaded with m2btab.click in 2016” in to each record when it attaches an order to an existing bib. Then reports can be run on that field. Checking these records would increase workload. This would be shortterm testing to see if we should go back to the old method. o *Lloyd – Salida Regional Library needs training on load table procedure.
  • 22,000 dups found, what do we do?
    • Using Excel to find duplicate BIB UTIL numbers and Sierra’s new feature to allow creation of a review file from a list of record numbers, we now how a file of 22,000 duplicate records.
    • They are in Create List bucket #333.
    • Everyone can work on their own holdings to clear these dups.
  • Do we want the same 001 behavior on all load tables?
    • Lloyd: we have two different primary behaviors for 001 fields when records are loaded. Some loaders leave the number alone and just load whatever is there, others are intended for use with OCLC. They strip the OCLC prefix and leading zeros. Then any number that does not have an OCLC prefix is removed entirely.
    • If an OCLC record is loaded with the first kind of loader, then you get ocn and ocm prefixes in our records. If a non-OCLC record is loaded with the second kind, then the 001 number is lost.
    • Jamie: currently, with vendor records, there is no vetting of 001 fields. These records come in with whatever number the vendor sends.
    • Lloyd suggests we have all the loaders behave the same for the 001. All would strip OCLC prefixes and leading zeros, and bring in all other 001 numbers if they don’t have an OCLC prefix.
    • The one thing we would lose is the report of non-OCLC numbers that an OCLC table now gives. When you load non-OCLC records with an OCLC loader you get a list of the bad records in the error log.
    • Lloyd – Created subcommittee consisting of Lloyd, Jamie, Shelley, and Karen N. to look at the issue and experiment on different 001 loading options. Match 019/001 recon.
    • Karen – at least 8 of our member libraries are allowed a free OCLC reclamation project.
    • Jimmy to find out about possibility of reclamation, full or portion of libraries are entitled to reclamation – will ask for pricing of full catalog reclamation. 
Meeting Date: 
Wednesday, 2016, April 27
Documentation Type: