From Libopedia
Jump to navigation Jump to search
Back to Harvesting

Urgent Notice


During the upgrade from 1.8.2 to 3.2, a bug report was submitted:

The command to completely clear out the cache does not work due to the fact that our Tomcat server runs as root so that it has full access to all files in $HOME.

So I manually cleared the cache as follows:

cd $HOME/var/oai/requests
sudo rm *

Then completely rebuilt the OAI SOLR DB with the import command as follows:

$HOME/bin/dspace oai import -o -v

And it works.

Another solution is to disable the cache, see config setting in example below.


During the upgrade from 1.8.2 to 3.2, a bug report was submitted:

Email sent;

Hi All

Regarding the following, is another patch required?

We use DSpace 3.2 with a SOLR DB for OAI.

This query seems to work.

This from Open Archives.

From:, I get the following;
Inline images 1

Error message from Open Archives;

 [1] ListRecords response gave a noRecordsMatch error when it should have included at least the record with identifier The from and until parameters of the request were set to the datestamp of this record (2011-06-23T08:15:02Z). The from and until parameters are inclusive, see protocol spec section 2.7.1. The message included in the error response was: 'No matches for the query'

 Total exceptions improperly handled: 1 out of 15 Total error count: 1


Edit the following file;

nano $HOME/source/dspace/config/modules/oai.cfg
  1. Select whether storage will be the SOLR database or the PostgreSQL database
  2. Define OAI URL's.
  3. Define the OAI folder paths.
  4. Define harvester settings.

See sample below.


#--------------------XOAI CONFIGURATIONS------------------------#
# These configs are used by the XOAI                            #

# Storage: solr | database 

# Base solr index
# OAI persistent identifier prefix.
# Format - oai:PREFIX:HANDLE
identifier.prefix =
# Base url for bitstreams
bitstream.baseUrl =

# Base Configuration Directory
config.dir = home/dspace/config/crosswalks/oai

# Description
description.file = /home/dspace/config/crosswalks/oai/description.xml

# Cache enabled?
cache.enabled = true

# Base Cache Directory
cache.dir = /home/dspace/var/oai

#--------------OAI HARVESTING CONFIGURATIONS--------------------#
# These configs are only used by the OAI-ORE related functions  #

### Harvester settings

# Crosswalk settings; the {name} value must correspond to a declated ingestion crosswalk
# harvester.oai.metadataformats.{name} = {namespace},{optional display name}
# The display name is only used in the xmlui for the jspui there are entries in the
# in the form{name}
harvester.oai.metadataformats.dc =, Simple Dublin Core
harvester.oai.metadataformats.qdc =, Qualified Dublin Core
harvester.oai.metadataformats.dim =, DSpace Intermediate Metadata

# This field works in much the same way as harvester.oai.metadataformats.PluginName
# The {name} must correspond to a declared ingestion crosswalk, while the
# {namespace} must be supported by the target OAI-PMH provider when harvesting content.
# harvester.oai.oreSerializationFormat.{name} = {namespace}

# Determines whether the harvester scheduling process should be started
# automatically when the DSpace webapp is deployed.
# default: false

# Amount of time subtracted from the from argument of the PMH request to account
# for the time taken to negotiate a connection. Measured in seconds. Default value is 120.
#harvester.timePadding = 120

# How frequently the harvest scheduler checks the remote provider for updates,
# messured in minutes. The default vaule is 12 hours (or 720 minutes)
#harvester.harvestFrequency = 720

# The heartbeat is the frequency at which the harvest scheduler queries the local
# database to determine if any collections are due for a harvest cycle (based on
# the harvestFrequency) value. The scheduler is optimized to then sleep until the
# next collection is actually ready to be harvested. The minHeartbeat and
# maxHeartbeat are the lower and upper bounds on this timeframe. Measured in seconds.
# Default minHeartbeat is 30.  Default maxHeartbeat is 3600.
#harvester.minHeartbeat = 30
#harvester.maxHeartbeat = 3600

# How many harvest process threads the scheduler can spool up at once. Default value is 3.
#harvester.maxThreads = 3

# How much time passess before a harvest thread is terminated. The termination process
# waits for the current item to complete ingest and saves progress made up to that point.
# Measured in hours. Default value is 24.
#harvester.threadTimeout = 24

# When harvesting an item that contains an unknown schema or field within a schema what
# should the harvester do? Either add a new registry item for the field or schema, ignore
# the specific field or schema (importing everything else about the item), or fail with
# an error. The default value if undefined is: fail.
# Possible values: 'fail', 'add', or 'ignore'
harvester.unknownField  = add
harvester.unknownSchema = fail

# The webapp responsible for minting the URIs for ORE Resource Maps.
# If using oai, the dspace.oai.uri config value must be set.
# The URIs generated for ORE ReMs follow the following convention for both cases.
# format: [baseURI]/metadata/handle/[theHandle]/ore.xml
# Default value is oai
#ore.authoritative.source = oai

# A harvest process will attempt to scan the metadata of the incoming items
# (dc.identifier.uri field, to be exact) to see if it looks like a handle.
# If so, it matches the pattern against the values of this parameter.
# If there is a match the new item is assigned the handle from the metadata value
# instead of minting a new one. Default value:
#harvester.acceptedHandleServer =,

# Pattern to reject as an invalid handle prefix (known test string, for example)
# when attempting to find the handle of harvested items. If there is a match with
# this config parameter, a new handle will be minted instead. Default value: 123456789.
#harvester.rejectedHandlePrefix = 123456789, myTestHandle

Initialise OAI database

Execute one of the following tasks to update the OAI database initially and then click here to enable regular updates.

If using the SOLR DB (solr)

sudo $HOME/bin/dspace oai import -c -v


If using the SQL DB (database)

sudo $HOME/bin/dspace oai compile-items