SUNScholar/Daily Admin/5.X

Back to Daily Admin

Step 1. Login
http://wiki.lib.sun.ac.za/index.php/SUNScholar/Prepare_Ubuntu/S01 Click on the link above to find out how to login to the server and then return here.

Step 2. Create "dspace" user crontab
Edit the crontab, by typing the following in a terminal:

su - dspace

crontab -e If asked to select an editor, choose nano

Sample crontab
Delete all of the contents and then copy and paste the following into the NANO text editor, and then save. See help for NANO above.

MAILTO="root"
 * 1) SAMPLE CRONTAB FOR A PRODUCTION DSPACE
 * 2) You obviously may wish to tweak this for your own installation,
 * 3) but this should give you an idea of what you likely wish to schedule via cron.
 * 4) NOTE: You may also need to add additional sysadmin related tasks to your crontab
 * 5) (e.g. zipping up old log files, or even removing old logs, etc).
 * 1) (e.g. zipping up old log files, or even removing old logs, etc).
 * 1) GLOBAL VARIABLES #
 * 2) Deliver cron email to the system administrator
 * 1) Deliver cron email to the system administrator
 * 1) Deliver cron email to the system administrator


 * 1) HOURLY TASKS #
 * 2) (Recommended to be run multiple times per day, if possible)
 * 3) At a minimum these tasks should be run daily.
 * 1) (Recommended to be run multiple times per day, if possible)
 * 2) At a minimum these tasks should be run daily.

0 0,8,16 * * * $HOME/bin/dspace generate-sitemaps > /dev/null 0 0 * * * $HOME/bin/dspace oai import -o > /dev/null 0 0 * * * $HOME/bin/dspace index-discovery > /dev/null 30 0 * * * $HOME/bin/dspace index-discovery -o > /dev/null 0 1 * * * $HOME/bin/dspace stats-util -i > /dev/null 30 1 * * * $HOME/bin/dspace stats-util -o > /dev/null 0 2 * * * $HOME/bin/dspace sub-daily > /dev/null 0 3 * * * $HOME/bin/dspace filter-media -q > $HOME/log/media-filter.log 2>&1
 * 1) Regenerate DSpace Sitemaps every 8 hours (12AM, 8AM, 4PM).
 * 2) SiteMaps ensure that your content is more findable in Google, Google Scholar, and other major search engines.
 * 1) DAILY TASKS #
 * 2) (Recommended to be run once per day. Feel free to tweak the scheduled times below.)
 * 1) (Recommended to be run once per day. Feel free to tweak the scheduled times below.)
 * 1) (Recommended to be run once per day. Feel free to tweak the scheduled times below.)
 * 1) Update the OAI-PMH index with the newest content (and re-optimize that index) at midnight every day
 * 2) NOTE: ONLY NECESSARY IF YOU ARE RUNNING OAI-PMH
 * 3) (This ensures new content is available via OAI-PMH and ensures the OAI-PMH index is optimized for better performance)
 * 1) Clean and Update the Discovery indexes at midnight every day
 * 2) (This ensures that any deleted documents are cleaned from the Discovery search/browse index)
 * 1) Re-Optimize the Discovery indexes at 12:30 every day
 * 2) (This ensures that the Discovery Solr Index is re-optimized for better performance)
 * 1) Cleanup Web Spiders from DSpace Statistics Solr Index at 01:00 every day
 * 2) NOTE: ONLY NECESSARY IF YOU ARE RUNNING SOLR STATISTICS
 * 3) (This removes any known web spiders from your usage statistics)
 * 1) Re-Optimize DSpace Statistics Solr Index at 01:30 every day
 * 2) NOTE: ONLY NECESSARY IF YOU ARE RUNNING SOLR STATISTICS
 * 3) (This ensures that the Statistics Solr Index is re-optimized for better performance)
 * 1) Send out subscription e-mails at 02:00 every day
 * 2) (This sends an email to any users who have "subscribed" to a Collection, notifying them of newly added content.)
 * 1) Run the media filter at 03:00 every day.
 * 2) (This task ensures that thumbnails are generated for newly add images,
 * 3) and also ensures full text search is available for newly added PDF/Word/PPT/HTML documents)

0 4 * * * $HOME/bin/dspace curate -q admin_ui > /dev/null
 * 1) Run any Curation Tasks queued from the Admin UI at 04:00 every day
 * 2) (Ensures that any curation task that an administrator "queued" from the Admin UI is executed
 * 3) asynchronously behind the scenes)

0 5 * * * $HOME/bin/dspace embargo-lifter > $HOME/log/embargo-release.log 2>&1
 * 1) Check for items to release from embargo in DSpace.
 * 2) (This applies to embargoes created with DSpace versions <= 3.2)

0 6 * * * $HOME/bin/dspace dsrun org.dspace.authority.UpdateAuthorities > $HOME/log/update-orcid-info.log 2>&1
 * 1) Update the local ORCID database with the latest information from the external ORCID database.
 * 2) (This only applies to DSpace versions => 5.2, if you enable ORCID lookups)


 * 1) WEEKLY TASKS #
 * 2) (Recommended to be run once per week, but can be run more or less frequently, based on your local needs/policies)
 * 1) (Recommended to be run once per week, but can be run more or less frequently, based on your local needs/policies)
 * 1) (Recommended to be run once per week, but can be run more or less frequently, based on your local needs/policies)

0 4 * * 0 $HOME/bin/dspace checker -d 1h -p > /dev/null 0 5 * * 0 $HOME/bin/dspace checker-emailer > /dev/null
 * 1) Run the checksum checker at 04:00 every Sunday
 * 2) By default it runs through every file (-l) and also prunes old results (-p)
 * 3) (This re-verifies the checksums of all files stored in DSpace. If any files have been changed/corrupted, checksums will differ.)
 * 4) 0 4 * * * $HOME/bin/dspace checker -l -p > /dev/null
 * 5) NOTE: LARGER SITES MAY WISH TO USE DIFFERENT OPTIONS. The above "-l" option tells DSpace to check *everything*.
 * 6) If your site is very large, you may need to only check a portion of your content per week. The below commented-out task
 * 7) would instead check all the content it can within *one hour*. The next week it would start again where it left off.
 * 1) would instead check all the content it can within *one hour*. The next week it would start again where it left off.
 * 1) Mail the results of the checksum checker (see above) to the configured "mail.admin" at 05:00 every Sunday.
 * 2) (This ensures the system administrator is notified whether any checksums were found to be different.)

30 0 * * 0 $HOME/bin/dspace stat-general > /dev/null 35 0 * * 0 $HOME/bin/dspace stat-monthly > /dev/null
 * 1) Run DSpace statistical analysis tools (12months takes approx 40secs)

00 1 * * 0 $HOME/bin/dspace stat-report-general > /dev/null 05 1 * * 0 $HOME/bin/dspace stat-report-monthly > /dev/null
 * 1) Generate DSpace statistical analysis reports
 * 1) MONTHLY TASKS #
 * 2) (Recommended to be run once per month, but can be run more or less frequently, based on your local needs/policies)
 * 1) (Recommended to be run once per month, but can be run more or less frequently, based on your local needs/policies)
 * 1) (Recommended to be run once per month, but can be run more or less frequently, based on your local needs/policies)

0 1 1 * * $HOME/bin/dspace cleanup > /dev/null
 * 1) Permanently delete any bitstreams flagged as "deleted" in DSpace, on the first of every month at 01:00
 * 2) (This ensures that any files which were deleted from DSpace are actually removed from your local filesystem.
 * 3)  By default they are just marked as deleted, but are not removed from the filesystem.)

01 0 1 * * find $HOME/dspace/log/*.log.* -mtime +30 -exec rm {} \;
 * 1) Remove all log files which are more than 30 days old
 * 2) on the first of every month
 * 1) YEARLY TASKS #
 * 2) (Recommended to be run once per year)
 * 1) (Recommended to be run once per year)
 * 1) (Recommended to be run once per year)

0 2 1 1 * $HOME/bin/dspace stats-util -s > /dev/null
 * 1) At 2:00AM every January 1, "shard" the DSpace Statistics Solr index.
 * 2) This ensures each year has its own Solr index, which improves performance.
 * 3) NOTE: ONLY NECESSARY IF YOU ARE RUNNING SOLR STATISTICS
 * 4) NOTE: This is scheduled here for 2:00AM so that it happens *after* the daily cleaning & re-optimization of this index.


 * 1) HOUSEKEEPING #
 * 2) (Recommended to be run daily)
 * 1) (Recommended to be run daily)
 * 1) (Recommended to be run daily)

0 2 1 * * find $HOME/config -name "*-*-*.old" -mtime +30 -exec rm {} \; 0 2 1 * * find $HOME/*.bak-*-* -maxdepth 0 -type d -mtime +30 -exec rm -rf {} \;
 * 1) Delete any ~/config/*/*.old files more than 30 days old (created by "ant update")
 * 1) Delete any ~/*.bak-*-*/ directories more than 30 days old (created by "ant update")

Save and exit the file.

System Log
To enable logging of cron events, edit the following file: sudo nano /etc/rsyslog.d/50-default.conf Enable the cron log by removing hash (#) in front of cron.*.

See example below. auth,authpriv.*                /var/log/auth.log cron.*                         -/var/log/cron.log
 * 1) First some standard log files.  Log by facility.
 * 1) First some standard log files.  Log by facility.
 * .*;auth,authpriv.none         -/var/log/syslog

Now restart the syslog service as follows: sudo service rsyslog restart