|Charles Pence a7f815c25a||2 months ago|
|check||8 months ago|
|fetch||9 months ago|
|lib||2 months ago|
|move||7 months ago|
|old||9 months ago|
|parse||2 months ago|
|util||9 months ago|
|xslt||7 years ago|
|.env.example||9 months ago|
|.gitignore||9 months ago|
|COPYING||7 years ago|
|Gemfile||3 months ago|
|Gemfile.lock||3 months ago|
|README.md||7 months ago|
This repository contains a set of scripts used for maintaining the documents in the evoText article database. They may or may not work for you, be useful, or explode.
Some of the scripts are documented in this README file, and others are not. We’re aiming to improve the status of this documentation, but make no guarantees.
In our work, we often produce multiple versions of the same file -- for instance,
a.pdf might be transformed into
a.json, and then supplemented by
a.crossref.json. This script will walk through the current directory looking at every basename, ensuring that for each one, each of the provided file extensions can be found. If not, all of the versions available will be moved into a folder called
In a folder containing:
Running this script will remove all characters from filenames in the current directory other than
0-9, dash, and underscore, leaving exactly one dotted file extension at the end of the filename.
Note that the assumption of one file extension means that both files that are supposed to have no extension at all and files with double-dotted extensions (like
.pubmed.xml is supposed to be “the file extension”) will produce unexpected behavior.
Directories that are too large tend to upset operating systems. Over about 10,000 files (at least in our testing), network shares and even basic local commands like
ls stop being very responsive. This script is designed to fix this in a way that still allows for one to quickly determine if a file is present on disk or not. It will take a number of files with names like
journal-article-10_2307_382953.xml, etc., and file them in folders corresponding to parts of the filename. For instance, the example files above could be placed in:
where here, the folder names have been extracted from the first “variable” parts of the filenames (
The script, then, works through the directories given and moves files into the output directory. If the output directory passes a given threshold size, it is split along the first non-ignored character. The process repeats, further subdividing folders as needed until all files have been moved.
With files stored in this way, one can write a quick algorithm for determining whether or not a file is present on disk. Start looking at the variable characters in the filename, walk down the folders present on disk, until you run out, and then look for the presence or absence of the file for which you’re searching.
in_hashed_directories [--max-files NUM] [--ignore-chars NUM] [--output DIR] --main-extension [EXT] [directories to search]
--max-files NUM: Control the maximum number of files that the script will allow within a given folder before splitting it. It defaults to 10,000.
--ignore-chars NUM: Ignore a given number of characters as “constant” at the beginning of every filename, before looking for “splitting” characters. It defaults to zero, skipping no characters.
--main-extension EXT: The script will look for all files which share the same basename and move them all at once (that is,
a.txtwill always wind up in the same folder). This parameter tells the script which extension should be the “primary” one to search. It defaults to
--output DIR: The output directory which will be the root of the hashed directory tree. Defaults to
in_hashed_directories --max-files 5000 --ignore-chars 24 --main-extension .xml --output ~/FilesHashed ~/Files
Move all files in
~/Files to hashed subdirectories of
~/FilesHashed, letting no directory grow larger than 5,000 files, and ignoring the first 24 characters of every filename (in the example above,
All scripts here, unless otherwise specified, are released under the Creative Commons CC0 license, making them as far as possible public domain content in every local jurisdiction.