This repo also includes the job queue. Previously the job queue was a separate repo: rfcx/arbimon-jobqueue.
Specific documentation about how the python scripts work for each type of job is in docs/.
- ssh to Bastion server:
ssh ec2-user@54.159.71.198 -i ~/.ssh/arbimon2-bastion.pem
- on Bastion server ssh to Job server:
ssh-job
- Change user to arbimon2
sudo su arbimon2
- List current active background processes
forever list
You should see a list like that:
info: Forever processes running
data: uid command script forever pid id logfile uptime
data: [0] XwGh /usr/bin/nodejs /var/lib/arbimon2/jobqueue/bin/jobqueue 1422 1428 /var/lib/arbimon2/.forever/XwGh.log 0:0:10:16.81
- Then you need to open logfile with a preferred tool
tail /var/lib/arbimon2/.forever/XwGh.log
To see all file, just use
cat /var/lib/arbimon2/.forever/XwGh.log
Repeat steps 1-4 from How to see latest logs
.
- Then you need to get uid and run the command:
forever restart XwGh
error: Forever detected script exited with code: null
error: Script restart attempt #1
Stop the process and start queue:
forever stop XwGh
cd /var/lib/arbimon2/jobqueue
./start_queue-debug.sh
- Open directory with logs
cd /var/lib/arbimon2/.forever/
- List current active background processes
forever list
- Get uid of the current active process and use it in the following command
rm -v !("XwGh.log")
This will delete all log files except file of the current process
- Go to jobs temp directory and delete folders which relate to old jobs
cd /mnt/jobs-temp/
-
python 2.7 - comes with Ubuntu
-
All dependencies in one line
sudo apt-get install -y python-pip libmysqlclient-dev python-dev gfortran libopenblas-dev liblapack-dev libpng12-dev libfreetype6-dev libsndfile1 libsndfile-dev libsamplerate-dev python-virtualenv r-base r-base-dev libfftw3-3 libfftw3-dev r-cran-rgl bwidget
-
Install all python dependencies, create python virtual enviroment and build
./setup/setup.sh
-
python 2.7 - comes with Ubuntu
-
pip - python dependencies
sudo apt-get install pip or sudo apt-get install python-pip
-
R statistics
sudo apt-get install r-base sudo apt-get install r-base-dev sudo apt-get install libfftw3-3 libfftw3-dev libsndfile1-dev r-cran-rgl bwidget
-
R packages (tuneR,seewave)
sudo Rscript scripts/setup/r-packages.R
-
MySQL-python dependencies
sudo apt-get install libmysqlclient-dev python-dev
-
python virtualenv
sudo apt-get install virtualenv
-
scipy dependencies
sudo apt-get install gfortran libopenblas-dev liblapack-dev
-
libsamplerate
sudo apt-get install libsamplerate-dev
-
matplotlib dependencies
sudo apt-get install libpng12-dev libfreetype6-dev
-
scikits.audiolab dependencies
sudo apt-get install libsndfile-dev
-
node global dependencies(
sudo npm install -g <package>
): -
bower
-
grunt-cli
-
individual python dependencies (
sudo pip install
):- numpy
- scipy
- MySQL-python
- scikit-learn
- boto
- pypng
- matplotlib
- wsgiref
- argparse
- virtualenv
- joblib
- scikits.audiolab
- Pillow
- networkx
- scikit-image
- scikits.samplerate
If you have access to the AWS ECR for RFCx, then build it with the latest production job queue:
docker build -t rfcx-arbimon-jobs -f build/Dockerfile .
Otherwise, you will need to build the job queue manually (see [rfcx/arbimon-jobqueue](https://github.com/rfcx/arbimon-jobqueue, follow the docker build command to tag rfcx-arbimon-jobqueue
) and then pass in a build arg to set the base image:
docker build -t rfcx-arbimon-jobs --build-arg base=rfcx-arbimon-jobqueue -f build/Dockerfile .
Test it:
docker run -it --rm --env-file config/config.env.sample --env-file ../rfcx-arbimon-jobqueue/config/config.env.sample rfcx-arbimon-jobs