-
Notifications
You must be signed in to change notification settings - Fork 1
Getting Started: Developer Guide
Sharead is currently hosted on heroku: sharead.org. The current version under active development is MVP v0.2. Our technical stack is Heroku+tornado+redis+S3
for backend, and coffeescript+mustache+css/html
for frontend.
-
browser/
. This includes chrome plugin. Users can pin papers to Sharead. -
server/
.-
frontend/
. The frontend is built with coffeescript, mustache, css/html/javascript. might migrate to react.js in the future. *static
. static files like javascript, CSS, icons. *template
. tornado templates for pages. -
sharead
. The main backend module, written in Python tornado. It usesredis
for in-memory data store, andS3
to store static files.
-
First, please install all the python dependencies. The easiest way to do so is
sudo pip install -r requirements.txt
Sharead also depends on pdf2htmlEX. Please see https://github.com/coolwanglu/pdf2htmlEX/wiki/Building for detailed instructions. For MacOS, this step is very simple.
brew install pdf2htmlEX
After installing the required modules, start the server by
cd server
python server.py
Then go to http://localhost:5000/
and test out Sharead.
We use Travis CI for unit and integration tests. The testing is hosted at travis-ci.com/strin/sharead. After you've made changes to the code, make sure run under server/
./test.sh
This will invoke py.test
and run all test_*.py
in the codebase.
-
server/tests/*.py
. This directory has tests for server routing paths. - for a module, its tests are put under
tests/test_xxx.py
. For example, the tests forsharead.storage.s3
is undersharead.storage.tests.test_s3.py
.
To run the tests on Travis CI, just push it to origin.
git push origin mvp0.2
Our backend is hosted on Heroku. The server provision is do through the buildpack at github.com/strin/heroku-buildpack-apt. To deploy our code to sharead.org, just do
git push heroku mvp0.2:master
This will automatically update the code on Heroku server. Since records are stored in redis and files are in S3, deployment will not flush those data.