Skip to content

Commit

Permalink
Merge pull request #37 from natashabatalha/dev
Browse files Browse the repository at this point in the history
Final merge to master
  • Loading branch information
natashabatalha authored Jul 12, 2021
2 parents 624c399 + 12035bd commit ab0ccfd
Show file tree
Hide file tree
Showing 100 changed files with 69,908 additions and 647 deletions.
3 changes: 1 addition & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@ language: python

matrix:
include:
- python: 3.6
- python: 3.7

git:
Expand Down Expand Up @@ -31,4 +30,4 @@ script:
- python -c 'import picaso'

after_success:
- test $TRAVIS_BRANCH = "master" && conda install conda-build && conda install anaconda-client && bash conda/conda_upload.sh
- test $TRAVIS_BRANCH = "master" && conda install conda-build && conda install anaconda-client && bash conda/conda_upload.sh
11 changes: 11 additions & 0 deletions HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,17 @@
History
-------

2.2 (2021-7-12)
~~~~~~~~~~~~~~~~~~
* Add evolution tracks
* Add ability to use pre mixed c-k tables
* Expand chemistry to include new Visscher tables
* Add ability to pull out contribution from individual species without running full RT
* Yount planet table from ZJ Zhang.
* Separate workshop notebooks for Sagan School 2020, 2021 and ERS
* Add explicit "hard surface" term for thermal flux boundary condition for terrestrial calculations
* Minor bug fixes/improvements

2.1 (2020-11-02)
~~~~~~~~~~~~~~~~~~

Expand Down
5 changes: 3 additions & 2 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@
nbsphinx_allow_errors = True
nbsphinx_execute = 'never'

#bibtex_bibfiles = ['refs.bib']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

Expand All @@ -65,9 +66,9 @@
# built documents.
#
# The short X.Y version.
version = '2.1'
version = '2.2'
# The full version, including alpha/beta/rc tags.
release = '2.1'
release = '2.2'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
193 changes: 193 additions & 0 deletions docs/contribution.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,193 @@
Contributing to ``PICASO``
==========================

PEP 8 Style Guide
-----------------

We generally follow `PEP 8 <https://www.python.org/dev/peps/pep-0008/#descriptive-naming-styles>`_ styling. Below we emphasize the "must haves" of our code.

- Your code should read like a book. If you have ~10 lines of code without a comment you are commenting too infrequently.
- It is really important to style function headers uniformly using `NumPy style Python Docstrings <https://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_numpy.html#example-numpy>`_. This enables `sphinx <http://www.sphinx-doc.org/en/master/>`_ to auto read function headers. Below is an example, but `look here <https://numpydoc.readthedocs.io/en/latest/format.html#sections>`_ for a full list of headers.

.. code-block:: python
def foo(in1, in2,in3=None):
"""
Describe function thoroughly.
Add any relevant citations.
Parameters
----------
in1 : int
This variable is for something cool.
in2 : list of float
This variable is for something else cool.
in3 : str,optional
(Optional) Default=None, this variable is options.
Returns
-------
int
Cool output
float
Other cool output
Examples
--------
This is how to use this.
>>a = foo(5,[5.0,4.0],in3='hello')
Warnings
--------
Garbage in garbage out
"""
- Variable names should explain what the variable is. Avoid undescriptive names like `thing1` or `thing2`
- If you have an equation in your code it should be referenced with a paper and equation number. For example:

.. code-block:: python
#now define terms of Toon et al 1989 quadrature Table 1
#https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/JD094iD13p16287
#see table of terms
sq3 = sqrt(3.)
g1 = (sq3*0.5)*(2. - w0*(1.+cosbar)) #table 1
g2 = (sq3*w0*0.5)*(1.-cosbar) #table 1
g3 = 0.5*(1.-sq3*cosbar*ubar0) #table 1
lamda = sqrt(g1**2 - g2**2) #eqn 21
gama = (g1-lamda)/g2 #eqn 22
Github Workflow
---------------

Before contributing, consider submitting an issue request. Sometimes we may already be aware of an issue and can help you fix something faster.

1) Clone the repository
^^^^^^^^^^^^^^^^^^^^^^^

Clond the repository that you are interested in working on.

.. code-block:: bash
git clone https://github.com/natashabatalha/picaso.git
This will download a copy of the code to your computer. You will automatically be in the ``master`` branch upon downloading.

**Side note: Important distinction between ``master`` and ``dev``**

``master`` always represents the released production code. Here is the workflow we will follow. All major development will be done on branches off of ``dev``. The only exceptions are what we call "hotfixes", which can go directly from the fixed branch to master, and minor bugs that can be directly fixed on ``dev``. See the overall schematic below.

.. image:: github_flow.jpg


2) Create a branch off of ``dev`` with a useful name
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

It's likely you will be working on a specific subset of a bigger code project. Any changes you make on a new branch will not affect ``master`` or ``dev``, so you can feel free to beat up the code without damaging anything that is fully tested.

.. code-block:: bash
git checkout -b myfeature dev
3) Work work work work...
^^^^^^^^^^^^^^^^^^^^^^^^^
Let's pretend that ``myfeature`` entails working on ``file1.py`` and ``file2.py``. After you are happy with an initial change, commit and push your changes.

.. code-block:: bash
#commit changes
git add file1.py file2.py
git commit -m 'added cool thing'
#switch to dev branch
git checkout dev
#merge your changes
git merge --no-ff myfeature
#delete old branch
git branch -d myfeature
#push to dev
git push origin dev
Many people ask: "How often should I commit??". Choose something that works for you and stick to it. I try and work on smaller, individual tasks and commit when I feel I have finished something. If you try and do too much at once, your commit comments won't make too much sense with what you have actually done. Remember, eventually someone will have to review your commits. If they are hard to parse, it will delay the merge of your work.

4) Final merge to ``master``
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
``master`` is generally a protected branch, so talk to the admin or the team before proceeding. In general, merges to master are easiest done through `Github Online <https://github.com/natashabatalha/picaso>`_. Near where the branches are listed, go to "New Pull Request". Write a description of the new dev capability, and request a merge to master. And if all good then, done!!!

Using Conda Enviornments
------------------------

Package control and version control is a pain. To make sure everyone is running on the same enviornment it will be beneficial if we are all work in the same environment. Here are the most pertinent commands you need to know.

Create your own environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^
To create your own environment with a specific name and python package:

.. code-block:: bash
conda create --name your_env_name python=3.7 -y
If you have specific environment variables that need to be tied to here, then you can specify them. For example, in PICASO there is the environment variable ``picaso_refdata`` and ``PYSYN_CDBS``:

.. code-block:: bash
conda activate your_env_name
cd $CONDA_PREFIX
mkdir -p ./etc/conda/activate.d
mkdir -p ./etc/conda/deactivate.d
touch ./etc/conda/activate.d/env_vars.sh
touch ./etc/conda/deactivate.d/env_vars.sh
Edit ``./etc/conda/activate.d/env_vars.sh``

.. code-block:: bash
#!/bin/sh
export MY_VAR='path/to/wherever/you/need'
And edit ``./etc/conda/deactivate.d/env_vars.sh``

.. code-block:: bash
#!/bin/sh
unset MY_VAR
No whenever you activate your environment, your variable name will be there. Whenever you deactivate your environment, it will go away.

Export your environment
^^^^^^^^^^^^^^^^^^^^^^^
Another great aspect of conda enviornments is that they can be passed to one another.

.. code-block:: bash
conda env export > my_environment.yml
Create enviornment from a ``.yml`` file
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

If someone passes you an environment file, you can easily create an environment from it !

.. code-block:: bash
conda env create -f environment.yml
Binary file added docs/github_flow.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,8 +60,10 @@ Resources

Installation <installation>
The Tutorials <tutorials>
The Derivations <https://natashabatalha.github.io/picaso_dev>
The Code <picaso>
Contributing <contribution>
Workshop Material <workshops>
The Derivations <https://natashabatalha.github.io/picaso_dev>
Github <https://github.com/natashabatalha/picaso>
The Paper <https://arxiv.org/abs/1904.09355>

Expand Down
44 changes: 36 additions & 8 deletions docs/installation.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
Installation
============

Python Version
--------------

Right now `PICASO` is only supported via Travis CI on Python 3.7 because of it's dependency on the STScI routine `photutils`.

Install with Pip
----------------

Expand All @@ -21,7 +26,7 @@ Install with Git
Download and Link Reference Documentation
-----------------------------------------

1) Download the `Reference Folder from Github <https://github.com/natashabatalha/picaso/tree/master/reference>`_. You may have this already if you did a Git clone.
1) Download the `Reference Folder from Github <https://github.com/natashabatalha/picaso/tree/master/reference>`_. You may have this already if you did a Git clone. **Make sure that your reference folder matches the version number of ``PICAOS``**. Check the version number in the file ``reference/version.md``.

2) Download the `Resampled Opacity File from Zenodo <https://doi.org/10.5281/zenodo.3759675>`_. Place in the `Opacities reference Folder you downloaded from Github <https://github.com/natashabatalha/picaso/tree/master/reference>`_ (see below in step 3)

Expand All @@ -46,16 +51,28 @@ Should look something like this
ls
base_cases config.json opacities
Your opacities folder shown above should include the file `opacities.db` `file downloaded from zenodo <https://doi.org/10.5281/zenodo.3759675>`_. This is mostly a matter of preference, as PICASO allows you to point to an opacity directory. Personally, I like to store something with the reference data so that I don't have to constantly specify a folder path when running the code.
Your opacities folder shown above should include the file ``opacities.db`` `file downloaded from zenodo <https://doi.org/10.5281/zenodo.3759675>`_. This is mostly a matter of preference, as PICASO allows you to point to an opacity directory. Personally, I like to store something with the reference data so that I don't have to constantly specify a folder path when running the code.

Download and Link Pysynphot Stellar Data
----------------------------------------

In order to get stellar spectra you will have to download the stellar spectra here from PySynphot:

1) Download the `stellar spectra from here <https://pysynphot.readthedocs.io/en/latest/appendixa.html>`_. The Defulat for `PICASO` is Castelli-Kurucz Atlas: `ck04models <https://archive.stsci.edu/hlsps/reference-atlases/cdbs/grid/ck04models/>`_.
1) PICASO uses the `Pysynphot package <https://pysynphot.readthedocs.io/en/latest/appendixa.html>`_ which has several download options for stellar spectra. The Defulat for ``PICASO`` is Castelli-Kurucz Atlas: `ck04models <https://archive.stsci.edu/hlsps/reference-atlases/cdbs/grid/ck04models/>`_.

You can download them by doing this:

.. code-block:: bash
wget http://ssb.stsci.edu/trds/tarfiles/synphot3.tar.gz
When you untar this you should get a directory structure that looks like this ``<path>/grp/redcat/trds/grid/ck04models``. Some other people have reported a directory structure that looks like this ``<path>/grp/redcat/trds/grid/ck04models``. **The full directory structure does not matter**. Only the last portion ``grid/ck04models``. You will need to create an enviornment variable that points to where ``grid/`` is located. See below.

.. code-block:: bash
wget -r https://archive.stsci.edu/hlsps/reference-atlases/cdbs/grid/ck04models/
2) Create environment variable
2) Create environment variable via bash

.. code-block:: bash
Expand All @@ -65,14 +82,25 @@ Add add this line:

.. code-block:: bash
export PYSYN_CDBS="/path/to/data/files/grp/hst/cdbs"
export PYSYN_CDBS="<your_path>/grp/redcat/trds"
Should look something like this
Then always make sure to source your bash profile after you make changes.

.. code-block:: bash
cd /path/to/data/files/grp/hst/cdbs
source ~/.bash_profile
Now you should be able to check the path:

.. code-block:: bash
cd $PYSYN_CDBS
ls
grid
Where `grid` contains whatever `pysynphot` data files you have downloaded (e.g. a folder called `ck04models`).
Where the folder ``grid/`` contains whatever ``pysynphot`` data files you have downloaded (e.g. a folder called ``ck04models/``).

.. note::

1. STScI serves these files in a few different places, with a few different file structures. **PySynphot only cares that the environment variable points to a path with a folder called `grid`. So do not worry if `grp/hst/cdbs` appears different.**

Loading

0 comments on commit ab0ccfd

Please sign in to comment.