Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: RAMP: stochastic simulation of user-driven energy demand time series #6418

Closed
editorialbot opened this issue Feb 27, 2024 · 106 comments
Assignees
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering

Comments

@editorialbot
Copy link
Collaborator

editorialbot commented Feb 27, 2024

Submitting author: @FLomb (Francesco Lombardi)
Repository: https://github.com/RAMP-project/RAMP
Branch with paper.md (empty if default branch): joss-paper
Version: 0.5.2
Editor: @AdamRJensen
Reviewers: @FabianHofmann, @trevorb1
Archive: 10.5281/zenodo.11526597

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/e87d43a5f8747d2b6e8d65f98446baf3"><img src="https://joss.theoj.org/papers/e87d43a5f8747d2b6e8d65f98446baf3/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/e87d43a5f8747d2b6e8d65f98446baf3/status.svg)](https://joss.theoj.org/papers/e87d43a5f8747d2b6e8d65f98446baf3)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@noah80 & @FabianHofmann, your review will be checklist based. Each of you will have a separate checklist that you should update when carrying out your review.
First of all you need to run this command in a separate comment to create the checklist:

@editorialbot generate my checklist

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @AdamRJensen know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Checklists

📝 Checklist for @noah80

📝 Checklist for @FabianHofmann

📝 Checklist for @trevorb1

@editorialbot
Copy link
Collaborator Author

Hello humans, I'm @editorialbot, a robot that can help you with some common editorial tasks.

For a list of things I can do to help you, just type:

@editorialbot commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

Software report:

github.com/AlDanial/cloc v 1.88  T=1.10 s (79.0 files/s, 367152.5 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          23            762           1110           2629
reStructuredText                43            999           1917            566
Jupyter Notebook                10              0         395383            478
TeX                              1             21              0            201
Markdown                         3             77              0            155
YAML                             5             19             15            116
DOS Batch                        1              8              1             26
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                            87           1890         398433           4180
-------------------------------------------------------------------------------


gitinspector failed to run statistical information for the repository

@editorialbot
Copy link
Collaborator Author

Wordcount for paper.md is 647

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/j.esr.2023.101171 is OK
- 10.1016/j.esd.2023.05.004 is OK
- 10.1016/j.enconman.2023.117223 is OK
- 10.1016/j.segan.2023.101043 is OK
- 10.1016/j.segy.2022.100088 is OK
- 10.1016/j.apenergy.2022.118676 is OK
- 10.1016/j.esd.2021.10.009 is OK
- 10.1016/j.esd.2020.07.002 is OK
- 10.1016/j.energy.2019.04.097 is OK
- 10.1016/j.segan.2023.101120 is OK
- 10.1016/j.joule.2022.05.009 is OK
- 10.3390/app10217445 is OK
- 10.5281/zenodo.10275752 is OK

MISSING DOIs

- 10.24251/hicss.2023.097 may be a valid DOI for title: Sustainable Energy System Planning in Developing Countries: Facilitating Load Profile Generation in Energy System Simulations

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@noah80
Copy link

noah80 commented Mar 1, 2024

Review checklist for @noah80

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/RAMP-project/RAMP?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@FLomb) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@noah80
Copy link

noah80 commented Mar 1, 2024

@FLomb: Could you clarify the individual contributions of the authors?

@AdamRJensen
Copy link

@editorialbot add @trevorb1 as reviewer

@editorialbot
Copy link
Collaborator Author

@trevorb1 added to the reviewers list!

@FLomb
Copy link

FLomb commented Mar 5, 2024

@FLomb: Could you clarify the individual contributions of the authors?

Hi @noah80, the contributions are the following (thanks for asking; we didn't know they were requested):

  • F. Lombardi: conceptualisation, code development, code maintenance, code review, documentation, project coordination
  • PF. Duc: code development, code maintenance, code review, documentation, project coordination
  • M.A. Tahavori: code development, code maintenance, code review, documentation
  • C. Sanchez-Solis: code development, code testing, code review
  • S. Eckhoff: code development (interface), code testing
  • M.C.G. Hart: code development (interface), code testing
  • F. Sanvito: conceptualisation (mobility), code development (mobility), code testing, project coordination
  • G. Ireland: code testing, code review, feature coordination
  • S. Balderrama: code testing, code review, project coordination
  • S. Quoilin: conceptualisation, code review

@AdamRJensen
Copy link

@FabianHofmann, @trevorb1 👋 Don't hesitate to reach out if you need help getting started with your reviewer checklist😄

@FabianHofmann
Copy link

FabianHofmann commented Mar 15, 2024

Review checklist for @FabianHofmann

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/RAMP-project/RAMP?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@FLomb) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@FabianHofmann
Copy link

Comment and Questions

Hey @FLomb, very nice project! To get the ball rolling here is the first round of comments from my side.

General checks

  1. Some of the authors do not appear in the contributions list of the Github repository. Could you clarify their role? https://github.com/RAMP-project/RAMP/graphs/contributors
  2. JOSS asks for reproducibility of the figure in the paper (Data Sharing & Reproducibility). Is the code online somewhere? I could not see it in the joss_paper branch of the repo.

Functionality

  1. The readme proposes to create a python environment with python=3.8. This python version is soon outdated. Is the software supporting newer python versions? If yes, I would suggest updating the python version in the Readme.
  2. After having followed the installation instructions, which work fine, I had a look at the example at https://github.com/RAMP-project/RAMP?tab=readme-ov-file#building-a-model-with-a-python-script. What I find confusing is that I cannot display the household_1 object. The error message is clear (there are no appliances), but for newbies this might be confusing. Also empty objects which seems to be that basic to the whole software should be displayable.

Documentation

The Readthedocs documentation is great with all its examples. Two minor remarks.

  1. there are some examples which do not have an intro at the beginning, like the fixed flat-appliance example. These are however very valuable to understand the context.
  2. the navigation on the left sidebar (with the partial TOC) is a bit counter-intuitive, I find. I am getting lost easily without knowing where I am. If you agree, I would suggest increasing the scope of the sidebar's TOC.
  3. What is the test coverage of the project? Since there are not that many, it would be good to check that minimum coverage is realised.
  4. I cannot find a contribution guideline as required by JOSS.

Software paper

  1. The State of the field is not realised. It would be interesting what other alternative software is there and how it compares.

Again, nice collaborative project!

@FLomb
Copy link

FLomb commented Mar 18, 2024

Hi @FabianHofmann,

Thanks for the first round of comments! We will work on those as soon as possible. In the meantime, I am providing below some answers to questions that have a quick solution:

General checks

  1. Author contributions: I have clarified the author contributions above in this thread in reply to a comment by another reviewer. Please let me know if you see any remaining issues with those, and if you have advice on how to best make those explicit for future reference. Most authors did contribute to the code. The few exceptions should align with JOSS's guidelines, which state that "financial and organisational contributions are not considered sufficient for co-authorship, but active project direction and other forms of non-code contributions are".

Documentation
4. Contribution guidelines should be accessible as part of the docs (https://rampdemand.readthedocs.io/en/latest/intro.html#contributing). Is this sufficient, or are we missing something?

Thanks again for the valuable comments; we'll be in touch soon about the rest

@Bachibouzouk
Copy link

@AdamRJensen - We wanted to ask how we should address the review comments? Should we commit them directly on the review-branch? Or should we open PR onto the review-branch?

Thanks for editing our paper :)

@AdamRJensen
Copy link

@AdamRJensen - We wanted to ask how we should address the review comments? Should we commit them directly on the review-branch? Or should we open PR onto the review-branch?>

Personally, I would prefer a PR as this gives a good overview for the reviewers of what changes your making to address their comments (you can tag the reviewers in the relevant PRs). Also, if things are multiple major things, then I would make multiple PRs.

A pleasure to serve as your editor 😄

@Bachibouzouk
Copy link

Personally, I would prefer a PR as this gives a good overview for the reviewers of what changes your making to address their comments (you can tag the reviewers in the relevant PRs). Also, if things are multiple major things, then I would make multiple PRs.

Thanks for your suggestion, we will address the comments in PRs then! :)

If I understood well the JOSS docs, once the review process is over we create a new tagged release which encompasses the changes done in context of the review, and this new release will then be associated with the JOSS publication. Is that correct?

@AdamRJensen
Copy link

If I understood well the JOSS docs, once the review process is over we create a new tagged release which encompasses the changes done in context of the review, and this new release will then be associated with the JOSS publication. Is that correct?

That is correct, it's the final version at the end of the review process that will be associated with the JOSS paper. This way we get all the good reviewer feedback included.

@trevorb1
Copy link

trevorb1 commented Mar 27, 2024

Review checklist for @trevorb1

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the https://github.com/RAMP-project/RAMP?
  • License: Does the repository contain a plain-text LICENSE or COPYING file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@FLomb) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines
  • Data sharing: If the paper contains original data, data are accessible to the reviewers. If the paper contains no original data, please check this item.
  • Reproducibility: If the paper contains original results, results are entirely reproducible by reviewers. If the paper contains no original results, please check this item.
  • Human and animal research: If the paper contains original data research on humans subjects or animals, does it comply with JOSS's human participants research policy and/or animal research policy? If the paper contains no such data, please check this item.

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of need' that clearly states what problems the software is designed to solve, who the target audience is, and its relation to other work?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

@trevorb1
Copy link

Comments

Hi @FLomb; thanks so much for submitting RAMP and suggesting me to review! The collaboration, documentation, and existing use in literature of the RAMP project is super nice! Please find below my comments.

General Checks

  1. Same question as @FabianHofmann; is there code where I can reproduce Figure 1?
  2. Your reference of "Generating high-resolution multi-energy load profiles for remote areas with an open-source stochastic model" is a 2019 paper in Energy describing the first version of RAMP and validating its functionality, I believe? If so, can you please describe (high level) how this paper is different from the one mentioned? This doesn't necessarily need to go in the paper, just for my reference clarifying the differences please. (Sidenote; I know other packages like pyam are published in different journals - ORE and JOSS in that case - so Im not trying to attack this submission or anything! Just trying to understand the differences.)

Functionality

  1. In the "Using real calendar days to generate profiles" example, it walks through how to use the CLI to generate a number of profiles. I tried to use the excel file generated from the "Using tabular inputs to build a model" as an input, but received a matplotlib error for the second plot. The command I ran and the error I received is shown below. I can attach the full trace back if that helps. Not sure if I am just misunderstanding what my input file should be?
$ ramp -i example_excel_usecase.xlsx -n 10
ValueError: x and y must have same first dimension, but have shapes (1440,) and (1,)
  1. I couldn't find where package dependencies are listed. When I tried to run tests, I got a ModuleNotFoundError: No module named 'scipy' error. The only place I saw dependences was in the ramp/__init__.py file, but scipy is not listed. If I pip install scipy into my environment and run pytest tests/ I get a ModuleNotFoundError: No module named 'nbconvert'. I had to pip install nbconvert as well to get tests to run. Clarifying the dependencies for using vs. developing would help!

  2. If I run tests (with the extra pip installs), I get one failure:

FAILED tests/test_switch_on.py::TestRandSwitchOnWindow::test_coincidence_normality_on_peak - AssertionError: The 'coincidence' values are not normally distributed.

assert 0.029748765751719475 > 0.05

Documentation

  1. In the Quick Start section, appliances are added through the method User.Appliance(...). Later on in the examples section, appliances are added through the method User.add_appliance(...). Based on the API reference, the Appliance(...) method should only be used for when working with legacy code? If users are encouraged to use the add_appliance(...) method, updating docs to reflect this would be good! (Same issue also appears in "Appliances with multiple cycles" example)

  2. I find it a little confusing how the documentation site contribution guidelines and file CONTRIBUTING.md in the repository don't match?

  3. It may be worth noting in the contributing guidelines that developers should follow Black formatting, and contributions will be checked for it with your actions (although, I do see you have the Black badge on the README.md). Alternatively, having a pre-commit file, or specifying in the contributing guidelines how to install Black in VSCode (or similar) to autolint contributions would be good!

  4. In contributions guidelines, it may be good to specify what to include in a new issue ticket and PR (or even create templates for these)? For example, for a new issue, do you want to know OS the user is running, version of RAMP they are running, ect.

  5. In contribution guidelines on the doc site, you mention contributors should perform qualitative testing only. On the repository CONTRIBUTING.md file, it mentions to also run tests via pytest. Please clarify what tests contributors should be running.

  6. At the bottom of the introduction page, there is a note that says "This project is under active development!". Can you please clarify what the active development means? Just want to ensure this does not mean we can't trust the results. (Sorry, I know this one is a little pedantic!)

  7. The examples are great at walking the reader through the many different functions of RAMP! However, I think in the first example it would be beneficial to explicitly write out what all the different arguments in the User(...) and add_appliance(...) calls are doing. This can maybe be done through a top level description like in some of your other examples (as also suggested by @FabianHofmann). At first, especially for the add_appliance(...) method, it wasn't necessarily clear to me what the different arguments did. While I eventually did find the info I was looking for in the Appliance.__init__ API ref, this was not the first place I intuitively thought to look for the information (which was the add_appliance(...) method API ref).

Software Paper

  1. I agree with @FabianHofmann; a comparison of similar software/tools is needed.
  2. Expanding on your sentence of RAMP "features several degrees of customisations" to explicitly describe some of these customizations would be great I think! For example, being able to generate different loads for different uses (EVs, hot water, cooking ect..) is a valuable point for readers, I believe.

@Bachibouzouk
Copy link

@FabianHofmann , @trevorb1 thanks for your insightful and helpful comments! We will start addressing them in PRs from the joss-paper branch (https://github.com/RAMP-project/RAMP/tree/joss-paper) :)

@Bachibouzouk
Copy link

3. If I run tests (with the extra pip installs), I get one failure:

@trevorb1 This is actually a known issue, this tests fails randomly : RAMP-project/RAMP#99 I will use this review to push myself to investigate it further. If you have good suggestion of methodology/articles about stochastic code testing, they are welcome :)

@AdamRJensen
Copy link

AdamRJensen commented Jun 9, 2024

MISSING DOIs

  • No DOI given, and none found for title: RAMP: stochastic multi-energy demand profiles
  • 10.24251/hicss.2023.097 may be a valid DOI for title: Sustainable Energy System Planning in Developing C...

INVALID DOIs
https://doi.org/10.1016/j.esd.2016.01.005 is INVALID because of 'https://doi.org/' prefix

@FLomb could you fix these? Also, I think instead of citing the github repo then just put the link in a parentheses.

@FLomb
Copy link

FLomb commented Jun 10, 2024

@FLomb could you fix these? Also, I think instead of citing the github repo then just put the link in a parentheses.

@AdamRJensen I'm sorry about the missing/invalid DOIs. We had checked them, but these issues must have slipped through. How to best fix them? Should we make a new minor release with the corrected paper?

@AdamRJensen
Copy link

AdamRJensen commented Jun 10, 2024

@AdamRJensen I'm sorry about the missing/invalid DOIs. We had checked them, but these issues must have slipped through. How to best fix them? Should we make a new minor release with the corrected paper?

I don't see any need to make a new release, but if you could just make the changes to the branch with the paper. The version and doi are only for the code part and not the paper part as far as I understand.

@FLomb
Copy link

FLomb commented Jun 11, 2024

@AdamRJensen Thanks for the clarification. I have fixed the DOI issues in the joss-paper branch.

@AdamRJensen
Copy link

@editorialbot generate pdf

@editorialbot
Copy link
Collaborator Author

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@AdamRJensen
Copy link

@editorialbot check references

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/j.esr.2023.101171 is OK
- 10.1016/j.esd.2023.05.004 is OK
- 10.1016/j.enconman.2023.117223 is OK
- 10.1016/j.segan.2023.101043 is OK
- 10.1016/j.segy.2022.100088 is OK
- 10.1016/j.apenergy.2022.118676 is OK
- 10.1016/j.esd.2021.10.009 is OK
- 10.1016/j.esd.2020.07.002 is OK
- 10.1016/j.energy.2019.04.097 is OK
- 10.1016/j.segan.2023.101120 is OK
- 10.1016/j.joule.2022.05.009 is OK
- 10.3390/app10217445 is OK
- 10.24251/hicss.2023.097 is OK
- 10.5281/zenodo.10275752 is OK
- 10.17028/rd.lboro.2001129.v8 is OK
- 10.1016/j.esd.2016.01.005 is OK
- 10.1186/s42162-021-00180-6 is OK
- 10.21105/joss.03574 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@AdamRJensen
Copy link

@editorialbot recommend-accept

@editorialbot
Copy link
Collaborator Author

Attempting dry run of processing paper acceptance...

@editorialbot
Copy link
Collaborator Author

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1016/j.esr.2023.101171 is OK
- 10.1016/j.esd.2023.05.004 is OK
- 10.1016/j.enconman.2023.117223 is OK
- 10.1016/j.segan.2023.101043 is OK
- 10.1016/j.segy.2022.100088 is OK
- 10.1016/j.apenergy.2022.118676 is OK
- 10.1016/j.esd.2021.10.009 is OK
- 10.1016/j.esd.2020.07.002 is OK
- 10.1016/j.energy.2019.04.097 is OK
- 10.1016/j.segan.2023.101120 is OK
- 10.1016/j.joule.2022.05.009 is OK
- 10.3390/app10217445 is OK
- 10.24251/hicss.2023.097 is OK
- 10.5281/zenodo.10275752 is OK
- 10.17028/rd.lboro.2001129.v8 is OK
- 10.1016/j.esd.2016.01.005 is OK
- 10.1186/s42162-021-00180-6 is OK
- 10.21105/joss.03574 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@editorialbot
Copy link
Collaborator Author

👋 @openjournals/pe-eics, this paper is ready to be accepted and published.

Check final proof 👉📄 Download article

If the paper PDF and the deposit XML files look good in openjournals/joss-papers#5485, then you can now move forward with accepting the submission by compiling again with the command @editorialbot accept

@editorialbot editorialbot added the recommend-accept Papers recommended for acceptance in JOSS. label Jun 11, 2024
@kyleniemeyer
Copy link

Hi @FLomb, it looks like one of your coauthors (Sergio Balderrama) is missing from the Zenodo author/contributor list. Can you correct that? You can edit the Zenodo archive metadata without needing a new version/DOI.

@FLomb
Copy link

FLomb commented Jun 12, 2024

Hi @FLomb, it looks like one of your coauthors (Sergio Balderrama) is missing from the Zenodo author/contributor list. Can you correct that? You can edit the Zenodo archive metadata without needing a new version/DOI.

Hi @kyleniemeyer, thank you for spotting it; that was quite a mistake! Sergio Balderrama has now been added to the Zenodo author list.

@kyleniemeyer
Copy link

@editorialbot accept

@editorialbot
Copy link
Collaborator Author

Doing it live! Attempting automated processing of paper acceptance...

@editorialbot
Copy link
Collaborator Author

Ensure proper citation by uploading a plain text CITATION.cff file to the default branch of your repository.

If using GitHub, a Cite this repository menu will appear in the About section, containing both APA and BibTeX formats. When exported to Zotero using a browser plugin, Zotero will automatically create an entry using the information contained in the .cff file.

You can copy the contents for your CITATION.cff file here:

CITATION.cff

cff-version: "1.2.0"
authors:
- family-names: Lombardi
  given-names: Francesco
  orcid: "https://orcid.org/0000-0002-7624-5886"
- family-names: Duc
  given-names: Pierre-François
  orcid: "https://orcid.org/0000-0001-8698-8854"
- family-names: Tahavori
  given-names: Mohammad Amin
  orcid: "https://orcid.org/0000-0002-7753-0523"
- family-names: Sanchez-Solis
  given-names: Claudia
  orcid: "https://orcid.org/0000-0003-2385-7392"
- family-names: Eckhoff
  given-names: Sarah
  orcid: "https://orcid.org/0000-0002-6168-4835"
- family-names: Hart
  given-names: Maria C. G.
  orcid: "https://orcid.org/0000-0002-1031-9782"
- family-names: Sanvito
  given-names: Francesco
  orcid: "https://orcid.org/0000-0002-9152-9684"
- family-names: Ireland
  given-names: Gregory
- family-names: Balderrama
  given-names: Sergio
- family-names: Kraft
  given-names: Johann
- family-names: Dhungel
  given-names: Gokarna
- family-names: Quoilin
  given-names: Sylvain
contact:
- family-names: Lombardi
  given-names: Francesco
  orcid: "https://orcid.org/0000-0002-7624-5886"
doi: 10.5281/zenodo.11526597
message: If you use this software, please cite our article in the
  Journal of Open Source Software.
preferred-citation:
  authors:
  - family-names: Lombardi
    given-names: Francesco
    orcid: "https://orcid.org/0000-0002-7624-5886"
  - family-names: Duc
    given-names: Pierre-François
    orcid: "https://orcid.org/0000-0001-8698-8854"
  - family-names: Tahavori
    given-names: Mohammad Amin
    orcid: "https://orcid.org/0000-0002-7753-0523"
  - family-names: Sanchez-Solis
    given-names: Claudia
    orcid: "https://orcid.org/0000-0003-2385-7392"
  - family-names: Eckhoff
    given-names: Sarah
    orcid: "https://orcid.org/0000-0002-6168-4835"
  - family-names: Hart
    given-names: Maria C. G.
    orcid: "https://orcid.org/0000-0002-1031-9782"
  - family-names: Sanvito
    given-names: Francesco
    orcid: "https://orcid.org/0000-0002-9152-9684"
  - family-names: Ireland
    given-names: Gregory
  - family-names: Balderrama
    given-names: Sergio
  - family-names: Kraft
    given-names: Johann
  - family-names: Dhungel
    given-names: Gokarna
  - family-names: Quoilin
    given-names: Sylvain
  date-published: 2024-06-12
  doi: 10.21105/joss.06418
  issn: 2475-9066
  issue: 98
  journal: Journal of Open Source Software
  publisher:
    name: Open Journals
  start: 6418
  title: "RAMP: stochastic simulation of user-driven energy demand time
    series"
  type: article
  url: "https://joss.theoj.org/papers/10.21105/joss.06418"
  volume: 9
title: "RAMP: stochastic simulation of user-driven energy demand time
  series"

If the repository is not hosted on GitHub, a .cff file can still be uploaded to set your preferred citation. Users will be able to manually copy and paste the citation.

Find more information on .cff files here and here.

@editorialbot
Copy link
Collaborator Author

🐘🐘🐘 👉 Toot for this paper 👈 🐘🐘🐘

@editorialbot
Copy link
Collaborator Author

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.06418 joss-papers#5495
  2. Wait five minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.06418
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@editorialbot editorialbot added accepted published Papers published in JOSS labels Jun 12, 2024
@kyleniemeyer
Copy link

Congratulations @FLomb on your article's publication in JOSS! Please consider signing up as a reviewer if you haven't already.

Many thanks to @FabianHofmann and @trevorb1 for reviewing this, and @AdamRJensen for editing.

@editorialbot
Copy link
Collaborator Author

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.06418/status.svg)](https://doi.org/10.21105/joss.06418)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.06418">
  <img src="https://joss.theoj.org/papers/10.21105/joss.06418/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.06418/status.svg
   :target: https://doi.org/10.21105/joss.06418

This is how it will look in your documentation:

DOI

We need your help!

The Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@FLomb
Copy link

FLomb commented Jun 13, 2024

Wonderful! Many thanks, @trevorb1, @FabianHofmann, @AdamRJensen and @kyleniemeyer, for contributing to this process in your different roles!

@FabianHofmann
Copy link

Congrats! Was a pleasure!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX Track: 3 (PE) Physics and Engineering
Projects
None yet
Development

No branches or pull requests

8 participants