Skip to content

Commit

Permalink
Update opportunities.html
Browse files Browse the repository at this point in the history
  • Loading branch information
victoriaBrook committed Oct 17, 2024
1 parent 27334b5 commit edc71c0
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions opportunities.html
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ <h3 id="ai_safety"><a href="aisafety">Learn more context on large-scale risks fr
<section>
<div class="inner">
<h1>Opportunities</h1>
<p><i>Last updated: 10/15/24</i></p>
<p><i>Last updated: 10/17/24</i></p>
<!-- <p style="margin-left:5%"><i>Opportunities relevant to <a href="aisafety">reducing large-scale risks from advanced AI</a>.</i></p> -->


Expand Down Expand Up @@ -114,11 +114,12 @@ <h3 id="funding">Funding Opportunities</h3>
<li><a href="https://futureoflife.org/grant-program/mitigate-ai-driven-power-concentration/">How to Mitigate AI-driven Power Concentration</a> <i>(deadline: 10/31/24)</i></li>
</ul>
<li><a href=https://www.cooperativeai.com/contests/concordia-2024>Cooperative AI Foundation Concordia Contest 2024</a><i> (deadline: 11/1/2024)</i></li>
<li><a href="https://taskdev.metr.org/bounty/">METR Evaluation Task Bounty</a></i> (<i>related: <a href="https://metr.github.io/autonomy-evals-guide/">METR's Autonomy Evaluation Resources</a></i>)</li>
<li><a href="https://www.schmidtsciences.org/safe-ai/">Schmidt Sciences: Safety Assurance through Fundamental Science in Emerging AI</a> <i>(deadline: 11/08/2024)</i></li>
<li><a href="https://www.aisi.gov.uk/grants">UK AI Safety Institute: Systemic AI Safety Grants</a><i> (deadline: 11/26/24)</i></li>
<li><a href="https://www.mlsafety.org/safebench">SafeBench Competition</a> (<i>deadline: 2/25/2025; $250k in prizes</i>)</li>
<li><a href="https://new.nsf.gov/funding/opportunities/secure-trustworthy-cyberspace-satc">NSF Secure and Trustworthy Cyberspace Grants</a></li>
<li><a href ="https://foresight.org/ai-safety/">Foresight Institute: Grants for Security, Cryptography & Multipolar Approaches to AI Safety <i>(quarterly applications)</i></a></li>
<li><a href="https://www.schmidtsciences.org/safe-ai/">Schmidt Sciences: Safety Assurance through Fundamental Science in Emerging AI</a> <i>(deadline: 11/08/2024)</i></li>
<li><a href="https://taskdev.metr.org/bounty/">METR Evaluation Task Bounty</a></i> (<i>related: <a href="https://metr.github.io/autonomy-evals-guide/">METR's Autonomy Evaluation Resources</a></i>)</li>
<li><a href="https://funds.effectivealtruism.org/funds/far-future">Long-Term Future Fund</a></li>
<!-- https://jobs.80000hours.org/?refinementList%5Btags_area%5D%5B0%5D=AI%20safety%20%26%20policy&refinementList%5Btags_role_type%5D%5B0%5D=Funding -->

Expand Down

0 comments on commit edc71c0

Please sign in to comment.