Skip to content

Commit

Permalink
Update opportunities.html
Browse files Browse the repository at this point in the history
  • Loading branch information
victoriaBrook committed Sep 19, 2024
1 parent e616d20 commit 557de21
Showing 1 changed file with 11 additions and 5 deletions.
16 changes: 11 additions & 5 deletions opportunities.html
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ <h3 id="ai_safety"><a href="aisafety">Learn more context on large-scale risks fr
<section>
<div class="inner">
<h1>Opportunities</h1>
<p><i>Last updated: 09/09/24</i></p>
<p><i>Last updated: 09/19/24</i></p>
<!-- <p style="margin-left:5%"><i>Opportunities relevant to <a href="aisafety">reducing large-scale risks from advanced AI</a>.</i></p> -->


Expand Down Expand Up @@ -84,6 +84,7 @@ <h3>Job Opportunities</h3>
<li class="expandable"><a href="connections">Postdoctoral Positions and PhDs</a></li>
<ul>
<li>You can use the <a href="https://airtable.com/appWAkbSGU6x8Oevt/shr70kvYK7xlrPr5s">filtered view</a> of our database to find professors with open positions of any seniority, or the <a href="connections">unfiltered view</a> to find potential collaborators.</li>
<li>We'd also like to highlight the Future of Life Institute's funding for <a href="https://futureoflife.org/grant-program/phd-fellowships/">Technical PhD Fellowships</a> and <a href="https://futureoflife.org/grant-program/postdoctoral-fellowships/">Technical Postdoctoral Fellowships</a></li>
</ul>
<!-- <li class="expandable" data-toggle="closed_orgs"><i>Currently Closed Opportunities</i></li>
<ul>
Expand All @@ -109,13 +110,18 @@ <h3 id="funding">Funding Opportunities</h3>
<li><a href="https://www.cooperativeai.com/grants/cooperative-ai">Cooperative AI Research Grants</a> (<i>deadline: 10/6</i>)</li>
<li><a href="https://taskdev.metr.org/bounty/">METR Evaluation Task Bounty</a></i> (<i>related: <a href="https://metr.github.io/autonomy-evals-guide/">METR's Autonomy Evaluation Resources</a></i>)</li>
<li><a href="https://www.mlsafety.org/safebench">SafeBench Competition</a> (<i>deadline: 2/25/2025; $250k in prizes</i>)</li>
<li class="expandable"><a href="https://futureoflife.org/">Future of Life Institute:</a> </li>
<ul>
<li><a href="https://futureoflife.org/grant-program/mitigate-ai-driven-power-concentration/">How to Mitigate AI-driven Power Concentration</a></li>
<li><a href="https://futureoflife.org/grant-program/phd-fellowships/">PhD Fellowships</a><i> (deadline: 11/20/24)</i> </li>
<li><a href="https://futureoflife.org/grant-program/postdoctoral-fellowships/">Postdoctoral Fellowships</a> <i>(deadline: 01/06/25)</i></li>
</ul>
<li class="expandable"><a href="https://www.anthropic.com/news/a-new-initiative-for-developing-third-party-model-evaluations">Anthropic Model Evaluation Initiative</a></li>
<ul>
<li>Note that though proposals are welcome, they will not be assessed until the round 1 proposals are processed (date TBD)</li>
</ul>
<li><a href="https://new.nsf.gov/funding/opportunities/secure-trustworthy-cyberspace-satc">NSF Secure and Trustworthy Cyberspace Grants</a></li>
<li><a href ="https://foresight.org/ai-safety/">Foresight Institute: Grants for Security, Cryptography & Multipolar Approaches to AI Safety <i>(quarterly applications)</i></a></li>
<li><a href="https://futureoflife.org/grant-program/mitigate-ai-driven-power-concentration/">Future of Life Institute: How to Mitigate AI-driven Power Concentration</a></li>
<li class="expandable"><a href="https://www.aria.org.uk/programme-safeguarded-ai/">ARIA's Safeguarded AI Program</a> <i>(deadline: 10/2)</i></li>
<ul>
<li>Safeguarded AI aims to provide quantitative safety guarantees for AI. Their current funding round is for demonstrations that AI systems with such guarantees are useful and profitable in safety-critical contexts (e.g. optimising energy networks, clinical trials, or telecommunications).</li>
Expand Down Expand Up @@ -161,10 +167,10 @@ <h3 class="expandable" id="visitor-programs">AI Safety Programs / Fellowships /
<ul>
<li><a href="https://www.constellation.org">Constellation</a> is offering year-long salaried positions ($100K-$180K) at their office (Berkeley, CA) for experienced researchers, engineers, entrepreneurs, and other professionals to pursue self-directed work on one of Constellation's <a href="https://www.constellation.org/focus-areas">focus areas</a><a href="https://airtable.com/appEr4IN5Kkzu9GLq/shr3LgseSOaRxA2mQ">Apply here</a>. See here for <a href="https://www.constellation.org/programs/residency">more details</a>.</li>
</ul>
<!-- <li class="expandable" data-toggle="mats_description"><a href="https://www.matsprogram.org/">MATS Winter Program</a> (<i>Neel Nanda and Arthur Conmy's streams only. Deadline: 8/30/24. Aimed primarily at students</i>)</li>
<li class="expandable" data-toggle="mats_description"><a href="https://www.matsprogram.org/">MATS Winter Program</a> ( Deadline: 10/6/24</i>)</li>
<ul>
<li>The <a href="https://www.matsprogram.org/">ML Alignment & Theory Scholars (MATS)</a> Program is an educational seminar and independent research program that aims to provide talented scholars with talks, workshops, and research mentorship in the field of AI alignment and safety. We also connect them with the Berkeley alignment research community. Our Winter Program will run from early Jan, 2025. General applications have now closed for the Winter 2024-5 cohort, but you can still apply to Neel Nanda or Arthur Conmy's stream until August 30th. Follow the instructions on the <a href="https://www.matsprogram.org/">MATS homepage</a> to apply.</li>
</ul>-->
<li>The <a href="https://www.matsprogram.org/">ML Alignment & Theory Scholars (MATS)</a> Program is an educational seminar and independent research program that aims to provide talented scholars with talks, workshops, and research mentorship in the field of AI alignment and safety. The Winter Program will run Jan 6 - Mar 14, 2025. Follow the instructions on the <a href="https://www.matsprogram.org/">MATS homepage</a> to apply.</li>
</ul
<li id="spar"><a href="https://supervisedprogramforalignment.org/">Supervised Program for Alignment Research (SPAR) Fall Program <i>(expression of interest)</i></a></li>
</ul>

Expand Down

0 comments on commit 557de21

Please sign in to comment.