Alignment Research Center: Theory Project

GiveWiki
 Accepting donations
$5,016,000
5 donations
Support score: 217OrganizationEliciting latent knowledge (ELK)AI safety

The Theory team is led by Paul Christiano, furthering the research initially laid out in our report on Eliciting Latent Knowledge. At a high level, we’re trying to figure out how to train ML systems to answer questions by straightforwardly “translating” their beliefs into natural language rather than by reasoning about what a human wants to hear. We expect to use funds donated via this fundraiser towards supporting the Theory project.

[This project was created by the GiveWiki team. Please visit the website of the organization for more information on their work.]

0
0