11 - Attainable Utility and Power with Alex Turner

Published: Sept. 25, 2021, 9:13 p.m.

Many scary stories about AI involve an AI system deceiving and subjugating humans in order to gain the ability to achieve its goals without us stopping it. This episode's guest, Alex Turner, will tell us about his research analyzing the notions of "attainable utility" and "power" that underlie these stories, so that we can better evaluate how likely they are and how to prevent them.

\xa0

Topics we discuss:

\xa0- Side effects minimization

\xa0- Attainable Utility Preservation (AUP)

\xa0- AUP and alignment

\xa0- Power-seeking

\xa0- Power-seeking and alignment

\xa0- Future work and about Alex

\xa0

The transcript: axrp.net/episode/2021/09/25/episode-11-attainable-utility-power-alex-turner.html

\xa0

Alex on the AI Alignment Forum: alignmentforum.org/users/turntrout

Alex's Google Scholar page:\xa0scholar.google.com/citations?user=thAHiVcAAAAJ&hl=en&oi=ao

\xa0

Conservative Agency via Attainable Utility Preservation: arxiv.org/abs/1902.09725

Optimal Policies Tend to Seek Power: arxiv.org/abs/1912.01683

\xa0

Other works discussed:

\xa0- Avoiding Side Effects by Considering Future Tasks: arxiv.org/abs/2010.07877

\xa0- The "Reframing Impact" Sequence: alignmentforum.org/s/7CdoznhJaLEKHwvJW

\xa0- The "Risks from Learned Optimization" Sequence: alignmentforum.org/s/7CdoznhJaLEKHwvJW

\xa0-\xa0Concrete Approval-Directed Agents: ai-alignment.com/concrete-approval-directed-agents-89e247df7f1b

\xa0-\xa0Seeking Power is Convergently Instrumental in a Broad Class of Environments: alignmentforum.org/s/fSMbebQyR4wheRrvk/p/hzeLSQ9nwDkPc4KNt

\xa0- Formalizing Convergent Instrumental Goals: intelligence.org/files/FormalizingConvergentGoals.pdf

\xa0-\xa0The More Power at Stake, the Stronger Instumental Convergence Gets for Optimal Policies: alignmentforum.org/posts/Yc5QSSZCQ9qdyxZF6/the-more-power-at-stake-the-stronger-instrumental

\xa0-\xa0Problem Relaxation as a Tactic: alignmentforum.org/posts/JcpwEKbmNHdwhpq5n/problem-relaxation-as-a-tactic

\xa0-\xa0How I do Research: lesswrong.com/posts/e3Db4w52hz3NSyYqt/how-i-do-research

\xa0-\xa0Math that Clicks: Look for Two-way Correspondences: lesswrong.com/posts/Lotih2o2pkR2aeusW/math-that-clicks-look-for-two-way-correspondences

\xa0-\xa0Testing the Natural Abstraction Hypothesis: alignmentforum.org/posts/cy3BhHrGinZCp3LXE/testing-the-natural-abstraction-hypothesis-project-intro