Skip to main content

Alignment of Intentions

What dynamics are really driving the rate of AI innovation?

Humans are hard-wired for status games, and successful networks often create scarcity so that people can signal status.

Alignment of intent and goodwill is the hardest part of any project.

The problem with PHDs is they big reputations to defend

Ego Status Games

danger

Not your model, not your mind

Consensus

If you cannot reach consensus on what success looks like, how can you possibly hope to achieve it?

  • What does the perfect week of human and AI cohesion look like?
  • What do people do for purpose and meaning?
  • How is that sense of purpose qualified?
  • How is that sense of purpose quantified?
  • How is success shared?
  • How fairly distributed is success?

Who exactly are the people that are imagining our future? Why should be trust them?