Takeways:
- Framework for accessing long term impact
- Significance
- Persistence
- Contingency
- Moral value change has high impact on lives of people in the long term
- we should focus on more general moral principle
- Value lock-in can happen
- Possible events to lock in value: artificial general intelligence, world government, space settlement
- We should:
- Leave options open (delay lock in, support conservation)
- Experiment and increase cultural diversity
- Structure society to guide to better moral values (certain form of free speech, fairly free migration)
- Risks of the future
- Extinction
- Space guard program found 98% of extinction threatening asteroids
- Top 2 risks: AI and engineered pathogen
- Civilization collapse
- Humans have been historically resilient when recovering, but all out nuclear war is a different level
- Runaway climate change is unlikely, but fossil fuel depletion will make recovery very difficult
- Stagnation
- Increases the risks of extinction and collapse
- Tech advancement will be harder since we picked lower hanging fruit
- AI can be the remedy
- Historically we relied on increasing number of people working on research. There is a limitation and birth rate decline will make it harder
- Population ethics
- Having one extra person in the world is good in and of itself, if that person is overall happy
- Total view: one population is better if it contains more total wellbeing (instead of average)
- We should also consider animals and non-wellbeing goods (eg art, ecosystem itself)
- What to do
- donation
- political activism
- spreading good ideas
- having children
- work on some of the major problems of our time
More notes: