Over the past century our world and human society have gone through great changes 1. These changes have dramatically increased humanity’s power over its environment. They have benefited most of us with blessings such as a higher standard of living and longer lifespans. But they have also brought new risks. Among other things, the welfare of future humans and non-human animals is threatened by climate change and the risk that powerful new technologies will be used for destructive purposes.
The happiness of future generations is not worth less than the happiness of contemporary sentient beings. Therefore long-term planning and responsibility are natural starting points for the GHO. In fact, we are the only organization in the world that explicitly works to increase the happiness of future generations. Thus the primary goal of the **Future Happiness** Unit is to ensure that long-term perspectives are not lost in a pursuit of short-term profit.
Many researchers believe that some very powerful technologies will be developed during the coming decades, such as advanced biotechnology 2, nanotechnology 3 and artificial intelligence 4. These bring unmatched opportunities for increasing global happiness. Biotechnology can be used to improve human health and welfare as well as for enhancing our emotional capacities 5. Nanotechnology can help us solve environmental problems, cure diseases and reduce poverty. Through in vitro meat, even the pain-driven “livestock industry” can be replaced 6. The Future Happiness Unit tries to draw attention to, investigate, and encourage the use of these opportunities.
In addition to great promise, future technologies unfortunately bring great risks. The worst of these risks threaten the very existence of mankind 7. Some well-known experts on the subject estimate that the risk of outright human extinction is significant, perhaps even as high as 25%-50% 8. A catastrophe of such a magnitude would not only kill all living humans; it would also ensure that countless generations of future humans are never born. That would amount to an astronomical loss of future lives 9.
The Future Happiness Unit seeks to draw attention to these risks so that humanity may reduce them. We continually gather and process information about the potential benefits and risks posed by emerging technologies, drawing on the latest research from leading experts in the field. We then focus on spreading this information to the public, politicians and policy makers.
Manager: Jesper Östman
- 1. Ray Kurzweil The Singularity is Near Viking. (2005) (http://en.wikipedia.org/wiki/The_Singularity_Is_Near)
- 2. Eds. Bostrom, Nick och Savulesco, Julian, Human enhancement Oxford University Press (2009) (http://ukcatalogue.oup.com/product/9780199299720.do)
Bostrom, Nick, Sandberg, Anders "Converging Cognitive Enhancements” Annals of the New York Academy of Sciences, Vol. 1093 (2006): 201-207.
- 3. Productive Nanosystems : A Technology Roadmap. Foresight institute. (2007) (http://www.foresight.org/roadmaps/Nanotech_Roadmap_2007_main.pdf)
- 4. Bostrom, Nick och Sandberg, Anders, "Whole Brain Emulation A Roadmap" Future of Humanity Institute. (2008) (http://www.philosophy.ox.ac.uk/__data/assets/pdf_file/0019/3853/brain-e…)
Ray Kurzweil The Singularity is Near Viking. (2005) (http://en.wikipedia.org/wiki/The_Singularity_Is_Near)
- 5. Walker, Mark, "In praise of Bio-Happiness" (2006) IEET (http://ieet.org/archive/IEET-02-BioHappiness.pdf)
- 6. Jones, Nikola "A taste of things to come?" Nature Vol. 468, s. 752-753 (2010) (http://www.nature.com/news/2010/101208/full/468752a.html)
- 7. Bostrom, Nick, "Existential Risks" Journal of Evolution and Technology, Vol. 9, No. 1 (2002) (http://www.jetpress.org/volume9/risks.html)
- 8. Martin Reese in Our Final Hour, Basic Books. (2004) (http://en.wikipedia.org/wiki/Our_Final_Hour) estimates the risk to 50% next century, John Leslie in The End of the World, Routledge. (1996) (http://geoff.mcnicoll.net/wp-content/uploads/2009/11/R-1997a-Review-of-…) estimates it to 30% for the next five hundred years. Nick Bostrom (2002) (http://www.jetpress.org/volume9/risks.html) there’s a total risk of at least 25%.
- 9. Parfit, Derek, Reasons and Persons , s. 453-454 Oxford University Press. (1984). (http://en.wikipedia.org/wiki/Reasons_and_Persons)
Bostrom, Nick, "Astronomical Waste" Utilitas Vol. 15, No. 3, (2003) s. 308-314 (http://www.nickbostrom.com/astronomical/waste.html)