Longtermism identifies the mitigation of existential risks—events that could severely undermine human potential or lead to human extinction—as a critical priority. These risks include advanced artificial intelligence, biotechnology, nuclear warfare, and climate change.

This is a moral imperative given the vast number of lives effected by action in the present.