Dominionism

Concept

The belief that humans are called to exercise dominion over the earth, as derived from the cultural mandate in Genesis and the Great Commission.

Mentioned in 1 video