Definify.com
Definition 2025
Southern_United_States
Southern United States
English
Proper noun

The Southern United States as defined by the United States Census Bureau.
- An expansive region encompassing the southeastern and south-central part of the United States, typically defined as including the states of Texas, Oklahoma, Louisiana, Arkansas, Alabama, Tennessee, Kentucky, Georgia, North Carolina, South Carolina, West Virginia, and Virginia.
Usage notes
The term Southern United States is defined more by shared culture and history than strict geography. Although located in the extreme south of the United States, southern California, New Mexico, and Arizona are not considered part of it. In contrast, Virginia and West Virginia, though located in the middle of the east coast, are considered part of it.
Synonyms
- American South, Dixie (informal), the South