Do you have to take economics in college?

Whether or not you have to take economics depends on your college major and requirements. Economics is a common requirement for business, finance, and even some social sciences majors. However, it is not universally required for all college students. Check with your institution to see if economics is required for your degree.
Learnify Hub © www.0685.com All Rights Reserved