Thursday, April 19, 2012

The College Majors That Make You Richest: Report

Starting at an early age, we begin to think about the career field we want to work in. Some choose nursing, some choose teaching. Whatever it may be, one of the factors we consider is how much money we'll make after graduating from college.


The Huffington Post takes a look at what college majors make the most money after graduating from college. See if your major makes the list!


To read that article, click here.

No comments:

Post a Comment