My friend's family all majored in liberal arts, and are struggling to find jobs after college. I'm trying to understand why anyone would do that.
I minored in sociology. I didn't major in it because I know you can't get a job after you graduate. I think if you're set on a liberal art major you should at least major in something else as well.
Are those people chasing a pipe dream, or do they not care about money? Thoughts?
I minored in sociology. I didn't major in it because I know you can't get a job after you graduate. I think if you're set on a liberal art major you should at least major in something else as well.
Are those people chasing a pipe dream, or do they not care about money? Thoughts?
Comment