Alrighty, so I'm driving myself crazy trying to figure out the right path to take in life. I've always been told going to college and getting a degree with guarantee you a good paying job. Now that I'm a sophomore in college, I'm finding out that a degree doesn't necessarily guarantee anything but a pile of debt. Now, obviously there's some college degrees that will get you a great career such as medicine, science, or business. But I've never excelled in those courses, therefore I don't plan on pursuing a degree in those fields. Right now, I'm thinking about majoring in English and minoring in journalism/communications. My ideal job would be working as some sort of magazine editor. My fear is that once I get out of college, I will be nothing more than a poor, unemployed college grad. So that's my first option. My second option is to go back to a community college and work towards an associates degree in either X-ray technology or Dental Hygienist. I know health care will always be in demand soooo what do you think? Get the college experience and get a degree or work in health care?
--------------------
X-ray techs are NOT in demand... schools are putting out too many graduates.
Source
What is your opinion?
Thursday, March 10, 2011
Posted by
Bobby vaizZ
at
6:15 AM
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment