A new report shows that, even after businesses announce dropping college degree requirements for jobs, most hire college graduates anyway.
This article outlines an opinion that organizations either tried skills based hiring and reverted to degree required hiring because it was warranted, or they didn't adapt their process in spite of executive vision.
Since this article is non industry specific, what are your observations or opinions of the technology sector? What about the general business sector?
Should first world employees of businesses be required to obtain degrees if they reasonably expect a business related job?
Do college experiences and academic rigor reveal higher achieving employees?
Is undergraduate education a minimum standard for a more enlightened society? Or a way to hold separation between classes of people and status?
Is a masters degree the new way to differentiate yourself where the undergrad degree was before?
Edit: multiple typos, I guess that's proof that I should have done more college 😄
For a lot of jobs that want bachelors degrees, people with lots of experience will do. But for jobs requiring masters and doctorates its a different story.