A new report shows that, even after businesses announce dropping college degree requirements for jobs, most hire college graduates anyway.
This article outlines an opinion that organizations either tried skills based hiring and reverted to degree required hiring because it was warranted, or they didn't adapt their process in spite of executive vision.
Since this article is non industry specific, what are your observations or opinions of the technology sector? What about the general business sector?
Should first world employees of businesses be required to obtain degrees if they reasonably expect a business related job?
Do college experiences and academic rigor reveal higher achieving employees?
Is undergraduate education a minimum standard for a more enlightened society? Or a way to hold separation between classes of people and status?
Is a masters degree the new way to differentiate yourself where the undergrad degree was before?
Edit: multiple typos, I guess that's proof that I should have done more college 😄
Using them incorrectly, would be incorrect. Without an example, it’s hard to tell.
But, pretty much everyone was doing the web “wrong” back in the day. Server-side html generation? Gag me. Or worse, inserting PHP into html?! Shudder. But that’s how it was for many backed languages.
IMO, nowadays, if it’s not a reactive js front end using the backend as an API, it’s doing it wrong. But I’m sure in 10 years we will all be laughing at how seriously we were taking JavaScript.
It makes me shudder to think how the modern web is just treating browsers as JavaScript application environments. Converting a little backend load into a massive frontend headache is the exact opposite of where we thought we were headed twenty years ago.
Well, it’s not a massive front end headache if you do it right. And, by passing off a lot of the easy stuff to the browser, your server can handle more load. As a bonus, it’s easier to decouple your architecture. Not only is this more efficient, but it’s easier to maintain, test, and deploy.
It's sacrificing efficiency on the frontend for the backend. It makes the backend easier to test, while making the frontend more complex. It significantly jacks up requirements for the clients while reducing them for the host.
You backend people are forgetting that there are devices on the other end that need to process and render this bullshit. It sells more new iPhones, though, so who the fuck cares?
I’m equally proficient on the front end. I don’t have any problem making front end code that doesn’t require the latest and greatest processor.
Inefficient JavaScript and abusive css animation are the cause of all that. Preventing event flooding is crucial and often overlooked. And ffs, not everything has to be animated. If the fan kicks on, that developer is a moron.
My point is that the JavaScript is inherently inefficient.
The possibility that you might suck less than someone else doesn't fix that fact, or the fact that the modern web can bring a ten-year-old tablet to its knees.
JavaScript doesn’t run on a Commodore 64 either, but that doesn’t mean we shouldn’t use it.
I’ll still argue that an efficient web app will be a significantly better experience than waiting for pages to load, even on a 10 year old tablet.
And to support that, I do most of my mobile testing on my old iPhone 6—which is, coincidentally, 10 years old. I don’t have trouble with JavaScript on that.
I think what it comes down to is there are a lot of unskilled developers out there that misuse JavaScript… and PHP.