Advertisement
Article main image
Jul 29, 2020

An important trend started in early 2018 that has been changing the foundation of higher education. More than 15 of the biggest names in business went public to announce that they will no longer require employees to have a college degree. 

Google, Apple, and IBM were just a few, and it seems that more employers are jumping on this bandwagon every day. Companies are dropping the college-degree prerequisite because they increasingly realize that time spent during academic courses doesn’t necessarily translate to a strong work ethic, competency, and talent in the workplace. This is especially the case in IT and other technology roles. 

The New-Collar Workforce

New-collar employees, individuals who develop necessary skills to work in tech in untraditional ways, are not your typical demographic. Caught in today’s massive unemployment situation, many new-collar workers have a high sense of urgency to find work. And given that time is money, they focus on getting credentials as quickly as possible to qualify for high-demand jobs — rather than enroll in years-long degree programs.

Meanwhile, employers are realizing that college degrees may not neatly map to the skills they seek in candidates. For example, our internal research shows at least six job fields projected to experience high growth. They pay at least $65,000 a year. And 40% of people currently in those roles do not have a four-year degree. So clearly, there’s no need to have an artificial barrier limiting tech jobs to degree-holders when so many people in those jobs lack a college degree. 

Still, just because these jobs may not require a degree does not mean that they are easy. Indeed, the training it takes to land these jobs can’t be found on YouTube or low-to-no-cost MOOCs (Massive Online Open Courses). Moreover, having the right skills isn’t enough either. Employers want to verify competencies. Often, this is via certification from an authorized training provider. 

Certifications vs Certificates

Put otherwise, certifications have become the new college degrees in technology. 

Notably, certifications are not the same as certificates. Certificates tend to be a subset of academic courses from degree programs, which means that they may not be actual markers of full and complete professional competency. Certificates are simply a “paper” to many professionals. 

Certifications, on the other hand, go beyond theoretical knowledge and the ability to pass a test. They confirm evidence-based application of skills combined with practical experience. IT certifications give hiring managers confidence that they are evaluating candidates on specialized knowledge, like vendor-specific skills vs. general technology skills.

As such, employers should ensure that certification comes from authorized providers, like Microsoft Teams certification that is backed by Microsoft. In other words, having a digital badge from Dave’s Training Barn in Indiana may look nice, but it may not be worth much. 

Training for Certifications

Meanwhile, training for certifications will continue following employer demands for qualified applicants and their adoption of new technologies. For instance, industry-specific IT certifications are increasingly important. Take health care. As organizations implement electronic health records systems, candidates must understand more than just databases like MS SQL, etc. They need to understand industry specific regulations pertaining to HIPAA, PII, and other rules.

For many hiring leaders, one of the valuable lessons over the last decade, and for the next decade, is that the talent pipeline can reach beyond traditional college sources. The pace of technology adoption continues to accelerate with no end in sight. And as the nature of work continues to evolve, so too will the requirements we seek in candidates.

Get articles like this
in your inbox
Subscribe to our mailing list and get interesting articles about talent acquisition emailed weekly!
Advertisement