What Machines Know: Can Algorithms Predict A Career Path?

December 2nd, 2013 by David Rothschild Leave a reply »

Algorithms are quickly shaping and defining our world.

In a TED Talk from 2011, Kevin Slavin points out the algorithms that already affect our daily lives in “How Algorithms Shape Our World.” Most are at least partially aware of how algorithms are used in the stock market–buying and selling at an astronomically fast rate–but may not be fully aware of how heavily they are being used in our culture and day-to-day activities.

The Language of Machines

Physics and programming have begun to track how we work, move, play and shop. Machines are being taught to track our every move, discovering the best ways to sell, advertise and operate with algorithms. Cleaning bots in our house predict the most efficient ways to sweep a room, web history is tracked and searched for our interests and everything from elevators to predicted movie rental sites are being programmed to stay one step ahead of humans in our culture’s capitalist quest for ever-growing convenience, speed and efficiency.

This doesn’t stop with how we purchase–this is heading into the very heart of how we are recruited, hired and promoted. In the article “They’re Watching You at Work,” The Atlantic writer Don Peck writes:

Until quite recently, however, few people seemed to believe this data-driven approach might apply broadly to the labor market. But it now does. According to John Hausknecht, a professor at Cornell’s school of industrial and labor relations, in recent years the economy has witnessed a ‘huge surge in demand for workforce-analytics roles.

From Ivy League to Social Media Analytics

It is common for pedigree to mean something. When an Ivy League graduate with high marks and an impressive resume seeks a job, companies are recruiting left and right—a sought-after candidate for a high-level job. But what if candidates who are better suited for the job are falling through the cracks? Companies are beginning to look at algorithm programs and tests that can determine the productivity, creativity and professional promise of individuals based on everything from social media usage to how they play specifically-designed gaming apps.

Knack is a company that is doing just that. They have developed gaming apps like Wasabi Waiter that have successfully been tested to predict an accurate competency rate after just 20 minutes of play-time. The Atlantic notes:

How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality.

Reason for Concern

It’s easy to worry about the intrusion of machines in our lives, judging our potential. This concern, however, fails to consider the challenges of our current system: over and over it has been proven that with (often unknowing) bias we judge candidates and produce results rife with human error.

Gender, race, appearance and even personality are subject to our partiality and personal preference. Numerous studies show that our society is still not where we expect it to be in unbiased hiring practices. From the Atlantic:

Tall men get hired and promoted more frequently than short men, and make more money. Beautiful women get preferential treatment, too—unless their breasts are too large. According to a national survey by the Employment Law Alliance a few years ago, most American workers don’t believe attractive people in their firms are hired or promoted more frequently than unattractive people, but the evidence shows that they are, overwhelmingly so.

Hiring the Underdog

The inability of humans to remain completely objective forces us to be open to the idea of machines and their formulas to help predict the outcome of the hiring potential in candidates. Undervalued candidates will be found that are better suited for the jobs we are looking to fill. This, of course, begs the question: Will programmers and algorithm writers be able stay away from introducing bias into machine formulas? And in what ways will candidates try to beat the system?

Advertisement

Comments are closed.