Archive for December, 2013

Are Your Hiring Managers Biased?

December 13th, 2013

hiring manager biasWhen you are in a position to assess people for employment at your organization, you may think that you have an open mind as you consider each applicant. However, it can be easy to hold biases that you are unaware of, according to a recent post by Lou Adler at Business Insider. By keeping possible bias in mind while conducting interviews, you will stand a better chance of finding the best people for the positions you seek to fill.

For example, you may be guilty of anchoring, which happens when you attribute too much value to the initial information you receive during an interview and then come to a conclusion before getting all the information you need.

Adler recommends that hiring managers strive to delay making any yes-or-no decisions for about 45 minutes, ensuring they will give as much weight to details they learn at the end of the interview as they do at the beginning.

Conformation bias is another problem that hiring managers face. They look for evidence to confirm their initial decision about a person, and then fail to see any information that conflicts with the first impression.

A hiring manager might make an effort to find “proof” that the applicants that they don’t like are simply incompetent, while ignoring facts that do demonstrate competence. A good approach here is to pause for a moment during the interview and seek out details that will counter the first impression.

Time pressures can also contribute to bias in the form of a perceived need for closure. When hiring managers feel rushed to come to a conclusion, they do their company and the applicant a disservice. Instead of worrying about how much time you are taking to do the interview, make a point of asking questions until you get all the facts you need to make the best possible decision.

Another problem with bias has to do with the concept of sunk costs. As hiring managers spend more time making a decision about applicants, they will feel the weight of how much time they’ve already invested doing interviews.

The result is a tired manager who will just settle for the next applicant who seems right for the job. To avoid this problem, remind yourself just how important it is to keep interviewing candidates and giving them all your full consideration. The future success of your company may very well depend on the decisions you make. To keep yourself objective, exercise your curiosity to discover the special skills and knowledge that each applicant brings to the table.

Ignorance of personal bias can lead to an increasing number of bad hires at your organization, as well as missed opportunities to bring in highly qualified applicants. By checking yourself for bias, you’ll have a leg up over other organizations whose hiring managers are less aware of their own bias.

What Machines Know: Can Algorithms Predict A Career Path?

December 2nd, 2013

Algorithms are quickly shaping and defining our world.

In a TED Talk from 2011, Kevin Slavin points out the algorithms that already affect our daily lives in “How Algorithms Shape Our World.” Most are at least partially aware of how algorithms are used in the stock market–buying and selling at an astronomically fast rate–but may not be fully aware of how heavily they are being used in our culture and day-to-day activities.

The Language of Machines

Physics and programming have begun to track how we work, move, play and shop. Machines are being taught to track our every move, discovering the best ways to sell, advertise and operate with algorithms. Cleaning bots in our house predict the most efficient ways to sweep a room, web history is tracked and searched for our interests and everything from elevators to predicted movie rental sites are being programmed to stay one step ahead of humans in our culture’s capitalist quest for ever-growing convenience, speed and efficiency.

This doesn’t stop with how we purchase–this is heading into the very heart of how we are recruited, hired and promoted. In the article “They’re Watching You at Work,” The Atlantic writer Don Peck writes:

Until quite recently, however, few people seemed to believe this data-driven approach might apply broadly to the labor market. But it now does. According to John Hausknecht, a professor at Cornell’s school of industrial and labor relations, in recent years the economy has witnessed a ‘huge surge in demand for workforce-analytics roles.

From Ivy League to Social Media Analytics

It is common for pedigree to mean something. When an Ivy League graduate with high marks and an impressive resume seeks a job, companies are recruiting left and right—a sought-after candidate for a high-level job. But what if candidates who are better suited for the job are falling through the cracks? Companies are beginning to look at algorithm programs and tests that can determine the productivity, creativity and professional promise of individuals based on everything from social media usage to how they play specifically-designed gaming apps.

Knack is a company that is doing just that. They have developed gaming apps like Wasabi Waiter that have successfully been tested to predict an accurate competency rate after just 20 minutes of play-time. The Atlantic notes:

How long you hesitate before taking every action, the sequence of actions you take, how you solve problems—all of these factors and many more are logged as you play, and then are used to analyze your creativity, your persistence, your capacity to learn quickly from mistakes, your ability to prioritize, and even your social intelligence and personality.

Reason for Concern

It’s easy to worry about the intrusion of machines in our lives, judging our potential. This concern, however, fails to consider the challenges of our current system: over and over it has been proven that with (often unknowing) bias we judge candidates and produce results rife with human error.

Gender, race, appearance and even personality are subject to our partiality and personal preference. Numerous studies show that our society is still not where we expect it to be in unbiased hiring practices. From the Atlantic:

Tall men get hired and promoted more frequently than short men, and make more money. Beautiful women get preferential treatment, too—unless their breasts are too large. According to a national survey by the Employment Law Alliance a few years ago, most American workers don’t believe attractive people in their firms are hired or promoted more frequently than unattractive people, but the evidence shows that they are, overwhelmingly so.

Hiring the Underdog

The inability of humans to remain completely objective forces us to be open to the idea of machines and their formulas to help predict the outcome of the hiring potential in candidates. Undervalued candidates will be found that are better suited for the jobs we are looking to fill. This, of course, begs the question: Will programmers and algorithm writers be able stay away from introducing bias into machine formulas? And in what ways will candidates try to beat the system?