Please enable JavaScript to ensure auto alt text generation works properly
Employer branding & candidate experience

Does blind hiring still make sense in 2026?

Discover how blind hiring impacts diversity, equity, and inclusion efforts. This article explores whether anonymized resumes truly reduce bias, and what else needs to change to build a fairer hiring process.
February 8, 2026
Table of contents

    The TL;DR

    Blind hiring isn’t a bias off-switch: even “anonymous” resumes leak identity through language, and algorithms can amplify those patterns unless they’re actively retrained and audited.
    The real bias fix is system design—structured interviews with standardized scorecards and consistent questions beat gut-feel hiring, improving both fairness and prediction quality.
    Most “diversity” problems start before screening: referrals and narrow sourcing channels recycle the same demographics, so expanding and diversifying top-of-funnel pipelines matters more than anonymizing applications.

    In theory, blind recruitment is a no-brainer.

    Remove names, reduce certain biases. Strip away photos, hometowns, and alma maters, and you make the initial screen more consistent. The goal: let qualifications speak louder than identity markers. It’s structured. It’s data-informed. It’s what many believe the future of recruitment should look like.

    And yet, a consistent theme emerged from conversations with practitioners running anonymized hiring experiments, building assessment platforms, and researching screening algorithms:

    Blind hiring is a useful tool. But it’s not enough. In fact, in some contexts, it might be distracting us from the bigger problem.

    1. The first myth is that blind = unbiased

    The appeal of blind recruitment is obvious: reduce visual and identity-based bias in screening, and let real qualifications shine.

    Tools and techniques range from anonymizing names on resumes to hiding voices in interviews to removing institutional pedigrees altogether. It all sounds good in theory, and in some cases, it works.

    But there's a problem.

    Even when resumes are anonymized, language patterns still leak identity. In a landmark study presented by Oleeo’s Jeanette Meister, researchers found they could predict a resume’s gender with 99% accuracy—even when names were removed.

    Why? Because men and women write differently. Men list nouns like “equity,” “capital,” and “finance.” Women use softer language—“organized,” “social,” “volunteer.” These patterns were consistent across finance, tech, retail, and consulting.

    This doesn’t just trip up biased humans. It impacts machines too. Unless you intentionally retrain the models, the algorithm becomes a bias amplifier.

    So yes, anonymization can reduce some types of bias. But it doesn’t eliminate them. And if you don’t dig deeper into how candidates present themselves ,and how your team interprets that, you might just end up swapping one form of bias for another.

    2. The future of recruitment isn’t blind, it’s systemic

    Charlotte Melkert, CEO of Equalture, put it plainly: “Anonymizing resumes might increase your top-of-funnel diversity, but without structured interviews, bias creeps right back in.”

    If all we do is interrupt bias at the resume stage and don’t address how we evaluate, decide, and promote—then we’re not fixing the system. We’re just delaying the moment bias kicks in.

    The blind hiring experiment eliminated candidate identity until the very final offer decision. No video, no names, no voice. Just assessments, tasks, and outcomes. As Mykhailo Kats explained, “This isn’t just about hiding identities. It’s about testing whether bias—positive or negative—is baked into every hiring decision we make.”

    Their goal? Not to hit diversity targets. Not to game the system. Just to test one idea: What if we actually hired the best person for the job, regardless of who they were?

    It’s a powerful question, and one that, ironically, might scare companies more than it excites them.

    3. Why many companies resist full-scale blind hiring

    A common pattern emerged across industry discussions. Organizations are quick to say they want fairness, but slow to give up control.

    Why?

    Because full anonymity challenges the foundation of most hiring processes. It removes gut instinct. It forces teams to define what “good” looks like in advance.

    It asks leaders to trust a system they don’t understand and give up levers they’ve always relied on: where someone went to school, how they present themselves in interviews, even whether they “feel like an environment fit.”

    In some cases, companies fear the optics. What if we go fully blind and don’t end up hiring any women? Or any people of color? Or anyone “diverse”? What if it makes us look bad?

    But diversity hiring shouldn’t be about optics. It should be about creating a system that reduces bias—any bias—through structure and consistency. Whether it favors or penalizes someone.

    That’s a tough pill to swallow if your company’s DEI strategy is built around quotas and KPIs, not systems design.

    4. The unsexy secret to fairer hiring

    Everyone we've spoken with—no matter how bullish or skeptical they were on anonymization—championed structured interview scorecards. This is where tools like AI-assisted screening platforms can help: by surfacing consistent information (interview summaries, assessment results, match scores) that maps to your scorecard criteria, so every candidate gets evaluated on the same dimensions.

    Sample interview scorecard

    Criteria

    Score (1–5)

    Notes

    Technical skills

    __

    e.g. relevant experience, proficiency

    Communication

    __

    e.g. clarity, listening, responsiveness

    Environment fit

    __

    e.g. alignment with team values and work preferences

    Problem-solving

    __

    e.g. logic, creativity, examples

    Overall impression

    __

    Would you move this candidate forward?

    Ask every candidate the same questions. Score them the same way. Avoid improvisation, gut feeling, and "chemistry checks" that tend to favor the familiar.

    The goal isn't to remove personality from the process. It's to remove the variability that invites bias.

    This doesn’t have to be robotic. You can still build rapport. You can still make space for questions. But when you anchor every decision to a consistent set of criteria, you reduce noise. You give underrepresented candidates a real shot at being evaluated fairly. You take “likeability” out of the driver's seat.

    And here's the kicker: structured interviews don't just help with diversity. They also provide more consistent, comparable data across candidates than unstructured ones.

    The problem? They require upfront work. Hiring managers have to agree on what matters. They have to score answers independently. And they have to give up their favorite question—“So, walk me through your background.”

    5. Referrals and sourcing are where the real bias begins

    Most companies treat referrals as sacred. If a trusted employee vouches for someone, that candidate often skips the line.

    The problem? That’s bias in action. People refer people like them. So if your team isn’t diverse, your referrals won’t be either.

    The same goes for sourcing. Where you look matters. If you're only sourcing on LinkedIn, you're limiting your pool to a specific demographic—one that may not reflect the diversity you're trying to build.

    Alternative channels, like niche job boards, community organizations, or grassroots outreach, require more work. But they also surface talent you’d otherwise miss.

    Blind hiring can help once someone applies. But if your sourcing pool is homogenous, no amount of anonymity will solve your diversity problem.

    6. The uncomfortable truth about blind hiring and identity

    Anonymization is supposed to help candidates. But sometimes, it does the opposite.

    Isn’t it unfair to ask candidates to erase their identity just to be seen fairly? It’s a fair critique.

    Removing names and backgrounds might protect people from bias, but it also strips away the very traits that make them unique. Their culture, their language, their lived experience, all silenced to protect them from discrimination.

    That’s not equity. That’s damage control.

    The real goal, several people we spoke with argued, isn’t to build processes that hide identity. It’s to build systems strong enough not to be swayed by it. Anonymization is just the first mile of a much longer journey.

    So what does the future of recruitment actually look like?

    The future of recruitment is not just blind. It’s not just structured. It’s not just tech-driven.

    It’s all of the above, and more:

    • A future where anonymization is used as a tool, not a crutch
    • Where structured interviews replace gut instinct
    • Where every step of the process, from sourcing to onboarding, is designed to reduce bias
    • Where screening tools use AI to surface information, not make decisions—and where assessments measure what AI can't fake, like personality tendencies, situational judgment, and work environment preferences
    • And where identity can be celebrated, not hidden

    As DEI fatigue grows and buzzwords wear thin, companies have a choice. Keep doing the minimum, or rebuild the system to reflect the future of work.

    Blind hiring won’t save us. But it might just be the wake-up call we need.

    Sean Griffith
    Sean began his career in leadership at Best Buy Canada before scaling SimpleTexting from $1MM to $40MM ARR. As COO at Sinch, he led 750+ people and $300MM ARR. A marathoner and sun-chaser, he thrives on big challenges.
    Author
    You posted a role and got 426 applicants. Now what — read all of their resumes and phone screen 15 of them?

    Try Truffle instead.
    Start free trial