Q2 2026 Report

The AI interview myth: what hundreds of candidate reviews actually reveal

We read hundreds of voluntary candidate reviews submitted after a Truffle video interview last year. 93% were positive. The 'candidates hate AI interviews' story doesn't survive contact with the data.

93% positive 11 months of data Hundreds of reviews Every major role category
Chapter 1

The myth

You've seen the take. AI video interviews are dehumanizing. Candidates hate them. The whole format is something companies are forcing on people who have no choice.

It's a great LinkedIn post. It performs well. It feels right.

It's also, mostly, wrong.

Across hundreds of voluntary candidate reviews from the last 11 months, 93% rated their video interview experience positively. Across customer service, sales, marketing, content, operations, technical screens, and senior leadership. Across thousands of completed interviews where the only candidates who saw the survey were the ones who finished it.

The single-digit percentage who left negative reviews almost never named the format as the problem. They named UX bugs (don't worry, we immediately fix any that appear), retake limits, and their own preparation. We'll show you exactly what they said.

Chapter 2

What we did

After every application Truffle hosts, candidates see an optional one-question survey. Thumbs up or thumbs down. One free-text comment. Whatever they write gets piped into an internal Slack channel that our team uses to monitor product health.

We pulled every entry from that channel between April 2025 and March 2026. Every candidate who chose to submit a review. Every comment, unedited. We also pulled the surrounding context: the type of role each candidate was applying for, the month, and the broad industry of the hiring company.

This is candidate self-reported feedback. Not interviewer scores. Not aggregated NPS. Real comments, written immediately after the candidate finished the interview, before they knew whether they got the job.

Chapter 3

The numbers

93%
positive sentiment across hundreds of voluntary reviews

93% positive. The negative reviews are a single-digit percentage of the total.

The reviews come from candidates applying across the full range of roles Truffle's customers hire for: customer service and support, sales and business development, marketing and content, technical and engineering screens, administrative and operations roles, hospitality, and senior leadership.

Across that range, sentiment is consistent. Candidates applying for hourly customer service roles report positive experiences. Candidates applying for senior content roles report positive experiences. Candidates applying for technical screens report positive experiences.

The negatives are the interesting data.

Chapter 4

What candidates actually say

We expected the positive reviews to be polite filler. Most of them are short. "Good." "Great." "It was good." Nothing to learn from those.

The longer ones are revealing.

"Very thorough interview and questions. It actually challenges me to think more..."
Marketing applicant · October 2025
"The retakes helped me to avoid misunderstanding interview questions."
Sales applicant · 2026
"It was quick and comfortable to get through!"
Marketing applicant · October 2025
"Easy and smooth process"
Marketing applicant · October 2025
"Loved it! Was really convenient."
Marketing applicant · October 2025
"Very simple process thank you for the opportunity..."
Customer service applicant · May 2025
"It was seamless."
Senior leadership applicant · October 2025

The themes that show up repeatedly in the positive reviews are practical and not philosophical:

  • Time to think before answering
  • Lower pressure than a live interview
  • Convenience and ease of use
  • A second chance through retakes

Notice what's missing. Nobody calls the format "humanizing." Nobody says it's better than meeting a real person. They say it's faster, easier, and less stressful. Those are practical benefits. Candidates aren't praising the format. They're saying it didn't get in the way.

That distinction matters. The narrative about AI interviews assumes candidates are being asked to choose between "talk to a human" and "talk to a machine." Most candidates are choosing between "do this video interview right now from my couch" and "wait three weeks for a phone screen that gets rescheduled twice." In that comparison, the video interview wins, and they say so.

Chapter 5

The negatives are not what you think

The interesting data is in the small group of thumbs-down reviews. Here's what they're actually about.

"the camera angle was wrong"
Technical screen applicant · February 2026
"i am reading the question but I miss the eye contact with the interviewer"
Technical screen applicant · February 2026
"3 attempts is too little i messed up a little bit on the last one and wasnt able to do one more take"
Operations applicant · June 2025

The majority of negative reviews are self-frustration: camera framing, eye-contact guidance, retake limits, and the candidate's own preparation. None of them say "AI interviews are bad." They say "this specific moment in this specific tool didn't work for me."

A small number are emotional reactions worth taking seriously. We'll come to those.

Chapter 6

The signal that does matter

The negative reviews that aren't about UX are the ones to pay attention to. There are very few of them.

Both senior roles. Both from the same month. In the entire dataset, these are the rare reviews that name the format itself as the problem. Both are for jobs where the candidate's expectation is "I'll meet a real human first."

That's a useful signal. It tells us that candidates evaluating senior or relationship-driven roles expect a different process.

For roles with lots of applicants, a video interview is a relief. For a director-level role where the candidate is also evaluating you, it can read as cheap.

What to do about it
  • Use video interviews for high-volume, lower-friction screens.
  • Don't use them as the only stage for senior-level evaluation.
  • The hundreds of other candidates already told you it works in the contexts where you should be using it.
Chapter 7

What candidates actually want fixed

Across every review, the recurring product asks were interview design choices and not category objections.

  • More retake attempts
  • A pause button between questions
  • More think-time between questions
  • Better camera framing guidance

This is the right kind of feedback to get. Candidates accept the format. They want it to be better.

Chapter 8

What this means for your hiring stack

The 'candidates hate AI interviews' narrative is loud, but the candidate-side data doesn't support it. What it does support is more nuanced and more useful:

  1. Candidates accept video interviews when the role is a sensible fit. Customer service, technical screens, content, sales, operations, admin support. Almost universally fine.
  2. Candidates resist video interviews when the role expects relationship-building. Director-level marketing, senior leadership, relationship-driven admin. The format can read as cheap when the role itself is relational.
  3. The complaints that do exist are almost all interview design and process limits, not philosophical objections.

So the design question is not "should we use AI video interviews." It's "for which roles, with what settings, and at which stage."

If you've been holding back because of LinkedIn discourse, this is your data point. The discourse is overweighted by a small, vocal minority. The candidates who actually go through the process, across hundreds of voluntary reviews and thousands of completed interviews, with no incentive to flatter their employer, report positive experiences at a rate of more than 13 to 1.

If you're already running video interviews and you're worried about candidate experience, focus on the levers candidates actually mention. More retakes. Longer think-time. Clearer UX. The format isn't the problem.

A note on methodology

How to read this data

Source. Hundreds of voluntary candidate reviews submitted via Truffle's post-interview survey between 2025-04-29 and 2026-03-18. The survey is optional. Only candidates who completed the interview saw it, and most candidates skip the question entirely. The reviews analyzed in this report represent the subset of candidates who chose to leave written feedback.

Sample. Candidates of Truffle's customers, hiring across customer service, sales, marketing, content, technical screens, administrative and operations roles, hospitality, and senior leadership. The customer base skews toward US-based companies hiring for high-volume positions. All comments are unedited. Typos preserved.

Anonymization. Candidate names and individual customer identities have been removed. Reviews are referenced by role category and month. Internal product test entries were excluded from the analysis.

Limitations. Selection bias means candidates who abandoned the interview before completing it don't appear in the data. Format scope: Truffle's video interviews are one-way. Candidates record their answers; recruiters watch later. This differs from real-time AI interviewing where the candidate interacts with a bot mid-interview. Some criticism of "AI interviews" specifically targets that real-time format. This report does not address it.

Candidates aren't rejecting AI video interviews. They're rejecting bad processes. Get the format right, and the data is on your side.

A report by Truffle. Candidate screening software.

Data: Hundreds of voluntary candidate reviews submitted to Truffle's post-interview survey between April 2025 and March 2026. Anonymized by role category and month. Comments unedited; typos preserved. Internal product test entries excluded.

Start typing to search 300+ pages on hiretruffle.com.