Recruiters Who Embrace AI in Hiring Build More Human Candidate Experiences

Credit: Outlever

Key Points

  • As AI becomes a baseline workplace expectation, hiring teams are struggling to respond consistently when candidates use those same tools in interviews, exposing gaps in how recruiting processes are designed and communicated.

  • Chris Helvajian, Senior Recruiter at Included Health, says the friction is not a candidate problem but a design problem, and that rethinking the approach leads to stronger, more human hiring experiences.

  • The fix starts with clearer job descriptions, transparent compensation, and structured interviews that test for genuine depth while embracing AI as a tool that, used openly, reveals exactly the fluency modern roles demand.

Companies expect candidates to be proficient with AI in their day-to-day work, yet those same tools are often unwelcome in the interview process. That is where the disconnect shows up.

Chris Helvajian

Senior Recruiter
Included Health

Hiring teams are screening for AI proficiency while candidates are using the same tools to compete in an increasingly crowded job market. That dynamic is prompting a fresh look at talent acquisition, challenging organizations to design recruiting processes that are clearer, more human, and better equipped for the way work actually happens today. For HR leaders, the question is no longer whether candidates will use AI, but whether the hiring process is built to respond thoughtfully when they do.

Chris Helvajian, Senior Recruiter at Included Health, a digital health company, has spent more than 15 years building recruiting solutions across higher education and the tech sector. He sees the friction around AI in interviews not as a candidate problem, but as a design problem. “Companies expect candidates to be proficient with AI in their day-to-day work, yet those same tools are often unwelcome in the interview process. That is where the disconnect shows up,” says Helvajian.

  • A numbers game: The more pressing issue is not the rare bad actor, but the authentic candidate who feels pressure to gain any edge possible. AI-assisted manipulation is easy to spot, but understanding why real candidates feel compelled to use it points to a more systemic problem. “Candidates realize they are one in a thousand for a mid-level position that gets 700 to 1,000 applications within the first day. It would be inauthentic on our side to expect a candidate to just put up with that,” he says. 

  • The transparent copilot: Rather than treating AI use as grounds for disqualification, Helvajian sees this moment as a catalyst for better recruiting design. That means setting clear expectations upfront and, in some cases, actively welcoming AI when candidates use it transparently to show their thinking rather than replace it. The same principle applies beyond technical roles into corporate functions, where a candidate might use AI to help build out a presentation for a given scenario, demonstrating the kind of tool fluency companies want to see on the job. “One candidate was upfront about using a copilot tool, and as they worked through the problem, they talked us through their entire thought process, explaining their methodology, the trade-offs of their approach, and the potential risks. That kind of use is encouraged because it demonstrates true depth of knowledge,” says Helvajian.

The most practical path forward has less to do with policing AI use and more to do with rebuilding the human experience of hiring from the ground up. Small, deliberate changes at each stage of the funnel can have an outsized impact, not just on candidate experience, but on the business outcomes that matter most to HR leaders: longer tenure, lower attrition, and stronger team fit.

  • Show them the money: Transparency at the top of the funnel extends to compensation. Posting a salary range eliminates a major source of friction for candidates and reduces the likelihood of misaligned expectations derailing an offer late in the process, saving time for everyone involved. “Always post a salary range with your role so that candidates have a very clear set of expectations. That is just good practice,” says Helvajian.

  • Reframing the rejection: For candidates who make it to later rounds, a brief debrief call can transform a disappointing outcome into a positive lasting impression. When hundreds of qualified people compete for a single opening, the difference often comes down to fit and timing rather than ability, and communicating that directly reframes the experience in a way that candidates rarely forget. “When you frame the feedback that way, the conversation is no longer about why they didn’t get the job. It is about helping them understand that there were other equally qualified people in the running, which gives them the sense of having successfully made it through a competitive process,” he says.

  • Pay it forward: That goodwill does not have to stop at the debrief. Staying connected to strong candidates and offering to make introductions within a broader network costs very little but builds the kind of recruiter reputation that attracts better talent over time. “I tell candidates who make it to the later stages that if they find another role at a company where I have a connection, I will always send a warm intro on their behalf. I am just trying to connect people to the opportunity they are looking for, even if it is not where I am.”

The same critical thinking that makes a great recruiter also makes a savvy technology buyer. As the market for AI-powered hiring tools continues to expand, HR leaders have a real opportunity to build smarter, more efficient recruiting processes, provided they invest in tools that are genuinely aligned with how their teams work. The key is approaching those decisions with the same rigor and clarity that good hiring demands. “Look for vendors that clearly understand your specific process and your talent needs, not just the one with the shiniest new AI tool. Rigorously question what problem it is solving and whether that is a problem you truly need to solve,” Helvajian says. 

Related articles

TL;DR

  • As AI becomes a baseline workplace expectation, hiring teams are struggling to respond consistently when candidates use those same tools in interviews, exposing gaps in how recruiting processes are designed and communicated.

  • Chris Helvajian, Senior Recruiter at Included Health, says the friction is not a candidate problem but a design problem, and that rethinking the approach leads to stronger, more human hiring experiences.

  • The fix starts with clearer job descriptions, transparent compensation, and structured interviews that test for genuine depth while embracing AI as a tool that, used openly, reveals exactly the fluency modern roles demand.

Companies expect candidates to be proficient with AI in their day-to-day work, yet those same tools are often unwelcome in the interview process. That is where the disconnect shows up.

Chris Helvajian

Included Health

Senior Recruiter

Companies expect candidates to be proficient with AI in their day-to-day work, yet those same tools are often unwelcome in the interview process. That is where the disconnect shows up.
Chris Helvajian
Included Health

Senior Recruiter

Hiring teams are screening for AI proficiency while candidates are using the same tools to compete in an increasingly crowded job market. That dynamic is prompting a fresh look at talent acquisition, challenging organizations to design recruiting processes that are clearer, more human, and better equipped for the way work actually happens today. For HR leaders, the question is no longer whether candidates will use AI, but whether the hiring process is built to respond thoughtfully when they do.

Chris Helvajian, Senior Recruiter at Included Health, a digital health company, has spent more than 15 years building recruiting solutions across higher education and the tech sector. He sees the friction around AI in interviews not as a candidate problem, but as a design problem. “Companies expect candidates to be proficient with AI in their day-to-day work, yet those same tools are often unwelcome in the interview process. That is where the disconnect shows up,” says Helvajian.

  • A numbers game: The more pressing issue is not the rare bad actor, but the authentic candidate who feels pressure to gain any edge possible. AI-assisted manipulation is easy to spot, but understanding why real candidates feel compelled to use it points to a more systemic problem. “Candidates realize they are one in a thousand for a mid-level position that gets 700 to 1,000 applications within the first day. It would be inauthentic on our side to expect a candidate to just put up with that,” he says. 

  • The transparent copilot: Rather than treating AI use as grounds for disqualification, Helvajian sees this moment as a catalyst for better recruiting design. That means setting clear expectations upfront and, in some cases, actively welcoming AI when candidates use it transparently to show their thinking rather than replace it. The same principle applies beyond technical roles into corporate functions, where a candidate might use AI to help build out a presentation for a given scenario, demonstrating the kind of tool fluency companies want to see on the job. “One candidate was upfront about using a copilot tool, and as they worked through the problem, they talked us through their entire thought process, explaining their methodology, the trade-offs of their approach, and the potential risks. That kind of use is encouraged because it demonstrates true depth of knowledge,” says Helvajian.

The most practical path forward has less to do with policing AI use and more to do with rebuilding the human experience of hiring from the ground up. Small, deliberate changes at each stage of the funnel can have an outsized impact, not just on candidate experience, but on the business outcomes that matter most to HR leaders: longer tenure, lower attrition, and stronger team fit.

  • Show them the money: Transparency at the top of the funnel extends to compensation. Posting a salary range eliminates a major source of friction for candidates and reduces the likelihood of misaligned expectations derailing an offer late in the process, saving time for everyone involved. “Always post a salary range with your role so that candidates have a very clear set of expectations. That is just good practice,” says Helvajian.

  • Reframing the rejection: For candidates who make it to later rounds, a brief debrief call can transform a disappointing outcome into a positive lasting impression. When hundreds of qualified people compete for a single opening, the difference often comes down to fit and timing rather than ability, and communicating that directly reframes the experience in a way that candidates rarely forget. “When you frame the feedback that way, the conversation is no longer about why they didn’t get the job. It is about helping them understand that there were other equally qualified people in the running, which gives them the sense of having successfully made it through a competitive process,” he says.

  • Pay it forward: That goodwill does not have to stop at the debrief. Staying connected to strong candidates and offering to make introductions within a broader network costs very little but builds the kind of recruiter reputation that attracts better talent over time. “I tell candidates who make it to the later stages that if they find another role at a company where I have a connection, I will always send a warm intro on their behalf. I am just trying to connect people to the opportunity they are looking for, even if it is not where I am.”

The same critical thinking that makes a great recruiter also makes a savvy technology buyer. As the market for AI-powered hiring tools continues to expand, HR leaders have a real opportunity to build smarter, more efficient recruiting processes, provided they invest in tools that are genuinely aligned with how their teams work. The key is approaching those decisions with the same rigor and clarity that good hiring demands. “Look for vendors that clearly understand your specific process and your talent needs, not just the one with the shiniest new AI tool. Rigorously question what problem it is solving and whether that is a problem you truly need to solve,” Helvajian says.