2 Sources
2 Sources
[1]
I run the world's largest employee mental health company. Leaders are treating AI adoption as a tech problem. It's not | Fortune
As CEO of ComPsych -- the world's largest provider of employee mental health services -- I spend a lot of time thinking about what's making workers anxious. Right now, AI is at the top of that list. According to a 2025 Pew Research Center report, workers are more worried than hopeful about AI in the workplace. And the consequences of that anxiety go well beyond morale: research shows that when employees believe they are likely to lose their job, the risk of serious psychological distress rises considerably -- along with the likelihood of mental health leave, disengagement, and burnout. Leaders are rolling out AI tools at speed. Most are treating this as a technology challenge. It isn't. It's a people challenge -- and getting it wrong has a measurable cost. Job insecurity driven by AI fear manifests in two damaging patterns. The first is disengagement -- distracted, uncollaborative workers doing the minimum to get by. The second is its opposite: workers who over-compensate by becoming hypersensitive and over-extended, eventually burning out. Both patterns hurt team performance. Both are symptoms of the same underlying failure: leaders who haven't addressed what their employees are actually afraid of. To counteract this, leaders need to focus on building genuine trust. Trust doesn't just happen -- it's earned. The most effective way to build it is through early, open, and consistent communication. The impulse to wait until everything is figured out before communicating is understandable. But in the absence of communication, rumors form, narratives harden, and fear sets in. Sharing relevant information as soon as it's available -- being transparent about what's still being determined and when more details will follow -- allows leaders to inform and reassure simultaneously. In the case of AI, this can be as simple as defining a clear company philosophy. Articulating that AI may change parts of a person's job but does not diminish their value gives leaders the opportunity to set the tone, reduce fear, and support emotional well-being before anxiety takes hold. Clarity about where AI should -- and shouldn't -- have the final word is one of the most important things a leader can establish early. I remind our teams constantly: AI has to earn our trust. If you were collaborating with a new colleague for the first time -- one with only a few years of real-world experience -- you would never take their work, assume it was fully correct, and move forward without carefully reviewing it, providing feedback, or challenging the portions you disagreed with. The same scrutiny applies to anything produced with AI assistance. Beyond quality checks, there are categories of work where human judgment, ingenuity, and creativity must remain the final word. Every organization will need to define its own guardrails. As a company providing mental health services to millions of people worldwide, we've been explicit with our teams: the human expertise of our clinicians is paramount. We are embracing AI to enhance operations and aid patient navigation -- but we will never defer to a large language model when someone is in crisis and needs human-centered care. Setting clear expectations is necessary but not sufficient -- organizations must also actively ensure their employees are growing alongside the technology, not being hollowed out by it. Research from MIT's Media Lab found that heavy reliance on AI tools can atrophy the independent thinking and problem-solving skills people need most. It is incumbent on leaders to ensure teams are using these tools to vault their creative and strategic thinking to new heights -- not as crutches they eventually won't be able to function without. This means actively encouraging imaginative, out-of-the-box thinking, reinforcing the value of individual skills, and investing in continuous learning and development programs. Upskilling is not optional: the skills needed to thrive alongside AI, and the tasks that make up our workdays, will change enormously. The leaders who get this right won't just be the ones who deploy AI fastest. They'll be the ones who brought their people along -- reducing fear, building trust, and preserving the human judgment that no model can replicate. The goal isn't to help your workforce adapt to the future of work. It's to help them build it.
[2]
AI Is Making Leaders Question Their Worth. Here's the Psychological Shift They Must Make.
They will also be the ones who stay steady when comparison rises and tolerate not being the smartest in the room without interpreting it as a threat. Artificial intelligence is not just changing workflows. It is destabilizing our identity. Most discussions around AI focus on efficiency, automation and competitive advantage. But working with founders and executives, I am seeing something quieter and more personal happening beneath the surface. AI is confronting leaders with a question many have never had to answer: Who am I if I am not the differentiator? We live in a society where our career consciously or subconsciously becomes our identity -- who do you become when there is technology at everyone's fingertips that can be you? For years, high performers have regulated their confidence through competence. They built authority by being the most creative, data-driven and strategic thinker in the room. Now, large language models can draft strategy outlines in seconds. Data can be analyzed instantly. Creative assets can be generated on demand. This is not just a technological shift. It is an identity disruption, and that feels scary. Sign up for the Entrepreneur Daily newsletter to get the news and resources you need to know today to help you run your business better. Get it in your inbox. From my perspective as an executive psychology coach, the brain is designed to reduce uncertainty through prediction. The mind is a prediction-making machine. It constantly uses past experience and our sensory present to anticipate what will happen next. When prediction becomes difficult, the nervous system increases vigilance. This doesn't necessarily show up as panic. It can feel like what I call "background noise" and show up behaviorally as vigilance. You might see it in leaders who suddenly feel the need to be in every AI-related conversation, even when it's not necessary. In executives who are checking competitors' AI announcements late at night, not because it's strategic, but because it's hard to turn off the scanning. In founders who insist on personally reviewing every AI-generated output, tightening control instead of expanding trust. It can also show up as pushing teams to "move faster" without a clear integration plan or dismissing AI publicly while privately feeling unsettled. Another example that I see happening often with seasoned leaders is this quiet comparison to younger, more tech-native talent, and wondering if you're falling behind. When identity has been unconsciously tied to exceptional performance, disruption feels like exposure. This is not about ego. It is about conditioning. Many high achievers developed early patterns that linked belonging and safety to output. Achievement became more than success. It became a way to regulate how they feel about themselves. For instance: If I perform, I am valued. And if I am valuable, I am safe. Safe in what way? Safe financially, safe in relationships and safe in identity. When a system can now perform parts of your role faster than you can, that equation destabilizes. Leaders who are unaware of this dynamic will compensate. They may overwork. Over-assert. Over-correct. They may dismiss AI publicly while privately feeling threatened. They may double down on productivity instead of expanding adaptability. This is where executive psychology matters. Performance is not only cognitive. It is physiological. A chronically activated nervous system narrows perception. It reduces cognitive flexibility. It increases black-and-white thinking. It subtly shifts leaders into defensive strategy rather than creative expansion. In volatile markets, defensive leadership becomes expensive. The competitive edge in the AI era will not belong to the leader who can outproduce a machine. It will belong to the leader whose identity is not destabilized by one. That requires internal differentiation. The leaders I see thriving right now are not the loudest adopters nor the fiercest resistors. They are the most internally stable. They can integrate new tools without unconsciously fighting for relevance. That stability comes from understanding your own patterns. If achievement has historically been your regulator, AI will amplify that pattern. If control has been your safety strategy, unpredictability will stress it. A word of hope: Disruption also offers opportunity. When performance is no longer the sole differentiator, leadership shifts toward integration, discernment and relational intelligence. Toward the capacity to hold uncertainty without collapsing into urgency. In other words, the edge becomes psychological. This is where maturity in leadership begins to separate itself from mere competence. When you are no longer competing on output alone, you are forced to compete on presence. On judgment. On the ability to regulate yourself in environments that do not offer clear answers. Artificial intelligence may transform industries, but the leaders who outperform will be the ones who transform their internal operating system first. The ones who can notice their own patterns before those patterns start driving decisions. The ones who can stay steady when comparison rises. The ones who can tolerate not being the smartest voice in the room without interpreting it as a threat. There has never been a more critical time to do the inner work, mentally, emotionally and psychologically, than right now. This has to be a non-negotiable, and it can't be a "when I get to it" or something you do for luxury. It is now your leadership responsibility. Because in a world that is accelerating, steadiness becomes power.
Share
Share
Copy Link
Leaders are racing to deploy AI tools, but they're missing the real challenge. Workers fear job loss while executives question their own worth as machines replicate their skills. According to ComPsych's CEO, this isn't a technology problem—it's a people problem with measurable costs in burnout, disengagement, and mental health.
As organizations accelerate AI adoption, leaders are making a critical mistake: treating it purely as a technology challenge. According to Richard Chaifetz, CEO of ComPsych—the world's largest provider of employee mental health services—this approach ignores the profound human impact unfolding across workplaces
1
. A 2025 Pew Research Center report reveals that workers are more worried than hopeful about AI in the workplace, and this employee anxiety carries measurable consequences. When employees believe they're likely to lose their job, the risk of serious psychological distress rises considerably, along with mental health leave, disengagement, and burnout1
.
Source: Entrepreneur
Job insecurity driven by AI manifests in two damaging patterns that directly impact team performance. The first is disengagement—distracted, uncollaborative workers doing the minimum to get by. The second is its opposite: workers who over-compensate by becoming hypersensitive and over-extended, eventually burning out. Both symptoms stem from the same underlying failure: leaders who haven't addressed what their employees are actually afraid of
1
.While employee anxiety dominates workplace conversations, a quieter psychological shift is destabilizing leaders themselves. AI is confronting executives with a question many have never had to answer: Who am I if I am not the differentiator
2
? For years, high performers have regulated their confidence through competence, building authority by being the most creative, data-driven, and strategic thinker in the room. Now, large language models can draft strategy outlines in seconds, data can be analyzed instantly, and creative assets can be generated on demand2
.This identity disruption manifests in behavioral patterns that executive psychology coaches are observing across organizations. Leaders suddenly feel the need to be in every AI-related conversation, even when unnecessary. Executives check competitors' AI announcements late at night, not because it's strategic, but because it's hard to turn off the scanning. Founders insist on personally reviewing every AI-generated output, tightening control instead of expanding trust
2
. When self-worth has been unconsciously tied to exceptional performance, disruption feels like exposure.To counteract the anxiety surrounding the future of work with AI, leaders need to focus on building genuine trust. Trust doesn't just happen—it's earned through early, open, and consistent communication. The impulse to wait until everything is figured out before communicating is understandable, but in the absence of communication, rumors form, narratives harden, and fear sets in
1
.Sharing relevant information as soon as it's available—being transparent about what's still being determined and when more details will follow—allows leaders to inform and reassure simultaneously. In the case of AI, this can be as simple as defining a clear company philosophy. Articulating that AI may change parts of a person's job but does not diminish their value gives leaders the opportunity to set the tone, reduce fear, and support emotional well-being before anxiety takes hold
1
.Clarity about where AI should—and shouldn't—have the final word is one of the most important things leaders can establish early. Chaifetz reminds his teams constantly: AI has to earn our trust. If you were collaborating with a new colleague for the first time—one with only a few years of real-world experience—you would never take their work, assume it was fully correct, and move forward without carefully reviewing it
1
.Beyond quality checks, there are categories of work where human judgment and creativity must remain the final word. As a company providing mental health services to millions of people worldwide, ComPsych has been explicit: the human expertise of clinicians is paramount. They are embracing AI to enhance operations and aid patient navigation—but will never defer to a large language model when someone is in crisis and needs human-centered care
1
.Related Stories
Setting clear expectations is necessary but not sufficient. Organizations must actively ensure their employees are growing alongside the technology, not being hollowed out by it. Research from MIT's Media Lab found that heavy reliance on AI tools can atrophy the independent thinking and problem-solving skills people need most
1
.Leaders must ensure teams are using these tools to vault their creative and strategic thinking to new heights—not as crutches they eventually won't be able to function without. This means actively encouraging imaginative, out-of-the-box thinking, reinforcing the value of individual skills, and investing in continuous learning and development programs. Upskilling is not optional: the skills needed to thrive in the AI era, and the tasks that make up our workdays, will change enormously
1
.The competitive edge in the AI era will not belong to the leader who can outproduce a machine. It will belong to the leader whose identity is not destabilized by one. That requires internal differentiation and understanding your own patterns
2
. When performance is no longer the sole differentiator, leadership shifts toward integration, discernment, and relational intelligence—toward the capacity to hold uncertainty without collapsing into urgency.The leaders who get this right won't just be the ones who deploy AI fastest. They'll be the ones who brought their people along—reducing fear, building trust, and preserving the human judgment that no model can replicate. The goal isn't to help your workforce adapt to the future of work. It's to help them build it
1
. When automation increases efficiency but leaders maintain adaptability and focus on team performance, organizations can navigate this transition without sacrificing their most valuable asset: their people.Summarized by
Navi
15 Jul 2024

13 Jan 2026•Business and Economy

25 Feb 2026•Business and Economy
