Shauli Zacks
SafetyDetectives recently spoke with Lesley Carhart, Principal Industrial Incident Responder at Dragos and a well-known voice in cybersecurity. In this in-depth conversation, Lesley shares their unique journey into the world of digital forensics and incident response, reflects on industry mentorship and mental health, and discusses the evolving skill sets cybersecurity defenders will need to combat today’s increasingly human-driven threats.
Prior to joining Dragos, Lesley was the incident response team lead at Motorola Solutions. Following four years as a Principal Incident Responder for Dragos, Lesley now manages a team of incident response and digital forensics professionals across North America who perform investigations of commodity, targeted, and insider threat cases in industrial networks. Lesley is also a certified instructor and curriculum developer for Dragos’ incident response and threat hunting courses.
Can you introduce yourself and talk about what led you to specialize in digital forensics and incident response?
My name is Lesley Carhart, and what I do professionally is incident response for industrial systems at a company called Dragos. I’ve been doing this work for well over a decade now. Before Dragos, I did similar work at Motorola Solutions.
As for my background, I started out pretty early. I grew up on a farm in Illinois. We didn’t have a lot of money or many exciting things to do. In the ’80s, my dad got a computer to do inventory for the farm—he leased it, which was a big deal back then. I had a choice: I could either learn to farm or learn to use the computer. And since I incinerate in sunshine, I chose the computer.
I learned to program when I was seven or eight years old. I got my first job as a programmer around 15, which was possible back then during the dot-com boom. As I got deeper into computers, I started reading about a developing field in the ’90s—computer forensics. I saw in magazines and bulletin boards that people were recovering deleted data from hard drives. It was fascinating. There was even early speculation about memory forensics.
It was much harder to get into the field back then. If you didn’t look a certain way or know the right people, finding a mentor or internship was nearly impossible. So I joined the military, got a degree, and took a long route. I eventually made my way into the field through network engineering and operations. I worked in aviation and finally found someone who took a chance on me. But it was very, very challenging. If you weren’t the “right” person, nobody would call you back for cybersecurity—especially digital forensics.
What inspired you to create Lesley Carhart’s Cybersecurity Blog, and what kind of impact do you hope your writing has on the industry?
Exactly what I just spoke about. I had a very hard time finding mentorship myself. And as much as I love my technical work, I’ve always had a strong motivation to make sure others don’t struggle the way I did to get into cybersecurity.
It’s becoming that way again—it’s getting very challenging to break into the field. That motivates me even more to put content out there and provide mentorship. I run career clinics, virtual conferences for free, and one-on-one mentoring online. I do it because it was hell trying to break in myself.
Now, I see gatekeeping returning. We’re seeing job shortages in the U.S., and I don’t want to see another generation go through what I did—struggling for 10 years to get a foot in the door.
What are some common mistakes that organizations make during incident responses, and how can they be better prepared?
I work in a very specific niche of incident response, and it’s not like traditional enterprise work. I respond to real-world physical things that have been hacked—like power plants, manufacturing equipment, and trains.
What I want people to understand is how different that space is. Almost every organization has industrial control systems (ICS) in their environment. Even if it’s just data centers with big air conditioners or buildings with elevators and water systems—those are all ICS.
These systems do crucial things, and when they fail, it affects life, safety, and operations. The equipment we work with is often very non-standard and quite old—sometimes it’s embedded computers or Windows NT. Even well-funded organizations struggle with this. They’re trying to come up with separate incident response plans for systems that might get ransomed or sabotaged.
So I encourage organizations to think about what ICS they have—even if it’s just building automation or generators. Think about how the response must be different. You can’t use the same playbook you use for modern enterprise environments.
Forensics on Windows XP is very manual. ICS protocols are different. The tools are different. Consider it. Make a plan.
You’ve worked extensively with critical infrastructure. What unique cybersecurity challenges exist in that space compared to traditional IT environments?
I mentioned a few already. First, the age of the equipment. It’s not because people in OT are being negligent—it’s because these systems have long life cycles. They’re built and tested to work together safely, and upgrading them can break that system.
Upgrades aren’t trivial. They can make systems unsafe if you do them without the vendor or OEM’s blessing. So we deal with very old systems that often aren’t on the same domain, use shared logins, service accounts, and so on.
Operationally, it’s different too. You can’t just shut things down. These systems may be keeping people alive or safe. You have to understand the whole process—what everything does, how it fits together, and what happens if it breaks.
In cybersecurity, it’s easy to focus on the “”hacker-y” stuff—domain controller exploits, metasploit. But in industrial environments, it’s more like, “What if someone accidentally disables the safety system that stops a machine when someone’s arm is caught in it?” That’s the kind of risk we deal with.
You have to think differently—not like a hacker, but like someone who understands physical processes and the consequences of failure.
Burnout and mental health are recurring themes in your work. How can the cybersecurity industry do a better job supporting its professionals?
Honestly, with the market the way it is, I don’t expect companies—especially in the U.S.—to invest more in employee well-being anytime soon.
Therefore, as an industry and as individuals, we have to be more aware of burnout. We’re working with fewer people, fewer resources, and there’s a constant sense of economic dread hanging over a lot of cybersecurity professionals. Meanwhile, the threats are only getting worse. The criminal ecosystem is growing. Adversaries have more resources, better tools, and they’re targeting more effectively.
Attackers are getting more sophisticated, and the bar for entry is lower thanks to new tools. I don’t think employers are going to step up to support mental health anytime soon. So we need to support ourselves and each other.
Recognize the signs of burnout—when you lose interest, feel overwhelmed, don’t want to get out of bed, or can’t finish projects. That’s when you need to talk to someone—professionals, peers, mentors, family. You need to find healthy coping mechanisms. Know when to walk away. Have hobbies that help you disconnect and relax.
And we need to watch out for each other. Things are really, really bad right now, and if we don’t take care of ourselves and each other, it’s only going to get worse.
As threats evolve, what skills or mindsets do you think will define the next generation of effective cybersecurity defenders?
It’s interesting—everyone talks about AI and how it’s going to replace analysts. But adversaries still use people. They have huge budgets now. Yes, they use tools and AI, but their tactics rely heavily on human beings with some training doing all kinds of attacks—ransomware, sophisticated ICS hacks, all of it.
As their attacks become more human-driven, our tools and AI become less effective. Computers aren’t good at spotting novel, human behavior. That means the next generation of analysts can’t just be doing what computers do.
If a task can be automated, it will be—especially in this economy. If your job is clicking buttons, a machine’s going to take over.
But attackers using these human-driven tactics means we need human defenders who know how to hunt threats, think creatively, and adapt quickly. Complex purple teaming, new tactics—that’s where we need people. Analysts will still use tools, but the tools will handle the boring stuff. You don’t want to be doing monotonous work.
You need to be the defender brain that outsmarts the bad actor’s brains. That’s the future of cybersecurity.