Redefining Detection Engineering: Part I
An Engineering Discipline
The Definition
Detection engineering is a structured, collaborative process that involves designing, building, and continuously improving threat detection capabilities to identify and respond to security threats quickly.
- Splunk, “Detection Engineering Explained”
Vendor definitions.. fine, but always generic. In my opinion, this definition oversimplifies the scope of detection engineering. This kind of framing leads professionals to believe the function is nothing more than writing rules in a SIEM, when, in reality, it is an engineering discipline - complete with a philosophy, lifecycle, practices, and pipelines.
So, what is the true definition of detection engineering? I’d argue it’s something close to this:
Detection engineering is the application of software engineering concepts to transform raw data into high-fidelity, actionable alerts that are measurable, manageable, and reliable through repeatable techniques, such as version control, testing, and optimization.
This distinction isn’t academic; it is the difference between successfully implementing detection engineering at scale and entire security programs being overwhelmed by false positives and manual processes.
To really understand what detection engineering is, sometimes it helps to clear up what it isn’t:
Just writing SIEM queries
Writing as many detections as possible
Copy and pasting indicators of compromise (IOC) into alerts
Tuning pre-canned detections from third parties
Why I’m Writing
Initially, I planned to turn this material into a course on detection engineering. However, after continually seeing the confusion around detection roles, the lack of program maturity across the industry, and vendor marketing, I decided to do something a little different.
I’ve never been one to appreciate paywalled knowledge. So, I’m open-sourcing everything here, in blog form, for anyone who wants to rebuild detection engineering from the ground up. The way I believe it was meant to be practiced.
The Philosophy
Detection engineering isn’t about chasing today’s attacker. It’s about building logic that will catch attackers tomorrow, next month, and next year - no matter who they are.
This philosophy is built on guiding principles that come from practice rather than vendor playbooks.
The Generalizability Principle
Good detections don’t catch specific IOCs; they catch attack patterns. Behaviors > signatures.
Assume Breach Mindset
Assume you will be breached; your detections shouldn’t just focus on the perimeter of your network.
Data-Driven Decision Making
If you can’t measure the value of a detection, it isn’t a detection - it’s a guess, and guessing isn’t engineering.
Fail Fast, Learn Faster
While you will test detections, you shouldn’t spend an obsessive amount of time crafting “perfect” logic. Deploy, measure, refine.
Signal vs. Noise Obsession
We don’t want more detections; we want better detections.
These principles aren’t just mine or anyone else’s theory. I’m giving them to you now because most organizations are failing at detection engineering today. Don’t believe me? Well, let’s take a look at the numbers behind their failures.
The Problem
Detection engineering didn’t just “appear” one day. It was born out of failure - the failure of organizations and their tools to operationalize detection. Even today, most organizations continue to fail. The numbers prove it.
Every year, CardinalOps conducts a study analyzing thousands of production SIEM environments to assess how companies are actually performing in terms of detection. Their 2024/2025 reports will sober you up (with some added SANS survey stats):
21% of MITRE ATT&CK techniques covered, despite collecting enough data to cover 90%.
Imagine missing 4 out of 5 behaviors entirely.
13% of rules are broken and will never fire due to issues with data sources, fields, and parsing errors.
Imagine making 160 detections and not knowing 16 of them will never work.
60% or more of attacks go undetected due to detection gaps.
Imagine crafting great detections but missing 60% of attack vectors.
80% or more of detections are untested or misconfigured.
Imagine deploying detections without knowing how they’re affecting your customers (e.g., SOC analysts).
64% of teams cite false positives as their biggest issue.
Imagine creating so many low-fidelity detections that your work becomes your team’s #1 issue.
Some of you might be thinking: “Sure, but my company doesn’t have numbers that bad.” Maybe. Have you ever actually checked? How? Was it thoroughly inspected against a framework? Or was it just checked enough to meet a compliance audit, without really proving anything?
These aren’t just numbers. They represent missed ransomware infections, unchecked insider threats, and attackers still living rent-free in an unknown number of production environments. This is the reality that you, as a detection engineer, are paid to fix.
Now, let’s try to imagine a world where this level of engineering failure is allowed:
20% of cars have functioning brakes.
10% of planes risk falling out of the sky.
Are you getting the point? This level of failure is unthinkable.
Things To Think About
If the average set of detections in a SIEM only covers 21% of techniques, what is it not seeing? What about in your environment?
How do false positives affect the professionals who have to deal with them? How would it affect you?
Why is detection engineering still this broken in 2025?
How Do We Fix It?
These failures are primarily due to a systemic misunderstanding of what detection engineering is, the responsibilities that should be assigned to the roles involved, how maturity is measured, and an inability to evaluate which metrics actually matter.
There’s only one way to fix it - setting aside preconceived notions of what you believe detection engineering is, and relearning it from the ground up. Congratulations on taking that first step. I’m glad to have your confidence.
It’s time to start rebuilding detection engineering as the engineering discipline it was always meant to be.
What’s Next in Part 2?
Responsibilities of a Detection Engineer


Great article. Thanks
False Positive Benign