E59 - 150,000 Incidents Prove Time Is the Variable That Matters
Posted on June 17, 2025 • 4 min read • 745 words
By FIR Risk Advisory | Cybersecurity Fraud Intelligence
Weekly Risk Intelligence Brief
Source: Cyentia Institute — IRIS 2025 (Information Risk Insights Study) (150,000+ incidents, 2008–2024)
The 30-Second Brief
Cyentia Institute analyzed over 150,000 cyber incidents spanning 2008 to 2024. The overarching finding: time matters. Incident frequency has increased sixfold over 15 years. Financial losses have grown 15x in absolute terms. And the organizations most targeted aren’t the ones you’d expect — large firms ($100B+) are 620x more likely to be targeted relative to their population size.
Static risk models built on historical averages are increasingly misleading. The threat landscape is evolving too fast for backward-looking frameworks.
The Data: 150,000+ Incidents Over 15 Years
Sixfold Increase in Significant Incidents
The volume of significant cyber incidents has grown sixfold over the study period. This isn’t just better reporting — it’s a genuine acceleration in attack activity across sectors and geographies.
INTEL [TREND]: Significant cyber incidents have increased sixfold over 15 years. Risk models calibrated to historical averages are underestimating current exposure. Organizations should weight recent data more heavily in risk assessments and recalibrate annually.
Large Enterprises: 620x More Likely to Be Targeted
Firms with over $100 billion in revenue are 620 times more likely to be targeted relative to their population size. The targeting is disproportionate — and intentional. Attackers go where the value is.
INTEL [SECTOR ALERT]: Large enterprises ($100B+ revenue) are 620x more likely to be targeted. Despite higher SMB incident counts in absolute terms, enterprise targeting intensity far exceeds smaller organizations. Enterprise security investments should reflect this disproportionate threat exposure.
Financial Impact: $2.9M Median, $32M Tail Risk
- Median loss: $2.9 million per incident
- Tail losses: $32 million (95th percentile)
- Loss growth: 15x increase in absolute terms; 8x proportional to revenue
- Professional services: 25x surge in median losses
- Media-reported incidents: $28.5 million median — 30x higher than historical baseline
The gap between median and tail losses is where risk quantification fails most organizations. Planning for the median means being unprepared for the tail.
INTEL [VULNERABILITY]: Median cyber losses of $2.9M mask tail risk of $32M at the 95th percentile. Professional services saw a 25x surge in median losses. Organizations using average loss figures for risk planning are systematically underestimating their exposure. Use percentile-based models, not averages.
Attack Techniques Are Shifting
The data shows technique evolution over time:
- Valid accounts remain the dominant initial access vector
- Application exploits jumped to 38% of incidents
- Misconfigurations rose to 30%
Static control frameworks that don’t account for technique proliferation become obsolete quickly.
INTEL [ATTACK TECHNIQUE]: Valid accounts remain the top initial access vector, but application exploits (38%) and misconfigurations (30%) are surging. Control investments should align with current technique proliferation patterns, not historical distributions. Review MITRE ATT&CK mappings against your control coverage quarterly.
Seven Core Risk Insights
- Rising incident frequency warrants updated defensive strategies
- Enterprise targeting exceeds SMB exposure despite higher absolute SMB incident counts
- Threat patterns shift rapidly — static frameworks become obsolete
- Normalized financial metrics are essential across business scales
- Sectoral variation is substantial — risks differ markedly by industry
- Attack techniques are evolving — valid accounts, app exploits, and misconfigurations dominate
- Data recency matters — historical datasets may obscure emerging threats
What Leaders Should Do Now
Recalibrate risk models — Weight recent data more heavily. The threat landscape of 2024 looks nothing like 2019. Risk models should reflect that.
Slice analysis by size, sector, and time — Aggregate statistics obscure meaningful variation. Benchmark against your peer group, not the global average.
Blend historical and real-time intelligence — Actuarial data provides the baseline. Real-time threat intelligence provides the current picture. You need both.
Align control investments with technique trends — Valid accounts, app exploits, and misconfigurations are the current top three. Ensure your controls map to them.
Normalize impact to revenue — A $2.9M loss means different things to a $10M company and a $10B company. Revenue-normalized metrics enable meaningful board-level reporting.
The Bottom Line
150,000 incidents over 15 years tell one story: time is the variable that matters most. Incident frequency is up sixfold. Losses are up 15x. Attack techniques are shifting faster than most risk frameworks can track.
The organizations that thrive in this environment will be the ones that treat risk modeling as a living discipline — updated quarterly, weighted toward recent data, and normalized to their specific size and sector. Static risk is dead.
Find all editions on FIR Risk Tuesday | GitHub