Your cyber stack has 47 tools. Three of them are working.

Tool count is the vanity metric of most enterprise cyber programmes. Three tests we run on day one of a readiness engagement that separate the tools from the programme.

January 2, 2026 By Rohit Khirapate

Your cyber stack has 47 tools. Three of them are working.

The average enterprise cybersecurity stack we see on a vCISO engagement runs somewhere between 40 and 70 tools. Endpoint, email, SIEM, SOAR, CSPM, CNAPP, IAM, PAM, vulnerability, threat intel, DLP, ZTNA, and a long tail of free trials that nobody remembers approving.

Tool count is not the programme. Tool count is the evidence the programme was never designed. When we open a readiness engagement, we run three tests. They are small, unglamorous, and they expose more about an organization's posture than a months-long architecture review.

Test 1: The named-owner test

Pull the tool inventory. For every tool, ask one question: who is the named owner, what is their runbook, and when did they last exercise it.

The result is usually the same. A third of the tools have a named owner and a runbook. A third have a named owner and no runbook. A third have no named owner at all.

The tools in the last two buckets are not working. They are producing telemetry nobody is reading, paying a licence nobody is justifying, and creating a compliance claim that will not survive a pen test.

The fix. Rationalize. Any tool without a named owner and a runbook goes on a 90-day track: either get an owner and a runbook, or be decommissioned. We have yet to run this exercise and not save a client at least 15% of their cyber software spend in the first quarter.

Test 2: The alert-to-action test

Pull the last 30 days of alert volume. Pull the last 30 days of analyst dispositions. If the ratio is not 1:1 — if every alert does not have a documented action — the SOC is not operating the stack. The stack is operating the SOC.

The fix. Measure three numbers weekly: total alerts, alerts with dispositions, mean time to disposition. Publish them. Anything measured weekly and published starts to move. Anything not measured doesn't.

Test 3: The board-question test

Ask the CISO three questions a board member will ask at the next meeting:

1. What are our top three cyber risks, in dollars?

2. What are we spending to reduce each of them?

3. What does the trendline look like over the last four quarters?

If any of these three answers requires "I'll get back to you" or a week-long data pull, the tool stack is not feeding the programme. The programme is being narrated out of a tool stack.

The fix. A one-page risk register, refreshed monthly, signed by the CISO. Three risks, three dollar estimates, three mitigation budgets, four quarters of trend. That is the whole artifact. It is also the artifact every CFO and board member will read and remember.


None of these three tests requires new tooling. All of them require that the existing tooling be connected to the operating cadence. That is almost always the gap.

The regulated-enterprise and government programmes we've rescued in the last two years all had the same shape: a sophisticated tool inventory and a thin operating layer. Closing the gap is rarely a procurement question. It is a staffing, cadence, and accountability question.

Next step. Our vCISO practice runs a structured 30-day assessment against these three tests and returns a rationalisation plan with named owners, runbook gaps, and quantified cost impact. Book a scoping call and you'll hear back from a senior practitioner within 48 hours. Not a sales rep. Not a chatbot.