The browser you are using is not supported. Please consider using a modern browser.
Article
Before You Build, Talk to Someone Who Did
I recently sat down with the CTO of a $7.5 billion bank who has spent four years building his institution’s enterprise data warehouse from the ground up. They pull from 25 data sources, have dedicated analysts embedded in every line of business, and have built an AI agent that writes SQL queries in plain English, self-corrects when it errors, and returns results with the original query attached so report writers can audit the output. By almost any measure, he’s done the work right.
And the first thing he told me was that most banks shouldn’t try to do what he did.
We have some version of this conversation almost every week at KlariVis. A bank CEO or CFO tells us they’re thinking about building their own data platform. They’ve got a data analyst on staff, maybe two. They figure they’ll stand up a warehouse, connect some sources, build some dashboards, and they’ll be off to the races.
Eighteen months and several hundred thousand dollars later, they call us.
Listen, I’ve been there. I know what it feels like to want to own the thing: to control it, to build exactly what you need. That instinct is reasonable, but the execution is where it falls apart, and it falls apart in ways that aren’t obvious until you’re already deep into it.
He walked me through three places where he’s watched banks get stuck. None of them are where you’d expect.
The warehouse is the easy part.
Sounds counterintuitive, but standing up a SQL Server or Snowflake instance and piping data into it isn’t actually the bottleneck. He’s watched banks go to competitors who hand over a data warehouse and then leave them with nothing tangible at the end. As he put it, the proof is in the dashboards, not the warehouse. If a vendor can’t show you analytics that actually mean something to a lender or a CFO, they probably haven’t built something useful underneath either.
This is what we see consistently. Banks end up with infrastructure that nobody uses because nobody translated the data into something a branch manager or a CFO can act on first thing Monday morning.
Data engineers are not business analysts.
This came up several times in our conversation, and it’s the point worth sitting with. A data engineer can write clean queries and build beautiful schemas, but do they know what a lender actually needs to see at 7am before calling on a relationship? Do they know how a CFO reconciles balance sheet anomalies across lines of business? Can they build a profitability view that accounts for how finance actually treats the numbers?
Usually, no. That’s not a knock on data engineers. It’s a recognition that understanding the banking business and understanding data architecture are two different skill sets. He has the resources to staff both at a $7.5 billion institution. Your average $2 billion community bank does not.
The breakeven keeps moving up.
What most banks don’t account for is that the target keeps moving. A few years ago, you could argue that banks above $1 billion in assets could potentially build in-house if they committed to it. Now, with the complexity of data sources, the demand for real-time intelligence, and the AI layer sitting on top of everything, he’s watching that threshold shift toward $2 billion, $3 billion, and higher.
Jerry Bradley, our Chief Product Officer, noted that we talk to banks well above $10 billion that have invested millions over years and still can’t get past ad hoc reporting. The hidden costs of building — the internal subject matter experts pulled off other priorities, the ongoing maintenance, the inevitable scope creep — have a way of compounding long after the initial budget conversation is over.
SVB is worth noting here, not for the politics of it, but for what it revealed about the operational gap between knowing your institution and assuming you do. SVB failed in less than a week, and the entire industry’s response was the same: we cannot wait a month for a report from finance to know how this bank is doing.
His bank had already moved to real-time balance monitoring before that moment. They set up event-driven alerts so relationship managers are notified when a client has a major balance drop or an unusual outbound wire, going from daily static reports to proactive, daily intelligence. He built that. It took years, dedicated headcount, and a CTO with both the technical skill and the organizational authority to make it happen.
Most community banks aren’t starting from that position. And that asymmetry is the real heart of the build vs. buy question. The banks that most need the intelligence are often the ones least equipped to build it. A $1.5 billion bank typically can’t simultaneously staff a data engineering team and a business analytics team, but they need the same quality of insight as the $7.5 billion bank. In some ways they need it more, because they have less margin for error and fewer people to absorb the cost of a wrong decision.
He said something during our conversation that I’ve been trying to articulate about KlariVis for years. He put it better than I have: “You’re delivering data that happens to come with a data warehouse. A lot of vendors are delivering a data warehouse and then don’t really give guidance on what to actually do with it.”
That’s the distinction. Our team is comprised of former bankers. Jerry ran lending operations before he ran product. I spent 25 years in banking, including as a CFO, working with our CORO Gill Hundley. We built this company because we lived the problem, and we knew the solution wasn’t going to come from data engineers who’d never sat across from a regulator or presented to a board.
He also made a point about banks trying to skip ahead. They hear about AI, they hear about natural language analytics, and they want to jump straight to the future without building a solid data foundation first. He’s right, and we tell banks the same thing. You can’t run AI on messy data. You can’t do natural language queries against a warehouse with ungoverned schemas. You can’t get consistent answers if every report writer is interpreting the same fields differently.
But the nuance he added is worth holding onto: you can’t leapfrog, but you can move faster. The way you move faster is to bring in a partner that has already solved the data problem for more than 150 banks and can compress your timeline from years to weeks.
That’s the real build vs. buy calculation. Not whether you can build it — at a certain size and with the right resources, some banks are absolutely equipped to. It’s whether you want to spend the next three to four years getting to a place that already exists, has been tested across 150 institutions, and can be running in your bank in 90 days. And perhaps more importantly, it’s worth asking what your institution won’t be doing while that build is underway — what strategic initiatives get deprioritized, what talent gets consumed, what competitive ground gets ceded in the meantime.
The person who spent four years building his own platform would tell you the same thing. He told me.