Fintech Nexus spoke with Pat Utz, CEO and Founder of policy intelligence firm Abstract about how businesses are navigating the new regulatory normal.
With the passage of the “One Big Beautiful Bill Act,” business leaders have added another task to their plate–understanding the impact on their bottom line. For many, it’s a constant stressor in a system that shifts every few years and requires tremendous diligence to manage. With more than two million regulatory and bill changes across the local, state, and federal level nationally per year, businesses confront dynamic compliance frameworks beyond the scope of their regulatory radar.
It’s such regulatory grey areas NYC-based Abstract aims to capture and synthesize through its automated policy intelligence and workflow platform. In an interview with Fintech Nexus, Pat Utz, Abstract’s Co-Founder and CEO, elaborates on current regulatory uncertainties, outlines how enterprises and law firms are using automated regtech to chart strategic courses, and reads the Trumpian regulatory tea leaves.
The following has been edited for length and clarity.
Where does the utility lie in something like Abstract?
We use AI to identify what’s a relevant shift in regulation or legislation that may impact a company, or, in the case of a law firm, may impact their clients. The key here is using the reasoning capability of AI to essentially reason through these dense documents and these updates and then calculate how these may be risks or opportunities to these companies.
So for the financial industry, this can take on many forms. One of the more traditional forms is in the bucket of compliance, such as state-level consumer data storage requirements. If you’re out of compliance, that leads to fines, which is the risk. On the flip side of risk and compliance, there’s opportunity for product development: That involves understanding the existing rules and laws to figure out how a company may amend our product roadmap so that it can actually take advantage of how the laws are currently set up. That may be for tax reasons or maybe for an actual feature within the product to align more closely with the existing law. I would say most of our clients use us to proactively identify potential risks across changes in government policy.
Why work with enterprises and law firms specifically?
We previously were working closely with contract lobbying firms and chambers of commerce and nonprofits. We grew that business to a couple hundred entities, and then it was through two, three years of running that, that we started talking to the corporate side and the big law firms. We went through a few hundred phone screens, just going through pain points and seeing how we could take that legacy product that we had and transform into what we have now.
It partially came from a natural shift in the market. ChatGPT 3.5 came out and we said, Hey, we can actually take this and really iterate on it. We wish we had that in 2019, but the models we had then were way simpler.
There have been several significant changes to regulatory regimes on the financial front over the past six months. To what extent have you seen the way fintech operators use something like Abstract change during that time?
Two main things come to mind here. One, the deregulation at the federal level, or a lack of wanting to set any precedent at the federal level, has created an uptick in volume of regulatory activity at the state level. So there’s just a lot more to process now. Hence the need for tooling to make sense of it all. And we can see this across various states, like California, New York, Colorado, et cetera.
Second, something I’ve been listening and hearing through the labor side — I’ve been working closely with people that work in policy and labor and employment — is that there’s a shift towards more of a populist agenda within the current administration. It manifests in prioritizing the individual, which is partly why we’re seeing a lot more of these fines and less friendly activity towards the big companies. So I think that’s something worth noting, such as what we’ve been seeing with the CFPB recently.
Could you clarify that second bit: about the Trump administration being less friendly toward big companies? What examples come to mind there?
I mean, recently, there was a $[175] million fine towards Block because they weren’t doing enough on fraud protection. [Editors’ note: The CFPB levied this fine during the Biden Administration and before the CFPB’s dismantling/atrophying under Ross Vought and the Department of Government Efficiency.]
And in general, on the labor side, there’s been a shift towards supporting more labor unions less so the employer. This goes sort of against what traditional policy experts thought this administration would do. We’ll see if it continues to play out, but I think that’s worth noting.
One of the conclusions of your recent Open Banker op-ed centered around creating evergreen compliance frameworks. What does that mean mechanically — does that mean building for the most conservative regulatory regime that’s out there and anticipating that that’s the ceiling?
Precisely. We can make a safe assumption that anything set up by California or New York is usually going to be the most rigorous framework out there. The general recommendation is to map out your framework to meet that and then you should be good for the rest of the states. Of course, there will be nuances, but this leads into my other viewpoint, which is, for smaller companies, I think it’s going to be really hard for these government bodies to enforce these measures, not only for consumer data and privacy, but also just AI in general. The bigger companies, of course, with larger cash flows and balance sheets, are going to have a lot more of a microscope on them. But smaller companies are going to continue to fly under the radar.
However, something like SOC 2, I think, is going to come into play. It doesn’t matter what stage of company you’re at, but if you want to sell enterprise-level software, the standard is you have to get SOC 2 compliance. Now that usually costs $20,000 to $50,000 just to get started. So I can envision a flavor of SOC 2 for these sorts of compliance measures, where, adding to the SOC 2, you have your AI sanitization, or whatever you want to add to it. I think that’s going to be inevitable in the next couple of years, and I think that’s how the smaller companies are going to be regulated.
Where do you see the greatest uncertainty for fintech operators?
For a lot of fintechs, LLMs are going to be involved in some shape, way or form, whether it’s an LLM or some flavor of a transformer model: these deeper reasoning models where most of the industry is heading for processing data. I think the greatest uncertainty still lies in how you can process that data and how you can use it to fine-tune a model or create a separate knowledge graph for the model to reference. But again, going back to my prior point, looking at what a state like California does, I think if you adhere to that framework, you’ll be good for the rest.
You mentioned the reasoning capabilities of Abstract’s LLM when it comes to processing millions of regulations in the US. How did you tune Abstract to translate the variables of a legal document into something that is qualitatively useful for a customer?
So we’re applying AI to the legal public policy space, meaning we take models like GPT-4o and Llama and whatnot, and we’ve made our flavor of it that we can update quickly. These foundational models are getting better as we speak, and so you don’t want to create your entire model in-house, because you’re not going to innovate as quickly as OpenAI does.
Now, in order to make the models really adept at your specific craft, that’s where the data and the context come into play. In our case, a lot of the data revolves around bills, regulations, and historical records. We’ve spent over six years scraping and structuring all this messy data from over 140,000 governments in the US. So it’s a big data-scraping and structuring problem that you have to solve for, and then, once you have that data, you’re able to start using it to reiterate and perfect what that model is tasked with.
How do you sustain a live stream of new regulations on the platform?
We are checking twice a day across any of the government entities that our clients pay for, and this is all done with agents and automation. No need for manual processes anymore.
Was it initially a big manual effort?
There was a lot of manual checking and setup in the first couple of years.
You work in a high-stakes, sensitive field. How do you prevent hallucination and mitigate the risk of using your tool?
The long story made short is we use methods like RAG, retrieval-augmented generation, to ensure that whatever the model is outputting has to be tied to some sort of document that we’ve structured. And so, if you look at how Abstract is architected, it’s always pulling from something that we’ve checked already. And then there are manual checks that we perform as well.
What’s the vision for Abstract from here?
Right now, and at least for the next couple of years, we’re really focused on being the go-to platform to distill and abstract any updates across regulatory bodies, government bodies, and any of the news and social media around them. Really having a pulse on what’s going on.
Thereafter, we’re currently in the US, but the plan’s to expand internationally. There’s a lot going on in the EU, LatAm… it’s a big world. There’s a growing focus, because the EU, for example, has standards now on AI that we don’t, so it’s important for companies to understand that. The vision for Abstract is to use AI to make sense of that which is hard to understand, and so it’s really hard to also understand case law. It’s also really hard to understand general trends. So we want to be that translation layer for any external noise that may impact the company. We want to be able to abstract and offer guidance for the heads of each business unit. So that’s the end vision. But, for now, very much focused on legal, public policy, compliance.