1. Brex’s AI Challenge: Outdated Procurement vs Fast Innovation
Brex, like many companies, initially followed a traditional months-long procurement process for adopting new tools. But with AI evolving rapidly, this system failed. By the time legal checks and evaluations were complete, teams had lost interest and newer tools had already emerged in the market. This realization led Brex to reassess how it approached AI adoption.
2. The Turning Point: Realizing the System Was Too Slow
At the HumanX AI conference, Brex CTO James Reggio admitted that their internal procurement controls were too slow for AI innovation. Teams requesting new tools would abandon them by the time approvals were complete. This bottleneck was hurting the company’s ability to experiment and innovate at the speed required by AI’s fast development cycle.
3. New Framework for Faster AI Tool Vetting
Brex introduced a new internal framework to simplify legal validations and data agreements, making it quicker to onboard and test AI tools. This allowed tools to be tried much sooner, ensuring engineers could engage with them while interest and relevance were still high, greatly improving the company’s agility.
4. Superhuman Product-Market-Fit Test
To determine whether an AI tool was worth adopting company-wide, Brex used what it calls a “superhuman product-market-fit test.” This method relies on employee feedback and real-world use cases to identify tools delivering extraordinary value. If the tool enhanced productivity significantly, it moved past the pilot stage into broader deployment.
5. Empowering Engineers with Monthly Budgets
Brex decentralized software selection by giving engineers a $50 monthly budget to choose from a list of pre-approved tools. This gave individuals freedom to experiment with AI tools without waiting for top-down approvals. It promoted faster iteration, reduced friction, and ensured tools were selected based on practical usage, not just top-level decisions.
6. Usage-Based Licensing Decisions
Brex used actual usage data to make licensing decisions. If only a couple of engineers used a tool, individual licenses sufficed. But if dozens adopted it and reported measurable improvements, the company scaled up. This strategy helped avoid overpaying for enterprise licenses with limited utility and ensured investments matched actual needs.
7. Embracing the “Messiness” of AI Innovation
Reggio emphasized that AI adoption isn’t a clean, linear process. Mistakes will happen, and that’s acceptable. Trying to perfect decisions before acting can be more damaging than occasional missteps. Brex embraced experimentation, even if it meant discarding 5–10 large deployments later. The goal was continuous learning and adapting fast.
8. Shorter Lifecycles Demand Faster Action
AI technology moves fast—often evolving within months. Brex recognized that taking six to nine months for evaluation meant falling behind. Their solution was “stepwise fast iteration”—small-scale testing over a week or two, gathering feedback, and making quick decisions. This kept them agile and able to stay on top of evolving tools.
9. Practical Example: Big Bank vs Brex Approach
Imagine a large bank taking nine months to approve a fraud detection AI tool—it risks that tool becoming outdated before launch. Brex, by contrast, can test and assess a tool in three weeks. This real-time agility allows Brex to remain relevant and make informed, timely decisions that large enterprises often struggle with.
10. Key Lessons from Brex’s AI Strategy
Brex’s approach offers key takeaways: speed up procurement, give teams autonomy, focus on real usage data, and embrace failure as part of innovation. Their strategy proves that in the AI era, it’s better to act fast and adjust than to get stuck in analysis paralysis. This mindset is crucial for staying ahead in tech-driven markets.