Inter & Co reached 65.6% of CDI in funding costs, setting an industry benchmark for Brazilian financial institutions with more than 20 million clients. The digital bank's AI-driven credit assessment enables it to grow faster than competitors while maintaining lower capital costs.
Nu Holdings deployed its nuFormer AI model to production in 2025. The system helped the bank hold risk-adjusted net interest margins steady quarter-over-quarter, excluding FGTS regulatory headwinds, according to CFO Guilherme Lago's Q4 2025 earnings call.
The performance gap between AI-native and traditional banks centers on credit risk prediction accuracy. Machine learning models process borrower behavior patterns across millions of transactions, identifying default probability signals that rules-based systems miss. This allows digital banks to price risk more precisely and extend credit to segments traditional banks reject.
Inter's newer client cohorts transact faster and more frequently than older groups, creating richer data for model training. The feedback loop improves underwriting decisions as the customer base scales, reinforcing the cost advantage.
The 65.6% CDI funding cost represents a 34.4 percentage point spread below the 100% CDI baseline that traditional banks typically pay. Applied across Inter's deposit base, this translates to millions in annual savings that can fund growth or improve margins.
Brazilian digital banks face a critical test period in 2026-2027 as rising interest rates stress credit portfolios. Banks with deployed ML models should show lower non-performing loan rates if their risk predictions prove accurate under changing economic conditions. The efficiency ratio and risk-adjusted return comparisons will reveal whether AI credit models deliver durable advantages or optimize only for recent low-rate environments.
Traditional banks can access similar AI technologies but face legacy system integration challenges and organizational resistance to algorithmic decision-making. The question is whether first-mover advantages in model training data create a persistent competitive moat or if technology diffusion will compress the performance gap.

