New AI Metering Forces MSPs to Rethink Contracts and Prove Control Over Usage

New AI Metering Forces MSPs to Rethink Contracts and Prove Control Over Usage

The episode reveals a structural shift in the technology landscape: artificial intelligence is becoming a new layer of managed consumption, with measurable impact on infrastructure, contract terms, and operational accountability. This shift is illustrated by leading technology platforms explicitly metering AI usage through compute tokens, storage footprints, and local model deployments. Companies such as Alphabet, Amazon, Microsoft, and Google are integrating AI not only as features but as quantifiable workload layers, leading to economic and governance questions regarding who controls consumption and who assumes the risk of overage or misuse.

The most consequential development discussed is the rapid, capital-intensive scaling of AI infrastructure by leading hyperscalers. Alphabet raised its 2026 capital expenditure guidance to a possible $190 billion; Amazon’s AWS revenues rose 28% year-over-year to $37.6 billion, with quarterly capital expenditures reaching $44.2 billion— both moves directly tied to AI infrastructure investments. At the same time, endpoint and storage vendors, such as Apple and Backblaze, are experiencing elevated demand from AI workloads. On the software side, companies like Anthropic are explicitly raising API rate limits and deploying features to formalize the measurement and orchestration of AI-driven processes.

Supporting developments include the migration of management and control functions into enterprise platforms and endpoint environments. Microsoft Agent 365 is now broadly available, offering admins centralized policy controls over AI agents across cloud and local machines, with integration into Intune for granular restriction and monitoring. Google’s Chrome browser now automatically downloads 4GB Gemini Nano models to support local AI functions, raising new operational considerations around storage, policy management, and user approval. These developments anchor the thesis that AI is no longer a passive toolset but a consumption and policy domain that requires active oversight.

Operationally, MSPs and IT service providers face heightened exposure to contract and governance risk. The presence of invisible AI consumption— in the form of storage expansion, token overages, unauthorized agent actions, or degraded endpoint performance— requires explicit clauses in client agreements and new monitoring capabilities. Providers unable to demonstrate control over AI usage, policy enforcement, and exception handling may inherit both support burdens and unresolved liability. The practical implication is clear: future margins and contract viability will increasingly depend on the ability to meter, document, and govern AI-related activities, rather than simply enabling client access.

00:00 AI Infrastructure Surge 

04:17 Control Layer Wins

06:41 MSP Liability Shift

10:50 Why Do We Care? 

Supported by: 

ScalePad 
CometBackup 
Moovila 

 

💼 All Our Sponsors

Support the vendors who support the show:

👉 https://businessof.tech/sponsors/

 

🚀 Join Business of Tech Plus

Get exclusive access to investigative reports, vendor analysis, leadership briefings, and more.

👉 https://businessof.tech/plus

 

🎧 Subscribe to the Business of Tech

Want the show on your favorite podcast app or prefer the written versions of each story?

📲 https://www.businessof.tech/subscribe

 

📰 Story Links & Sources

Looking for the links from today’s stories?

Every episode script — with full source links — is posted at:

🌐 https://www.businessof.tech

 

🎙 Want to Be a Guest?

Pitch your story or appear on Business of Tech: Daily 10-Minute IT Services Insights:

💬 https://www.podmatch.com/hostdetailpreview/businessoftech

 

🔗 Follow Business of Tech

 

LinkedIn: https://www.linkedin.com/company/28908079

YouTube: https://youtube.com/mspradio

Bluesky: https://bsky.app/profile/businessof.tech

Instagram: https://www.instagram.com/mspradio

TikTok: https://www.tiktok.com/@businessoftech

Facebook: https://www.facebook.com/mspradionews


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

[00:00:02] AI is the next vendor lock-in layer because the meter is now compute, tokens, storage, and embedded local models. The provider who only enables the tool gets the ticket volume. The provider who measures consumption, limits exposure, and provides business output gets the control point. The economic question is becoming simple. Who owns the meter and who eats the overage?

[00:00:27] This is the Business of Tech. I'm Dave Sopel. We're seeing a specific pattern across the largest tech platforms. AI is not just a feature being added to products, it's becoming measurable demand in capital spending, cloud capacity, endpoint hardware, storage gross, token limits, and local model footprints. Let's start with infrastructure.

[00:00:52] CNBC reports Alphabet raised its 2026 capital expenditure guidance to as much as $190 billion, with 2027 spending expected to increase significantly. Amazon is showing the same build cycle. Quartz reports AWS revenue rose 28% year-over-year to $37.6 billion, its fastest growth in 15 quarters, while Amazon's quarterly capital expenditures reached $44.2 billion,

[00:01:19] explicitly tied to AI infrastructure investment. That tells us AI demand is not theoretical. The largest platforms are committing balance sheet-level spending to support it. But the signal is not only in hyperscale data centers. It's also moving closer to the client environment. TechCrunch reports Apple delivered $8.4 billion in Mac revenue in its March quarter, with executives pointing to demand from customers running AI workloads locally.

[00:01:48] And Blocks and Files reports Backblaze's B2 cloud storage business grew 24% year-over-year to $22.4 million, with AI-related bookings called out as a meaningful driver. So AI is pulling demand in two directions at once, upward into cloud infrastructure and outward into endpoints and storage. The constraints are becoming more visible, too.

[00:02:12] ITPro reports Anthropic expanded cloud code usage limits after a compute partnership adding more than 300 megawatts of capacity, over 222,000 NVIDIA GPUs, and raised API rate limits dramatically, including input token limits moving from 30,000 to 500,000 tokens per minute for some tiers. That is, the meter becoming explicit. And the most everyday signal may be the most operationally important.

[00:02:41] The register reports Chrome can automatically download a 4GB local model file, Gemini Nano, onto user machines unless settings or enterprise policy disables it. Put those pieces together and the signal is clear. AI is becoming a managed consumption layer. It shows up as cloud capex, endpoint demand, storage growth, token ceilings, and local files and machines your clients may not even realize are changing.

[00:03:09] In a backup market that keeps consolidating, MSPs are asking a smart question. Who's actually going to be there for the long haul? Comet Backup has stayed focused. Same flexibility, same MSP-first pricing, same commitment to letting you run your business your way. No surprises. If you want a backup platform built for how MSPs actually operate, go to cometbackup.com.

[00:03:39] This episode is brought to you by Control Map. Growing MSPs are using Control Map to build recurring revenue by expanding their GRC services. Starting now, Control Map is offering a free plan for MSPs looking to get started with providing compliance as a service. Create a free account and run an assessment. Track key items like policies, risks, and evidence in one place. It's a practical way to prove value to a client before deciding to expand your compliance offering.

[00:04:07] Try Control Map for free today. Visit scalepad.com slash Dave to get started. That's scalepad.com slash Dave. The mechanism is not simply that organizations want AI. It's that they want measurable output from AI before they have the operating model to control cost, quality, permissions, and handoffs behind that output.

[00:04:34] Payments reports that 34% of CFOs at billion-dollar companies cite increased output as the main reason for adopting AI, with productivity taking priority over cool tools. But only 12% say they feel very prepared for the workforce changes that AI will bring. That gap matters.

[00:04:54] Leadership is asking AI to produce measurable gains, but the organization often has not defined who approves usage, who measures consumption, what quality standard counts as acceptable output, when a human must intervene, or who owns the cost when automation expands beyond the original plan. So, vendors are turning operational control into the product.

[00:05:18] VentureBeat reports Anthropic is rolling out clawed managed agents features such as dreaming, outcomes grading, and multi-agent orchestration. The important point is not the feature list. It's the packaging. Anthropic is trying to convert one-off AI use into repeatable work. Tasks specifies in advance, outputs graded against rubrics, and execution delegated across specialized agents. This is what happens when AI moves from experimentation to expected business output.

[00:05:46] The value shifts from access to the model toward the system that can define, meter, evaluate, and constrain the work. ServiceNow makes the same positioning explicit in its own newsroom release. Enterprises have AI chaos, and the answer is a governed platform for autonomous work. In other words, the control layer becomes the sellable layer. Put those pieces together and the mechanism is straightforward.

[00:06:14] As AI becomes expected infrastructure, the bottleneck moves from can we use it to can we govern it. The provider that supplies the control layer owns the meter. The provider that does not is left supporting usage, cost, and behavior it cannot see. If you're listening to this and you haven't hit follow yet on Apple Podcasts, search Business of Tech. It takes five seconds and you'll get tomorrow's show automatically.

[00:06:42] Once automation moves into the everyday tools your clients already run, the MSP problem changes. It's no longer just can we support the application, it becomes can we govern what the automation is allowed to do and can we prove we governed it? Microsoft is already making that shift visible.

[00:07:01] Thoreau reports that Microsoft Agent 365 is now generally available as an agent management layer designed to let admins discover, monitor, and control both cloud-based and local AI agents. The important detail is the control model. Microsoft is tying agent oversight into Intune policy, including the ability to block or restrict classes of agent and sync in agents from other ecosystems.

[00:07:26] That tells you where the industry is heading. Agents are becoming a managed estate. They are not just features inside applications. They are autonomous behaviors that have to be inventoried, permissioned, monitored, and constrained. The same way MSPs already manage identities, endpoints, configurations, and access policies. Google shows the same shift at the endpoint level.

[00:07:50] Thoreau reports Google clarified that Chrome has been downloading on-device Gemini Nano models, consuming up to 4GB of local storage, to power AI features such as summarization, rewriting, tab organization, and certain security checks. Again, the question is not whether the feature is useful. The operational question is whether the MSP knows it's there, whether policy allows it, whether the client approved it, and whether the configuration can be verified.

[00:08:17] That is the consequence for MSPs. AI issues will not arrive as clean AI tickets. They will arrive as storage growth, slow endpoints, unexpected usage, unauthorized workflow behavior, data handling concerns, bad automated output, or a user asking why a tool behaved differently than expected. The exposure chain is direct.

[00:08:40] If an AI feature consumes tokens, compute storage, or endpoint resources without a defined approval path, the client experiences it as cost, performance impact, or uncontrolled data handling. If an agent acts outside its intended scope, the failure is not just AI made a mistake. It's a permissions failure, a workflow control failure, or an audit failure.

[00:09:04] And if the MSP contract does not say who approves AI consumption, who sets caps, who owns policy exceptions, and who pays for overages, the MSP gets the escalation before anyone has a clean legal answer. That's where the meter becomes the business model.

[00:09:22] The MSP that can show last month's AI consumption, identify which users and tools drove it, prove which agents were allowed to act, document which local models were deployed, and tie those controls back to contract language, can price AI as governed infrastructure. The MSP that cannot measure those things is left supporting invisible consumption inside agreements priced for seats, tickets, and devices.

[00:09:50] One of the hardest problems in managed services isn't technology. It's delivering projects predictably and profitably. Every MSP has lived this moment. You estimate a project at 40 hours, and it ends up taking night. Not because your team isn't capable, but because projects have dependency, shared engineers, shifting priorities, and timelines that change constantly. That's where Movala comes in.

[00:10:14] Movala uses automation and AI-driven scheduling to build accurate project timelines and continuously adjust them as conditions change. That means you know with certainty when a project will actually finish, when engineers will become available, and when you can safely take on new work. For MSPs trying to run a more mature, predictable operation, kind of visibility is a big deal. If you want to deliver projects without the constant overruns, visit Movala.com slash MSP radio.

[00:10:44] That's M-O-V-I-L-A dot com slash MSP radio to learn more. Why do we care? Platform vendors are establishing default control architectures inside managed client environments right now. Not as a roadmap item, but as shipping behavior. And the billing meter has already shifted from seats to compute, tokens, and local model footprints.

[00:11:09] MSPs without a governance layer between that infrastructure and the client are holding liability proximity to a question that has no legal answer yet, on contracts priced for a model that no longer exists. So what to consider? Pull current contracts and identify any that lack AI consumption carve-outs, token overages, storage growth driven by AI workloads, and compute spikes not covered by flat fee language written before 2024.

[00:11:40] Build a consumption monitoring capability, even a basic one, before positioning it as a service. The credibility gap between we can govern AI spend and here's your actual token consumption last month is the difference between a retained client and a churned one. And don't sell AI enablement. The hyperscalers have that covered at a price point no MSP can match.

[00:12:05] The defensible position is AI accountability, policy, audit reporting, and liability proximity management. Frame it in a way that client conversations happen now before a competitor does. If this trend continues by 2027, the margin difference in managed AI services will not come primarily from resale discounts. It will come from whether the MSP can prove control over consumption and exposure.

[00:12:33] A provider that can report token use, storage growth, local model deployment, agent permissions, policy exceptions, and workflow output will have the basis for carve-outs, usage-based pricing, and governance fees. A provider that cannot will absorb AI-related tickets and the disputes with them inside flat fee contracts built for a pre-AI operating model. This is the Business of Tech.

[00:13:02] Want more from the Business of Tech? Join Business of Tech Plus for ad-free episodes, early interviews, extended cuts, subscriber-only shows, and exclusive member perks and analysis. Sign up at businessof.tech slash plus. And follow this show in your podcast app. And if you're on YouTube, hit subscribe and the bell so you never miss a story. Reviews and comments help spread the word too. Interested in advertising?

[00:13:30] Head to mspradio.com slash engage. The Business of Tech is written and produced by me, Dave Sobel, under ethics guidelines posted at businessof.tech. Thanks for listening. I'll see you on the next episode. Part of the MSP Radio Network. Drop my castle to microphone over to the third episode of drop on «off. sky flicker» Let's throw out this session on your harrow at house and down basis in your 느낌,