By Kerry Robinson
Last week Mark Zuckerberg let slip a mind-bender on Dwarkesh Patel’s podcast:
“If the AI can handle 90 percent of a WhatsApp support call, we’ll probably hire more customer-service people, not fewer.”
He was riffing on the idea that – right now – Meta could never afford to build a customer service center.
He guessed it would cost $20b dollars!
Not feasible.
But WhatsApp’s sister company, Meta AI, is building and open-sourcing high performing models. Their latest ‘Llama 4’ release includes smart, fast and cheap models. Zuck claimed:
“Scout and Maverick [offer] some of the highest intelligence-per-cost you can get … designed to run on one host, very low latency … we build what we need, then open-source it.”
Zuck has talked about voice AI a lot during his recent interviews and the new Meta AI app showcases the latest Llama realtime voice AI.
He sees it as the natural way to interact with AI – whether via smart Ray Ban glasses, or just to talk about the content of their feed while doom-scrolling on Facebook or Instagram. Just like you might mention or discuss a post with a friend or family member.
But if Meta AI succeed in building a smart, fast, cheap voice AI model it also opens up the possibility of customer services for WhatsApp and other Meta properties.
And for their customers.
That’s pretty much every B2C business, globally!
Zuck specifically noted that they’ll probably just bundle the customer services capability into the WhatsApp platform – he wants to make WhatsApp the western world’s equivalent to WeChat in china and WhatsApp in places like Thailand.
But he recognizes the cost of customer services is too high for most to offer it via WhatsApp.
Right now it can cost upwards of a dollar a minute to provide human-agent support, and north of 20c per minute to deploy top-notch voice AI in customer service scenarios.
Bundling a smart, fast, cheap voice AI solution inside of WhatsApp solves that issue.
Zuck literally wants to give away voice AI. Bundle it as an on-ramp for businesses to use WhatsApp for customer services.
So is that it. Are we done?
No more contact centers? No more contact center agents?
Nope, not at all.
Because once Tier-0 is an always-on robot, thousands of brands that never offered phone support can suddenly afford Tier-1 humans for the messy 10 percent.
Just like Zuck plans for WhatsApp.
It’s the age-old paradox: commoditize the core, and demand explodes at the edge.
Commodities crush price — and simultaneously expand the market.
In the past I’ve talked about Generative AI as ‘Electricity for knowledge work’, but my colleague Michael Fisher (affectionately known as ‘Fish’) has taken this analogy a whole lot further in his Substack article: The Possible, the Practical, and the Profitable.
Fish reminds us that:
“Individual components, no matter how impressive in isolation, provide little value without successful integration into a complete system.”
Voice AI is cool. And at near-zero cost, hard to ignore. This is an example of what’s ‘Possible’.
But is it Practical?
Profitable?
Fish argues every tech wave passes three gates:
1 – Possible – magic-show demos: e.g. a perfect AI call in the lab – like we showed at the launch of our vision for the AI First Contact Center)
2 – Practical – first real use-case: e.g. outbound Voice AI payment collections bot – like we delivered with our Conversational AI platform partner: Parloa. Read the case study here)
3 – Profitable – systemic scale: now we’re talking bot + telephony + contact center + CRM + SLA + humans – that’s what we do, every single day!)
Guess where most of those shiny voice agents currently sit?
Somewhere between 1.5 and 2.25
Fish’s call-out is that real money accrues when you tackle the last mile of AI deployments: addressing API failures, compliance issues, latency gremlins, and the long list of embarrassing things that generative AI can do when you’re not looking.
Edison needed transformers, sockets, safety codes. You’ll need SIP trunking, observability, a brand voice guide the bot can actually read and a plethora of services it can leverage to get stuff done for your customers.
The article is worth 10 mins of your time and a caffeine top-up: check it out
Fish has some other thought provoking takes that are worth signing up to his substack for so why not click ‘Subscribe’ while you’re there?
Three takeaways for CX Leaders:
Imagine a world where AI is free: Because inside WhatsApp (and ultimately many platforms) it will be. Shift focus and spend to orchestrating the AI and feeding it with the content and context it needs to help your customers.
Who can you serve when AI does 90% of the serving: Like Zuck is contemplating, could you open up whole new service channels and target customers that were previously unprofitable to serve.
Treat integration talent as your differentiator: The model weights are open-sourced; the plumbers who connect them to your IVR are not. Recruit staff and partners accordingly.
Kerry
PS: If you want a more regular dose of insights, follow or connect with me on LinkedIn for regular posts on conversational AI, mindset, and egg juggling, among other things!
PPS: You are building with GenAI right now, aren’t you? If not, what’s stopping you? Check out our blog on Gen-AI blockers, or sign up for a complimentary Strategy Workshop to help you get started.
If someone forwarded this to you, please subscribe yourself for weekly insights that’ll make you think differently about your IVR, voice, and chatbots.
Helping you get maximum ROI from conversational AI — whatever the platform.