I’m just taking a moment to digest all the news from last week. And honestly, if you’re building anything with AI right now, you should too — because the landscape shifted pretty significantly.
The Big Deal: Function Calling
With the addition of function calling in the OpenAI API — and similar functionality now accessible at Hugging Face — we’ve crossed a threshold that I don’t think most people have fully processed yet.
Here’s what it means in plain terms: anyone who can code now has a way to have two LLMs cooperate on accomplishing a task.
Not just generate text. Not just answer questions. Actually cooperate — like two colleagues working through a problem together.
What Two AIs Working Together Actually Looks Like
Think about what happens when you pair two powerful models and let them interact. They can proofread each other’s work. They can ideate together to improve their text or coding outputs. One can draft, the other can critique, and they can iterate — without a human sitting there babysitting every step.
But here’s where it gets REALLY interesting. Function calling means these models can also reach out and call external programs through their own API links. That massively expands what they can actually accomplish in the real world.
To put this another way — you can now ask two powerful AIs to chat on the phone and hash things out together, AND make adjustments in Salesforce, or update inventory, or make reservations and bookings. They can do just about anything we can also do on the Internet.
So, that’s pretty cool.
We Don’t Need the Middleware Anymore
I’ve been thinking about knowledge ingestion, about AutoGPT’s ability to build things autonomously, about the trajectory of AI capability. One thing I kept running into was the intermediary problem — the need for some system sitting between the AI and the real world, translating intent into action.
Function calling essentially dissolves that problem. The AI doesn’t need a human-built translation layer anymore. It can directly interact with APIs, databases, and services. We are no longer working on building an intermediary system. The AI IS the intermediary system.
If you’ve been following the AutoGPT experiments I’ve written about, you already saw hints of this — an AI that started building its own visualizations, that tried to use a credit card. Those were early signs of agency. Function calling is the infrastructure that makes that agency reliable and scalable.
The Real-World Implications Are Immediate
This isn’t theoretical. This is available NOW. A developer can set this up TODAY. And the implications cut across pretty much every business function:
- Sales teams can have AI agents that don’t just draft emails but actually update CRM records, schedule follow-ups, and adjust pipeline stages
- Operations can have agents that monitor inventory levels and place reorders when thresholds are hit
- Customer service can have AI that doesn’t just suggest responses but books appointments, processes returns, and updates account information
The gap between “AI that talks about doing things” and “AI that does things” just closed significantly.
My Advice: Invite AI to Listen
Here’s my last thought, and I think it’s the most practical takeaway. I’d invite AIs to listen in on your calls moving forward. Not as some creepy surveillance tool — as a productivity amplifier.
An AI that can listen to a business call can create a follow-up plan. And now, in many circumstances, it can actually EXECUTE that plan. Update the CRM. Send the follow-up email. Schedule the next meeting. Create the task list. Flag the items that need human approval and handle the ones that don’t.
We’ve spent years talking about AI as an assistant. Function calling is the moment it actually becomes one.
What I’m Watching Next
The speed here is what gets me. I was talking about AutoGPT’s autonomous behavior just weeks ago, and already the official API infrastructure has caught up to enable that kind of agency in a structured, reliable way. The gap between “experimental hack” and “production feature” is compressing fast.
If you’re a founder, a developer, or anyone building products — this week’s announcements aren’t just incremental updates. They’re the plumbing that makes autonomous AI agents viable at scale. The building blocks are all on the table now. The question isn’t whether AI agents will handle real-world tasks. It’s how quickly you’re going to let them start.