Apple executives have said in interviews that the company isn't building a public chatbot. You can actually create one with a new action in iOS 26. Here's how.
There's a new action that gives you direct access to Apple Intelligence, and it responds to prompts just like ChatGPT or Claude. That wasn't a headline feature at WWDC 2025.
In fact, Craig Federighi, Apple's senior vice president of Software Engineering, and Greg Joswiak, senior vice president of Worldwide Marketing, have said Apple isn't trying to build a chatbot.
But with Shortcuts, you can make Apple Intelligence act like a chatbot. If you're running the iOS 26 developer beta, you can try it yourself. It only takes a few taps.
How to use Apple Intelligence in Shortcuts
To get started, open the Shortcuts app and create a new shortcut.
- Tap the plus button to add an action.
- Search for the action called Use Model.
- Under Model, pick either On This iPhone, Private Cloud Compute, or ChatGPT if you have that installed.
- Enter your prompt.
That's it. When you run the shortcut, it'll generate a response based on your prompt. You can show that output as a notification, save it to Notes, or use it in another step of the shortcut.
On-device versus cloud model
Apple provides you with a choice between two versions of the model. The first option is the on-device model, which runs entirely on your iPhone. This version is faster and more private, but it is limited in terms of how complex or creative it can be.
The second option is the Private Cloud Compute model, which runs in Apple's data centers using more advanced models. This version can handle longer and more nuanced prompts, but it requires a network connection.
If you attempt something that the on-device model cannot handle, the shortcut will notify you and offer to switch to the cloud model.
It's mostly a chatbot...
It's not a full chatbot. There's no memory between prompts, no chat history, and no name or personality. But it answers clearly and naturally when you give it a prompt.
In testing, it performed decently with creative writing, summaries, brainstorming ideas, and even simplifying text. I'd say the cloud model performs around the level of GPT-3.5.
It also has basic safeguards. When asked to write a fake doctor's note, even framed as part of a fictional story, it refused. This is good.
But it made factual errors in other cases. When asked to list Aristotle's lost and rediscovered works, it made up names and added made-up details about Renaissance scholars finding them.
The entire response was fiction and delivered with confidence. Or, as the AI companies say, hallucination.
Apple says it's not a chatbot
Craig Federighi and Greg Joswiak said during WWDC interviews that Apple isn't trying to make a chatbot. Apple's focus is on building helpful features, not chat companions.
But the behavior of the Shortcuts action tells a different story. It may not be a chatbot by name, but you can still chat with it. And it responds like one, especially when you give it open-ended prompts.
The shortcut is the most direct way to test Apple's language model right now. You don't need to go through Siri or wait for new system updates. You just write a prompt, run the shortcut, and see how the model responds.
It's also one of the only places where you're in full control. You can send any prompt you want, choose which model runs it, and decide what to do with the response.
Even if Apple doesn't call it a chatbot, it's close enough for now.
1 Comment
This is a really nice illustration that Apple's claim about not building a chat bot is just a distraction based on semantics.
The "chatbot" is just a product that provides a light-weight interface for using a combination of integrated tools, the most notable of which is the LLM. But if you have any experience using ChatGPT, you know that it's not just an LLM -- there are other tools in play, such as tools for creating various types of files for download (.docx, .pptx, .xlsx, .csv, .md, etc).
So all Apple is really saying is that they want to integrate the LLM along with other components into a product that they will not call a 'chat bot' but instead whatever it is that they end up calling it (Siri, Siri+, or whatever). And I could imagine that Apple's product(s) that integrate the LLM end up being truly much better than the current chatbots out there, because Apple has control of multiple platforms.
I guess what it boils down to is that Apple is behind in two respects. First, they do not have a home-grown LLM that is as capable as other competitors. Second, they have not yet figured out how to integrate an LLM with other components into products that customers value.
It's the second thing that's the bigger deal, because they could always use somebody else's LLM until their own is up to snuff (just like they used Intel processors for many years). ChatGPT offers a product (the chatbot) that businesses and consumers alike are willing to pay actual $$ for. Apple is not offering an LLM-based product -- whatever they might call it -- that anybody would pay for.