Apple Intelligence Live Translation Workout Buddy Shortcut Feature Unveiled In WWDC 2025
Foundation model framework for developers
In Apple, Senior Vice President of Software Engineering (SVP) Craig Federighi announced that the tech veterans have now started access for third party app developers up to their on device foundation model. These AI models also provide power to many Apple Intelligence features. Developers can access these AI models to create new features inside their app or create a completely new app through Foundation Model Framework.
Apple emphasized that these are on device models, so this AI capacity will work even when there is no internet connection. This will also ensure that users data will never go out of the device. For the developers, they will not have to offer any application programming interface (API) cost for on cloud points. This framework supports Swift, allowing developers to easily access the AI model. Apart from this, it supports a lot including framework guided generation, tool calling.
New apple intelligence feature
Federighi revealed Siri will not get the Advance AI feature teased in the last year’s WWDC till 2026, about which Apple will share more information. However, this year Cupertino based tech giants are planning to provide some more Apple intelligence features.
Live translation
Live translation AI based feature is being integrated into message apps, facetime and phone apps, so that users will be able to easily convert to those who speak a different language. This is an on device feature, so that the conversion will not go out of the users’ device. Live translation will translate the message to the automatic message app. At the same time, the feature on facetime calls will include live captions in the language of automatic users. Live translation during phone calls will translate what the person is saying in real time.
Visual intelligence
Apple is also updating visual intelligence. Iphone users can now ask the Chatgpt question by looking at the camera of their device. Openai Chatbot will know what the users are watching and will understand the context to answer the users’ questions. It can also search on apps like Google and Etsy so that the same image and product can be found. Users can highlight a product in their camera and search it online.