Apple Boosts Siri's Intelligence with New Onscreen Awareness API in iOS 18.2

· 1 min read

article picture

Apple is taking steps to enhance Siri's capabilities with a new feature called "onscreen awareness." This upcoming functionality aims to make Apple's virtual assistant more intelligent and responsive to content displayed on the device's screen.

In preparation for this advancement, Apple has introduced a new API in the latest iOS 18.2 beta. This API allows developers to make onscreen content in their apps accessible to Siri and Apple Intelligence, paving the way for more sophisticated interactions between users and their devices.

The new feature will enable Siri to understand and act upon information visible on the screen. For instance, if a user receives a text message containing an address, they could simply instruct Siri to "Add this address to their contact card," and the assistant would complete the task automatically.

While the full implementation of onscreen awareness is not expected until a future iOS update, possibly iOS 18.4 in spring 2025, Apple is providing developers with early access to the necessary tools. This strategy allows app creators ample time to integrate the new functionality into their products, ensuring a smooth user experience when the feature officially launches.

In the current iOS 18.2 beta, users can already experience a taste of enhanced Siri capabilities through ChatGPT integration. This allows Siri to answer questions about photos, PDFs, and other documents displayed on the screen. For example, users can ask, "What's in this photo?" and Siri will capture a screenshot to analyze using ChatGPT.

It's worth noting that this ChatGPT integration is distinct from the planned onscreen awareness feature, although they may be related in some aspects. The full extent of Siri's onscreen awareness capabilities is yet to be revealed, but it promises to make interactions with Apple devices more intuitive and efficient.

As Apple continues to refine and expand Siri's abilities, the introduction of this new API marks an important step towards a more intelligent and context-aware virtual assistant. Developers and users alike can look forward to exciting new possibilities in how they interact with their Apple devices in the near future.