Apple Introduces API for Siri’s Onscreen Awareness Feature
On the Apple Developer web site, the corporate has offered documentation (via Macrumors) for the brand new API titled ‘Making onscreen content material out there to Siri and Apple Intelligence’ that’s designed to permit entry to an app’s onscreen content material, enabling Siri and Apple Intelligence to grasp what content material the consumer is accessing.
If a developer provides assist for the onscreen content material API, their utility will present the contents of the display screen to Siri/ Apple Intelligence when a consumer explicitly requests it, in response to the corporate. The data on a consumer’s display screen can then be shared with a third-party service (resembling OpenAI’s ChatGPT).
Apple has additionally offered an instance of Siri accessing onscreen content material. While looking the net, a consumer can say or kind “Hey Siri, what’s this doc about?” to ask Siri to offer a abstract of a doc.
Developers may also add assist for onscreen consciousness in browser, doc reader, file administration apps, mail, images, displays, spreadsheets, and phrase processing apps. Apple says that this checklist isn’t exhaustive, so extra apps ought to be capable of reap the benefits of the API sooner or later.
It’s price noting that iOS 18.2 will not deliver assist for the brand new Siri, which is predicted to supply tremendously improved performance. That’s anticipated to reach on iOS 18.4 together with assist for in-app actions, which can reportedly be launched by Apple in April 2025, which is ample time for builders to combine assist for the API into their apps.