Apple is currently levelling up Siri’s ‘onscreen awareness,’ enabling it to interact with your screen
- Apple is developing an “onscreen awareness” feature to allow Siri to understand and interact with the content currently displayed on your screen
- Apple will also provide APIs to developers for integrating onscreen awareness into their third-party apps and is currently testing ChatGPT integration, allowing Siri to answer questions based on screenshots
- While not available in the iOS 18.2 beta, “onscreen awareness” may arrive in time for iOS 18.4 in 2025
Among the digital assistants, Siri has fared fairly well (certainly compared to Cortana, the ill-fated assistant from rival Microsoft), and now Apple is working on making Siri even smarter by giving it a better sense of what you’re looking at on your screen, calling it ‘onscreen awareness.’
Apple has gone into detail about the development of this feature in an Apple Developer Documentation page, which also notes that it’s due to be included in various upcoming Apple operating system (OS) beta releases for testing.
Apple originally showcased onscreen intelligence in June 2024 and this is a pretty solid indication that it’s still in development.
The core idea of onscreen awareness is pretty straightforward – if you’re looking at items on your screen, say a document or a browser with a page open, and you have a question about something you’re looking at, you can ask Siri (equipped with Apple Intelligence). Siri should then be able to respond to your question with relevant information or perform an action asked of it, like sending content to a supported third-party app.
If it works as intended (and that’s a big ‘if’), it will result in a smarter Siri that won’t need you to describe what you want it to do as extensively as you probably need to right now. For example, you can have a document open and ask for a summary without entering the contents of the document yourself.
Apple’s plans for Siri’s foreseeable future
MacRumors reports that Apple has provided APIs for developers, enabling them to make the content in their apps available to Siri and Apple Intelligence. The idea is to provide developers with this API many months in advance of its release so that you can use onscreen awareness with third-party apps when it’s officially rolled out.
Currently, we know Apple is trialing out ChatGPT in the latest iOS 18.2 beta version (among other Apple OSs), and ChatGPT integration coupled with Siri will enable you to ask questions about items (like images, PDFs, and videos) on screen. Then, Siri will take a screenshot and pass it on to ChatGPT to reply to your query. This means that, for now, this functionality is limited to screenshots of your screen.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
However, onscreen awareness is a little bit different, as pointed out by MacRumors, as onscreen awareness is intended to be integrated in a more direct way.
Siri’s onscreen awareness is supposed so be a sort of capacity to directly analyze, interpret, and interact with the content on your screen. If someone messages you their number, and you’d like to save it, you can tell Siri to create a new contact without having to add additional instructions or Siri having to perform many intermediate steps.
Will Siri survive and even thrive in the AI age?
Apparently, onscreen awareness isn’t actually in the iOS 18.2 developer beta just yet, and MacRumors speculates that it’s one of multiple Siri features that we won’t see for a while, but this news is still promising. One prediction has onscreen awareness to possibly be included in iOS 18.4, expected to be released in the first half of 2025.
If this pans out, Siri could become a more helpful digital assistant, and with Apple’s flair for design, it might become the choice digital assistant for many. This development reminds me of what Microsoft is aiming for with Copilot. That hasn’t been received very well from what we’ve seen so far, leaving the goal open for Apple.
YOU MIGHT ALSO LIKE…
Apple is developing an “onscreen awareness” feature to allow Siri to understand and interact with the content currently displayed on your screen Apple will also provide APIs to developers for integrating onscreen awareness into their third-party apps and is currently testing ChatGPT integration, allowing Siri to answer questions based on…
Recent Posts
- Microsoft Surface rumors point to a big Copilot Plus refresh next year
- What Arm’s CEO makes of the Intel debacle
- Nobody wants this but a TikTok ban is starting to seem inevitable
- OpenAI’s new AI Reinforcement Fine-Tuning could transform how scientists use its models
- 13 Deals on WIRED-Approved Gear at Walmart
Archives
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011