Apple Releases iOS 18.3 With New AI Features For All iPhones
Technology

Apple Releases iOS 18.3 With New AI Features For All iPhones

3 min read

Apple has released stable versions of iOS 18.3 and iPadOS 18.3. The update will bring AI-powered features to supported devices, which will be a step towards expanding the availability of Apple Intelligence tools.

First of all, we note the Visual Intelligence feature for the iPhone 16 series. This tool will allow users to add events to the Calendar app by scanning data from posters, flyers, etc. In addition, Visual Intelligence can be used to visually identify plants or animals.

iOS 18.3 also introduces support for generating news summaries. However, it is currently unavailable because Apple had to suspend this feature due to the fact that it often generated misleading summaries whose content was not related to reality. The news summaries themselves are highlighted in italics, and their display can be customized through a special menu available on the device’s lock screen. This will also allow you to quickly disable the function for any application if necessary.

The developers also returned the ability to repeat the last operation by pressing the equals key to the Calculator app. A bug was also fixed due to which the on-screen keyboard could disappear when invoking Siri text mode. A bug was fixed due to which audio playback in Apple Music could continue until the end of the track, even if the app was closed.


Apple Unveils Visual Intelligence AI Search - A Response to Google Lens

All four presented Apple smartphone models received a dedicated touch button on the side of the case, which is responsible for controlling the functions of the main camera. After the release of iOS 18, it will help to quickly search for information about objects caught in the camera lens - Apple called it Visual Intelligence. This function will be integrated with the proprietary artificial intelligence system Apple Intelligence. The new touch button on the side panel of the iPhone 16 family of smartphones understands several types of action, including single and double pressing, holding, and sliding the finger up or down. Depending on the associated gesture, the corresponding function will be called. By swiping your finger up or down on the button, for example, you can change the scale of the image.

As The Verge adds, a single press and further holding of the button will call the function of searching for information on the image of objects that are in the field of view of the smartphone’s main camera. The system will search for contextual information related to those objects that the camera is aimed at. For example, if you point the camera at a cafe building, its working hours and menu will be found. The function will also be useful when transferring information from paper ads and leaflets to quickly create a reminder of a particular event. In many ways, the function resembles Google Lens.

In the future, Apple plans to integrate the Visual Intelligence function with third-party information services such as Google. Company representatives emphasize that the images themselves are not stored on Apple servers if the user uses them only to search for related data.