Apple is set to introduce a range of new accessibility features for its devices, including iPhones, iPads, and Mac computers, later this year. One of these features, Personal Voice, aims to assist individuals with difficulty speaking clearly or confidently. Through a free software update, users can type messages or orders on their iPhones and hear their Voice or a similar voice say it aloud.
Personal Voice eliminates the need for additional apps or accounts and allows users to save frequently used sentences or phrases as shortcuts for quick playback. The feature assists in face-to-face conversations and integrates spoken audio into phone calls and FaceTime.
Users can create a Personal Voice model to achieve a more personalized voice. This involves providing about 15 minutes of spoken samples, which can be completed conveniently. Afterward, the device processes the samples overnight, enabling users to type messages and listen to them in their Voice.
It’s worth noting that the Personal Voice model is device-specific by default. The training process must be repeated to use the model on other devices unless explicit permission is granted to share it across devices.
Another accessibility feature, Assistive Access, is designed for individuals with cognitive impairments. It simplifies the user interface by removing unnecessary visual elements, allowing for a straightforward interaction with the iPhone or iPad. For instance, favorite contacts can be set up for quick access to voice or video calls, streamlining the calling process. Assistive access also provides a simplified messaging experience.
Apple’s Magnifier app will include a feature called Point and Speak. This feature utilizes the device’s built-in lidar sensor to read text when the user points their finger at it using the app’s camera. It could be used to read small text on microwave buttons. However, Point and Speak will only be available on Apple devices with a lidar sensor, which is currently exclusive to the company’s Pro iPhones and iPads.
The official launch date is unknown for these new accessibility features, but they typically coincide with the launch of new iOS versions, iPadOS, and macOS software in the fall. These features aim to enhance the accessibility and inclusivity of Apple devices, allowing individuals with speech impairments or cognitive disabilities to interact more easily with the world and communicate effectively.
Check out the Reference Article. Don’t forget to join our 21k+ ML SubReddit, Discord Channel, and Email Newsletter, where we share the latest AI research news, cool AI projects, and more. If you have any questions regarding the above article or if we missed anything, feel free to email us at Asif@marktechpost.com
🚀 Check Out 100’s AI Tools in AI Tools Club
Niharika is a Technical consulting intern at Marktechpost. She is a third year undergraduate, currently pursuing her B.Tech from Indian Institute of Technology(IIT), Kharagpur. She is a highly enthusiastic individual with a keen interest in Machine learning, Data science and AI and an avid reader of the latest developments in these fields.
Credit: Source link
Comments are closed.