June 23, 2020
Points from the WWDC 2020 Keynote today June 23 2020
I guess most exciting announcements for me in a nutshell:
- New touch accommodations allow a double tap or triple tap on the back of the iPhone to enable over 23 actions including Accessibility features or enabling Shortcuts
- New FaceTime feature will recognise people using Sign Language and make their screen bigger in chats
- New AirPods feature to enhance verbal sounds around you
- Microsoft’s Adaptive Controller will work with Apple TV
- VoiceOver Recognition: On-device intelligence recognises key elements displayed on your screen to add VoiceOver support for app and web experiences that don’t have accessibility support built in.
But there is so much more I am going to unpack in the area of Cognitive over the coming time period as well.
Links below - love your thoughts.
No doubt more to share soon!
Accessibility features include Headphone Accommodations, which amplifies soft sounds and tunes audio to help music, movies, phone calls, and podcasts sound crisper and clearer, and sign language detection in Group FaceTime, which makes the person signing more prominent in a video call. VoiceOver, the industry’s leading screen reader for the blind community, now automatically recognizes what is displayed visually onscreen so more apps and web experiences are accessible to more people.
Sign Language Prominence: FaceTime can now detect when a participant is using sign language and make the person prominent in a Group FaceTime call.
Picture-in-Picture: With Picture in Picture for FaceTime, you can continue viewing your call while multitasking.
Headphone Accommodations: This new accessibility feature is designed to amplify soft sounds and adjust certain frequencies for an individual’s hearing, to help music, movies, phone calls, and podcasts sound more crisp and clear.10 Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.
Audio sharing for Apple TV: Connect two sets of AirPods to your Apple TV 4K so you can enjoy movies and shows with someone else without disturbing others.
Automatic Switching: Seamlessly move between devices without manually switching your AirPods.11 If you finish a phone call on your iPhone and pick up your iPad to watch a movie, AirPods automatically switch over. (Also works w/macOS Big Sur).
iOS 14 Accessibility: VoiceOver Recognition: On-device intelligence recognizes key elements displayed on your screen to add VoiceOver support for app and web experiences that don’t have accessibility support built in.
• Image Descriptions: VoiceOver Recognition: Image Descriptions: VoiceOver reads complete-sentence descriptions of images and photos within apps and on the web.
• Text Recognition: VoiceOver speaks the text it identifies within images and photos.
• Screen recognition: VoiceOver automatically detects interface controls to aid in navigating your apps, making them more accessible.
- New accessibility feature in iOS 14 that can perform quick actions through taps on the back of an iPhone. The Back Tap feature, which can be found in accessibility settings, be used to instantly pull up Control Center, summon Siri, or even run Shortcuts.
There are currently 23 actions in total (aside from any user created Shortcuts), and users can assign two separate tap actions to invoke the run the assigned action. Both double and triple taps are available, and can be used at any time, although it is currently limited to when the device is unlocked.
The Home app makes smart home control even easier with new automation suggestions and expanded controls in Control Center for quicker access to accessories and scenes. Adaptive Lighting for compatible HomeKit-enabled lights automatically adjusts the color temperature throughout the day, and with on-device Face Recognition, compatible video doorbells and cameras can identify friends and family. The Home app and HomeKit are built to be private and secure, so all information about a user’s home accessories is end-to-end encrypted.
Adaptive Lighting: Supported lighting accessories can now automatically adjust color temperature throughout the day to maximize comfort and productivity.4Ease into the morning with warmer tones and remove blue light in the evening as you wind down for the night.
Face Recognition and Activity Zones are part of HomeKit Secure Video, the feature that brings video from your camera accessories right to your Home app. It’s secure and private, with all video analysis done on the Apple devices in your home — not in the cloud.
Maps takes elevation into account to let you know if you’re in for an uphill workout or a leisurely, flat ride. You’ll be alerted if there are steep passages along the route or if you’ll need to carry your bike up stairs. You can also choose a route that avoids stairs or busier roads altogether.
Automatic language detection transcribes the original and translated text on the appropriate sides of the screen, followed by translated audio. Translate uses advanced on-device machine learning and the powerful Apple Neural Engine to enable natural-sounding conversations.
Enlarge translated text in landscape view, making it easier to read and more effective at getting someone’s attention.
• Customers can now use Siri to translate many languages conveniently from the wrist, dictation is handled on device with the power of the Apple Neural Engine for faster and more reliable processing when dictating messages and more, and Apple Watch now supports Announce Messages with Siri.
• The Shortcuts app is also now available on Apple Watch and can be accessed as a complication.
• The bold X-Large face now has an option to add a rich complication.
Following the introduction of the Noise app in watchOS 6 that measures ambient sound levels and duration of exposure, watchOS 7 adds further support for hearing health with headphone audio notifications. Customers can now understand how loudly they are listening to media through their headphones using their iPhone, iPod touch, or Apple Watch, and when these levels may impact hearing over time.
When total listening with headphones has reached 100 percent of the safe weekly listening amount, Apple Watch provides a notification to the wearer. This amount is based on World Health Organization recommendations that, for instance, a person can be exposed to 80 decibels for about 40 hours per week without an impact to hearing abilities.Customers can also see how long they have been exposed to high decibel levels each week in the Health app on iPhone and can control the maximum level for headphone volume. No audio from the headphone audio notification feature is recorded or saved by the Health app or Apple Watch. (See image at bottom)
Scribble: With Scribble, you don’t have to put Apple Pencil away to do other things. Now you can write by hand in any text field across iPadOS, and your words automatically convert to text. Use Apple Pencil to write a quick message or search for something in Safari. Your handwriting will automatically transform to typed text, so you can get back to what you were doing without interrupting your flow.
Notes: Powered by advanced machine learning that distinguishes writing from drawing, Smart Selection lets you select handwritten text using the same gestures you’ve always used for typed text.
Paste handwriting as text: Simply select your handwritten notes and copy them as text. When you paste them into another app, like Pages, they’ll be converted to typed text.
macOS Big Sur:
Control Center for Mac:
Designed just for Mac, the new Control Center consolidates your favorite menu bar items into a single place to give you instant access to the controls you use most. Just click the Control Center icon in the menu bar and adjust Wi-Fi, Bluetooth, AirDrop, and other settings — without opening System Preferences. Add controls for the apps and features you use most, like Accessibility or Battery.
Adaptive Xbox Controller support with Apple TV
Shortcuts got some very cool updates in iOS/iPadOS 14:
- Disable confirmation for automations
- New compact UI for lists, input dialogs, running shortcuts in share sheet
- Automatic categories for share sheet/Watch
- Copy & paste actions (!)
- New automation triggers
On Apple Watch
WWDC Accessibility Developer Sessions
Make your App Visually Accessible
When you design with accessibility in mind, you empower everyone to use your app. Discover how to create an adaptive interface for your app that takes a thoughtful approach to color, provides readable text, and accommodates other visual settings to maintain a great experience throughout. We've designed this session like our user interfaces — to be accessible to all. If you'd like to learn even more about accessibility and design, you may also enjoy “Visual Design and Accessibility,” “Accessibility Inspector,” “Building Apps with Dynamic Type,” and “Introducing SF Symbols.”
Create a seamless speech experience in your apps
Augment your app's accessibility experience with speech synthesis: Discover the best times and places to add speech APIs so that everyone who uses your app can benefit. Learn how to use AVSpeechSynthesizer to complement assistive technologies like VoiceOver, and when to implement alternative APIs. And we'll show you how to route audio to the appropriate source and create apps that integrate speech seamlessly for all who need or want it. To get the most out of this session, you should be familiar with AVFoundation and the basics of speech synthesis. For an overview, watch “AVSpeechSynthesizer: Making iOS Talk.”
Accessibility design for Mac Catalyst
Make your Mac Catalyst app accessible to all — and bring those improvements back to your iPad app. Discover how a great accessible iPad app automatically becomes a great accessible Mac app when adding support for Mac Catalyst. Learn how to further augment your experience with support for mouse and keyboard actions and accessibility element grouping and navigation. And explore how to use new Accessibility Inspector features to test your app and iterate to create a truly great experience for everyone. To get the most out of this session, you should be familiar with Mac Catalyst, UIKit, and basic accessibility APIs for iOS. To get started, check out “Introducing iPad apps for Mac” and "Auditing your apps for accessibility.“
VoiceOver efficiency with custom rotors
Discover how you can integrate custom rotors and help people who use VoiceOver navigate complex situations within your app. Learn how custom rotors can help people explore even the most intricate interfaces, explore how to implement a custom rotor, and find out how rotors can improve navigation for someone who relies on VoiceOver. To get the most out of this session, you should be familiar with general accessibility principles and VoiceOver accessibility APIs on iOS and iPadOS. For an overview, watch “Making Apps More Accessible with Custom Actions.”
App accessibility for Switch Control
Switch Control is a powerful accessibility technology for anyone with very limited mobility. The feature is available natively on iOS, and you can create an even better Switch Control experience in your app with tips, tricks, and a few APIs. We'll walk you through how people use Switch Control, as well as provide best practices for supporting it in your app effectively. To get the most out of this session, you should be familiar with general accessibility principles and VoiceOver accessibility APIs. Check out "Making Apps More Accessible With Custom Actions," "Writing Great Accessibility Labels, and "VoiceOver: App Testing Beyond The Visuals" for more information.
Meg Frost using a Whill