Amid a flood of Amazon-branded tablets and Alexa-powered tech, Amazon Devices and Services SVP Dave Limp announced that the company’s digital assistant will soon be integrated into a purpose-built large language model (LLM) for nearly every new Echo device. Will tap.
Amazon planned to design the LLM based on five core competencies. One of these is to ensure conversations are “conversational” and the company claimed it “studied what it takes to have a great conversation. It’s not just words; it’s body language, understanding that you It’s eye contact and body language about who you’re addressing.” Amazon is still waiting to add eye and hand gestures to its Echo devices. Has anyone seen it recently?
However, based on the demo on Amazon’s showcase, it still has some work to do. When Limp asked Alexa to compose a quick message inviting friends to a BBQ, the assistant requested his friends’ presence for “BBQ Chicken and Sides” — the same way we humans order dinner. Invite for, right? Alexa also completely ignored the Amazon SVP’s requests at several points during the presentation, but I’ll chalk those issues up to the bewildering nature of the voice assistant demo in a live setting. We’ve put together all of Amazon’s announcements right here.
– Matt Smith
The biggest stories you may have missed
Freedom from touching your screen.
With the Apple Watch Series 9, Apple is introducing a new way of interaction: double tap. It’s also introducing on-device Siri processing, which will let you ask the assistant about your health data and log your daily stats. When both hands, or at least the hands of your watch, are busy, a double tap obviously won’t be helpful. You will need to have at least your thumb and forefinger available to perform the pinch. But when Engadget’s Cherlynn Lo is cleaning her apartment, holding a side plank, lifting a dumbbell or reading a book, it makes her life easier. Additionally, it’s worth noting that the Apple Watch Series 9 and Ultra 2 are the company’s first carbon-neutral products. Read on for our full verdict.
continue reading.
But the full damage of the attack is not clear.
All MGM Resorts hotels and casinos are back to normal operations nine days after a cyberattack caused a system shutdown at the company. The ALPHV ransomware group took credit for the attack shortly after the system went offline. The group claimed it used social engineering tactics using a little knowledge of LinkedIn and a short phone call to gain access to critical systems at the casino. Worryingly, both attacks originated through identity management vendor Okta – and at least three other Okta customers have been hit by cyberattacks. reuters Report.
continue reading.
It is also bringing on-screen translation of Alexa calls to its smart displays.
Amazon announced two new accessibility features coming to its devices later this year. The first is Eye Gaze on Alexa, which will let people with mobility or speech disabilities use their gaze to perform preset tasks on the Fire Max 11 tablet. This is the first time that Amazon has worked on gaze-based navigation of its devices, and it will use the camera on the Max 11 to keep track of where the user is looking. Pre-defined actions include smart home control, media playback, and making calls. Eye Gaze will be available at no extra cost on the Max 11 later this year, though the company hasn’t otherwise detailed how Eye Gaze actually works.
Amazon is also adding a new Call Translation feature that will transcribe Alexa calls on Echo Show devices. It can convert them between more than 10 languages, including English, French, Spanish and Portuguese. This feature will also be launched later this year.
continue reading.