Amid the flood of Amazon-branded tablets and Alexa technology, Amazon senior vice president of devices and services Dave Limp announced that the company’s digital assistant will soon come with a custom-built Large Language Model (LLM) that nearly every new Echo will use. Instruments.
Amazon aims to design LLMs based on five core functions. One of these is making sure interactions are “conversational,” and the company says it has “studied what it takes to have a good conversation.” I’m still waiting for Amazon to release the Echo – adding eye and hand gestures to devices. Have any of you seen it lately?
Based on the demo at Amazon Showcase, there’s still work to be done. When Limp asked Alexa to write a short message inviting friends to a barbecue, the assistant asked her friends to join her for a “BBQ Chicken and Sides” party — that’s how we invite people to dinner, right? At some point during the presentation, Alexa completely ignored the Amazon SVP’s requests, but I attribute these issues to the tense nature of demos of the voice assistant in a live environment. We’ve compiled all your Amazon ads here.
– Matt Smith
The biggest stories you might have missed
Freedom to not touch your screen.
With the Apple Watch Series 9, Apple is introducing a new type of interaction: double-tapping. Siri processing will also be introduced on the device, allowing you to ask the assistant for your health information and record your daily stats. If both hands, or at least your watch hands, are busy, double-tapping is obviously not helpful. For pinching, your hand should have at least your thumb and index finger. But when Engadget’s Cherylne Lowe cleans her apartment, picks up a plank, lifts a dumbbell or reads a book, it makes her life easier. It’s also worth mentioning that the Apple Watch Series 9 and Ultra 2 are the company’s first carbon-neutral products. Read on for our full verdict.
Continue reading.
However, the full damage of the attack is still unclear.
All MGM Resorts hotels and casinos are back to business as usual, nine days after a company-wide cyber attack crippled systems. The ALPHV ransomware group claimed the attack shortly after the systems went offline. The group claimed they used social engineering techniques, using little LinkedIn knowledge and a quick phone call to gain access to critical systems at the casino. Worryingly, according to Ward, the attack was launched through identity management company Okta – and at least three customers were affected by the cyberattack. Reuters A report.
Continue reading.
It also offers screen translation for Alexa calls on its smart display.
Amazon has announced two new accessibility features on its devices later this year. The first is Eye Gaze on Alexa, which allows people with mobility or speech impairments to use their sight to perform predefined actions on the Fire Max 11 tablet. This is the first time Amazon has worked on vision-based navigation in its devices, using the Max 11’s camera to track where users are looking. Preset actions include smart home control, media playback and making calls. Eye Gauge will be available on the Max 11 later this year at no additional cost, although the company has not otherwise explained how Eye Gauge will actually work.
Amazon is also adding a new call translation feature that transcribes Alexa calls to Echo Show devices. It can convert them into more than 10 languages, including English, French, Spanish and Portuguese. The feature will also be launched later this year.
Continue reading.