Yesterday, I demoed again the Amazon Look to the bewilderment of a guest. She also seemed to think it was a little bit bizarre of a device. What’s strange again about the Look is how niche it is for a consumer electronic hardware product. This is a device specifically for fashionistas who are Amazon fans. For a quick review, you can use the Look to test out different looks with its app and Amazon can then suggest outfits or why one look looks better than another.
Amazon has done a lot with cameras.
First, there was the Fire Phone that had four cameras for creating a 3D window through face tracking and perspective shifting. Then, there were the cameras on the Fire Tablet device.
There’s also the Amazon Cloud Cam (it’s answer to the Nest Cam), it’s billion dollar acquisition of Ring, and now, the Amazon DeepLens camera. The latter is very interesting because it gives developers the ability to build apps using Edge AI (AI residing on the device).
A few things that are interesting:
- Why can’t the Echo Looks capabilities be adding to other Amazon camera devices (Echo Show, Echo Show Spot, Fire Tablet, etc)
- What about adding voice capabilities to Cloud Cam / Ring?
- If the Look is tracking looks, why not weight gain / loss, skin tone or other health characteristics?
Some concepts around cameras that Look could borrow from:
My Look sits on a the shelf most of the time, passively waiting for a false trigger when I try to talk to my Echo Dot (and then shutting down nicely because of Echo Spatial Perception). I wish it could passively do more.