When we were creating different UX elements for the Ubi, it soon became apparent that there were thousands of different areas where we could add some spunk or personality.
Our setup music was Thus Spoke Zarathustra. We would turn green to indicate mute (green means safe, right?), flashing pink would mean setup mode (for those who couldn’t see colours), we replaced “updated” being uttered through the speaker with nothing when a new lesson was created.
One of the processes I’d love to see in Amazon and Google is how these different UX elements are selected based on different regions. Why does the mic have to announce that it’s muted when I’ve pressed it on the Google Home? (I like the Echo’s light chime when it’s muted and louder chime when it’s unmuted. How about the sounds when volume is increased? (I prefer Home’s Geiger-counter sounding cricks to the Echo’s chimes).
Instead of a committee determining these elements, why not open them up to the community? I could see a dev community for advanced Echo UX or Home.
Another avenue to test this out is through AVS or Google Assistant enabled products. Maybe allow for non-consistent interaction with Alexa / Google Assistant?