The bot will provide education, while also flagging potential problems before they become seriousRead more...
AI begins to emerge in the consumer market, what kind of user experience will that enable?
You’ve seen this with your portable music player and your camera — as they’ve shrunk in size they’ve grown in capabilities and battery life. You’ve seen it even more dramatically with your mobile devices – size, shape, and interface have become significantly transformed by user desires, needs, and experience, not to mention the competitive marketplace. The underlying principle is simple: User experience continuously reshapes the way we think about our devices and the way we interact with them in our daily lives. As a new generation of Artificial Intelligence (AI) begins to emerge in the consumer market, it is exciting and worthwhile to wonder what kind of user experience will soon be enabled on our next mobile device.
Controlling the experience
The present anticipates the future in revealing ways. It’s certain, for instance, that dramatic changes are in store for device:user interface, just as tablets brought a new dimension to touch screens and Siri brought along voice. Developing new controls that complement the next mobile device hardware innovation will be a huge challenge for the evolution of the AI thread in the coming years. The hardware design of Google Glasses, for example, limits the use of touch. The control panel for wearable technology such as these glasses will therefore likely consist of a combination of a trackpad on the side of the glasses as well as voice and gestures/movements.
Simplifying the experience
The new user experience will rely on AI components to create an open-ended or intelligent dimension in the interactions you’ll have with your mobile device. Soon it won’t suffice to work within an interface where you receive 10 blue URL links as your search results. Searching in the near future will provide more immediate and precise answers. Already our mobile device interfaces have shifted in the direction of less content and more information per answers. AI will be behind these answers and interactions, learning from your interactions and other profile content to predict what is going to be the right answer for you, and only you. Once powered by AI the UI solution can also offer multi dimensional answers, sharing visual results that provide the user with a visually structured set of answers (i.e. one type of answers are grouped to the top left, others on the bottom right of the screen) making it easier to navigate complex responses to inquiries. This is in fact closer to the visual experience we get in shops where people navigate much more by looking at objects than on a screen.
A new on-the-go experience
Augmented Reality (AR) will be advanced by AI integration in the user experience. As our hardware becomes more hidden, and therefore wearable, augmented reality interfaces will provide projected multi-dimensional screens of information about what we have in front of us. This new interface experience may be most helpful for in-the-moment information—a local subway station, upon viewing it, alerts you that its closed—and provides a way to search instantly for a product while on-the-go in a new era.
The future of UI/UX is already here, as seen in leading apps applying AI to improve the experience; either it’s hidden in predictions shown to users or it’s an obvious new intelligent interaction. As companies try to differentiate themselves via their mobile apps, this is only the beginning.
(Image source: dailycaller.com)
Support VatorNews by Donating
Read more from our "Trends and news" series
The company, fresh off a $4M round, looks to bring tech to treatment for chronic autoimmune diseaseRead more...
The new program will at first be available to 40,000 Superior Healthplan membersRead more...