This talk covers the design of fuzzy things. We’re all familiar with designing for default, typical, and error states. But how do we design for our users’ state of mind? Or, physical state? Or the state of the art in machine vision? Two powerful trends are pushing design into fuzzier areas: ubiquitous sensors and Machine Learning. Our iPhones and Watches capture more and more about our physical activity and environment. How can we use this data when designing great experiences? The first half of this talk looks at the design of heuristics, with common approaches and tools for fast prototyping. The second half of the talk contains strategies for wrestling with machine vision or machine learning results in your UI design. This includes strategies on mapping “confidence levels" to UI, and avoiding Clippy-isms like “It looks like you’re writing a letter”. It concludes with some real-world app examples to inspire new levels of ML integration.
I'm an interaction designer and new developer! Previously co-founded plasq (makers of Comic Life) and Skitch, acquired by Evernote. Have designed for all manner of things including iOT, drones. :-)