News/Reviews

Until the 15th of February 2015, visitors of the Frans Hals Museum in Haarlem, Netherlands are able to find an interactive art installation from Sightcorp as part of the ongoing exposition. The application was developed in close collaboration with the University of Amsterdam and the Museum, offering visitors the opportunity to play with their emotions while trying to mimic the facial expressions portrayed in different famous paintings which are shown on a screen. The exposition “Emoties” is dedicated to the paintings from the Dutch Golden Age (16th and 17th century), in which Dutch Old...
Cooper’s new Design the Future series of posts opens the door to how we think about and create the future of design, and how design can influence changing technologies. Join us in this new series as we explore the ideas behind agentive technology, and summon a metaphor to help guide us to the next interface. Part 1: Toward a New UX If we consider the evolution of technology—from thigh-bones-as-clubs to the coming singularity (when artificial intelligence leaves us biological things behind)—there are four supercategories of tools that influence the nature of what’s to come:  ...
Natural User Interfaces make some serious buzz these days – not without a reason. Natural User Interfaces (NUIs) literally change the way people interact with computers. They create new patterns, new means of communication, and new business opportunities. We can now play football without a controller. We can create personalized 3D models of the human body. We can accurately track finger movements. Computers are able to even understand our voice and what we mean. A few days ago, Microsoft announced HoloLens: an innovative way of viewing and interacting with holograms. Guess what: if I was...
IBM Watson is a system for reasoning over unstructured information. Initially, all this information came in as text and all interactions were typed or GUI-based. Information was presented back to the user via a GUI. No hands-free, no spoken interactions. We are pleased to take our first steps in bringing the ability to recognize speech (“Speech to Text”) and produce speech (“Text to Speech”) to IBM Watson developers. These services allow you to build applications that can take speech as input and return speech as output. It uses exactly the same programming models as other cognitive services...
The Internet of Things is evolving and the number of connected devices growing, but with each new device and form factor created comes some novel way to interact with it – different inputs and outputs. As designers, we have an exciting opportunity in front of us to inform how speech, alongside other modalities, manifests itself on these different devices, and what that ultimately means for the people using them.   When you consider IDC’s projection that there will be 15 billion connected devices by 2015 – and with that number ballooning to a massive 200 billion by 2020 – it is apparent...

Pages