Events

06/20/16
Touchscreens are how we interact with many digital products - smartphones, tablets, cars, point of purchase, etc.  Yet this experience is incomplete because all we feel is lifeless glass. Tanvas has haptic touchscreen technology that provides the same finger positioning data as today, but also delivers a rich touch experience that allows users to physically interact with the digital world.  The effect is based on elecrostatics or static cling - a small amount of capacitive coupling pulls the finger into the glass to create an increase in friction.  By controlling the amount, frequency, and...
05/16/16
Alexa, the voice service that powers Amazon Echo, Echo Dot, Amazon Tap and Amazon Fire TV, provides a set of built-in abilities, or skills, that enable users to interact with devices in a more intuitive way using voice. Examples of these skills include the ability to play music, answer general questions, set an alarm or timer and more. Users can then access these new skills simply by asking Alexa a question or making a command. This event will be a walk-through of the latest Alexa Skills Kit (ASK) and will teach you how to build your own skills for Alexa enabled devices. You will also learn...
04/18/16
With the explosion of mobile and IoT technology over the past few years, voice has become a primary interaction method for a growing and diverse range of apps and devices.  Accompanying this growth is a shift to a self-service model of voice design and development, enabled by a new host of tools and platforms. Designers, developers, and product owners are finding themselves positioned to build their own cutting-edge natural language interfaces and to learn the disciplines of voice design and development at the same time. In this discussion, Tanya Kraljic and Adam Emfield of Nuance will share...
03/07/16
Affective computing is a flourishing branch within the digital realm that recognizes, interprets, and even fosters human emotional interactions. Typically analyzed by facial imagery, Beyond Verbal has developed a specialized approach by evaluating human voice. The extraction, decoding and measurement of emotions introduces a whole new dimension of emotional understanding, which they call voice-driven Emotions Analytics. It has the potential to transform the way we understand ourselves, our emotional well being, our interactions with machines, and most importantly, the way we communicate with...
02/22/16
We're kicking off 2016 (our 4th year!) with a new name and an event that takes a deep dive into Humanized User Interface™ or (HUI™) - a more accurate description of the technologies typically called NUI. So, this presentation will explore the amazing advancements available today, that can enable apps and devices to interact with us like we do with other people. We'll help attendees to understand the difference between NUI and HUI, what exactly HUI is, where it's headed, how to best use it in projects and products, how to create amazing HUI based user experiences, and touch on some of the...

Pages