#GestureUI
Explore tagged Tumblr posts
timestechnow · 11 days ago
Text
0 notes
voiceandgesture · 11 years ago
Video
youtube
More than touch: understanding how people use skin as an input surface for mobile computing Abstract: This paper contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations. Our main findings show that (1) users intuitively leverage the properties of skin for a wide range of more expressive commands than on conventional touch surfaces; (2) established multi-touch gestures can be transferred to on-skin input; (3) physically uncomfortable modalities are deliberately used for irreversible commands and expressing negative emotions; and (4) the forearm and the hand are the most preferred locations on the upper limb for on-skin input. We detail on users' mental models and contribute a first consolidated set of on-skin gestures. Our findings provide guidance for developers of future sensors as well as for designers of future applications of on-skin input.
1 note · View note
voiceandgesture · 11 years ago
Video
youtube
Still pushing pixels by hand? You shouldn't be. Imagine combining this technology (demonstrated in 1979!) with the subvocal speech recognition tools developed by NASA. Then imagine repetitive motion injuries disappearing. “Put-that-there”: Voice and gesture at the graphics interface by Richard A. Bolt, MIT Media Lab (PDF)
0 notes
voiceandgesture · 11 years ago
Video
youtube
Consumed endurance: a metric to quantify arm fatigue of mid-air interactions Abstract: Mid-air interactions are prone to fatigue and lead to a feeling of heaviness in the upper limbs, a condition casually termed as the gorilla-arm effect. Designers have often associated limitations of their mid-air interactions with arm fatigue, but do not possess a quantitative method to assess and therefore mitigate it. In this paper we propose a novel metric, Consumed Endurance (CE), derived from the biomechanical structure of the upper arm and aimed at characterizing the gorilla-arm effect. We present a method to capture CE in a non-intrusive manner using an off-the-shelf camera-based skeleton tracking system, and demonstrate that CE correlates strongly with the Borg CR10 scale of perceived exertion. We show how designers can use CE as a complementary metric for evaluating existing and designing novel mid-air interactions, including tasks with repetitive input such as mid-air text-entry. Finally, we propose a series of guidelines for the design of fatigue-efficient mid-air interfaces.
0 notes
voiceandgesture · 11 years ago
Video
youtube
MouStress: Detecting stress from mouse motion Abstract: Stress causes and exacerbates many physiological and mental health problems. Routine and unobtrusive monitoring of stress would enable a variety of treatments, from break-taking to calming exercises. It may also be a valuable tool for assessing effects (frustration, difficulty) of using interfaces or applications. Custom sensing hardware is a poor option, because of the need to buy/wear/use it continuously, even before stress-related problems are evident. Here we explore stress measurement from common computer mouse operations. We use a simple model of arm-hand dynamics that captures muscle stiffness during mouse movement. We show that the within-subject mouse-derived stress measure is quite strong, even compared to concurrent physiological sensor measurements. While our study used fixed mouse tasks, the stress signal was still strong even when averaged across widely varying task geometries. We argue that mouse sensing "in the wild" may be feasible, by analyzing frequently-performed operations of particular geometries.
0 notes
voiceandgesture · 11 years ago
Video
vimeo
FingerSense: A way to think with your hands
What if we stopped creating new gestures and started using more body parts? FingerSense from Qeexo can tell the difference between your finger, fingernail, and thumb, and can differentiate between ends of a stylus, without using Bluetooth the way Pencil does.
FingerSense Overview (by Qeexo)
0 notes
voiceandgesture · 11 years ago
Link
In dance, there is a practice called "marking". When dancers mark, they execute a dance phrase in a simplified, schematic or abstracted form. Based on our interviews with professional dancers in the classical, modern, and contemporary traditions, it is fair to assume that most dancers mark in the normal course of rehearsal and practice. When marking, dancers user their body-in-motion to represent some aspect of the full-out phrase they are thinking about. Their stated reason for marking is that it saves energy, avoids strenuous movement such as jumps, and sometimes it facilitates review of specific aspects of a phrase, such as tempo, movement sequence, or intention, all without the mental and physical complexity involved in creating a phrase full-out. It facilitates real-time reflection.
If you're a designer of gestural interfaces, you likely spend a lot of time marking, using your body to think about the interactions you have with a screen or camera or any machine space. This paper can give you a framework for thinking about marking and argues for the importance of going beyond 'imagining' interaction.
The opportunity to reconceptualize things is something that mental simulations does not offer. It is a major reason "externalizing" what is in mind is a more powerful strategy than working with things in the mind alone.
0 notes
voiceandgesture · 11 years ago
Video
youtube
Imagine interactive electronic books that never need batteries or charging.
... a new energy harvesting technology that generates electrical energy from a user's interactions with paper-like materials. The energy harvesters are flexible, light, and inexpensive, and they utilize a user's gestures such as tapping, touching, rubbing and sliding to generate energy. The harvested energy is then used to actuate LEDs, e-paper displays and other devices to create interactive applications for books and other printed media.
Paper Generators: Harvesting Energy from Touching, Rubbing & Sliding (by DisneyResearchHub)
0 notes
voiceandgesture · 11 years ago
Link
“I know that this isn’t an animal,” said Pierre Carter, 62, smiling down at the robot he calls Fluffy. “But it brings out natural feelings.”
0 notes
voiceandgesture · 11 years ago
Video
youtube
D4 is Xbox One's Weirdest Game, and That's Awesome
What feelings are triggered when you wash your face at a sink? Do you wake up a bit? Do you feel refreshed? Are you ready to face a challenge?
The designers of Dark Dreams Don't Die, an episodic adventure game, want players to feel empathy during game play and created a world best suited to exploration via Kinect. The game features plenty of punching and some gruesome comedy, but is a gem for the interactions incorporated into the story.
0 notes
voiceandgesture · 11 years ago
Link
A great listen (it's 6 minutes long), this interview with Melissa Wagner is a good introduction to the sorts of gestures available to us as we design new interaction methods.
Buy a copy of the book super-cheap on Amazon
0 notes
voiceandgesture · 11 years ago
Link
Human interfaces guidelines specific to Kinect. This site also includes information about speech, skeletal tracking, and data streams
Direct download for PDF
0 notes
voiceandgesture · 11 years ago
Quote
Founded in 2002, the International Society for Gesture Studies (ISGS) is the only international scholarly association devoted to the study of human gesture.
International Society for Gesture Studies (ISGS)
0 notes
voiceandgesture · 11 years ago
Link
The design principles for Android Wear are a great place to start thinking about wearable gestural UIs and how to offer user support and adapt to user needs.
0 notes
voiceandgesture · 11 years ago
Video
youtube
iOS 7 Head Gesture Controls Hands-On
0 notes
voiceandgesture · 11 years ago
Video
Moto X "Lazy Phone" Ad shows the "Quick Capture" gesture to turn the phone on and launch the camera app with two flicks of the wrist and a tap.
0 notes