Dynamic User Input:
Interfacing Eye-Hand Behavior during
Motion Sensing/Motion Capture
Nathaniel Bobbitt
ac551@rgfn.epcc.edu
Simon Fraser University
(Sponsored by School of Kinesiology Seminar Series)
Inputbased aspects of Sensorimotor Behavior:
Aiming
[Feed Through]......[Particle Systems]......[GUI for Time Motion Study]
Note: Links Require CD-ROM
Photographic and Cinematic Media Obstacles to Time Motion Studies
Frame Based Study of Articulated Motion
Linear Sequence
Cell Animation or Biological Motion?
Concentricity and Biological Motion
Aiming:
Visualization of a Pouring Motion
Concentricity of Sensorimotor Behavior
Aiming (With Grabbing):
Get & Grasp-Hold-Relocate/Rotate
Performer-Handled Object-Receptacle
Media Production and Interface Implementations
Dance Related Interfaces
Dancer Controlled Object
Mouse-Related
Eye-Hand Motion with Color Feedback:
From Grief Project on Teenage High School Cafeteria Shooting
Grief and Virtual Tactile Therapy
Emotional Processing and Eye-Hand Behavior
Healing Interface
On Hand Trace
Dyanmic User Input
Concentricity and Cross Modal Behavior
Dynamic User INPUT
1. Slide the Mouse to the RIGHT of the STARS
2. Move the Mouse along TOP of the Box
3. Pause the Mouse Movement
If the Background is Black. Start:
Here ***
Media Eye-Hand Controls
This interface combines eye-hand motion (according to mouse movement).The user input in this interface consists of:
Aiming of Mouse
Vision
The challenge in this interface is to integrate:
Hand
Eye (Vision)
Mousepad Reference
Standard eye-hand exchanges in a computer superimpose eye (focus), hand, reference:
My interface shows cases in which user input is more dynamic due to:
Splitting of Eye (Focus)
Requirement of Hand Mobility
Layering of Reference (Eye-Hand, Feedback_Outcome)
Direct: Haptic Design Implications
User input is dynamic in this interface given the real time interaction of the mouse movement over sensitive areas (On-Screen, Mousepad Texture) and the layering or splitting of eye/hand resources during the movement of the mouse.
Layering of eye-hand motion leads to mousedrift and reliance on semi-automatic (intuitive) options. [See:Visual Interface] This interface integrates:
Layering of Eye-Hand Motion
Invisibility of the Target (Mouse Sensitive) Areas:
This interface also is useful in considering aiming gestures which are based on eye-hand coordination. The screen episodes in all the cases to be considered are Aiming without Grabbing. As part of a New Media residency at the Banff Centre of the Arts an Aiming with Grabbing prototype was designed
Aiming without Grabbing will be useful as we can consider the phenomena of mousedrift due to layering the user's allocation of eye or motoric focus.
Before moving on first try the interface to the right.[When using the interface have the browser at the maximum size possible].
Interface Hidden Color Setup:
Eye-Hand Interface
Examples of Dynamic Input and Mouse Movement
Conventional Mouse Movement
Eye-Hand=Synchronized
Eye Focus=Overlap (Hand)
Reference=On Screen Only
Stretched Screen Vision
Eye-Hand=Mobile
Hand=Mobile
Eye Focus=Split
Reference=Split
Screen-Off Screen Vision
Eye-Hand=Fixed
Hand=Slight Drift
Eye Focus=Split
Reference=Layered (Eye)
Layered Eye-Hand Activity
Eye-Hand=Layered
Hand=Mobile
Eye Focus=Split and Mobile
Reference=Layered(Eye,Hand)
Eye-Hand Interface 1
Eye-Hand Interface 2
Eye-Hand Interface 3
I. Transmission vs. Reception
Transmission:
Excitation (Allocation)
Control
Input (Energy)
Reception:
Meaning
Symbolism
Image
II. Sensorimotor Behavior and
Excitation
[Song and Respiratory Feed ]
Dissipation/Replenishment
Guidelines for Processing Articulatory Behavior
[Isolate]
Engagement of Sensory Resources
[Amplify]
Visualization of Intangible Performance Behavior
[Explore]
Redirection or Filtration of Streams of Behavior
Refinement of Tasked Behavior
[Grasp]
[Hold]
[Sort]
[Pour]
[Unavoidable Delay]
[Getting Ready]
Advantages of Excitation/Performance Model
Performer Modulation:
- Layering of Performer Energy with Performer Articulated Activity
Example:Exchange of Respiration and Beating Spectra
- Multi-Resolution
: Feed and Incremental Task Realization
On Physical Playback
- Dynamic User Input [Passive Capture of Hand Motion and Decision-Making]
Temporal Interaction:
[Performer, Material Object]......[ Brief Decay Rates or Replenished Excitation]
Development of Amplifiable Visual Content:
- Zoom-In Capable
- Rewrite Visual Systems [Meyer and Marriott 1997]
Shaping Feed: Insert or Delete Warping (Time Warping)
COPYRIGHT 1998 NATHANIEL BOBBITT