← Works

ExtruFace

A gesture-controlled interface for multi-meter-long industrial extrusion machines

Web HTML JavaScript jQuery UX UI User Research Prototyping Development
Timeline 2015 to 2016
Role UX Design & Development
Status Research Project
In a Nutshell

Applied research project with a UI that dynamically adapts between near-field hand gestures and far-field body gestures based on operator distance. Informed by Fitts' Law analysis and operator workflow observation.

Overview

ExtruFace was an applied research project developing the concept and prototype of a gesture-controlled user interface for industrial extrusion machines. I was involved in all project phases: from requirements analysis through implementation to final acceptance.

The Challenge

Extrusion machines are multi-meter-long, heavy industrial machines for pipe forming. Operators sometimes need to control them from a distance and often wear protective gloves, making the existing touchscreen solution impractical. Key challenges included:

  • Adapting modern UX/UI principles to an industrial machinery context.
  • Designing for two fundamentally different interaction modes: near-field hand gestures and far-field body gestures.
  • Balancing gesture innovation with operator familiarity and safety requirements.
  • Ensuring reliability in production environments.

The Process

  • Requirements Analysis: Interviews with machine operators to understand workflows and identify which controls are needed at various distances.

  • Gesture Design: Findings from my bachelor's thesis significantly influenced the gesture vocabulary; far-mode gestures were designed in collaboration with machine operators.

  • Low-Fidelity Prototyping: Wireframes and flowcharts mapping the full interaction flow between near and far mode.

  • High-Fidelity Prototype: Functional web application demonstrating both interaction modes.

  • User Evaluation: Prototype testing in a laboratory setting with real operators.

The Solution

Key features designed and built:

  • Adaptive Interface: UI elements scale dynamically as the operator moves closer to or further from the monitor (via Microsoft Kinect distance tracking).

  • Far Mode: Only distance-relevant controls are displayed, selection based on operator interviews; controlled via body gestures (arm poses) recognized by Microsoft Kinect.

  • Near Mode: Touchless hand gesture control using Leap Motion Controller for fine-grained interaction.

  • Fitts' Law-Optimized Menus: Circular menus minimizing the distance to menu options for gesture input.

  • Touchscreen Fallback: Traditional touch control available as an alternative input method.

Results & Impact

  • Delivered a functional prototype used as a source of ideas for further development of the existing system.
  • Demonstrated that touchless gesture control is viable for industrial machine operation.
  • The adaptive near/far mode concept addressed the core problem of operating machinery from varying distances with protective equipment.

Interested in working together?

Get in touch