Carnegie Mellon University
Browse

Designing Real-Time, Adaptive Learning Environments for Origami

thesis
posted on 2025-06-03, 18:12 authored by Yiming JiaoYiming Jiao

Origami can be intuitive, yet traditional tutorials often fail to convey spatial and sequential steps clearly—especially in complex patterns. This thesis presents an interactive origami learning system that integrates a real-time camera, hand gesture recognition, and projection-based feedback to support embodied and responsive folding experiences.

The system operates in three distinct modes: Create Mode allows users to construct custom crease patterns through freeform folding and gesture confirmation; Learner Mode provides step-by-step projection of predefined models with real-time feedback on fold accuracy; and Auto Mode explores data-driven inference by matching user folds to known patterns and suggesting possible next steps.

Technically, the system tracks paper geometry and hand movement using computer vision, builds a dynamic crease graph, and projects instructions and feedback directly onto the paper surface. Users interact entirely through hand gestures, enabling continuous focus on the material without additional devices.

A Miura-ori pattern was used as a case study to demonstrate the system’s capabilities in both Create and Learner Modes. The results show that the system successfully supports fold tracking, feedback projection, and gesture-driven interaction, while revealing design challenges related to fold detection resolution, gesture robustness, and projection alignment. This work contributes a novel framework for embodied origami learning and offers insights into the integration of computational interaction with craft-based practices.

History

Date

2025-05-09

Degree Type

  • Master's Thesis

Department

  • Architecture

Degree Name

  • Master of Science in Computational Design (MSCD)

Advisor(s)

Daragh Byrne

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC