Carnegie Mellon University
Browse
OmniSense: A Collaborative Sensing Framework for User Context Rec.pdf (525.05 kB)

OmniSense: A Collaborative Sensing Framework for User Context Recognition Using Mobile Phones

Download (525.05 kB)
journal contribution
posted on 2010-01-01, 00:00 authored by Heng-Tze Cheng, Senaka Buthpitiya, Feng-Tso Sun, Martin L Griss
Context information, including a user’s locations and activities, is indispensable for context-aware applications such as targeted advertising and disaster response. Inferring user context from sensor data is intrinsically challenging due to the semantic gap between low-level signals and high-level human activities. When implemented on mobile phones, more challenges on resource limitations are present. While most existing work focuses on context recognition using a single mobile phone, collaboration among multiple phones has received little attention, and the recognition accuracy is susceptible to phone position and ambient changes. Simply putting a phone in one’s pocket can render the microphone muffled and the camera useless. Furthermore, naïve statistical learning methods used in prior work are insufficient to model the relationship between locations and activities.

History

Date

2010-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC