Poster
in
Affinity Workshop: WiML Workshop 1
Exploiting Hyperdimensional Computing and Probabilistic Inference for Reasoning Across Levels of Abstraction in Dynamic Biosignal-Based Applications
Laura Isabel Galindez Olascoaga
Hyperdimensional computing (HDC) has recently emerged as a well-suited approach for efficient biosignal processing and classification. Inspired by the understanding that the brain's computations rely on massive circuits of neurons and synapses, HDC operates on pseudo-random hypervectors (HVs) and a set of well-defined arithmetic operations. The traits of HDC have been most notably leveraged by electromyogram (EMG)-based hand gesture classification, often used for prosthetic interfaces. Previous works have shown that HDC can classify hand-gestures with over 90% accuracy by projecting the EMG signals onto >1000-dimensional bipolar HVs representing its spatiotemporal properties, and performing nearest-neighbor search on prototype class HVs learned from data. However, to augment the functionality of HDC beyond static input-output mappings, higher level-of-abstraction representation and reasoning capabilities are needed. We propose a hybrid scheme that can hierarchically represent the different levels of abstraction of the application: at the lowest level it relies on HDC and other machine learning models to encode and classify spatiotemporal features from biosensors; at the high level it relies on a Dynamic Bayesian Network (DBN) to probabilistically encode the temporal relations between user intent and the information provided by the layer below.