The CHIST-ERA funded Bitscope project, led by Professor Tomás Ward has just got underway at Insight. Bitscope stands for Brain-Integrated Tagging for Socially Curated Online Personalised Experiences.
The video above uses the example of a virtual visit to an art gallery and explains how the visit might be enhanced and personalised depending on a user’s reactions to different paintings. The user won’t have to express an affinity for particular works but rather their responses will be measured, analysed and used to build a picture of what they like and respond to.
Essentially, the researchers envisage a future in which attention, memorability and curiosity elicited in virtual worlds will be measured without the requirement of “likes” and other explicit forms of feedback. Instead, users of their improved Brain Computer Interface (BCI) technology can explore online experiences leaving behind an invisible trail of neural data-derived signatures of interest. This data, passively collected without interrupting the user, and refined in quality through machine learning, can be used by standard social sharing algorithms such as recommender systems to create better experiences.
Technically the work concerns the development of a passive hybrid BCI (phBCI). It is hybrid because it augments electroencephalography with eye tracking data, galvanic skin response, heart rate and movement in order to better estimate the mental state of the user.
You can read about it in more detail here.