Tutorials

Attention versus Consciousness: Independent or Conjoined?

Description

There is presently a growing controversy over the relationship of attention and consciousness. There are an increasing band of experimenters (and related increasing set of experiments) that claim to show that attention and consciousness are independent. On the other hand there is a solid body of evidence that indicates that without attention to a stimulus there is no consciousness of it. The experimental data are probing the fringes of attention and consciousness and in the process doing a good job of clarifying the nature of both attention and consciousness (both by consciousness being present in processing and by its absence).

The tutorial will describe the crucial experiments being called upon, and what might be their interpretation in support of the independence thesis. Other data will also be considered (inattentional blindness, priming effects, the attentional blink and others) which also seem initially to support the thesis.

A brief survey will then be given of the various neural network models of consciousness presently of interest (including those of the Global Workspace, Higher order thought, Complexity and the Primitive Core, and Attractor-based theories) [1] as well as that of the author, the attention-based CODAM model [2]. The pros and cons of these theories will be considered and how they could help relate to the new experimental data already outlined, apparently implying the separation of attention and consciousness.

In particular creativity and the 'creativity effect' [3] will be seen as an important component that may help bridge the gap between attention and consciousness when it seems that consciousness has floated freely away from attention.

Possible further experiments and an overall summary will conclude the tutorial.

Presenter

J. G. Taylor,
Dept. of Mathematics King’s College London

He is presently European Editor-in-Chief of the journal Neural Networks and was President of the International Neural Network Society (1995) and the European Neural Network Society (1993/4). He is Emeritus Professor of Mathematics at King's College, London, after 25 years there as Professor of Mathematics, and is Director of the Centre for Neural Networks at King's College.

He has published over 500 scientific papers (in theoretical physics, astronomy, particle physics, pure mathematics, neural networks, higher cognitive processes, brain imaging, consciousness) since he started scientific research in 1953, authored 12 books, edited 13 others, including the titles When the Clock Struck Zero (Picador Press, 1994), Artificial Neural Networks (ed, North-Holland, 1992), The Promise of Neural Networks (Springer,1993), Mathematical Approaches to Neural Networks (ed, Elsevier,1994), Neural Networks (ed, A Waller,1995) and The Race for Consciousness (MIT Press, 1999). His latest book is The Mind: A User's Manual, published in 2006 by Wiley.

He has worked in various parts of the world: Paris (the Institute des Hautes Etudes Scientifique), Princeton NJ (the Institute for Advance Study), the Baltimore Research Institute for Advanced Studies, at the US Universities of Rutgers, NJ, College Park Maryland, Berkeley CA and Los Angeles CA, as well as the UK Universities of Cambridge, Oxford, King's College London, Queen Mary College London and Southampton. He is presently Emeritus Professor of Mathematics at King's College London.

He has given many plenary addresses and tutorials at international conferences, both in particle physics, cosmology and string theory, well as more recently lectured intensively on neural networks, consciousness and attention, and related areas at many levels and through all media (press, TV, radio, theatre, poetry).

He started research in neural networks in 1969 and has contributed to all areas of neural networks as well as spending 2 years working on experiments in brain imaging in Germany. Present research interests are: financial and industrial applications; control as applied to cognitive processes; dynamics of learning processes; stochastic neural chips and their applications (the pRAM chip); higher cognitive brain processes, including natural language processing, attention and consciousness. He has recently had funded research projects from the EC (on building hybrid attention agents through the EC IST Project ORESTEIA) and on emotion recognition system construction, the EC project ERMIS) and from EPSRC (on natural language processing).

He has just finished research on an attention modelling project (in a UK EPSRC project) with N Taylor, as well as a BBSRC (UK) collaborative project with 3 experimental partners on the interaction of attention and emotion (with N Fragopanagos). He was the co-ordinator of the EC GNOSYS project attempting to build a cognitive robot with reasoning powers (2004-2007), as well as currently being involved in the EC HUMAINE Network of Excellence on emotion understanding. He has also just finished work on the EC MATHESIS Project with M Hartley to create a model of how humans learn movements from others (in the 'mirror neuron' system. He is presently also engaged in developing a higher cognitive level model of consciousness, using the most recent results on attention to model the latter as an engineering control system, and leading to the CODAM model of consciousness (COrollary Discharge of Attention model) class of models of consciousness, as well as attempting to justify their most appropriate internal structure by use of the latest experimental data on attention and emotion. He is also developing a new research project involving a more complete global-level experimental and modelling analysis of attention, as well as further research projects moving the GNOSYS system on to language understanding and development and more advanced motor control.

His most recent award was the IEEE CIS Pioneer Award in 2009 for contributions to neural network models of the brain.

References

[1] Taylor JG (2010) A Review of Models of Consciousness, in The Perception and Action Cycle, eds V Cutsuridis & JG Taylor. Berlin: Springer
[2] Taylor JG (2007) CODAM: A Model of Attention Leading to the Creation of Consciousness. Scholarpedia 2(11):1598
[3] Taylor JG (2010) The Creativity Effect: Consciousness versus Attention. (submitted)


User modelling and machine learning for affective and assistive computing

Description

The advent of ubiquitous sensors and computing power and, especially, natural interfaces in the form of speech-based commands or hand-held devices enables users to interact with computers and machines in a human-like fashion, surpassing the conventional paradigm of keyboards and mice. This emerging paradigm opens up new means of non-verbal communication: users can shrug their shoulders to indicate indifference, nod when agreeing or shout when angry, thus producing feedback which computing systems can take advantage of. In addition to this, users with disabilities, ranging from dyslexia or autism to physical disabilities, can benefit from the natural interaction capabilities, as well as enjoy better care in their home or work environments.

However, it order to bridge the gap from low-level signals (audio, video or biosignals) to affective and behavioural cues, one needs to map extracted features or cues to user characteristics, taking into account background information or user and environment context, e.g. a smile from the user may be interpreted as positive feedback by a search engine, while a frown may indicate that the user did not get what he/she expected to. Knowledge technologies can be of great assistance here, offering useful qualities, such as alignment and consistency checking, while concepts from cognitive theories, e.g. theory of mind, can prove valuable when trying to reason about the beliefs, desires and intentions of the user. As a result, research and development of natural, intuitive interfaces are not confined to one single discipline, but compose an exciting and challenging interdisciplinary field instead.

The inter-disciplinary nature of this field has been recognized by authorities such as European Union’s Directorate General on Information Society and Media (DG INFSO) which introduced affective computing across different research areas: ICT, e-Health, Technology-Enhanced Learning and recently Digital Content and Libraries. Another indication of the interest in affect-related research is the fact that papers on affective computing appear in more than 90 conferences across disciplines and almost 30 special issues in high-impact journals have been published or prepared; the momentum is such that more than 500 researchers participate in the Humaine Association (http://emotionresearch.net/), a follow-up initiative of the Humaine Network of Excellence which also produces IEEE Transactions on Affective Computing, while Springer and other major publishers are in the process of producing specialised journals on assistive computing.

Intended audience

Researchers in the fields of:

  • Machine learning/Neural networks
  • Human factors, assistive computing
  • Human-computer interaction, affective computing
  • Image processing/computer vision/speech processing
  • Knowledge technologies

Tutorial outline

The tutorial will cover the following concepts:

  • User modelling
    • User models for assistive and affective computing
    • Requirements for natural and intuitive interfaces
    • Concepts from psychology: beliefs, desires and intentions
    • Cognitive architectures for interaction modelling
    • User, environment and interaction context
  • Emotion, affect and behaviour
    • The characteristics and intricacies of multimodal interaction
    • Available modalities in natural/ubiquitous human-computer interaction
    • Semantics of modality fusion
    • Handling uncertainty/noise
    • Behavioural cues, non-verbal interaction
    • From signals to signs of emotion
    • From emotional episodes to understanding behaviour
    • What else can we recognize? What can't we?
  • Lessons learned
    • Affect in the classroom: the case of the Agent-Dysl project
    • Aesthetic vs. expressed emotions: the Callas project on emotion in arts and entertainment
    • Human-like qualities in robots: Feelix Growing
    • Behavioural cues for medical monitoring: Metabo

Presenter

Dr. Kostas Karpouzis, Associate Researcher,
Image, Video and Multimedia Systems Lab Institute of Communication and Computer Systems (ICCS/NTUA)

Dr. Kostas Karpouzis is an associate researcher at the Institute of Communication and Computer Systems (ICCS) and holds an adjunct lecturer position at the University of Piraeus. His current research interests lie in the areas of human computer interaction, affective and assistive computing, multimodal sensing, understanding and reasoning about emotion, and virtual reality. Dr. Karpouzis has published more than 110 papers in international journals and proceedings of international conferences and is a member of the editorial board of the Journal of Multimodal User Interfaces and the Journal of Personal and Ubiquitous Computing; he is also a co-editor of the 'Humaine Handbook on Emotion research' and a contributor to K. Scherer's 'Blueprint for Affective Computing: a Sourcebook'. Since 1995 he has participated in more than twelve research projects at Greek and European level; most notably the Humaine Network of Excellence, within which he completed his post-doc in the field of mapping signals to signs of emotion, and the FP7 Siren project on children with issues in conflict resolution, which he coordinates.

He is also a national representative in IFIP Working Groups 12.5 'Artificial Intelligence Applications' and 3.2 'Informatics and ICT in Higher Education' and a member of the Executive Committee of the Humaine Association. Recently, Dr. Karpouzis was invited to present affective computing as a potential technology at the 'Social Inclusion and Related Technologies' workshop organized by the ICT for Inclusion unit. He is the keynote speaker of the EvoGames conference to be held in Istanbul later this year.

References

[1] G. Caridakis, K. Karpouzis, M. Wallace, L. Kessous, N. Amir, Multimodal user's affective state analysis in naturalistic interaction, Journal on Multimodal User Interfaces, Springer, DOI 10.1007/s12193-009-0030-8
[2] S. Asteriadis, P. Tzouveli, K. Karpouzis, S. Kollias, Estimation of behavioral user state based on eye gaze and head pose-application in an e-learning environment, Multimedia Tools and Applications 41 (3), Springer, pp. 469-493.
[3] G. Caridakis, K. Karpouzis, A. Drosopoulos, S. Kollias, SOMM: Self organizing Markov map for gesture recognition, Pattern Recognition Letters, 31 (2010), pp. 52-59
[4] G. Caridakis, K. Karpouzis, S. Kollias, User and context adaptive neural networks for emotion recognition, Neurocomputing 71 (13-15), Elsevier, pp. 2553-2562.
[5] S. Ioannou, G. Caridakis, K. Karpouzis, S. Kollias, Robust feature detection for facial expression recognition, EURASIP Journal on Image and Video Processing, Volume 2007 (2007), doi:10.1155/2007/29081.
[6] M. Wallace, S. Ioannou, K. Karpouzis, S. Kollias, Possibilistic rule evaluation: A case study in facial expression analysis, Int. Journal of Fuzzy Systems 8 (4), pp. 219-223.
[7] A. García-Rojas, F. Vexo, D. Thalmann, A. Raouzaiou, K. Karpouzis, S. Kollias, L. Moccozet, N. Magnenat-Thalmann, Emotional face expression profiles supported by virtual human ontology, Computer Animation and Virtual Worlds 17 (3-4), Wiley, pp.259-269.
[8] S. Ioannou, A. Raouzaiou, V. Tzouvaras, T. Mailis, K. Karpouzis, S. Kollias, Emotion recognition through facial expression analysis based on a neurofuzzy network, Neural Networks 18 (4), Elsevier, pp. 423-435.