Towards 'Useful' Social Cognitive Robots
My research interests are broadly in the overlap between (developmental) cognitive robotics and social human-robot interaction, with particular interest in the domains of education and social/care assistance. My research can therefore be summarised as being on Social Cognitive Robots.
My overall goal is to develop robotic and autonomous systems that can assist people through social interaction. To achieve the desired outcomes, I believe it is necessary to go beyond the surface aspects of these autonomous systems (e.g. appearance and reactive behaviours) and address the human-centric cognitive aspects of behaviour. As an inherently multi-disciplinary problem, I seek to integrate principles and methods from computer science (including robotics and machine learning) and psychology (including social and cognitive psychology and human factors) to develop autonomous systems that are validated by and tested in human-centred empirical evaluations.
This approach is both interesting and necessary for the two application areas that I am primarily interested in: (1) making social robots more useful in assisting people in their lives, and (2) making use of social cognitive robots as exploratory tools/platforms to investigate cognition and social behaviour to better understand ourselves as humans.
My research is loosely organised around three inter-related themes, introduced below: (1) learning from humans; (2) learning with humans; and (3) long-term social human-robot interaction. In general, I seek to introduce, develop, and evaluate autonomously operating technologies not merely for the sake of it, but with the intention of either directly helping people, or establishing and understanding principles that may facilitate such help in the future.
By learning "from" humans, I mean here in the general sense of learning from the general characteristics, competencies, and mechanisms of people. Not of any particular individual, but of human beings as the prime example we have of `intelligent' (or cognitive), adaptive, and flexible behaviour. For example, it is known that human gaze is used for many social functions besides supporting the ability to see the environment - how can this be applied to robotics to improve their social interactivity, and to what extent can this be achieved? How do people recognise the intentions of others, and how do they use this to inform their own behaviour? How is it that we as humans are capable of such complex integration of information over time to enable flexible directed behaviour that is appropriate to the content, and over multiple time-scales? It is these questions, and many more, that are the focus of this theme of my research. To do this, I seek to learn from psychology and cognitive and develomental sciences (and indeed on occasion attempt to feed something useful back) in order to improve the `usefulness' of robotic autonomous systems.
The flexibility and appropriateness of human behaviour, in a huge range of social and non-social contexts, is an ability that current autonomous systems can only dream of. One key component of this ability, and one that virtually all autonomous systems attempt to make use of, is learning. When dealing with robots that are intended to interact with people (the entire field of Human-Robot-Interaction!), then why not try to learn from the humans that are currently present, be they partners, supervisors, bystanders, etc? Learning from individual people in this way, in real-time, from knowledgable/expert but perhaps inconsistent humans is an important source of information for robots. It is however a difficult challenge, especially with people who may not be experts with robotics, but who are nevertheless experts in the domains/environments that our robots are trying to assist them in. There are a range of principles and techniques that I try to incorporate here, including learning with humans-in-the-loop, adaptive robot social behaviours, and robotic cognitive memory (a topic of particular and special interest to me).
Two important aspects of this work: (1) believable any-depth interaction, which means (for example) that the robot system is robust to low quality sensory information, and resulting uncertain knowledge of human/environment state, and can nevertheless maintain an engaging interaction (this is a very difficult problem, requiring the integration of the two themes above!), and (2) a strong commitment to testing and evaluation of systems 'in the wild', i.e. in settings that are typical for the people we are trying to help, and not for ourselves as roboticists, with my work involving schools, hospitals, museums, etc (which can be extremely difficult practically, methodologically, and ethically).
Publications are one of the main outputs of academic research (one of the means by which we are typically assessed, both in terms of our employing institutions, and by our research communities). Research publications are not always the most public-friendly, but I would contend that most 'good' publications should be readable by as wide an audience as possible.
For an up-to-date look at citations and metrics on my publication record, please take a look at my Google Scholar Profile. The metrics are far from perfect (not handling self-citations very well in my view, for example), but they provide somewhat of an overview of my research publication activity and (to a much lesser extent) influence.
Default view is to group by year; to group by Journal (article) and Conference (inproceedings) papers, sort by "Type".
Links and other resources may be placed here where relevant. The first place to check for information on any taught module though is Blackboard.
For current students who would like to arrange a meeting with me (personal tutees, UGT/PGT project students, SEPS PGR students, and/or students taking the modules in which I am a member of the delivery team), please select a meeting time using my meeting booking system: found here.
If any of these times are not suitable, then please just contact me by email or on MS Teams, and we can arrange another time if possible. Happy to meet either face-to-face or on MS Teams depending on your preference.
Over the years, I have developed a range of guidance and supporting documents in relation to various of my teaching activities. Some of these are gathered below, in case they may be of some use to any current students (or indeed anyone else).
Please see drop-downs below for the modules I am (and have previously been) involved in:
Modules:
Applied Programming Paradigms (CMP2801M)
User Experience Design (CMP2805M)
Advanced Robotics (CMP9764M)
Research Methods (CMP9139M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to UGT Final Year Projects and PGT Research Projects):
Advanced Programming (CMP2801M)
User Experience Design (CMP2805M)
Advanced Robotics (CMP9764M)
Research Methods (CMP9139M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to UGT Final Year Projects and PGT Research Projects):
Advanced Programming (CMP2801M)
User Experience Design (CMP2805M)
Advanced Robotics (CMP9764M)
Research Methods (CMP9139M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to Final Year Project):
Advanced Programming (CMP2801M)
User Experience Design (CMP2805M)
Autonomous Mobile Robotics (CMP3103M)
Advanced Robotics (CMP9764M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to Final Year Project and Group Project supervisions):
User Experience Design (CMP2805M)
Autonomous Mobile Robotics (CMP3103M)
Advanced Robotics (CMP9764M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to Final Year Project and Group Project supervisions):
Human-Computer Interaction (CMP2019M)
Object-Oriented Programming (CMP2090M)
Autonomous Mobile Robotics (CMP3103M)
Advanced Robotics (CMP9764M)
Frontiers of Robotics Research (CMP9766M)
Modules (in addition to Final Year Project and Group Project supervisions):
Human-Computer Interaction (CMP2019M)
Object-Oriented Programming (CMP2090M)
Autonomous Mobile Robotics (CMP3103M)
Modules (in addition to Final Year Project and Group Project supervisions):
Human-Computer Interaction (CMP2019M)
Object-Oriented Programming (CMP2090M)
Autonomous Mobile Robotics (CMP3103M)
-- Oscar Wilde