Mobile menu icon
Skip to navigation | Skip to main content | Skip to footer
Mobile menu icon Search iconSearch
Search type

Department of Computer Science


Designing Safe & Explainable Neural Models in NLP

Primary supervisor

Additional supervisors

  • Lucas Cordeiro

Contact admissions office

Other projects with the same supervisor

Funding

  • Competition Funded Project (Students Worldwide)

This research project is one of a number of projects at this institution. It is in competition for funding with one or more of these projects. Usually the project which receives the best applicant will be awarded the funding. Applications for this project are welcome from suitably qualified candidates worldwide. Funding may only be available to a limited set of nationalities and you should read the full department and project details for further information.

Project description

Project Description

The construction of complex neural Natural Language Processing (NLP) systems which can provide both explainability and safety guarantees is fundamental for the adoption of these models in real world applications. Despite the recent evolution of safe AI methodologies in the field of computer vision, approaches which are specific to NLP are still limited.

In this project we will design new safe AI methods which are suitable for complex natural language inference tasks. We will propose new methodologies for designing semantically controlled embedding spaces for explainable inference tasks in NLP. The project will explore the dialogue between emerging methodologies to systematically assess the desirable semantic and consistency properties of embedding spaces in NLP (e.g. semantic probing, metamorphic testing, attribution), and the design of embedding spaces with well-defined geometric-semantic properties (e.g. better disentanglement).

Topics related to this theme includes:

Neuro-symbolic representations for natural language inference.
Dialogue between explainability and safety.
Semantic probing.
Metamorphic testing in NLP.
Synthetic data for NLP.
Disentangled representations.

This is an ARM industrial CASE (iCASE) project and will involve the collaboration between a multi-disciplinary academic team: Andre Freitas (NLP, Inference, Explainability), Lucas Cordeiro (Software Testing and Verification) and an industrial supervisor appointed by ARM. The successful candidate will have the opportunity to collaborate on the EPSRC Enncore (End-to-End Conceptual Guarding of Neural Architectures) project.

The successful candidate will receive funding for 4 years, and a supporting budget for conference attendance.

The Candidate

Required attributes:

BSc in computer science or related area (with a solid mathematical basis).
Fluent programmer in Python (evidenced by existing projects).
Fluent English.

At least one of the following attributes:

MSc in AI, Data Science or related areas.
Scientific publications.
Mathematical background in machine learning.
At least 2 years of industrial experience.
An outstanding academic record.


The Supervision Team and Research Environment

Manchester saw the birth of computer science, with the creation of the world's first stored-program computer. We continue to work on pioneering research with widespread activity and strength in a range of key aspects of computer science from hardware through to user interaction.

The Department of Computer Science at the University of Manchester is the longest established department of Computer Science in the United Kingdom and one of the largest. The University of Manchester is a member of the Russell Group, the N8 Group, and the worldwide Universities Research Association. The University of Manchester has 25 Nobel laureates among its past and present students and staff, the fourth-highest number of any single university in the United Kingdom.

Applications

Deadline for applications is April 20th, 2022. Qualified applicants are encouraged to informally contact Andre Freitas (andre.freitas@manchester.ac.uk) and Lucas Cordeiro (lucas.cordeiro@manchester.ac.uk) with CV and transcripts to discuss the application prior to applying.

Person specification

For information

Essential

Applicants will be required to evidence the following skills and qualifications.

  • You must be capable of performing at a very high level.
  • You must have a self-driven interest in uncovering and solving unknown problems and be able to work hard and creatively without constant supervision.

Desirable

Applicants will be required to evidence the following skills and qualifications.

  • You will have good time management.
  • You will possess determination (which is often more important than qualifications) although you'll need a good amount of both.

General

Applicants will be required to address the following.

  • Comment on your transcript/predicted degree marks, outlining both strong and weak points.
  • Discuss your final year Undergraduate project work - and if appropriate your MSc project work.
  • How well does your previous study prepare you for undertaking Postgraduate Research?
  • Why do you believe you are suitable for doing Postgraduate Research?