Supporting non-visual manipulation of algebraic equations

Primary supervisor

Additional supervisors

  • Robert Stevens

Contact admissions office

Funding

  • Competition Funded Project (Students Worldwide)
This research project is one of a number of projects at this institution. It is in competition for funding with one or more of these projects. Usually the project which receives the best applicant will be awarded the funding. Applications for this project are welcome from suitably qualified candidates worldwide. Funding may only be available to a limited set of nationalities and you should read the full department and project details for further information.

Project description

Manipulations of notations such as algebra are physically straight-forward using a combination of pencil, paper, eyes and a brain. When eyes are removed from this list the physical manipulations of algebra become difficult; this is particularly true when synthetic speech is the presentation medium.

Solving an equation such as 3x + 4 = 7 for x may involve subtracting 4 from each side; dividing each side by 3; getting the solution for x being 1. On paper this can be done in separate lines of notation or with simple annotations of the original formula, with crossings out, insertions, substitutions, and so on, but in each the external memory of the paper facilitates the mental activity; the "movement" of the +4 to the "other side" and "changing the sign" helps the "working out" and makes explicit the "3x -= 3" stage and the "obvious" solution.

Such manipulations without sight are harder and not "seeing" the stages easily make all the stages cognitively harder.

Solutions have been presented for rendering and exploring algebra notation using non-speech sound and synthetic speech and various input devices to control information flow. Mobile technology, with touch screens, that use gestures to manipulate objects on the device offer interesting research opportunities for exploring notations such as algebra and then manipulating that notation. We can imagine scenarios where fingers are used to tuch elements on an equation in order to "read" them and then grasp elements and "move them about" to achieve various manipulation tasks; this would exploit proprioceptive memory. For example, each term in 3x +4 = 7 is touched to find out its form, dragged "over to the other side", merged with an adjacent term, substituted, and so on.

Such a scenario includes, but is not limited to, the following research questions, which may have wide reaching implications for non-visual technology design:
- Is proprioception with audio fine grained enough that it can be exploited in such a scenario?
- What are the necessary gesture-based languages for reading and manipulating algebra?
- How do we evaluate such a language?
- What does this tell us about how people interact with complex information?

▲ Up to the top