There are many things we take for granted in the world. Our loved ones, having a roof over our head, weekend lie-ins… they’re all things that we are happy for, yet it’s more often that we enjoy them at face value rather than sit and consider how much we actually appreciate them. But there is one thing that we rely on in life more than anything – so much so that it is hard to picture our lives without it – and that is how we interpret the world with our senses. Touches. Sounds. Smells. And certainly, sight.

Sight? Who needs that!

Daniel Hajas lost his sight at the age of 17, for reasons that still elude his doctors. Born in Hungary, and living in Croatia at the time, his sight started to deteriorate around 6 years old and it went downhill from there. Even the smallest things that people take for granted – crossing the street, sending a text, pouring a drink – were taken away from him, and he had to relearn how to do them in new, inventive ways.
However, Daniel didn’t let this enormous change stop him from pursuing his dreams. In 2013, he packed two suitcases and moved to Brighton, UK, to attend the University of Sussex. Around the time he lost his sight, he had began falling in love with science and physics, and he was set on continuing down that road. He was accepted to do a Bachelor’s and Master’s in Theoretical Physics – a degree he completed in Summer 2017.

Daniel at age 6 in a school setting

One for all, all for one

For his last year before university, Daniel had teachers and his parents to help him with his studies, but he knew things were going to be different at Sussex with a large undergraduate cohort and no parents to help him. Whilst he had all the support he could have asked for, his time there made him realise the comparably small amount of help that was available to blind and visually impaired people studying science, technology, engineering and mathematical subjects. Existing assistive technology enables independent reading, writing, and communication without relying on sight, but dealing with mathematical notation is still not easy. Things like screen readers are remarkably helpful for literature based subjects, but if your literature says “As seen in figure 1…“, then what? How do you move on from there?

Daniel at age 23 in a graduation gown.

The genesis

In 2014, one of Daniel’s computing lecturers, Prof. Kathy Romer, set the class an assignment to think of a hypothetical project using the programming they had learnt that term. This was the first time that Daniel started to put down on paper the frustration that had followed him since he first lost his sight: how can we use technology to make graphs and images a useful extra resource rather than a source of frustration? How can we make students more independent, more able?

Prof Kathy Romer with telescopes in the background

In the past, Daniel had to wrestle with expensive, bulky, slow, and outdated equipment to emboss images onto paper; or if he was lucky, he could sometimes rely on friends to sit down and describe the images he was looking. Now, everyone could have that friend to be there for them at any hour of any day – and that friend would be called Grapheel.

Taking off

As the idea spread, word spread; since then, several other members have joined the team as Grapheel began to pick up speed. David, Tim, and Linn were all fellow physics students with Daniel at the University of Sussex, and together, they have helped Daniel build his idea into the company Grapheel is today.

Daniel jogging with his guide runner.

Keeping both mind and body fit

Daniel is now thriving. After graduating from his undergraduate degree in summer 2017, he wasted no time and lined up a PhD to start immediately afterwards, still at Sussex. He’s an avid runner, and runs with a group in Brighton several times a week. He also established a sporting group at Sussex during his time as an undergrad there called Blind Football, which is welcome for both BVI and sighted people — except the sighted people wear blindfolds, and there’s a bell in the ball.


Group photo of the blind football team with members wearing blindfolds.

Daniel helps organise the Sussex Universe lecture series, where postgrads and faculty members from Sussex’s Physics department hold weekly lectures about a topic of choice, and he also teaches Hungarian GCSE/high school students maths and physics over Skype.


Journey to academia

Whilst thermodynamics and quantum mechanics will always be close to Daniel’s heart, his interests are shifting from fundamental physics to human computer interaction: the topic of his PhD. Studying sciences using assistive tech, he developed interest to understand how scientific information is stored, communicated, and used, as well as concepts like sensory substitution devices. This, alongside other technology, can be used to better represent information in a fun, more interactive, multi-sensory way.

Daniel standing with a poster, displaying physical quantities and their connections.

Connecting and Enabling

Grapheel isn’t meant to just be a service. It’s not just a blunt tool, whose only reason for existence is to complete a task. The core of it, and the reason behind its conception, is to explore how we can reach out to our fellow humans and give everyone the chance to do what they love. It’s about giving everyone, regardless of their background, the chance to pursue what they want to do without being hindered by their physical abilities — to give not the same opportunities, but equal opportunities. To expand the community, and to make sure that no one feels alone in their disability. It’s about understanding that different people learn in different ways, and that seeing things from a new perspective is something to be encouraged, not dismissed. It’s about furthering our understanding of how people learn, so that we in turn can understand how to teach using all of our senses, together.

The saying goes that “a picture paints a thousand words”. Now, we will be able to paint a thousand pictures, graphs, and diagrams with only one word: Grapheel. And we have Daniel to thank for that.