AI Can Translate. Belonging Is Human.
- Tim Scannell
- 4 days ago
- 3 min read
For years, Deaf people have adapted to systems that were never truly designed for us.
We learned to navigate poor subtitles.
Missed announcements.
Phone-only services.
Rooms where access depended on luck, kindness, or whether someone remembered to book an interpreter.
Now AI is changing accessibility faster than many people expected.
Live captions are improving.
Translation tools are becoming smarter.
Hospitals, workplaces, transport systems, and customer services are beginning to explore sign language technologies in ways that once felt impossible.
This progress matters.

But there is something important the accessibility industry still risks misunderstanding:
Access is not the same as belonging.
A person can technically “access” a meeting and still feel excluded from the conversation.
A Deaf employee can receive captions and still feel invisible in workplace culture.
A Deaf patient can receive information and still leave a hospital feeling isolated, anxious, or emotionally exhausted.
Technology can reduce barriers.
But belonging is built through human behaviour.
Recently, I watched See Hear on BBC iPlayer discussing AI and Deaf communities. One point stayed with me afterwards:
Technology can recognise information. Human beings recognise emotion.
But real communication is more than information delivery.
A skilled human interpreter notices confusion, hesitation, stress, humour, emotion, body language, and cultural context. Communication is not only about accuracy. It is also about reassurance, trust, and feeling understood.
This is why I am pleased AI may create more communication options for Deaf people, but I do not believe AI should replace human sign language interpreters or translators.
In many situations, human communication support remains essential.
For example:
airport and aeroplane announcements
train disruption information
emergencies
healthcare appointments
MRI or body scan explanations
mental health support
legal conversations
education environments
Many AI systems still focus mainly on one-way communication.
But Deaf communication is two-way.
Understanding information is only part of accessibility. People also need opportunities to ask questions, express concerns, clarify misunderstandings, and communicate emotions back in their own language.
That human interaction becomes especially important during stressful or high-risk situations.
Sometimes the greatest risk is not technical failure.
It is false confidence.
Captions may appear accurate while important meaning is lost.
A translation may look impressive while emotional context disappears.
An automated system may provide information without providing reassurance.
This is why AI should support human accessibility - not quietly replace it.
I hope governments, businesses, healthcare providers, and public services continue investing in:
sign language interpreters
translators
communication professionals
Deaf-led accessibility consultants
emotional wellbeing support
human-centred care
inclusive communication systems
Accessibility is not simply a software feature.
It is about dignity, well-being, trust, safety, and human connection.
Belonging is when communication feels natural instead of negotiated.
It is when Deaf people are included early, not added later.
It is when sign language is respected as a language, culture, and lived experience - not simply a technical problem to solve.
The future of accessibility should not only ask:
“Can people access this?”
It should also ask:
“Do people feel safe, respected, and understood here?”
Because AI can help open doors.
But people create belonging.
Comments are available at the bottom of each blog post. Simply open the post and scroll down to join the conversation.




Comments