AI, BSL, and Language Equity: A Reflective Journal on Inclusion, Ethics, and the Future of Communication
- Tim Scannell
- Jul 6
- 3 min read
An AI company manager struggled to learn BSL Level 1, but still asked his tutor to move him forward to Level 2. Meanwhile, the company's founder launched an avatar-based sign language system for wayfinding without Deaf input. Unsurprisingly, the Deaf community pushed back.
In another development, a non-AI tech company announced it had stopped developing audio and video voice systems - AI had taken over. A business soon entered into a deal with the AI company, investing and paying for licenses without any negotiation, user testing, or feedback from native BSL users. No input was gathered from people with lived experience, cultural roots, or linguistic knowledge. This raises a serious question: Whose language is it, and who benefits from it?

The Right to Language, the Right to Compensation
Look at how Air Jordan works: the creator receives a percentage for each pair sold under a clear IP agreement. Shouldn’t native signers whose language, expressions, and cultural capital are being harvested to train AI also receive royalties or payments every time their signing is used?
Yet, we often see native BSL users left unpaid, uncredited, and unconsulted.
A Growing Sense of Displacement
AI companies are chasing investment, avoiding share dilution, merging quickly, and reshaping teams. But this fast pace has left interpreters, developers, and Deaf professionals burned out and pushed aside. Meanwhile, salaries rise in tech, but what about accessibility, inclusion, and trust?
Non-English BSL users are especially left behind. Government and private AI companies have begun replacing BSL interpreters without notice or consultation. This leaves the community fearing for its future. Interpreters tasked with providing voiceovers and translations now see AI avatars doing their job. It’s hard not to feel replaced, especially when these avatars lack accuracy, nuance, or emotional depth.
A Real-World Example: Easter and Mistranslation
During the Easter period, many observed AI-generated BSL translations in public campaigns. The result? A rollercoaster of emotions. The context was off. The signs didn’t match the meaning of the audio or the English text. It was an unsettling reminder: sign language cannot be auto-generated like subtitles.
So, how do we fix this?
Restoring Trust Through Transparency
AI-generated sign language must be verified by native BSL users and Deaf-led teams.
Signers should be paid not just once, but every time their signing is used in a dataset or output.
A legal framework is urgently needed: a declaration protecting sign language data, consent, and performance rights.
Universities and labs must be involved in verifying the linguistic accuracy and context of sign language use in AI.
Think of it like Hollywood - actors get royalties. Signers should, too.
We Need Tools That Truly Include Everyone
Currently, companies pay to license AI avatars but not real, photorealistic human BSL avatars or native interpreters. Website tools like Google Translate or ReciteMe exist for spoken languages, but there’s no equivalent for real-time BSL translation with visible, synchronised text, interactive features, or trusted verification.
Imagine this:
Sign language support on digital billboards
BSL on cinema screens, public intercoms, defibrillators, websites, radio, QR codes
Sign language on all interfaces, not just subtitles
Where is the WCAG, W3C, or WAI regulation around sign language access? Still missing.
Human-Centred AI: The Only Ethical Path Forward
I’m encouraged to see universities exploring “human-centred language and technology.” AI should never exist without people - it should exist because of people. Humans must shape the training data, define the boundaries, and make the decisions.
We must also ask: Do we want short-term hype or long-term success?
True success includes:
Deep collaboration with native sign language users
Respectful partnerships
Fair pay
Data accuracy
Trust-building
Transparency in development and deployment
Sign Language Is Not a Dataset - It’s a Culture
BSL cannot and must not be “reformed” or simplified to suit AI structures. It is a complex, rich language with:
Facial expressions
Non-manual features (NMFs)
Multi-channel communication
Classifiers
Glosses
Phonology
Nuances
Spatial grammar
Body language
Handshape codes
Movement and orientation rules
Sign language is visual poetry - a living culture that cannot be flattened or approximated by tech without harm.
What Gives Me Hope
I’ve seen some technical improvements: better frame rates, smoother gestures, and less robotic movement. That’s progress. People appreciate AI’s potential to help with productive and receptive sign language, and for supporting turn-taking in conversations between Deaf and hearing users.
But technology can’t move faster than trust. It must move with the community, not ahead of it.
Final Thoughts
The Deaf community deserves transparency. Signers deserve compensation. Interpreters deserve inclusion. And BSL deserves protection, not approximation.
Let’s build technology with people, not on people.
💬 I'd love to hear your thoughts:
Do you work with sign language, accessibility, or AI? What challenges do you see? What ethical questions are you asking?
Feel free to share or respond. Let’s build a better, more inclusive future together.