A team from a Chinese university has developed an AI platform designed to translate written text into sign language in real time. The project aims to narrow communication gaps, promote inclusion for deaf and hard-of-hearing people, and facilitate their integration into wider communities.
Dr Su Jionglong, deputy dean of the School of AI and Advanced Computing at Xi’an Jiaotong-Liverpool University (XJTLU) in Suzhou, is leading the project. Dr Su’s main research focus is applying technology to social causes to help reduce discrimination, provide emotional support, and speed up medical diagnoses.
His latest AI-powered platform translates written text to and from sign language in real time using digital avatars to perform the signs. The tool aims to foster inclusion for people with hearing impairments and bridge everyday communication gaps. It is part of a start-up called Limitless Mind, which Dr Su is currently developing with his students. They describe it as a “socially responsible, freely accessible academic initiative”.

This approach not only optimises data efficiency and reduces dependence on high-end hardware, but also enhances translation accuracy, positioning Limitless Mind as one of China’s most advanced sign language AI initiatives.
According to XJTLU, the team now boasts China’s largest sign language translation dataset, comprising the most extensive video vocabulary and 3D skeletal point collection.
China’s vast population makes it the perfect place to develop and test AI technologies. It also has access to high-quality datasets from hospitals across the country, which often share anonymised medical data for research purposes.
In an interview with the South China Morning Post, Su described Limitless Mind as “an inclusive communication platform to help overcome barriers between people who use sign language and those who do not”. He said the lightweight model could be used as an app on mobile phones – essential for everyday access – but could also be integrated into smart glasses.

Su added that his team was “innovating technologies to meet real demands in society”. “Disabled students can learn better in class, patients can have more succinct and meaningful conversations with doctors, and workplace discrimination can be minimised,” and reportedly expressed interest in supporting the commercialisation of the project. Su argues that the platform could make classrooms more accessible, improve doctor–patient communication, and reduce workplace discrimination.
The team is also exploring next-generation assistive technologies, such as lip movement to text and brain–computer interfaces (BCIs) capable of translating EEG brain waves into written language.
Su is collaborating with Mind with Heart Robotics, a Shenzhen-based company that specialises in electronic pets and childlike humanoid robots designed to provide emotional support. These tools have been successfully trialled with autistic children, as the machines can assess their emotional state and share this information with their caregivers.
This new app follows a study published in the peer-reviewed journal the Chinese Medical Journal, which predicts that, by 2060, 240 million Chinese people will suffer from some degree of hearing loss – twice the number recorded in 2015.












