Chameleon: changing the future of Deaf Bible translation
By Deb Fox | Wycliffe Today Spring Edition 2020 |
Saul and Rebecca Thurrowgood are Wycliffe members who have just welcomed the birth of their fifth child. They are also excited about the arrival of a program Saul and his team have spent many years perfecting in partnership with the Deaf Bible Society: Chameleon. Rebecca explains that, just like the chameleon’s ability to adapt and change in order to communicate, ‘the goal of Chameleon is bringing the gospel to the Deaf in a new way that protects the people involved by changing their appearance.’
Currently, less than two per cent of the world’s Deaf identify as followers of Jesus. Many do not have access to God’s Word in a language they understand—their own sign language. There are over twenty-five sign languages with portions of Scripture available on video but there are significant barriers to videoing real people for the translation of the remaining sign languages.
For many regions around the world, persecution is a daily occurrence. Therefore, filming a real person recording sign movements in their local sign language may be a dangerous move. Another barrier which often presents itself in small Deaf communities is denominational differences among Christians. Unlike the anonymity of a printed Bible translation, the face of the signer may become attached to the signed translation. If their character, past life or community become an issue, they risk overriding the message of the gospel. The use of animated characters eliminates these risks and also enables the translation work to be accelerated.
How does the technology work?
Chameleon is a form of motion capture technology which uses artificial intelligence which the team has trained to create neural networks1 for an avatar (an animated character) to copy.2 In order to create the neural networks, they have had to source movements from as many places as possible, including videos already available from the Deaf Bible Society and filming live recordings in a studio.
Saul says:
We need the computer to track the person, regardless of their shape, size, ethnicity and gender. We need to train the computer to recognise the various parts of the body. We have trained the computer’s neural networks to recognise different locations including the body, the hand and five different neural networks in the face. There are hundreds of thousands of images fed into the computer in order for it to recognise various shapes. Once it can remember specific movements, the avatar can be asked to perform a number of sign movements.
Rebecca explains: ‘We’re trying to get the computer to recognise the movements. Perfect copying means a better data output—the better that is, the better the outcome is.’
After many years of setbacks and trials, Chameleon 1.0 is almost ready for release. Saul says that the team was excited to discover that a team in South-East Asia had been using the pre-release version of the program and it worked better than expected.
Rebecca adds:
To know that this technology is being used for its intended purpose is a huge blessing. We are so grateful knowing that this product will be a way to get the gospel out to places where it otherwise may have been impossible to create a sign language translation safely.
1 Neural networks: A set of algorithms, modeled loosely after the human brain, that are designed to recognise patterns.
2 An avatar is an electronic image that represents—and may be manipulated by—a computer user.