tatum Robotics

Tatum Robotics: A helping hand

68 0

With a cobot hand sufficiently dextrous to perform sign language, Tatum Robotics hopes to help DeafBlind people around the world achieve improved powers of communication


It’s hard to comprehend a world where your entire means of communication is based on touch alone, but this is the reality for those with combined sight and hearing loss.

A tactile form of sign language – tactile signing – where the DeafBlind person places their hands on top of another person’s signing hands in order to feel and read their movements is key to their ability to converse and learn about the world around them.

As an engineering student at Northeastern University in Boston, Massachusetts, Samantha Johnson took a class in American Sign Language (ASL). That brought her into contact with members of the city’s deaf community and, in particular, local DeafBlind people.

“I began chatting with one of the women there through tactile signing. So essentially, she held onto my hand while I was signing with her. And I was just really amazed that she was receiving communication through just touching my hand,” Johnson recalls. “I asked her, how would we talk if we weren’t right next to each other right now?”

The answer was simple – it’s only possible in person. That seemed crazy to Johnson, who stayed in contact with some of the deafblind people she’d met after the course finished.

When the Covid-19 pandemic hit, isolating at home stopped interpreter visitor services to such people, leaving an already isolated and forgotten community even more neglected, says Johnson. In some cases, the service visits ceased without any explanation, leaving DeafBlind people with no clue as to what was happening and no way to follow the news or reach out for assistance.

Advertisement
Advertisement

Staying in touch

Keen to help, Johnson reached out to a DeafBlind centre in Boston and chatted with other advocates about what they could do to support the community during this time. “We started thinking about communication tools and started developing this really early-stage tactile signing system,” she says.

Johnson’s aim was to build a cobot that could help supplement the DeafBlind community’s ability to communicate remotely, relaying phone conversations or simply reading the news. Starting first by understanding existing technologies and their limitations, the team began to 3D print prototypes and assess what it would need to develop a text-to-sign device.

There’s very little literature about how you design technology for the DeafBlind user and designing user interfaces for people who can’t see or hear them

“My outlook on robotics, and a lot of technologies in general, is that if we just took one step further, we could use all the existing technologies and make them more accessible,” says Johnson.

While some redesign work is often required, the real skill in many projects like this lies in knowing how to incorporate universal design in order to get a result that is more inclusive.

About to graduate from her studies, Johnson was approached by the Canadian National Institute of the Blind, which had learnt about the project’s early progress and provided her with a grant to start a company around the cobot technology.

Today, Johnson is CEO of Tatum Robotics, developing the T1 Fingerspelling Hand and leading a team headquartered at MassRobotics, Boston – a robotics innovation hub and automation start-up accelerator in the city.

The company works not only with Boston’s DeafBlind groups, but also similar groups in New York, Connecticut and Washington DC. The goal is to keep expanding, to get as much feedback as possible from the widest possible community of users.

This feedback is critical to the design process, Johnson explains. “There’s very little literature about how you design technology for the DeafBlind – designing user interfaces for people who can’t see or hear them. So, we’ve been needing to play a lot with haptics, with the different sorts of environmental variables that we can play a lot with, and that we’re able to iteratively prototype with 3D printing.”

As Tatum Robotics started to bring in more mechanical engineers, it made the decision to join Onshape’s startup programme, giving it access to PTC’s cloud-based CAD software to model the cobot hands.

“Once we started building out a team, I realised that our work needed to be collaborative. We needed to all be able to use the same documents, we needed to have versions, and we needed to be able to revise. And especially, as I mentioned, we get feedback constantly, so we have to iterate really quickly.”

Johnson says that using Onshape has been a positive experience so far. “It’s been really helpful, allowing a lot of hands to be in the same document, while not making too much of a mess. And it allowed us to build this team really early on,” she says.

“Using Onshape allows us to have a virtual team and really stay on pace with each other, and we’ve been really excited to be a part of its start-up programme and see how the product has grown since.”

The team is building new prototypes every day, from new finger concepts to tactile gloves that cover the inner workings, so that new ideas can quickly be tested. “We have two DeafBlind folks here in Boston that are in our office every week, really validating what we’re designing, giving me feedback,” says Johnson.

Manual dexterity

The team at Tatum Robotics quickly realised that it needed a highly dexterous robotic hand, capable of performing ASL, which would require a lot of different joints moving independently. Today, the company has developed an 18 degrees-of-freedom hand that is undergoing beta testing.

Aside from the usual concerns with a cobot design, the design of the hand also must consider that its users cannot hear it or see it moving. In most cases, they have little to no experience with technology.

“So we really needed to make sure this was safe to use under all circumstances – the epitome of collaborative robotics – since users are completely in contact with it the whole time,” says Johnson.

The design couldn’t use traditional linkages in the joints, because the back profile of the hand might then have connections where the user’s fingers could be pinched. This led the team to start building the hand with flexible materials, testing different durometers of 3D-printed flexible elastomeric polyurethanes (EPUs) and thermoplastic polyurethanes (TPEs), but also working with local suppliers on tooling for silicone and urethane castings.

So far, the cobot hands have been tested by over 50 DeafBlind people and the feedback has been remarkable. “Folks are using it with almost no training and they’re getting 95% to 98% accuracy without any training on the device,” says Johnson. “And that is the goal. They put their hand on it, and it moves like a hand that they’re used to.”

Tatum Robotics’ software for the hands is cloud-based and can be customised for each user, continues Johnson. “You can customise the hand shapes, the speeds, the grammar it’s using, to really fit that use case for those DeafBlind folks. So iterative prototyping has really been our best friend, as we are trying to figure out what that should look like.”

As the current version of the T1 Fingerspelling Hand moves into beta testing, the company is bringing in deaf blind people to train them how to use the devices completely independently, to see how they use it without the presence of another person.

By putting more devices in the homes of DeafBlind users, the team aims to learn more about what they can be used for and how often. Above all, the Tatum Robotics team hopes that opening new avenues of communication will transform lives and ensure that deafblind people are never again left deserted.


This article first appeared in DEVELOP3D Magazine

DEVELOP3D is a publication dedicated to product design + development, from concept to manufacture and the technologies behind it all.

To receive the physical publication or digital issue free, as well as exclusive news and offers, subscribe to DEVELOP3D Magazine here