• A Staff Canada Group Company

3D-printed objects that sense how a user is interacting with them

3D-printed objects that sense how a user is interacting with them

 September 14, 2021 at 6:13 pm   |     Author:   |     Innovation  

Engineers create 3D-printed objects that sense how a user is interacting with them
Researchers from MIT have developed a method to integrate sensing capabilities into 3D printable structures comprised of repetitive cells, which enables designers to rapidly prototype interactive input devices. Credit: Massachusetts Institute of Technology

MIT researchers have developed a new method to 3D print mechanisms that detect how force is being applied to an object. The structures are made from a single piece of material, so they can be rapidly prototyped. A designer could use this method to 3D print “interactive input devices,” like a joystick, switch, or handheld controller, in one go.

To accomplish this, the researchers integrated electrodes into structures made from metamaterials, which are materials divided into a grid of repeating cells. They also created editing software that helps users build these interactive devices.

“Metamaterials can support different mechanical functionalities. But if we create a metamaterial door handle, can we also know that the door handle is being rotated, and if so, by how many degrees? If you have special sensing requirements, our work enables you to customize a mechanism to meet your needs,” says co-lead author Jun Gong, a former visiting Ph.D. student at MIT who is now a research scientist at Apple.

Gong wrote the paper alongside fellow lead authors Olivia Seow, a graduate student in the MIT Department of Electrical Engineering and Computer Science (EECS), and Cedric Honnet, a research assistant in the MIT Media Lab. Other co-authors are MIT graduate student Jack Forman and senior author Stefanie Mueller, who is an associate professor in EECS and a member of the Computer Science and Artificial Intelligence Laboratory (CSAIL). The research will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology next month.

“What I find most exciting about the project is the capability to integrate sensing directly into the material structure of objects. This will enable new intelligent environments in which our objects can sense each interaction with them,” Mueller says. “For instance, a chair or couch made from our smart material could detect the user’s body when the user sits on it and either use it to query particular functions (such as turning on the light or TV) or to collect data for later analysis (such as detecting and correcting body posture).”

Embedded electrodes

Because metamaterials are made from a grid of cells, when the user applies force to a metamaterial object, some of the flexible, interior cells stretch or compress.

The researchers took advantage of this by creating “conductive shear cells,” flexible cells that have two opposing walls made from conductive filament and two walls made from nonconductive filament. The conductive walls function as electrodes.

When a user applies force to the metamaterial mechanism—moving a joystick handle or pressing the buttons on a controller—the conductive shear cells stretch or compress, and the distance and overlapping area between the opposing electrodes changes. Using capacitive sensing, those changes can be measured and used to calculate the magnitude and direction of the applied forces, as well as rotation and acceleration.

To demonstrate this, the researchers created a metamaterial joystick with four conductive shear cells embedded around the base of the handle in each direction (up, down, left, and right). As the user moves the joystick handle, the distance and area between the opposing conductive walls changes, so the direction and magnitude of each applied force can be sensed. In this case, those values were converted to inputs for a “PAC-MAN” game.

By understanding how joystick users apply forces, a designer could prototype unique handle shapes and sizes for people with limited grip strength in certain directions.

The researchers also created a music controller designed to conform to a user’s hand. When the user presses one of the flexible buttons, conductive shear cells within the structure are compressed and the sensed input is sent to a digital synthesizer.

This method could enable a designer to quickly create and tweak unique, flexible input devices for a computer, like a squeezable volume controller or bendable stylus.

A software solution

MetaSense, the 3D editor the researchers developed, enables this rapid prototyping. Users can manually integrate sensing into a metamaterial design or let the software automatically place the conductive shear cells in optimal locations.

“The tool will simulate how the object will be deformed when different forces are applied, and then use this simulated deformation to calculate which cells have the maximum distance change. The cells that change the most are the optimal candidates to be conductive shear cells,” Gong says.

The researchers endeavored to make MetaSense straightforward, but there are challenges to printing such complex structures.

“In a multimaterial 3D printer, one nozzle would be used for nonconductive filament and one nozzle would be used for conductive filament. But it is quite tricky because the two materials may have very different properties. It requires a lot of parameter-tuning to settle on the ideal speed, temperature, etc. But we believe that, as 3D printing technology continues to get better, this will be much easier for users in the future,” he says.

In the future, the researchers would like to improve the algorithms behind MetaSense to enable more sophisticated simulations.

They also hope to create mechanisms with many more conductive shear cells. Embedding hundreds or thousands of conductive shear cells within a very large mechanism could enable high-resolution, real-time visualizations of how a user is interacting with an object, Gong says.


Building a multifunctional pressure sensor with 3D printing technology


Provided by
Massachusetts Institute of Technology

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Citation:
3D-printed objects that sense how a user is interacting with them (2021, September 14)
retrieved 14 September 2021
from https://techxplore.com/news/2021-09-3d-printed-user-interacting.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.



Source link