17 March 2017. A system that captures and interprets spoken words by individuals and groups can track their various moods, which also predict eating behaviors leading to obesity. A team from University of Southern California in Los Angeles, and University of Virginia in Charlottesville presented its findings at the annual meeting of the American Psychosomatic Society, now underway in Seville, Spain.
Obesity is a continuing problem in the U.S., with nearly 4 in 10 (37%) adults and almost 1 in 5 children (17%) considered obese, according to Centers for Disease Control and Prevention. Chronic medical conditions including heart disease, stroke, and type 2 diabetes, as well as some forms of cancer can be prevented by curbing obesity.
Researchers from USC’s Center for Economic and Social Research and colleagues are seeking to better understand factors that contribute to unhealthy eating, particularly those brought about mood and social interactions. The team led by psychologist Donna Spruijt-Metz, director of the center’s mHealth Collaboratory, notes that current methods for tracking eating behavior that ask individuals to keep a log of their food intake are not adequate.
“The three-day multiple pass dietary recall that asks people to remember what they ate is the gold standard for measuring food intake, but we can’t accurately measure someone’s diet or food intake,” says Spruijt-Metz in a USC statement. “We really have no idea what people eat, because people lie. People don’t remember.”
Instead, Spruijt-Metz with her USC and Virginia colleagues, developed a system called Monitoring and Modeling Family Eating Dynamics, or M2FED, to study eating habits in families with remote sensors and wearable devices. The project, funded in 2015 by National Science Foundation, combines sensors and microphones in homes and smart-watch devices to capture conversations. The smart watches also record eating behavior with wrist actions, including length of time spent eating and the speed of consuming food. Those conversations and eating behavior data then are evaluated by algorithms, refined over time by artificial intelligence and machine learning.
The paper presents early data evaluating the ability of M2FED to accurately interpret conversations to detect different types of moods and stress related to eating behavior. The researchers recruited 10 volunteers, 5 male and 5 female, to test the system. M2FED recorded 535 snippets of conversation, lasting 2 to 5 seconds each, from the volunteers. At the same time, a research assistant observing the volunteers coded the interactions for comparison.
In both cases, the pieces of conversation were evaluated for 6 mood types most associated in the literature with unhealthy eating: anger, annoyance, anxiety, boredom, happiness, and sadness. The results show M2FED accurately identified 89 percent of conversations with happiness and sadness, and nearly all (95% to 98%) of anger, annoyance, anxiety, and boredom.
“The culture at home, within the family, can affect how people eat,” adds Spruijt-Metz. “We can now reliably measure that with sensors. Forget measuring dietary intake.”
More from Science & Enterprise:
- Mobile App Shown Feasible for Asthma Data
- Mobile App-Controlled Patch Reduces Migraine Pain
- Smart Watch Data Reveal Epilepsy Triggers
- Skin Health Data Collected by App Offered for Research
- Smart Watch Software Designed to Verify Signatures
* * *
You must be logged in to post a comment.