Let's pick up our in-depth exploration of body fat measurement with 3D Body Scanning and the future of body composition
I recently attended Bodies… The Exhibition, a traveling homage to the intricacies of the human body that shows visitors dissected cadavers in a variety of displays and levels. As someone who has dedicated many years to the study of the human form, this served as a respectful reminder of the complexity of the body – and a reminder that the best way to learn about the body is to interact with it.
The previous body composition methods we’ve discussed in this series are missing just that – the interactive component. The ability to see what’s happening on the inside as a direct corollary to what is happening on the outside... The capability to spin and zoom on body parts and see how they change over time. 3D body scanning, although still in its infancy, is changing the way we think about and interact with body composition.
What are 3D scanners?
3D scanners come in a variety of shapes and sizes, but they are all quite similar in principle. A 3D scanner takes a number of either static two-dimensional images or three-dimensional depth maps of the body and then uses these images to create a 3D avatar of the body. The 3D avatar itself is then used to extract a variety of metrics, such as body fat and circumferences. Although the majority of 3D scanners are designed and marketed for health and fitness, a wide array of applications is possible for this technology, ranging from custom clothing to ergonomic design.
How does 3D scanning work?
A number of 3D scanners have been developed to date, each with slightly different hardware, different sensors and a different processing pipeline. The original 3D body scanners used lasers to identify the location and shape of the body – and cost a hefty $100K. Most contemporary body scanners – Naked included – use 3D depth sensors. If you’ve ever played Xbox with Kinect, the sensors used to track your movement as you hit imaginary tennis balls are not unlike those used in 3D body scanners. In fact, Naked’s first ever prototype used Microsoft’s Kinect depth sensors!
Depth sensor technology, first commercially available in the early 2000s, projects either a known or random infrared pattern onto your body. In Naked, this pattern — completely invisible to the human eye — is detected by sensors embedded behind the mirror. Because the infrared light is captured by two cameras in known and static positions, the depth and formation of the object in view can be detected, through a process called deformation analysis.
Most of the commercial and consumer-grade 3D scanners use deformation analysis to reconstruct the 3D avatar. The means by which they do so, however, varies from technology to technology. Some scanners use a single depth sensor that moves up and down while the user rotates on a platform. Others use multiple depth sensors with a rotating platform (like Naked!). A final hardware approach has the user stand still while sensors are either mounted on multiple sides of the user or the sensors themselves rotate around that user.
There are pros and cons to each of these approaches: having the hardware move or having multiple cameras surrounding the user can make for easier processing, while having the user rotate generally makes for a better experience. We've opted for the latter, because we are confident in our ability to solve challenging problems with software — and we believe it makes for a more versatile product in the long term.
How does Naked’s 3D scanning work?
Naked, as it exists today, collects its depth maps using three Intel® RealSense™ depth sensors embedded behind the mirror (fun fact: we’re the first commercial product to ever use this generation of Intel depth sensors!). As your body rotates on the Naked scale, these depth maps are collected in the mirror’s processor and stitched together to create your 3D body model. That body model is sent to the cloud, where Naked uses proprietary algorithms to extract meaning from the avatar, which translates to the metrics – like circumferences and body fat – you see in the Naked App.
The depth maps we collect are accurate within 1/10th of an inch which translates to an accuracy of approximately 1.5 centimeters on large circumferences (such as waist, stomach, hips, etc.) and 0.5 centimeters on small circumferences (such as thighs, calves and biceps). Unlike the approach your tailor or personal trainer may use, Naked uses concave circumferences, meaning the circumference is taken directly along the surface of the body, accounting for any indentations or protrusions. This will likely translate to slightly different circumferences from what you’d get using a tape measure, but we believe this provides the most accurate values for measurements and this method may also translate to future features coming down the pike!
Naked among the body comp landscape
Naked derives body fat from a series of circumferences taken from the body’s surface in conjunction with user information (such as biological sex, ethnicity, and age) using an algorithm derived from DXA. In its current state, our algorithm is accurate within +/- 2.5% and consistent within +/- 1%, assuming correct user behavior (specifically dress, hair, lighting, and scan pose). Over time, in tandem with our academic partners, we will continue to refine this algorithm and use increasingly sophisticated methods to estimate and track your body composition.
Why not use volume and the Siri equation, you might ask? Some 3D body scanning techniques do! That said, Naked did a series of validation studies to test the best approach for body fat and found that while volume-based methods yielded relatively accurate results with perfect use, even slight variation in clothing or hairstyle resulted in highly inconsistent body fat results. As we learned more about this method, we found that the volatility of volume in combination with the inherent shortcomings of a 2-compartment model (e.g. the Siri equation) made a DXA-based approach more viable and desirable.
Why use DXA as a reference method? As we’ve stated in our DXA blog article, DXA is not a perfect method for body fat, and by using DXA as our criterion method, we are subject to some of the same limitations. That said, short of very expensive and inaccessible MRI scans, DXA is currently the gold standard for body composition for both medical centers and academic research, and we believe it's the best approach for 3D scanning.
Naked in the wild
Through extensive testing, both with our beta participants and university partners, we’ve learned quite a bit about how 3D scanning in the home has the potential to change the way we think about our bodies.
Perhaps the most important observation is that body fat percentages and body metrics in general can be misleading. We’ve seen firsthand that two people of the same height, same weight, and same body fat can look incredibly different. To date, no universal metrics exist that can fully encapsulate differences in body shape. At Naked, we believe that the visual representation of the body – through the 3D body model – provides the richest data for understanding your progress; the body fat and measurements serve to provide additional context to the body model, but as standalone metrisc they aren’t as powerful.
Scanning the future
For tracking health and fitness, the weight scale has been standard in homes and medical settings for centuries. Since its invention in the 1700s, weight scales have evolved to show slightly richer information, such as bioimpedance-based body fat, BMI, lean mass, fat mass, and total body water. But think about other technologies — like computers, TVs, or cars. These technologies have advanced much farther in a more consolidated time period.
This technology will continue to evolve dramatically over the coming years. In the next year alone, we will be unveiling a whole host of fitness-related features and integrations. In the slightly longer-term future, there will be endless other applications for this technology, including custom clothing, visualizing clothing on your body before you buy (hello, perfect fitting pants!), playing yourself in a video game (no joke, our CEO Farhad has done this…), and so many others. Naked and the 3D scanning landscape is moving us away from a world built for the ‘average’ person – and toward a world customized just for you.