Feb 18, 2015 · 2 minutes

There was some debate last week over whether people should buy dedicated fitness trackers or if they should just let smartphones count their steps from inside their pockets. It was prompted by a study which, depending on who you ask, either proved smartphones are more accurate than fitness trackers or showed that a FitBit is still better than an iPhone.

Mother Jones made the first argument in a blog post titled "Science Says FitBit is a Joke." Its interpretation of the study led it to believe that relying on a smartphone to monitor activity is more convenient, more accurate, and much cheaper than buying a separate fitness tracker.

Outside tore apart that argument in a post titled "No, Your FitBit Isn't Actually Lying to You." (It's interesting how both headlines presume that "FitBit" is shorthand for "fitness tracker" in consumers' minds.) Its reading of the same study led to these conclusions:

[W]hile the Nike Fuelband was indeed inaccurate (which is something we’ve known for years), and the aging Fitbit Flex wasn’t so great either, the Jawbone UP24 actually out-performed all the smartphones in the 1,500-step test. But that’s just the wrist-worn trackers. When you look at the two trackers you wear on your hip or in your pocket (the Fitbit One and the Fitbit Zip, both of which are now more than two years old), you can see that the these trackers were actually more than three times more accurate than the phones.
The result is a much more nuanced takeaway from this study: it turns out some products are better than others, and consumers prioritize different things (heart rate sensors, not having to wear an irritating bracelet), which makes a consensus on this issue seem pretty unlikely.

But I think the biggest problem with the back-and-forth over this study is that it assumes most people who buy fitness trackers, or use their smartphones to count their steps, need them to be accurate. They don't. All they need is a number that rises when they walk around.

Craig Mod explained the appeal of a FitBit in a Morning News article, recounting his own experience with the itty-bitty fitness tracker:

Active for one person is inactive for another. The only way to understand your definition is to give it personal form. One way to make real such abstractions is to ground them in a number. Any number. Steps work well.

'What kind of day was today?' you ask. The steps answer.

You learn to suss out the qualitative feeling of worn comfort after a 20,000-step day, and know the anxiety wrought by an idle 5,000-step day. Those numbers don't have to be accurate to provide someone with an idea of their general activity level. As long as they're at least moderately consistent, and they're used as guides instead of absolute values, the numbers probably don't matter to most consumers.

And if the numbers don't matter, how much does the tool really matter? Different things work for different people. Someone might want to wear a fitness tracker because seeing it reminds them to get off their asses. Someone else might prefer to use their iPhone.

I suspect the accuracy of these tools won't influence either consumer. So as interesting as this study is, these various interpretations are ultimately meaningless. People like what they like. Nobody but the most fastidious of athletes really cares about all this crap.

[illustration by Brad Jonas]