Key Takeaways

  • 90% of 117 wearable devices monitor Health & Wellness metrics, making it the most widely tracked data category.
  • 63% of the devices analyzed record location data either through built-in GPS or connected GPS via a smartphone.
  • 23% of the major wearable brands analyzed explicitly share or sell personal data to marketing partners or third-party advertisers.
  • 55% of these brands share de-identified biometric data with outside researchers.

Your smartwatch is more than just a fitness tracker — it’s a window into your daily life. It monitors your heart rate, logs your sleep patterns, detects signs of stress, and even records your alcohol consumption. But where does all that sensitive data go? Who gets to see it and what do they do with it?
In this investigation, we at vpnMentor dive deep into what wearable devices are truly collecting. We’ll uncover how that data is being used, shared, and in many cases, quietly monetized. If you own a smartwatch, fitness tracker, or any health-focused wearable, you may be sharing far more than you bargained for.

What These Devices Collect

In this study, we examined 117 wearable devices from 33 major brands, including Apple, Fitbit, Samsung, and Huawei. Overall, we identified 22 different types of devices, ranging from popular trackers and smartwatches to more specialized products like EEG headbands and bike computers.

For each device, we noted the specific types of data it collects, and then we looked into what each company is doing with all this data. You can find the full details of our methodology further down.

Loading Data Chart…

Health & Wellness

Our research shows that 90% of wearable devices monitor at least one Health & Wellness metric, making it the most widely tracked category. However, some of these metrics are more commonly measured than others.

For instance, 71% of devices track the user’s heart rate, 56% monitor blood oxygen, and 29% assess skin temperature variation. More specialized features—such as calories burned and glucose monitoring—are available in only a select few devices.

Tracking health-related metrics can help users assess their physical condition, detect early signs of illness, and manage chronic diseases. Having said that, it’s important to keep in mind that sensitive health information is vulnerable to unauthorized sharing or misuse, which may have consequences beyond privacy infringement.

For example, in 2020, the City of Chicago’s wellness program required thousands of employees and their spouses to submit biometric screening data. According to the City of Chicago, the data was gathered to improve employee health and manage health conditions.

However, many employees reported their data was shared with multiple companies without clear consent. This led to a class-action lawsuit that cited violations of privacy and discrimination laws. Some claimed that the program discriminated against employees with disabilities or genetic conditions by forcing them to disclose sensitive health information under threat of financial penalties.

Loading Data Chart…

Fitness & Activity

Wearable devices have proven very effective at encouraging users to do more physical activity. A 2022 analysis of nearly 400 studies involving over 160,000 participants found that wearables can increase walking time by roughly 40 minutes and 1,800 steps a day.

Our research shows that step tracking is nearly universal among modern wearables, being monitored by 98 of the 117 devices we looked into. In fact, 87% of the devices in our research collect data related to fitness and activity, ranking it as the second most common category.

Although the number of steps you take or the distance you travel may not be as sensitive as other types of data, companies should still make sure fitness and activity records are handled and stored securely.

For example, back in 2021, an unsecured database that belonged to GetHealth (a health and wellness data unification solution provider) was discovered online without password protection. The database exposed 61 million fitness records pertaining to users of Fitbit, Apple HealthKit, and other wearable devices.

GetHealth secured the database quickly after being notified, but the incident highlighted the risks of storing large amounts of data without proper security. In this case, the exposed records matched the fitness datapoints (like speed, weight, and heart rate) to information that could help identify the user they belonged to (name, date of birth, gender, and even time zone).

If the data was misused, it could potentially lead to targeted advertising, phishing attacks, or even blackmail. We’ll dive deeper into the potential risks in our case study further down.

Loading Data Chart…

Location

We also found that roughly 63% of the wearable devices analyzed record location data. Specifically, 49 devices have integrated GPS, while 28 rely on a smartphone’s GPS to capture location.

On the one hand, GPS tracking can be highly effective at improving safety, particularly for workers in hazardous environments like mining and construction. On the other, if it is misused, the collection of location data can be extremely invasive and even put people’s safety at risk.

For instance, in 2018, Strava—a fitness app which shows the aggregated activity data of its users in a global heatmap—unintentionally revealed the location of secret U.S. army bases in Afghanistan and Syria. The incident led to military organizations worldwide restricting the use of fitness apps by personnel. Strava responded by updating its privacy settings and excluding known military sites from its heatmaps.

That same year, fitness activity company Polar faced a similar scandal. Its public activity map feature allowed anyone with a Polar account to see other users’ personal data, such as their names, home addresses, and time-stamped GPS data of their workout sessions.

Polar claimed that only data from users who had agreed to public sharing was exposed in the map, but the ease with which sensitive information could be accessed led to the removal of the public location-tracking feature.

Sleep

Sleep data is also widely captured by wearable devices, with 83% of those we analyzed monitoring this type of information. The most common feature in this category is a sleep log, which is available on 90 devices.

Like other data collected by wearable devices, sleep data can also raise privacy concerns. For instance, companies often share with third-parties without the users’ explicit consent or awareness. Most of the time, this results in targeted advertising, but it could also be used for research purposes or potentially be monitored by employers as part of a wellness program.

Loading Data Chart…

Stress & Mindfulness

Metrics related to stress and mindfulness are less prevalent, tracked by little over half (55%) of all devices analyzed. Specifically, out of 117 devices, 31 track stress levels, while 11 offer a feature called Meditation Minutes, designed to help users monitor and manage their meditation practice.

Only select wearables—like the Muse headband and the Empatica EmbracePlus—can give more specialized mindfulness-related data, such as meditation biofeedback or EDA-based stress.

Loading Data Chart…

Advanced Running

Specialized running metrics, which provide detailed performance insights (such as recovery time or running power), are the least common in our data set. They are monitored by just 21% of devices.

Generally, advanced running metrics are a distinctive feature of high-end sports watches, such as the Coros Pace 3 and the Suunto 9 Peak Pro.

Loading Data Chart…

Where Does All the Data Go?

Many wearable technology companies claim they don’t sell your data — and while this may technically be true, some still share your information (even if it is sometimes anonymized) with service providers, research groups, or advertisers.

Out of 33 major wearable brands analyzed, 7 companies explicitly share or sell personal data to marketing partners or third-party advertisers. These are Samsung, Huawei, Xiaomi, Amazfit, Meta Quest, Ray-Ban Meta, and Tile/Life360.

For example, Samsung’s Galaxy Watch series shares data within its ecosystem and uses it for personalized advertising; Xiaomi’s Mi Band and Amazfit devices do the same. Meanwhile, Meta’s devices—including Meta Quest VR headsets and Ray-Ban Meta smart glasses—use biometric data (such as gaze-tracking) and media uploads to personalize ads and train AI. User opt-out options are limited on these devices.

Huawei, on the other hand, acknowledges in its privacy policy that it may obtain information about you from third-party sources (like the social media account you use to log into its platform) and share it with their business partners for advertising purposes. This data is stored on cloud servers, some of which may be located outside the U.S.

Having said that, there are some companies that are more tactful with the data they share — or so they claim. Google, for instance, explicitly declared not to use health data for advertising, whereas Apple shares data only with user permission. Meanwhile, other companies emphasize GDPR- and HIPAA-compliance.

EmbracePlus and Sensoria, for instance, claim to adhere to HIPAA and share data under research contracts, focusing on confidentiality and user consent. Oura, Withings, Microsoft, Empatica, WHOOP, Dexcom, and Minimed claim something similar. All in all, over half (55%) of the companies we considered share de-identified biometric data with outside researchers.

While it’s reassuring that the majority of the companies examined in this study refrain from sharing users’ personally identifiable information, it’s concerning that major players like Samsung, Fitbit, Huawei, and Xiaomi openly disclose personal data for advertising purposes — particularly given their significant influence in the wearable tech market.

The following color-coded table ranks the 33 companies included in our analysis based on their data-sharing practices, categorizing them as Excellent, Good, Moderate, or Poor. These ratings were derived from a thorough review of each company’s privacy policy, focusing specifically on how they handle user data, their transparency, and their data-sharing practices. You can see more details regarding our methodology at the end of the article.

Loading Data Chart…

Employer and Insurer Programs: Wellness or Surveillance?

Life360 used to sell precise location data to brokers until just a few years back. Although the company claims to have stopped this practice in 2022, it does currently allow users to optionally share driving data with Arity, a subsidiary of Allstate, for personalized insurance quotes.

Like Life360, other wearable technology companies are now sharing health data with insurance companies and HR teams as part of wellness incentive programs. Allegedly, their aim is to improve the health of employees and reduce their healthcare costs by tracking metrics such as heart rate, sleep quality, and activity levels.

For example, WHOOP offers a sophisticated platform that streams continuous physiological data (like heart rate, heart rate variability, skin conductivity, temperature, and motion) to a cloud-based analytics system. This system provides coaches, trainers, and potential employers with dashboards that reveal how much strain an individual is under, their recovery status, and sleep efficiency.

WHOOP’s team dashboard also allows employers or trainers to monitor an individual’s health and performance in real-time. Although it can be beneficial for athletes who are aiming to optimize training and recovery, this level of detailed monitoring can raise privacy concerns in non-athletic environments.

How much does your employer need to know about you? Are they entitled to know how fast your heart is beating or if you have trouble sleeping? What about your insurer? What decisions are they making based on the data they get about you?

The privacy risks of these programs are underscored by watchdog groups like Mozilla, which flags six major wearable brands—Google Pixel Watch, Samsung Galaxy Watch, Huawei Watch, WHOOP Strap, Amazfit, and Xiaomi Mi Band—as “Privacy Not Included.” This label signals persistent red flags regarding data security, transparency, and user control, even among market leaders.

Despite this, wellness incentive programs are becoming more and more common. According to Shortlister’s 2025 Workplace Wellness Trends Report, almost 85% of large U.S. employers offer wellness programs.

Case Study: The Potential Risks of Exposed Health Data

For this research, we also conducted a detailed analysis of Apple’s HealthKit framework, which serves as the central platform for health and fitness data on iOS devices such as iPhones and Apple Watches.

In total, we identified 134 distinct data categories, covering a broad range of health information, ranging from body measurements to fitness and activity, reproductive health, alcohol consumption, and mental health data.

Loading Data Chart…

All health and fitness data stored in HealthKit is encrypted on the device. It can only be accessed when the device is unlocked using a passcode or Face ID. Apple also employs end-to-end encryption when users sync their health data to iCloud and requires explicit user permission before any app can read or write health data.

Although Apple has excellent data-sharing practices, it’s important to note that HealthKit data could be highly detailed and often personally identifiable. This means that, if compromised, it could potentially lead to serious privacy violations, discrimination practices, or even legal risks.

For example, variables like atrialFibrillationBurden and irregularHeartRhythmEvent reveal information about cardiac health conditions. This sensitive medical data could be exploited by insurers to adjust premiums or even deny coverage, leading to potential discrimination based on an individual’s health status.

Reproductive health data (recorded through variables like ovulationTestResult and pregnancyStatus) could be subpoenaed in investigations related to abortion, placing users at legal risk. This is especially true in the U.S. following the Dobbs v. Jackson Women’s Health Organization Supreme Court Decision, which gave individual states the power to impose strict abortion bans or restrictions.

Data entries into the sexualActivity field also expose highly intimate aspects of a user’s private life. If this data were leaked or illegally accessed, it could lead to reputational damage or even blackmail. Moreover, the HKWorkoutRouteTypeIdentifier records precise workout routes, enabling the reconstruction of exact locations and movements. If such detailed data were to fall into the wrong hands, users’ physical safety could be compromised.

The examples above are just hypothetical risk-scenarios to illustrate how the data collected by wearable devices could potentially be misused should it fall into the wrong hands. We are not stating or implying that any user is currently at risk of these types of threats. We only aim to highlight the potential implications of a privacy breach to raise awareness on the importance of safeguarding sensitive personal data.

Recommendations & Conclusions

Wearable devices have brought us numerous benefits, allowing us to understand our bodies better and encouraging a more active lifestyle. But safeguarding privacy is no simple feat when it comes to wearable technology. Companies, users, and legislators must all do our parts.

First off, it is crucial that companies implement variable-level transparency, allowing individuals to understand and control exactly what data is collected and how it is used. Wherever possible, local processing of sensitive data on the device itself should be prioritized to reduce the risk of data leaking.

Current regulations, like HIPAA, provide strong protections for health data in clinical settings. However, their protection should be extended to cover consumer wearables as well, thus ensuring the same privacy and security standards for all health data.

As for consumers, we suggest you choose brands that offer clear opt-out options, undergo regular privacy audits, and follow strict data minimization policies — collecting only the data necessary for their services and limiting sharing with third-parties.

By pushing for these measures, we can balance the benefits of wearable technology with stronger privacy protections, empowering users to maintain control over their personal health information in an increasingly connected world.

Our Methodology

In this study, we examined 117 wearable devices from 33 major brands such as Apple, Fitbit, Samsung, and Huawei. Overall, we identified 22 different types of devices, ranging from popular trackers and smartwatches to more specialized products like EEG headbands.

Smartwatches made up the largest category, accounting for 38% of the devices analyzed, while trackers represented 36%. Outdoor watches ranked as the third largest group, with just 5 devices. The other 19 categories are far more specialized, each represented by only 1 or 2 devices, including wearables like rugged watches, heart rate sensors, bike computers, and herding aids.

For each device, we identified the specific types of data it collects, including:

  • Location: Data obtained via connected GPS and on-device GPS. 
  • Fitness & Activity: Metrics such as steps taken, distance traveled, active minutes, swimming activity, and altitude changes.
  • Sleep: Data related to sleep patterns, including sleep logs, sleep stages, and sleep score.
  • Stress & Mindfulness: Measurements of stress levels and features like Meditation Minutes, which help users track their meditation practice.
  • Advance Running: Specialized running metrics such as Running Power and Wrist-Based Running Power, which provide detailed performance insights.
  • Health & Wellness: Vital signs and health indicators including heart rate, blood oxygen saturation, glucose monitoring, and other metrics.

Afterwards, we looked into the data-sharing practices of the 33 companies included in our analysis. After a thorough review of each company’s privacy policy, we rated them based on the following criteria:

  • Excellent: The company offers clear transparency, strong user control, minimal data sharing, and robust security practices.
  • Good: The company provides reasonable transparency and user controls, with some disclosed third-party data sharing.
  • Moderate: The company shows limited transparency and user control, with broader or less clear data sharing practices.
  • Poor: The company lacks transparency, shares data extensively or unclearly, and provides minimal user control or data protection.

Transparency around data practices was identified as a key limitation in this study. Among the companies analyzed, only Apple offers a detailed, granular list of all data variables it may collect. In contrast, privacy policies from other brands tend to be broad and general, lacking sufficient detail to enable direct comparisons.

Additionally, many wearables integrate with third-party applications outside the manufacturer’s control, introducing potential risks of data misuse by these external entities. However, this investigation focused exclusively on the primary device manufacturers and did not evaluate data practices or potential misuse by third-party apps.
The table below provides a comprehensive list of all devices included in our study, along with some examples of the types of data each one collects. If you’re interested in learning more, please don’t hesitate to contact us.

Loading Data Chart…