brandonb 2 days ago

I worked on one of the first wearable foundation models in 2018. The innovation of this 2025 paper from Apple is moving up to a higher level of abstraction: instead of training on raw sensor data (PPG, accelerometer), it trains on a timeseries of behavioral biomarkers derived from that data (e.g., HRV, resting heart rate, and so on.).

They find high accuracy in detecting many conditions: diabetes (83%), heart failure (90%), sleep apnea (85%), etc.

  • teiferer a day ago

    What is an "accuracy" of 83%? Do 83% of predicted diabetes cases actually have diabetes? Or did 83% of those who have diabetes get diagnosed as such? It's about precision vs. recall. You can improve one by sacrificing the other. Boiling it down to one number is hard.

  • crorella 2 days ago

    Insurance and health insurance companies must be super interested in this research and its applications.

    • jeron 2 days ago

      I'm sure they're also interested in the data. Imagine raising premiums based on conditions they detect from your wearables. That's why it's of utmost importance to secure biometrics data

      • brandonb 2 days ago

        At least in the US, health insurers can’t raise rates or deny coverage based on pre-existing conditions. That was a major part of the Affordable Care Act.

        • abenga 2 days ago

          The ACA will not survive the next couple of years.

          • daveguy a day ago

            That's what they said in 2016.

            • abenga 20 hours ago

              This time he has control over all the arms of government + the support of much of the private sector though. The last time there was at least some push-back.

      • apwell23 17 hours ago

        how would that work. i pay flat rate through my employer.

    • autoexec 2 days ago

      There are so many companies across many industries who are salivating at the thought of everyone using wearables to monitor their "health" and getting their hands on that data. Including law enforcement, lawyers, and other government agencies.

      • teiferer a day ago

        It's industry leaders that are salivating the most.

  • throwaway314155 2 days ago

    Had the phrase "foundation model" become a term of art yet?

    • brandonb 2 days ago

      By 2018, the concept was definitely in the air since you had GPT-1 (2018) and BERT (2018). You could argue even Word2Vec (2013) had the core concept of pre-training on an unsupervised or self-supervised objective leading to performance on a downstream semantic task. However, the phrase "foundation model" wasn't coined until 2021, to my knowledge.

      • throwaway314155 12 hours ago

        I guess I just find the whole "foundation model" phrasing to be designed in a way to pat the backs of the "winners" who would of course be those with the most money. I'm sure there are foundation models from groups that aren't e.g. OpenAI, but the origins felt egotistical and asserting that you made one prior to the phrase's inception only feels more-so.

        Had you merely called it an early instance of pretraining, I'd be fine with it.

LPisGood 2 days ago

Is anyone else surprised by how poorly performing the results are for the vast majority of cases? The foundation model which had access to sensor data and behavioral biomarkers actually _underperformed_ the baseline predictor that just uses nonspecific demographic data in almost 10 areas.

In fact, even when the wearable foundation model was better, it was only marginally better.

I was expecting much more dramatic improvements with such rich data available.

  • bumby a day ago

    I wonder how much of that is driven by poor performing behavioral models. There was a HN article from a few weeks back and it only had an accuracy of about 70% determining if someone was awake or asleep. I would guess that the secondary behavioral data used in this data (like cardiovascular fitness) are much harder to predict from raw sensor data than being awake or asleep.

  • Herring 2 days ago

    I worked with similar data in grad school. I'm not surprised. You can have a lot of data, but sometimes the signal (or signal quality) just isn't present in that haystack, and there's nothing you can do about it.

    Sometimes you just have to use ultrasound or MRI or stick a camera in the body, because everything else might as well be reading tea leaves, and people generally demand very high accuracy when it comes to their health.

vibecodermcswag 2 days ago

i love this because I build in medtech, but the big problem is no open weights, nor open data.

you can export your own apple XML data for usage and processing, but if you want to create an application and request apple XML data from users, that likely crosses into clinical research territory with data security policy requirements and de-identification needs.

  • piratesAndSons 2 days ago

    Trusting your health data with AI brothers is... extremely ill-advised.

    I don't even trust Apple themselves, which will sell your health data any insurance company any minute now.

    • autoexec 2 days ago

      They might not sell "your" data outright, but it doesn't mean they won't sell inferences/assumptions that they make about you using your data.

      The reality is that no matter how ethical the company you trust with that data is, you're still one hack or pissed off employee away from having that data leaked, and all of that data is freely up for grabs to the state (whose 3 letter agencies are likely collecting it wholesale) and open to subpoena in a lawsuit.

    • kridsdale1 2 days ago

      What do you base that suspicion on?

      • autoexec 2 days ago

        If a corporation can make money hand over fist by doing something, they will do it. It doesn't matter if it's illegal or unethical. As long as it's still highly profitable it will be done.

aanet 2 days ago

Thanks for posting this. This looks promising...

I have about 3-3.5 years worth of Apple Health + Fitness data (via my Apple Watch) encompassing daily walks / workouts / runs / HIIT / weight + BMI / etc. I started collecting this religiously during pandemic.

The exported Fitness data is ~3.5GB

I'm looking to do some longitudinal analysis - for my own purposes first, to see how certain indicators have evolved.

Has anyone done something similar? Perhaps in R, Python? Would love to do some tinkering. Any pointers appreciated!

Thanks!!

  • kridsdale1 2 days ago

    It might actually be worth writing your analysis in Swift with the actual HealthKit API and visualization libraries.

    Bonus: when you’re done, you’ll have an app you can sell.

    • aanet 2 days ago

      :thumbs_up.gif:

      My sentiments, exactly.

      Though I'm looking to scratch my own itch for now...

  • brandonb 2 days ago

    FWIW, we're working on something similar (you wouldn't necessarily need to write R or Python). Feel free to email me at bmb@empirical.health and I can add you to a beta once we have it ready!

    • aanet 2 days ago

      Thanks, I'll reach out.

      I am curious to do my own analysis, for two main reasons:

      - some data is confidential (I'd hate for it to leave my devices) - wanna DIY / learn / iterate

      Will ping you in any case. Thanks

memming 2 days ago

Interesting to see contrastive loss instead of a reconstruction loss.

fiduciarytemp 2 days ago

Has anyone seen the publishing of the weights or even an API release?

  • brandonb 2 days ago

    In the paper, they say they can't release the weights due to terms of consent with study participants (this is from the Apple Heart and Movement study).

dyauspitr 2 days ago

Is there a way to run this on your own data? I’ve been wearing my Apple Watch for years and would love to be able to use it better.

  • brandonb 2 days ago

    Not yet -- this one is just a research study. Some of their previous research has made it into product features.

    For example, Apple Watch VO2Max (cardio fitness) is based on a deep neural network published in 2023: https://www.empirical.health/blog/how-apple-watch-cardio-fit...

    • pricklyprice 2 days ago

      Apple was reporting VO2max for a very long time (much before 2023). I wonder what the accuracy was back then? Maybe they should the option for users to re-compute those past numbers based on the latest and greatest algorithm.

    • llm_nerd 2 days ago

      Apple's VO2Max measures are not based upon that deep neural network development, and empirical seems to be conflating a few things. And FWIW, just finding the actual paper is almost impossible as that same site has SEO-bombed Google so thoroughly you end up in the circular-reference empirical world where all of their pages reference each other as authorities.

      Apple and Columbia did recently collaborate on a heart rate response model -- one which can be downloaded and trialed -- but that was not related to the development of their VO2Max calculations.

      Apple is very shrouded about how they calculate VO2Max, but it likely is a pretty simple calculation (e.g. how much is your heart responding based upon the level of activity assumed based upon your motion, method of exercise and movements). The most detail they provide is in https://www.apple.com/healthcare/docs/site/Using_Apple_Watch..., which mostly is a validation that it's providing decent enough accuracy.

      • brandonb 2 days ago

        What’s your source on Apple not using the neural network for VO2Max estimation? They’ve been using on-device neural networks for various biomarkers for several years now (even for seemingly simple metrics like heart rate).

        FWIW, the article above links directly to both the paper and a GitHub repo with PyTorch code.

        • llm_nerd 2 days ago

          >FWIW, the article above links directly to both the paper and a GitHub repo with PyTorch code.

          Neat, though the paper and the Github repo have nothing to do with Apple's VO2Max estimations. It's related to health, and touches on VO2Max and health sensors, but the only source claiming any association at all is that Empirical site. And given that this research came out literally years after Apple added VO2Max estimates to their health metrics, it seems pretty conclusive that it is not the source of Apple's calculations. Neat research related to predicting heart rate response to activity (which might come into play for filling in measurement gaps which happen during activity when a device isn't tight enough, etc).

          >What’s your source on Apple not using the neural network for VO2Max estimation?

          You're asking me to prove a negative. Apple never claims that they do any complex math or deep neural networks to derive VO2Max, and from my own observations of its estimates of mine, it seems remarkably trivial.

          Trivial can still be accurate. But it hardly seems complex. Like, guess people's A1c based upon age, body fat percentage, demographic and you'll likely be high-90s accurate with trivial algebra.

          >even for seemingly simple metrics like heart rate

          Deriving heart rate from a green light imperfectly reflecting off skin, watching for tiny variations in colour change, is actually super complex! Doing it accurately is actually pretty difficult, which is why wearable accuracy is all over the place, though Apple is one of the leaders and has been for years. Guessing a number based upon HR and activity level isn't quite as complex.

MangoToupe 2 days ago

Can someone explain what "wearable foundation" means?

  • compiler-guy 2 days ago

    It's a "Foundation Model" for wearable devices. So "wearable" describes where it is to be used, rather than describing "foundation".