> "that want live videos of different angles of your face"
Hetzner (outsourcing to Idenfy) dared to demand this of me, three years ago. I'm still mad about it.
> "When that data eventually leaks,"
Indeed, my understanding is these sensitive biometrics are generically (i) uploaded in full to a remote server, where they're (ii) retained for a nontrivial amount of time, because they need to be (iii) manually QA'd by humans. It's nothing like an iPhone's local-only biometrics enclave. My understanding's based on the specific case of Idenfy, and an ex-Idenfy HN'er explaining its workflow[0].
Hetzner uses some kind of AI (the old kind) to assign risk scores to customers. In my case they just wanted a photo of my passport, but that was years ago. For some people they just outright deny access no matter what you upload. Other people just go right on through.
Hetzner (outsourcing to Idenfy) dared to demand this of me, three years ago. I'm still mad about it.
> "When that data eventually leaks,"
Indeed, my understanding is these sensitive biometrics are generically (i) uploaded in full to a remote server, where they're (ii) retained for a nontrivial amount of time, because they need to be (iii) manually QA'd by humans. It's nothing like an iPhone's local-only biometrics enclave. My understanding's based on the specific case of Idenfy, and an ex-Idenfy HN'er explaining its workflow[0].
[0] https://news.ycombinator.com/item?id=33863625#33864440