I don't get the point of this. I really don't. While far from perfect, web fonts have largely been a solved problem for a while now, so why are Google reinventing this wheel? This seems to happen all the time with the web; people taking old ideas and reinventing them again and again, in yet more bloated ways, breaking things for millions.
They've changed their "typeface rendering engine" (for want a better description). Hence why many people are now unable to see their typefaces in full when once they could.
Heya. Other people have already replied with versions it fails on, but for completeness, I'm on Firefox 47.0 (package version: 47.0+build3-0ubuntu0.16.04.1) on Ubuntu 16.04.
Arch Linux user reporting in. Works fine in… uh… latest chromium (51.0.2704.84-1) and there are blocks in ff (47.0-1). Epiphany (3.20.2-1) horribly crashes, so bad even dev tools stop working.
However, they are replacing what they think is a missing glyph with an image. For example, if I type the letter "r" while keeping the DOM inspector open, I see the text node containing the letter for a moment before it is replaced with an image element:
This is a Google-specific problem. YouTube Gaming launched only working on Chrome as well. One of Google's web UI frameworks wrapped text by breaking words of any size mid-letter, if you weren't on Chrome for a while. (It would cut "the" if it felt like it during a word wrap.)
My browser, Edge, is simply blocked on Google Fonts.
It seems like Google is more and more dependent on automated testing for all if it's deployments, perhaps since everyone internally uses Chrome, nobody notices when those tests are lacking on other browsers. Google would do well to reintroduce "humans" into their development process, and maybe open their new websites once in each major browser before they push a release.
In my department every developer uses a different browser as their primary browser. I only use Firefox, another only uses Chrome and another only uses Safari.
That's a very smart way to enforce ad-hoc testing in every browser. Probably prevents a lot of "works on my machine" issues.
I've worked for companies with no defined browser support, and other clients which required us to test in IE6-9, FF, Chrome, Safari, Mac FF, Mac Chrome, Mac Safari. It took FOREVER, so we ended up automating a lot of it.
From my experience with front-end dev's it's not that they don't test in Firefox but rather they only test in Chrome. I've worked with FE devs who insisted it was unreasonable to expect them to test in all the major browsers.