Google, not content with existing image formats that support 10-bit like HEIC (used by Apple) and AVIF (based off of AV1, a codec that Google helped design and is better than JPEG), they decided in all their wisdom to make Ultra HDR for Android phones, which is an incompatible standard built on top of JPEG, which is separate from JPEG-XL.
Now Samsung has released Super HDR, without any information about that standard or if it relates to Google's Ultra HDR. Sigh.
Edit: I forgot about WebP/WebP2, which was also developed by Google as a JPEG replacement.
Considering JPEG XL and Ultra HDR are both based on JPEG, couldn't they be combined into one standard? Wouldn't it be better for everyone if the whole industry could eventually agree on a single standard?
Apple's HEIC is very annoying since it's not really supported by anything non-Apple. Would certainly be nice to see that go away.
It's crucial to distinguish JPEG the file format (now retronymed as JPEG 1) and JPEG the standardization group.
JPEG XL is officially blessed by JPEG but otherwise irrelvant here, even though non-progressive JPEG 1 files can be losslessly recompssed into JPEG XL by its design. The main role of JPEG here was to specify explicit goals for JPEG XL proposals [1]. Interestingly enough, the JPEG 1 recompression was not a part of that call for proposals back then. JPEG XL is otherwise a completely different format with much better compression algorithms, so should be the image format for pretty much all uses once popularized.
Ultra HDR [2] is an extension to the JPEG 1 format, which depends on XMP and the CIPA Multi-Picture Format. This kind of extension was not the first, even JPEG the group itself had a similar extension called JPEG XT which was never popular! (If you don't know much about JPEG, APNG was a similarly designed extension to PNG which eventually became a part of PNG.) Naturally Ultra HDR files cannot be smaller than ordinary JPEG 1 files, so can't fulfill Samsung's needs.
I proposed lossless JPEG1 recompression as a functionality of JPEG XL after PIK/FUIF were chosen as a platform. The committee saw this of great value and we decided together this as an additional requirement after the competition was completed. We had a lot of experience of this in brunsli and were sure that we can deliver a good solution for it.
Ultra HDR is a relatively unambitious standard. JPEG XL already supports hdr, but also has a ton of other nice features (extra channels, bigger maximum resolution, high bit depth, better entropy coding).
> On November 22, 2016, HEVC Advance announced a major initiative, revising their policy to allow software implementations of HEVC to be distributed directly to consumer mobile devices and personal computers royalty free, without requiring a patent license.
Regardless of that, macOS, Windows, iOS and Android all have OS-level support, via a combination of hardware and software decoders. Ubuntu (and Debian) provide libraries in their official repos.
So you're saying that neither Chrome nor Firefox support HEIC because of patents, despite arguably 90%+ of the environments they will run, having OS-level support for the format?
Sure that sounds likely. I'm totally sure that's the reason Google doesn't support this. I'm positive it's nothing at all to do with pushing the format they control, and omitting or removing support for formats they don't control. That would be such an obvious dick move, it's inconceivable they would do that.. right.. RIGHT?
Mate, Apple has been using HEIC for over 6 years, but they didn't add it to Safari until less than a year ago. And Apple doesn't even want you to use HEIC on the web! They added it to Safari so that app devs can use it in WebViews.
I consider this would work best with a great local tone mapping algorithm and only sharing the HDR image (JPEG XL), then viewing it on SDR with a great local tone mapping algorithm. That would reduce the data size to transmit and give more guarantees that two people downloading the same file will see the same content.
Honest question: what’s so awful about WEBP? Though it’s worse than the gen arriving now, it’s better than or the same as the one before it for the use cases it supports, and free and open. I get the impression people associate it with brokenness and low quality, but the brokenness is just a lack of support, and the quality a creator choice. Maybe its original sin was not trying to be suitable for original data, leading to the low quality association and making support less desirable outside of the web, but if that’s the case then AVIF is in the same boat.
Lossy WebP has some crappy limitations. For example, it's limited to 4:2:0 chroma subsampling, which makes it unsuitable for high quality images. Hell, even if you restrict yourself to 420 JPEG, JPEG still comes out as having a higher quality ceiling than WebP. And yes, it does beat MozJPEG and libjpeg-turbo in file size sometimes, but newer JPEG encoders like jpegli crush WebP in size completely across the entire quality spectrum.
I associate it with google, and I have associated google with bad and untrustworthy. Also google's hostility towards jpeg xl makes me even more apprehensive of their format and the way they tried to push it so hard.
Google Research develops and maintains JPEG XL, currently main focus on improving streaming encoding -- to use much less memory during the encoding process.
Google Chrome added and then removed experimental support of JPEG XL from Chrome. This has caused discussion in the related bugs.
It is not really awful. It is just way over hyped, over promised and under delivered. WebP was better than standard JPEG. But JPEG also improved via many other encoders such as MozJPEG. And it wasn't obvious what the advantage were, or it was so little it really shouldn't be included as a standard feature. Mozilla made their case back then with many testing and data point.
Feature-wise, maybe (transparency, animation). Compression-wise, not really (better at low qualities because the artifacts are somewhat less objectionable than JPEG’s, but worse at high qualities).
Lossless WebP (practically a separate codec) is nice though, albeit 8-bit only.
The original lossy WebP format is more or less an intra-coded frame format from VP8. This "I-frame-as-image" approach was considered good enough for some time, for example AVIF largely follows the same structure.
The WebP lossless format (internally "VP8L") is the second codec supported by libwebp but otherwise independent. This format can be thought as an optimized PNG; it uses the main PNG algorithm but then tweaks various bits for better compression. For this reason it is very rare to see PNG files smaller than losslessly compressed WebP files when efficiently done.
JPEG XL is very different from both. There are some technical resemblances, as the author of lossless WebP also worked on JPEG XL (and Brotli), but the overall architecture is completely revamped. The lossy portion of JPEG XL is a (much larger) superset of JPEG 1, while the lossless portion is based on a learnable context modelling respresented as a binary decision tree. And then everything got sandwiched between yet another set of components.
It would be more correct to say that PNG uses the GIF algorithm than that WebP uses the PNG algorithm.
WebP's select predictor and some other predictors are used in JPEG XL. The same for 2d locality map in backward references.
Both are able to use delta palettes. JPEG XL goes further by having a mixed mode where some entries are deltas for smooth gradients while others are usual palette entries.
the main difference is webp is a last gen video format (vp9) adapted to support images, while jpeg-xl is an image format first. This may not sound important, but actually matters a lot because videos and images are viewed very differently and therefore have different constraints to optimize for.
1. A frame of video only is viewed for 1/30th to 1/60th of a second. Quality standards are lower than for an image which will often be looked at for several seconds. One example tradeof is webp always has chroma sub-sampling which is a fairly major quality tradeoff.
2. image sizes vary a lot more. Images sizes range from 32x32 pixels to enormous. Videos are pretty much always between 360p and 4k (webp supports 16k by 16k vs 1 billion by 1 billion for jpeg-xl)
3. videos don't care about progressive loading, but it makes sense for images (especially for users with slow connections)
4. video formats care about encode time because a videos are massive. Image formats are fine with 100x slower encode if it makes the file smaller and decode is faster
3.
I don't understand how google can just be consistently on the back foot of the "tech world hivemind" for going on 7 (?) years now and have zero shakeup of not just culture but at least PR.
Noone gave a crap about this format on this site until Google decided to not add it to Chrome. Noone used it, no posts were upvoted. It just became a thing when it was yet another reason to rant at Google.
Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
Actually, lots of people were talking about it, and lots of people at big companies like Facebook were excited when Google added it behind a flag. Shopify even rolled out JXL to their storefronts not long after JXL was added to Chrome.
> Note how noone is asking Mozilla why Firefox won't support it or actually building websites using it.
People do ask why Firefox isn't supporting it, but the answer is obvious: because Chrome dropped it, and Firefox has what, 5% market share?
And people do use it on websites, you just didn't notice because companies like Nike don't tend to write up blog articles about how they're using a cool new image format.
Just reminding everyone that it is now 2024, and it is still impossible to send a HDR still image to a group of people not all in the same ecosystem.
Apple for example “supports” the JPEG XL format, but decodes it to sRGB SDR irrespective of the source image gamut.
As of today, Adobe Lightroom running on an Apple iDevice can edit a RAW camera image in HDR, can export the result in three formats… none of which can be viewed as HDR on the same device.
Windows 11 with all the latest updates can basically open nothing and will show garbage half the time when it can open new formats.
Linux is still stuck in the teletype era and will catch up to $399 Aldi televisions from China any decade now.
I should create an “Are we HDR yet?” page and track this stuff.
These are trillion dollar companies acting like children fighting over a toy.
“No! Use my format! I don’t want to play with your format! It’s yucky!”
I've edited photos using Photoshop and Lightroom with HDR support and was able to immediately view them in preview on my m1 Mac mini. Of course it looked different than if I viewed it in multiple different browsers or even the Canary, Dev, beta, release branches of all of those browsers if they had support for the image format at all. But they definitely did view correctly because I made sure that the images weren't professional... If viewed normally on something that wouldn't go HDR correctly you would just see a whole bunch of blown out whiteness but if it views correctly then you actually see that there was something in that Fulbright section of the image.
I also verified this by transferring the images to my NAS and then grabbing those images on my Pixel 5 at the time and also my pixel fold that I use now and both of those you could tell immediately when the HDR transforms the image. There's like a split second as the image shows up on the screen where you could tell that it's like tone mapping it or engaging the HDR display mode or something.
And I know we aren't talking about video but way back when Doom eternal came out I recorded a full playthrough of that using a capture card that I have that allows me to capture in h265 with proper HDR metadata. It was a messy setup because my only HDR monitor is my TV and my computers in my living room so I had to string an HDMI cable from my TV to the capture box input and then another HDMI to my computer monitor along with the USBC cable to my computer. So I beat the entire game in the avermedia preview window and then edited each level in DaVinci resolve exported that with all the correct settings after reading the like 4,000 page manual just to make sure I was doing it just right. The entire time I was editing I wasn't exactly sure it was going to come out right because my computer monitor is a 6-bit panel with dithering to make it 8-bit and it's not even an HDR monitor at all. But in the end, My m1 Mac mini was able to watch it on YouTube in HDR in 4K. My TCL 4K HDR TV was able to watch it using the built-in YouTube app. Basically anything I had that I could attach to a screen that would enable HDR mode would let that video play correctly, including the Pixel 5. And I did move things around in my living room just so I could make sure my windows 10/11 and I say that because I was insider preview around the transition time so it was kind of a hybrid of both in a way, that was also able to watch the video natively and on YouTube correctly.
I think things are more compatible than just looking at compatibility listings. If you have a modern computer with parts that are 7 years old but run a modern operating system and you have, and this is the kicker, a screen with a 10-bit or 12-bit panel that also has an actual rating of 1000 nits then you have something that can legitimately view the minimum standard for most HDR technical specifications.
If you're trying to look at HDR content and you say that it's not working correctly then you might not actually have a monitor that is at least the proper video industries or film industries minimum standard. It's okay if you have a cheap TV like I do that's an 8-bit panel that uses advanced dithering to make it 10 bit, mine for some whatever reason also could go up to 12 in windows. But you have to have that 10-bit minimum you have to have REC 2020/2084 and P3DCI 65 along with 1000 nits peak brightness.
Some gamer monitor saying that it's an HDR display and it has something like 600 nits isn't a standard it's a marketing term that that company made up so they could say it's HDR because all laypeople think HDR means is brighter. What's the point of having 1,024 levels of brightness per color if your brightness levels of your screen can't show that full dynamic range?
Now send it to anyone, in any way. iMessage? Doesn’t work. Received on an iPhone? Won’t work. Android? Definitely not. Etc…
There is no one file format that works across ecosystems. Apple is even internally fragmented, with some formats working on MacOS that don’t on IOS.
> Of course it looked different
That is broken!!
This is precisely what I mean: HDR is often incorrectly decoded as SDR in Apple operating systems. This is worse than just sending an SDR JPG because the HDR-to-SDR conversion is unpredictable, and making the HDR image was a waste of time and bits.
Note that I didn’t say HDR video! I meant specifically HDR still images, the type that JPEG XL can encode.
Right now, in 2024, if I want to send someone HDR anything, the only robust method is to make it into a video and send them a YouTube link to it.
I mean, JPEG XL is designed as an all-encompassing file format that also subsumes RAW and DNG. On the other hand, DNG 1.7.0.0 added JPEG XL as one of compression options back in 2022 (!), which might be what Samsung is actually using.
The parent commenter is incorrect. Yes, the JXL authors do want to try to use JXL as a raw format, but they are focusing on other priorities right now, so the specification for how JXL could be used as a raw format doesn't exist yet. But the format is flexible enough to allow for it, without backwards-incompatible changes.
Ah, yeah, the current JPEG XL is not enough for the full RAW coverage. (I was talking more about per-channel color spaces.) There is indeed a reserved extra channel type for specific filter layouts among others, but its specification doesn't exist yet.
For some, possibly most, needs of using raw having yuv444, 14+ (or 16+) bits of precision and lossless or near-lossless (with strong guarantees of maximum error on individual pixels) compression is supposedly acceptable. I base this interpretation mostly on digital photography discussions I have seen on the internet. I am not a photographer myself.
(12 bits of component already requires adapting to the current lighting and can supposedly slow down professional photography.)
Now Samsung has released Super HDR, without any information about that standard or if it relates to Google's Ultra HDR. Sigh.
Edit: I forgot about WebP/WebP2, which was also developed by Google as a JPEG replacement.