Detail from the comments, about the QR-code-like blocks that appear between sprocket holes:
> The blocks also carried software, so that the decoding equipment (such as the DA-10 and DA-20) could have their firmware automatically upgraded in the field, just by playing a newly-released film! Pretty advanced for early 1990 electronics.
I found another oblique reference to this in the manual for the Dolby DA20 sound processor, page 8-6 here, where it refers to "dynamic loading":
however, I can't find anything else online about this. This raises the possibility of putting a film-bourne virus in a movie or trailer, though I'm not totally sure what you could really do by pwning a sound processor.
Dynamic loading was indeed a software update of the processor using the data carried on film. If a film contained an update, it would be contained at the start of a film. The processor would detect this, read the software update, reboot (and revert into mono analogue audio for this) and then continue off using the updated software.
There are 3 blocks every 128 on the film that contain the software update (and/or other data channels, or are unused). They're also used for something called the "Splice cache" which I can discuss if you want. The rest are the audio blocks.
As far as I'm aware, Dolby only used this once - to update from what they called version EC9 to EC11. I've also heard that it didn't work well with the DA10 - their first processor - and this may be part of the reason.
Is the "splice cache" used to cover for discontinuities (e.g. reels spliced together for the entire film, cuts to remove damaged sections...)? I'm just guessing from the name here.
Indeed it is! Films come on 20 minute reels, and from the late 80s onwards were spliced together onto one big reel (or platter) for playback. However the splice often wasn't perfect, and the reel ends are most likely to contain dust/dirt/scratches, which could knock out the digital audio and revert playback to the analog track. This obviously isn't ideal, so Dolby had the idea of encoding the area at the end of each reel throughout the entire reel too, which is stored in a cache in the processor. Then, when the end of the reel is reached (you can tell this because each block has a sequence number), the processor uses the cached data to ensure a nice changeover.
The Wii used this technique - game discs included the latest version of the system software, so that consoles would get updated by buying new games even if they were never connected to the internet.
I suspect the author doesn't know much about digital audio and theatrical audio.
The Dolby Digital signal is AC3, a lossy codec for 5.1 audio. I believe it's the same AC3 that DVDs use. Bitrates (and audio quality) is roughly similar to MP3. Both were developed around the same time.
> I suspect audio data is stored uncompressed.
A little bit of math: 96 squares a second, at (assumed) a 48000 / second sample rate correlates to 500 samples / block. If it's 5778 bits a block, (and that's a strong if,) that's ~1000 bits / block / channel, or ~ 2 bits a sample. There's definitely a LOT of compression going on.
> I couldn’t find any public references to the dolby matrix encoding format
The author is confusing terms here. "The Dolby Matrix Encoding Format" is what's going on in the analog audio tracks. In audio, "matrix encoding" is how multi-channel audio is derived from two-channel audio. The analog audio is "Dolby Stereo," where a 5.1 mix is "matrixed" into a 2-channel recording, and then "de-matrixed" on playback. (In theatrical audio, "stereo" means what home audio calls "surround" or "multichannel.")
You sure Dolby Digital is that low of quality? From what I understood it was lossy compression but mainly limited by the bandwith of things like optical cables and could approach some pretty decent levels. Just because it might be at a similar sounding bitrate does not mean the quality is the same.
edit: This explains the differences pretty well - Dolby Digital (original version) can go higher than 320kbs
From my experience, "low quality" MP3s had little to do with limitations of the MP3 codec. They came about because whoever encoded them chose low bitrates, used a bad encoder, or chose bad settings when making them. And, typically whoever was encoding an MP3 wasn't a trained professional.
In contrast, AC3 encoding was performed by professionals who understood what they were doing.
How does professionalism help if they don't have any choice in parameters? AC3 streams on DVD are constant bitrate and there's only one encoder, as far as I know. It's technically fairly similar to MP3, and mostly gets away with that by being used at higher bitrates.
Stereo AC3 encodes for DVDs were commonly 128kbps with some in the 224kbps-256kbps range with 5.1 surround in the 320kbps-384kbps range. IIRC, AC3 could go to 512kbps, but rarely did that happen as it took away bandwidth from the video. All of the video/audio/subtitle streams in DVD VOB had a total bandwidth of 9.8Mbps. If you had multiple 5.1 @ 320kbps and a default stereo as 224kbps, you've already robbed just under 1Mbps. Subtitle streams were small, but added up as well when supporting multiple languages. Avg desired bitrate for video was 8.5Mbps, but with overhead and all of the audio, it was often much lower. All of that math is just for the demuxer and does not take runtime into consideration.
I am mainly thinking about this for modern sources - I haven't run a DVD player in years and years. But I do have a Roku with a Dolby Digital hardware encoder built in that is encoding anything 5.1 from Netflix/HBOMax/Amazon etc etc to Toslink to send to my somewhat older receiver. It sounds pretty damn good so I am hoping it is defaulting to the highest bitrate it can but who really knows. Have my Chromecast Audio also using Toslink which I assume is doing stereo AC3
TOSLINK is just an optical connector system, it's not a system for encoding. Just like "RCA" is a type of connector, and you can run whatever signal you want.
Optical audio in consumer equipment is very limited. It seems like it should be high-bandwidth because it's optical, but it falls short. For consumer audio equipment, ordinary RCA cables and TOSLINK optical cables do the same job, and the optical cables don't have better bandwidth or let you do longer cable runs.
You get two channels of PCM audio, or more channels of lossy audio. If you're running stereo or 2.1, I think it makes sense to configure your Roku to send a stereo S/PDIF signal over TOSLINK.
I understand how optical works - its just that my receiver only has optical or RCA. Early 2000s receiver. Roku only has HDMI out and Toslink. Trust me - definitely the best (and I believe ONLY way) to get 5.1 to this thing is using Toslink. No S/PDIF connections on either device.
That's because S/PDIF isn't a type of connector. It's commonly carried over coax cables (with RCA connectors) or optical cables (with toslink connectors).
I had to switch to a TOSLINK cable for my soundbar. My LG TV and LG Soundbar are really meant to connect via ARC using HDMI cable. However, mine had serious problems with going out of sync and playing distored audio for 10-15 seconds before correcting itself. It was very very annoying. I switched to a TOSLINK cable and no longer suffer that issue.
Yea that is a somewhat common issue with HDMI and audio - if the device passing it through cant keep up then there can be sync issues. Or the source device can also be a problem (or an app on the source device).
I have found toslink and SPDIF to be both very reliable in this though (probably since they are such mature technologies at this point).
Make sure your TV is actually outputting the correct format over Toslink to your soundbar - many times they default to 2.1 instead of 5.1 or the wrong format (Assuming you have a soundbar with more than just 2.1). A lot of TV's also downmix to 2.1 audio if you pass through a source to them and then use the Toslink output of the TV to a receiver (like say if you did Apple TV --> TV --> Receiver)
I can't remember ever having a receiver fail. Every receiver I've owned still functions. If I visit my dad, he still uses a receiver that he got in the 1980s.
Older receivers (1980s) can blow an output transistor, which is a pain (fixable if you are skilled). Newer receivers are often built with integrated power amps that have extensive protection circuitry... they're truly impressive, and hard to accidentally damage.
I've done some amplifier repair work. The amplifiers I repaired were typically much older (1950s, 1960s). The most common failure modes are fuses blowing, potentiometers failing, capacitors failing, and PCB-mounted jacks failing.
I still have a 1970s model head unit that sounds just as good today as when my dad passed it down to me as a teenager in the 90s. The thing I loved about it so much is it had A, B, A+B mode so that it could drive 4 speakers. It wasn't quad, but I could put 2 sets of stereo speakers in different rooms.
First, I wouldn't call MP3 "low-quality". There are some limitations to MP3, and it has lower fidelity than other codecs at the same bit rate, but MP3s are crystal clear when done right.
AC3 has higher fidelity at the same bit rate, and can be used at higher bit rates, but the difference between AC3 and MP3 is not exactly night and day. AC3 is roughly similar to MP3 in terms of quality and bitrate, but that's not a bad thing.
As mentioned elsewhere in the thread, the audio format in cinema wasn't technically AC3 but something quite similar.
MP3 has fatal flaws on some audio and can't compress it transparently at any bitrate - notably happens with cymbals. Some of this is because there's a maximum bitrate above which it's technically out of spec.
There is a reason we don't use it anymore. AAC/Opus don't have these problems.
Valid reasons... but definitely not "low-quality".
When we're talking about "transparency" here, we're talking about finding small percentage of samples that can be ABX'd by people who are specifically listening for artifacts. For plenty of source material, you get transparency with MP3.
Yes, for almost all source material this is true. Unfortunately there's definitely killer samples out there that can be distinguished at 320kbit latest LAME with no effort at all.
Also, rather than increasing complexity I think the newer better codecs are actually simpler than MP3, since we learned which parts of it helped and which didn't.
Was that ever released for film use? I don't recall seeing any prints referring to ProLogic or ProLogic II during my time as a student film projectionist. If so, it was highly backwards compatible since our decoder was only SR (analog 4.0 matrix).
The prints would generally come with SR (analog 4.0) and SR-D (Bitmaps between the sprockets). Most of the time you'd also get DTS CDs and about 40-50% of the prints we got (in Stockholm) had SDDS, though I think there were maybe like 5 SDDS theatres in Sweden.
The analogue soundtrack on 35mm is recorded optically and also used as a fallback in case the digital soundtrack fails for some reason. It often used Dolby Stereo which could be played as a regular stereo track but also as four channels including centre and surround.
70mm film was far superior with 6 discrete magnetic soundtracks. But it was also far more expensive and required entirely new equipment.
Digital soundtracks were a solution to retrofit discrete multi-channel audio to existing 35mm technology. I love how they found space next to and between the perforations. It reminds me of how colour TV works. DTS is one that used the timecodes and came on an external CD which would also be far cheaper and something cinemas could easily upgrade to without replacing existing equipment.
The LFE channel appeared with these formats and is another example of supporting retrofitting. The idea with LFE is cinemas could simply buy and install some subwoofers and feed the LFE channel directly to them. No expensive bass processing required. No need to touch the existing main channels. Easy upgrade. Now every cinema has bass management so it's not really necessary but it's still there.
Dolby digital could be used without replacing much existing equipment: the sound head looked like it bolted right onto the 1940s-vintage projector in the one installation I've seen. would look like it belonged if the projector wasn't gray wrinkle paint and the sound head smooth black finish. yes, several parts have been updated, including the light source and a platter system, but again it looks like none of this involved doing any violence to the actual projector.
There were several projectors (such as the Century JJ) which could show 35 and 70mm that were stuck with only 35mm capability after the Dolby Digital upgrade; the mounting location blocked the 70mm egress slot. There were workarounds that involved hacksaws dremels, and a new roller, but not very good ones.
Depending on your projector and the amount of space you can fit one or two digital heads in there. We modded our 1950's era Bauer U2's to add DTS readers upstream of the analog sound readers, but at the expense of the 70mm magnetic sound heads. Took about an afternoon.
As I recall you also had mag strips on some 35mm films but it wasn't very widely used. (And it may have been all films that were originally shot in 70mm.)
The SDDS soundtrack (blue strips on the outer edge of the film) had severe issues with reliability the older the print got. This is because that part of the film is the bit that touches the rollers while the film is being transported from the platter to the projector and back so unless everything is squeaky clean and perfectly aligned it’ll get scratched and SDDS would begin to drop out (and fallback to other formats, usually analogue depending on your config) resulting in a bad experience.
If you got a second-hand print from somewhere you’d just never bother to run SDDS.
Since it's digital data, I wonder if anyone ever captured the data the very first time it was played back, and then just played the captured version on subsequent runs.
It would eliminate the problem with it wearing out.
Commercially? Probably never happened. It is just easier to use more reliable formats like Dolby Digital or DTS (which is shipped with the print on CDs and synchronised using a timecode… some DTS units had HDDs in them you could load the soundtracks in to so you didn’t have to swap the discs over when you changed what movie was showing).
Maybe it was different elsewhere, but SDDS was pretty much a non-starter for us for any kind of effort.
I've captured Dolby Digital data to rip the audio from the film before. I don't (yet) have a SDDS reader, but when I get one it's something I want to try.
Sound on film is fascinating, and demonstrates a funny tension in analog formats: there's visible grain in the frames themselves, and yet a precise light source can encode massive amounts of data in just the spare emulsion between the sprockets.
Some old "Super 8" 8mm handheld movie cameras recorded audio magnetically on the side of the film, which is weird because the film itself is not magnetic at all. There's just a thin magnetic strip running along the edge. I've seen it on recordings from the 1970s, but it probably dates to the 1960s.
There was also stereo optical recordings but it was typically mono. You could hold the film up to a light source and see the inverse of the waveform. There was one place in town that I was aware of that had a stereo optical on their telecine unit. It wasn't that hi-fi though.
> The optical soundtrack on a Dolby Stereo encoded 35 mm film carries not only left and right tracks for stereophonic sound, but also—through a matrix decoding system (Dolby Motion Picture matrix or Dolby MP[1]) similar to that developed for "quadraphonic" or "quad" sound in the 1970s—a third center channel, and a fourth surround channel for speakers on the sides and rear of the theater for ambient sound and special effects. This yielded a total of four sound channels, as in the 4-track magnetic system, in the track space formerly allocated for one mono optical channel. Dolby also incorporated its A-Type noise reduction into the Dolby Stereo system.
That sounds like you might be thinking of the earliest form of surround sound, as was used for 2001 and some other films. This was developed to become Dolby Stereo in the late 70s, and Dolby Pro Logic for home cinema in the 80s.
Makes you wonder what resolution of video could be encoded digitally if you used the entire frame. With compression algorithms and a good encoding method, surely the image quality could be improved over the standard analogue print.
There is actually a reason to do this - and people do. For long term archival film is king - it's store and forget, unlike hard drives that require migration every few years. So people write digital data to film, either in the form of text documents as images, or some digtal encoding of the data, and then it's put into long term storage.
GitHub did this with its Archive Program https://archiveprogram.github.com/, and Piql https://www.piql.com/ is a company that specialise in doing this. They're the company that created the Cinevator, a high speed data to film transfer machine that was originally used for creating film prints, but since film is kinda dead these days, they've pivoted to archival storage.
> Dolby Digital cinema soundtracks are optically recorded on a 35 mm release print using sequential data blocks placed between every perforation hole on the sound track side of the film. A constant bit rate of 320 kbit/s is used. A charge-coupled device (CCD) scanner in the image projector picks up a scanned video image of this area, and a processor correlates the image area and extracts the digital data as an AC-3 bitstream. The data is then decoded into a 5.1 channel audio source. All film prints with Dolby Digital data also have Dolby Stereo analogue soundtracks using Dolby SR noise reduction and such prints are known as Dolby SR-D prints.
All correct (as far as I'm aware), except for the unfortunately common inaccuracy that Dolby Digital on film is AC3. Sadly it's not - it's an earlier iteration of that technology that's incompatible. It must be pretty similar, though.
Sorry, I didn't mean to comment on your expertise, but rather the epistemology of who we believe.
> My knowledge comes from someone who helped develop Dolby Digital on film so I'm pretty sure it's correct.
That is certainly better information than anyone else here has, and you seem to actually understand the technology. I wish HN was 90% comments like yours and almost none of the rest, especially Wikipedia!
But while we're talking about epistemology! Decades-old memories go wrong, and when vested interests are involved the memories can drift to desired beliefs or familiar narratives. That doesn't make the comment a bad one - it's very valuable, actual knowledge. But it's not sure to be correct either, and/or not precise.
Absolutely true and I agree with you. Something else to back up that it's not quite AC3 is that I've captured the raw signal that goes into the decoder box's "AC3-decoding card", and it's not quite what I'd expect for AC3. You can learn more about that and download the raw signal capture at https://fanrestore.com/thread-2633-post-73179.html#pid73179
Holy cow, what an amazing-looking community! What is going on there? Who is behind it? And when people restore a work, can they share it with other members?
TLDR: If you know about audio encoding and/or want to help reverse engineer this format, I'd be very interested in hearing from you!
----
Oh, a topic I know a bit about!
To add a few things onto the stuff mentioned in the article:
- A lot of places mention Dolby Digital on film being AC3 encoded. This is sadly not true - ish. Dolby were developing AC3 for film and other applications, but the version that ended up going on film isn't quite AC3 - although I suspect it's very close. To quote someone at Dolby:
> The audio coding used is actually a pre-cursor of what became AC3, and in fact, two different versions of the codec were used during the lifetime of Dolby Digital on film. We referred internally to these as EC9 and EC11. I don’t remember the exact characteristics of these codecs but unfortunately they are not compatible with AC3
- The article mentions Reed-Solomon encoding being used - this is indeed the case, but more specifically a concatenated code meaning that two Reed-Soloman instances are applied one after the other. However because nothing is nice and simple here, some of the bits are only protected by one of the R-S layers and not both, but I'm not sure which at the moment.
- The article questions how the bytes are stored - this is actually mentioned in the patent. The bytes are arranged in 2x4 groups, ie 2 horizontal, 4 vertical, and therefore each block is 38 bytes wide, and 19 bytes high. Interestingly the picture in the article is rotated - you can tell by the Dolby logo being the wrong way round.
- Something not mentioned in the article or the patent - before the Reed-Solomon decoding is applied, all the bits are XORed against a constant in order to break up large black or white areas which would be hard to detect. Because the same value is used during encoding as decoding, the original data is recovered. Sadly, I don't know what this constant value is.
I would love there to be a open source decoder for the Dolby Digital on film, but at the moment I'm at the limit of my knowledge on AC3 and audio signal processing. If anyone knows more, or wants to help, I'd be delighted! You can reach me on email `fergusondavid6@gmail.com` or discord `davidferguson#2018`
I picked up some 35 mm feature film at an auction, thinking I'd be able to find a cheap projector at some point and it could be a fun weird hobby... But then I realized the cheap projectors are not 35 mm. Also, I only got half the film (reels 2, 4 and 6)... so if anyone wants half of Twilight: Breaking Dawn - Part I, my email is in my profile.
This is funny, you've got half mathematically, but logically you've got the second, fourth and sixth sixths of the feature!
When I was still working as a projectionist, for big releases (like Twilight) the film courier would often bring three reels on Monday and three reels on Tuesday, so you could make it up Tuesday night and test before the film opened on Thursday (or Wednesday night for midnight releases). I suspect that is why you have what you have...
Most theaters moved to digital around 10 years ago and most of them got rid of their 35mm projectors (some kept one or two just in case)(I worked on the provider side of the migration to digital cinema at the time). In theory there should be an abundance of cheap 35mm projectors on the market, but maybe the demand wasn’t all that so most of the projectors were just tossed? I know some projectors were shipped to 3rd world countries as well.
There was no demand, because you can't buy 35mm prints, unlike 16mm and 8mm. And additionally, in order to finance the purchase of digital projectors, many cinema owners went with something called "Virtual Print Fees" (VPFs), where the cost was paid off in time. However, most of these mandated that the film projectors were either removed and destroyed, or disabled so they couldn't run film. Why? Well this ensured that everyone switched to digital and couldn't go back, so the VPFs were still paid. Very sad, hearing about all these machines being damaged.
I worked as a projectionist in an old theatre in my teens that used 35mm film and the analog sound stripe. This was early 2000s and was very dated for the time. Projectionist used to be a trade, but by this point they gave the job to kids with minimal training for minimum wage. I loved threading the film through the projector and across the room on pulleys to the platter that wound it back up, the clickity clack when you turned it on.
One neat thing I remember: we applied these little foil bits on the edge of the film to control the house lights. They were applied at either end to dim the lights when the movie started and bring them back up at the end.
I think I still have a Star Wars Ep. III trailer on 35mm kicking around.
The Dolby Digital track is actually compressed, not uncompressed as the author speculates. Back then, "Dolby Digital" branding was still synonymous with the Dolby AC-3 codec, which offers lossy audio compression.
The article miscalculates the data rate as "584KB/s (Kilobytes/s)" when the numbers actually give 554kb/s (using the patent's 5776 bits per square). This is kilobits per second, not kilobytes per second. The incorrect kilobytes number is more than enough for 6 channels of uncompressed CD quality audio, but the real number requires lossy compression.
Interestingly AC3 (with Joint Object Coding) is also what Apple Music uses to deliver Spatial Audio, although the OS makes it difficult to decode to any layout beyond 5.1.
> The blocks also carried software, so that the decoding equipment (such as the DA-10 and DA-20) could have their firmware automatically upgraded in the field, just by playing a newly-released film! Pretty advanced for early 1990 electronics.
I found another oblique reference to this in the manual for the Dolby DA20 sound processor, page 8-6 here, where it refers to "dynamic loading":
http://www.film-tech.com/warehouse/manuals/DOLBYDA20.pdf
however, I can't find anything else online about this. This raises the possibility of putting a film-bourne virus in a movie or trailer, though I'm not totally sure what you could really do by pwning a sound processor.