No matter what the cost per MB, why is it completely normal for everybody nowadays that websites can be multiple million bytes in size? It's text after all. Boggles my mind.
This is why I only use old.reddit.com. If they ever stop providing the old Reddit, I doubt I'll keep using it. Sure the old one isn't perfect, but I find the new one borderline unusable, even on a modern laptop.
There's a furniture chain here that has a website so terrible, I just gave up on trying to buy what I wanted from them. So much happening in JavaScript to make things happen on the page. I have to imagine it's also terribly written, since it freezes up the tab most of the time, and Firefox suggests killing the script.
I don't think I've ever been asked to accept a cookie policy on Reddit. I normally only browse Reddit while logged in, so that could be part of it. Is this a new thing?
Yes it (new reddit) even brings my i5 to its knees... Way overbloated with javascript. Also in FF by the way. Perhaps Chrome is better but I won't use it.
I relate to this a lot. reddit.com struggles to load on my MacBook Air 2015, or a Windows with 9th gen i5 with 8GB RAM.
I have an extension on my Firefox to always load the old reddit. There is an account-specific setting to always load old reddit, but there are times when I browse without an account.
Reddit on mobile is a disaster. It's clearly designed with one goal in mind: pushing as many people to their app as possible. You can't even read threads without logging in.
Especially reddit which has such a simple UI that no frameworks are needed.
If you drop support for MSIE 11 and only support modern browsers the modern javascript is surprisingly compact.
But then your CV looks stupid: no frameworks on the list. How would you ever get a job again?
How do you get any respect if you say "I just code javascript directly on the browser", sounds like you're too stupid to learn a framework.
You will only get respect from the very best developers but not all the copy/paste 'developers', no respect from the HR department nor mediocre managers.
Use of frameworks is often blamed for bloat, but the two are completely independent.
React is about 110kB. Vue is 33kB. It's perfectly possible (and really not that hard) to write fast, small web apps with the big popular JS frameworks. It's the thoughtless stuff people do—pulling in huge libraries for tiny features, mandating the use of huge analytics packages, etc—that cause the bloat.
Whether it should all just be server-rendered is a whole other argument, but if you're going to build something the size and scale of reddit as a client-rendered app, you'd be crazy to not use one of the existing frameworks IMO. If for no other reason than you're likely to end up just re-creating a worse version of them.
I think bloat really means two different things. One is the obvious "file size" issue, and as you are mentioning it's quite possible to write small sites even using heavy-weight frameworks.
But a lot of the complaints in this thread boil down to modern sites being slow to run on computers that are only a few years old. If modern frameworks are to blame for that, it's not because they're a hundred (or more) KB. It's because they require the end user to run way more Javascript in the browser and consume far more memory than a site with minimal Javascript requires to accomplish the same thing.
As a random example, Firefox's about:performance page says that the three YouTube tabs I have open are each consuming about 50 MB of my computer's memory. That's obscene. (None of the tabs are open on a video, so this is not the result of storing videos in RAM.)
Also, I did a Lighthouse analysis of the Reddit home page in Chrome (on Desktop, but targeting mobile). The page scores less than 20/100 on performance, and takes almost twenty seconds to become interactive!! The biggest problem Lighthouse sees with the page is not the file size, but the amount of Javascript being executed. There's ~25 seconds of "main thread work" being done in my test.
If you drop support for MSIE 11 and only support modern browsers the modern javascript is surprisingly compact.
On the contrary, I'm pretty sure you could make Reddit a usable site in everything down to text-based browsers (static HTML + small bits of JS), and still have it be faster and smaller than what the new version is today.
"Modern" development is a horrible bloated mess.
I agree with the rest of your comment, however; and perhaps the solution is to not hire only "web developers", as those whose full-time job isn't to work on that stuff seem to be far better at not adding "padding" for the sake of "CV-driven development", when they are actually asked to work on websites.
The people who develop said sites live in a bubble where high speed computers and internet are the norm, and many of them are too young to remember when computers were actually slow, so they don't really understand how unbelievably wasteful their product is.
Indeed. Compare to https://teddit.net (no-js reddit front end), and you can see the vast difference in bandwidth and performance.
Some of the cause of all this has to do with market forces and prioritizing speed of development over customer experience. But I don't think that's the whole story.
Many of them, thanks in part to dynamic linking. (I don't think excluding DLLs is too unfair, after all the browser constitutes an enormous Javascript runtime environment.)
VLC is one of the more complex programs out there, and yet the VLC binaries and the libvlc libraries they ship with total only 1.295 MiB on my computer.
The main application for Scid, a chess database program, is only 1.1 MiB. (The program also comes with a simple test chess engine and some opening books, so the overall package is somewhat larger.)
The binary and libraries for hexchat, an IRC application, total 1.22 MiB.
The program files for Pinta, an image editing program, come in right at 1.3 MB.
The vast majority of programs are under 10 MiB total. Moreover, I've tried to pick relatively full featured desktop apps, and desktop apps as a rule do more than any 1 MB web page. And these applications come in highly compressed archives, which means that their download size is often significantly less.
Furthermore, I would argue that web pages are rarely comparable to applications. When you're browsing Reddit, if the home page is 1.3 MiB on mobile, chances are the first individual post you click on is also going to be 1.3 MiB, and the one after that, and so on. You don't get the benefits of Reddit by downloading an "application" one time and getting tiny updates after that. If you're lucky most of the libraries being pulled in will get cached, but frequently it doesn't work out that way. On desktop Reddit, a warm refresh of the site still used more than 2.5 MiB for me, and that's with an ad blocker enabled.
It's possible to create GUI applications with under 10kb in size, if you link dynamically and use raw win32 api. Pretty sure that my personal wxwidget stuff was way under 100kb too back in the days. Never used GTK directly on linux, but again, 1MB is an indicator that you are either linking statically, or doing something wrong.
How much signal is there in that 1.3MB though? We don't think about the weight of websites much nowadays because we have insane bandwidth, but how much of that is actual content? The text on one page can't be more than a few kilobytes, and images don't have to be much more data either if they're just thumbnails.