Here is an example of what I'm talking about: https://www.heavensgate.com/

This is a website from a cult that committed suicide in the 1990s. This is irrelevant to the thread.

When you click on their links, they are very snappy, why is that exactly?

death to america

  • Parzivus [any]
    ·
    2 years ago

    marxists.org and marxism.org are both good examples of lightweight websites and the average marxist's graphic design passion

  • Llituro [he/him, they/them]
    ·
    2 years ago

    Old websites like this are almost nothing but HTML, and most of the styling is done through that as well. HTML is basically just a description of a document formatting, so like the header and body and title and stuff. Browsers today support more complicated styling via CSS which means it will spend more time making things pretty. Browsers today also support JavaScript, so they can basically run arbitrary code on a website. That enables all kinds of modernizations and stuff, but the costs are that you need to download way more data, and then your computer has to spend more time actually running that code. All of that takes time, especially if things aren't optimized very well. So the old internet is basically loading text and the new internet is running a program.

    • Antiwork [none/use name, he/him]
      ·
      2 years ago

      Css doesn’t slow down websites that much either though. It’s when you start adding a ton of different css that links to JavaScript just to load an image. And all the plugins that link to other sites to track visitors across the web.

      • Ideology [she/her]
        ·
        2 years ago

        Yeah, iirc, a CSS file is normally less than 1kb if you're handwriting it for a small static website. Should load instantly.

      • Llituro [he/him, they/them]
        ·
        2 years ago

        Yeah, I guess that's more what I was getting at. I am not a web dev by trade, I'm just aware that stylesheets are often another of the things that gets downloaded.

  • HornyOnMain
    ·
    2 years ago

    linking the heavensgate website as an example lmao

    :michael-laugh:

      • InevitableSwing [none/use name]
        ·
        2 years ago

        "We've determined that you and Tom are the two people who get the honor of doing it."

        "But... I don't get to go to paradise?"

        "Sure you do. And Tom too. Just not next month. Somebody's gotta stay here in hell - just kidding - and run the website to make the remainers understand."

  • ToastGhost [he/him]
    ·
    2 years ago

    modern websites load a billion gigabytes of tracking and ads into your browser.

    • Beaver [he/him]
      ·
      2 years ago

      For example: when you go to the heaven's gate site, your browser downloads 13kb. If you go to a site like Facebook, your browser has to download literally a thousand times as much shit.

    • forcequit [she/her]
      ·
      2 years ago

      Need to emulate the 112k connection when loading images tho

      • D61 [any]
        ·
        2 years ago

        Watching the imagine resolve, line by line, and it taking a full minute.

    • sgtlion [any]
      ·
      edit-2
      2 years ago

      Shout out to the Gemini protocol. Which is basically HTML 1.0, no cookie bullshit, no fancy apps, no scripts. Just text and links. It is adorable.

      https://gemini.circumlunar.space/

  • makotech222 [he/him]
    ·
    edit-2
    2 years ago

    I develop websites for a living. Basically, its because new websites are actually apps, not just simple html documents

    Old websites:

    1. Recieve request
    2. Generate Html (if there is any sort of customization, otherwise, just return requested html file) (< 500kb)
    3. User receives fully formed webpage without any dynamic logic

    New Website

    1. Receive request
    2. Deliver js bundle containing app and all dependencies (at least 5mb, usually way more)
    3. User receives bundle, now must run the code bundle
    4. Code bundle is loaded showing a blank page
    5. Blank page makes a request to actually get the data that belongs on the page
    6. Server generates data
    7. User receives the page he wants

    Usually on a new website, the first 4 steps only occur only on first visit, since they are cached. Nonetheless, every interaction with the webapp will run js code locally on your computer to handle it, which takes additional time vs the old websites where every interaction was purely a server request for a new page.

  • PasswordRememberer [he/him]
    ·
    2 years ago

    I don't know shit about websites but I want to congratulate you on joining the death to America signature club :07:

    Death to America

  • infuziSporg [e/em/eir]
    ·
    2 years ago

    The pressures under capitalism are not to make sites that are efficient and usable, but to make sites that keep users on them and generate revenue.

    As the power of the average consumer's computing device has risen, the demands of the content have risen in tandem. So as connections and processors got faster, we went from banner GIF ads to Flash ads tovideo ads and also invisible trackers that log all your clicks and mouseovers and even how much time you spend looking at something before scrolling.

    Many prominent sites deliberately add a couple extra seconds of "loading" animation or visuals even when all the data is already loaded, just to try to convince the user that the site is experiencing the burden of processing.