• rottingleaf@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 hours ago

    I personally think it’s not about Mozilla. It’s about the Web.

    You need to see the bigger picture always.

    The Web as an application for global system of hypertext documents served from different computers is fine.

    The Web wasn’t intended as a platform for platforms for global applications.

    It’s used as one, because that allows a certain kind of people to gather power. Networked personal computers made the civil society too powerful. Needed a solution.

    Why the Web and not just “Facebook native application” and “Google native application”? Well, it’s hard to maintain a hypertext document system made application platform. It limits competition. It also allows Facebook and Google popularity to affect web browser and web techologies popularity. If these don’t work in a browser, that browser is doomed.

    While the verticals and monopolies themselves allow thieves and murderers in governments to control the Internet.

    So - there weren’t that many websites, if you think about it, requiring any particular web technology when it came into existence. Those mostly started specifically for Google, Facebook etc services and/or policies. Say, HTML5 to phase out Netscape plugin API, which was presented as phasing out Flash (everybody hated Flash).

    Mozilla followed those policies and appeared neutral, yes.

    But in general the moment using Dillo or Netsurf or Links became plainly, completely not an option for the Web, it was decided. A world standard that has only a handful of compliant realizations is not a standard. It’s an oligopoly.

    So, getting back to hypertext - Flash was hated by some because it didn’t allow to turn the whole webpage into an application, but that wasn’t its purpose. JS was a mistake, I think. Any interpreted content should have been embedded in its clear place separate from the rest of the page with its own plugin, similar to Flash applets. But - one can accept that in year 1996 they didn’t think of such consequences.

    And remote big services not being standardized were also a mistake. I wrote a bit on that from time to time here, gets tiring to repeat - a lot of what the server side of many applications does is just routing to another client, computation and storage. One can devise a standard for remote services. So that local applications would be different, but would use the same pooled infrastructure, found and announced via trackers similar to torrents. With global identifiers of entities to allow interoperability, so that “post #12435324646dasgtshdryh” would be the same text on any of such storage services (having it) and at any point in time.

    That, of course, is a bit late. In our current world things like Briar and other mesh are probably a better direction. One can have what I described over them too, but it will also require management of bandwidth and bottlenecks and stuff not reachable directly.