Well-written article, manages not to sound rant-y while describing the problem well.
I feel like part of the blame for the situation is that JavaScript has always lacked a standard library which contains the "atomic architecture" style packages. (A standard library wouldn't solve everything, of course.)
What functionality is still missing from the JS standard library? The JS standard library seems massive these days - especially in modern runtimes like Node and Bun.
I really think writing dependency-free JavaScript is the way to go nowadays. The standard library in JS/CSS is great. So are static analysis (TypeScript can check JSDoc), imports (ES modules), UI (web components), etc.
People keep telling me the approach I am taking won't scale or will be hard to maintain, yet my experience has been that things stay simple and easy to change in a way I haven't experienced in dependency-heavy projects.
I’ve been exploring this for years, even made a tutorial website about building sites and apps without dependencies (plainvanillaweb.com). What I’ve learned is that many of the things the frameworks, libraries and build tools do can be replaced by browser built-ins and vanilla patterns, but also that making things that way is at present an obscure domain of knowledge.
I think this is because the whole web dev knowledge ecosystem of youtubers and tutorial platforms is oriented around big frameworks and big tooling. People think it is much harder than it actually is to build without frameworks or build tools, or that the resulting web app will perform much worse than it actually will. A typical react codebase ported to a fully vanilla codebase ends up just as modular and around 1.5x the number of lines of code, and is tiny in total footprint due to the lack of dependencies so typically performs well.
To be clear though: I’m not arguing the dependencies are bad or don’t have any benefits at all or that vanilla coding is a superior way. Coding this way takes longer and the resulting codebase has more lines of code, and web components are “uglier” than framework components. What I’m saying is that most web developers are trapped in a mindset that these dependencies must be used when in reality they are optional and not always the best choice.
Thanks for creating and sharing that resource! I'm reading through it now, and it looks fantastic. I'll share it the next time someone asks where to get started with web dev.
Come to think of it, I should write up the techniques I use, too...e.g. I have simple wrappers around querySelector() and createElement() with a bit of TypeScript gymnastics in a JSDoc annotation to add intellisense + type checking for custom elements.
Would you be open to a pull request with a page on static analysis/type checking for vanilla JS? (intro to JSDoc, useful patterns for custom elements, etc.) If not, that's totally OK, but I figure it could be interesting to readers of the site.
And agreed on vanilla/dependency-free not being a silver bullet. There aren't really one-size-fits-all solutions in software, but I've found a vanilla approach (and then adding dependencies only if/when necessary) tends to help the software evolve in a natural way and stay simple where possible.
I've been doing JS for nearly a couple decades now (both front and back) and I landed on the same approach a few years ago. Pick your absolutely minimal set of dependencies, and then just make what you need for everything else. Maybe counter-intuitive to some, I feel like I'm more comfortable maintaining a larger codebase with less people.
What's more, given the tools we have today, it fits really well with agentic engineering. It's even easier to create and understand a homegrown version of a dependency you may have used before.
Did this for a project in 2022. Haven't had any drama related to CVEs, hadn't had any issues related to migration from some version of something to another.
The client has not had to pay a cent for any sort of migration work.
Is the lack of CVE because the implementations you wrote are better written and safer than those in the standard libraries or because no one has checked?
Presumably the latter. However, mindlessly bumping package versions to fix bullshit security vulnerabilities is now industry standard practice. Once your client/company reaches a certain size, you will pretty much have to do it to satisfy the demands of some sort of security/compliance jarl.
There are certainly security benefits to keeping things in-house. Less exposure to supply-chain attacks (e.g. shai-hulud malware) and widespread security bugs (e.g. react server components server-side RCE). Plus it's much easier to do a complete audit and threat model of the application when you built and understand everything soup-to-nuts.
Of course, it also means you have to be cautious about problems that dependencies promise to solve (e.g. XSS), but at the same time, bringing in a bunch of third-party code isn't a substitute for fully understanding your own system.
been doing something similar. the projects ive been building recently use as few dependencies as possible and honestly the maintenance burden dropped significantly. when something breaks you actually know where to look instead of digging through 15 layers of node_modules. people said the same thing to me about it not scaling but the opposite turned out to be true.
yeah, plus stack traces, debuggers, and profiling tools are easier to use when all of the non-essential complexity is stripped out. which in turn means it's possible to work productively on software that solves more complex problems.
that's in contrast with the sort of stuff that invariably shows up when something falls over somewhere in a dependency:
cannot access property "apply" of null
at forEach()
at setTimeout()
at digest()
at callback()
at then()
...
it's not fun to step through or profile that sort of code either...
Depends on what cryptography you're talking about, the Web Crypto API exists for quite some time, so I'd say that fits in (usually) with "The standard library in JS/CSS is great".
Is there no room for describing the setting? Must every utterance that sets the atmosphere also advance the plot or reveal character? Is there no room for mood?
describing the setting should (ideally) be done through a character's interaction with the setting.
if you're developing some sort of dystopia where everyone is heavily medicated, better to show a character casually take the medication rather than describe it.
of course, that's not a rule set in stone. you can do whatever the fuck you want.
He's very efficient with prose and I find it a joy to read (well, given what he's writing about it's not always joy, but still). I'm not sure he's following that rule 100% of the time, but it's close. Depending on the setting, you can often describe it through characters' actions or how it shapes them.
Some authors rarely describe a place objectively. We see a space through the eyes of the characters - and in doing so, we learn about our characters as we learn about the space they inhabit.
Setting would provide the context for action or characterisation to occur in a meaningful way, or provoke it, so it is necessary part of both (if done for either of those purposes). Given that, the charitable interpretation would be to only provide enough description of the setting for that.
All software has bloat, but npm packages and web apps are notorious for it.
Do you think it could be inherent to the language?
JavaScript seems to be unique in that you want your code to work in browsers of the past and future—so a lot of bloat could come from compatibility, as mentioned in the article—and it's a language for UIs, so a lot of bloat in apps and frameworks could come from support for accessibility, internationalization, mobile, etc.
The problem JS development is facing is the same most languages might go through. The "Magic" that solves all problems, frameworks and solutions that solve small issues at a great cost.
Lots of developers don't even say they are JS devs but React devs or something. This is normal given that the bandwidth and power of targets are so large nowadays. Software is like a gas, it will fill all the space you can give it since there is no reason to optimize anything if it runs ok.
I've spent countless hours optimising javascript and css to work across devices that were slow and outdated but still relevant (IE7, 8 and 9 were rough years). Cleverness breads in restrictive environments where you want to get the most out of it. Modern computers are so large that its hard for you to hit the walls when doing normal work.
You should give it a try to compile other game engines, and compare them, Unreal Engine is a fun one with the source available, take a look how big their artifacts are :)
With that said, there are plenty of small game engines out there, but couple Rust's somewhat slow compile times with the ecosystems preferences for "many crates" over "one big crate", and yeah, even medium-scale game engines like Bevy take a bunch of time and space to compile. But it is a whole game engine after all, maybe not representative of general development in the community.
I wouldn't say every Rust app does, but I do think it has become more normal for Rust apps to have 200-600 dependencies. However when I look at the list, they usually all make sense, unlike with NPM. There are rarely any one-line crates. Actually I haven't seen any yet (except joke ones of course).
There's no way the average C++ app uses 250 packages though. It's usually more like 5. C++ packaging is a huge pain so people tend to use them only when absolutely necessary, and you get huge libraries like Boost primarily because of the packaging difficulty.
I would say Python varies but 150 sounds high. Something more like 50-100 is typical in my experience.
"Do you think it could be inherent to the language?"
Not to the language but its users. Not to bash them, but most of them did not study IT on a university, did not learn about the KISS principle etc.
They just followed some tutorials to hack together stuff, now automated via LLM's.
So in a way the cause is the language as it is so easy to use. And the ecosystem grew organically from users like this - and yes, the ecosystem is full of bloat.
(I think claude nowdays is a bit smarter, but when building standalone html files without agents, I remember having to always tell chatgpt to explicitely NOT pull in yet another libary, but use plain vanilla js for a standard task, which usually works better and cleaner with the same lines of code or maybe 2 or 3 more for most cases. The standard was to use libaries for every new functionality )
> All software has bloat, but npm packages and web apps are notorious for it. Do you think it could be inherent to the language?
It sure seems like it is because JS devs, by and large, suck at programming. C has a pretty sparse standard library, but you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.
> you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.
Believe me, if C had a way to seamlessly share libraries across architectures, OSes, and compiler versions, something similar would have happened.
Instead you get a situation where every reasonably big modern C project starts by implementing their own version of string libraries, dynamic arrays, maps (aka dictionaries), etc. Not much different really.
Everyone trash talking the JS ecosystem without contributing the slightest to the conversation would benefit a lot if they read https://www.artmann.co/articles/30-years-of-br-tags in order to understand the evolution of the language and its tooling.
Nobody argues what we currently have is great and that we shouldn't look to improve it. Reducing it to "JS developers bad" is an embarrassing statement and just shows ignorance, not only of the topic at hand, but of an engineering mindset in general.
I find the mindset of trying to understand and accept bad fine in moderation but as defeatist when taken past the end of the block. It doesn't matter why JS is bad and will harm your future prospects if you approach it with too much acceptance. We always need to be examining the practice in front of us and the theory that would be a better replacement for it and trying
to make the leaps at the right times to keep getting paid while not becoming part of the problem ourselves.
Science advances one funeral at a time applies to software with things going at a faster pace so a good software engineer needs to fake a few funerals or really be senior at 4 years to be dead by 7.
I found it to be a nice post that documents why things sometimes are bad. It didn’t feel accusatory at the developers themselves, but seemed to serve as a reasonable critique of the status quo?
“Alternatively, what’d be really nice is if they upgraded“
Easy enough for y’all with techie salaries, but as one of the millions of poor folks whose paychecks barely (or don’t even) pay the bills, it’d be really nice if we didn't have to junkheap our backbreakingly expensive hardware every few years just cuz y’all are anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome to maintain.
They're talking about people still running ES3 browser engines, like IE8, which was released 15+ years ago and went EOL 10+ years ago. The author could have done a better job clarifying this, but they're not pushing for a world with 2y device lifetimes.
A lot of this basically reads to me like hidden tech debt: people aren't updating their compilation targets to ESx, people aren't updating their packages, package authors aren't updating their implementations, etc.
Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).
People use the oddest devices to do "on demand" jobs (receiving a tiny amount of money for a small amount of work). Although there aren't that many, I've seen user agents from game consoles, TVs, old Androids, iPod touch, and from Facebook and other "browser makers", with names such as Agency, Herring, Unique, ABB, HIbrowser, Vinebre, Config, etc. Some of the latter look to be Chrome or Safari skins, but there's no way to tell; I don't know what they are. And I must assume that quite a few devices cannot be upgraded. So I support old and weird browsers. The code contains one externally written module (stored in the repository), so it's only a matter of the correct transpiler settings.
Sometimes it's a result of unforeseen consequences of design decisions.
All pre-signal Angular code must be compiled down to JS which replaces native async with Promise.
Why is that so? For a long time Angular's change detection worked by overriding native functions like setTimeout, addEventListener etc. to track these calls and react accordingly. `async` is a keyword, so it's not possible to override it like that.
Signals don't require such trickery and also allow to significantly decrease the surface area of change detection, but to take advantage of all of that one has to essentially rewrite the entire application.
The desire to keep things compatible with even ES6, let alone ES5 and before, is utterly bizarre to me. Then you see folks who unironically want to maintain compatibility with node 0.4, in 2025, and realize it could be way worse....
Ironically, what often happens is that developers configure Babel to transpile their code to some ancient version, the output is bloated (and slower to execute, since passes like regenerator have a lot of overhead), and then the website doesn't even work on the putatively supported ancient browsers because of the use of recent CSS properties or JS features that can't be polyfilled.
I've even had a case at work where a polyfill caused the program to break. iirc it was a shitty polyfill of the exponentiation operator ** that didn't handle BigInt inputs.
Maybe I didn't look hard enough, but there's no obvious switch to "just turn off all the legacy stuff, thnx".
Also, there has been a huge amount of churn on the tooling side, and if you have a legacy app, you probably don't wanna touch whatever build program was cool that year. I've got a react app which is almost 10 years old, there has to be tons of stuff which is even older.
> Maybe I didn't look hard enough, but there's no obvious switch to "just turn off all the legacy stuff, thnx".
There is. Break compatibility for it, and whatever poor bastard that is still maintaining software that is targeting a PalmPilot is free to either pin to an older version of your library, or fork it. Yes, that's a lot of pain for him, but it makes life a little easier for everyone else.
Just how old an Android device in the developing world do you not want to support? Life's great at the forefront of technology, but there's a balancing act to be able to support older technology vs the bleeding edge.
I like the sentiment, but building a website that can actually function in that setting isn't a matter of mere polyfills. You need to cut out the insane bloat like React, Lottie, etc., and just write a simple website, at which point you don't really need polyfills anyway.
In other words, if you're pulling in e.g. regenerator-runtime, you're already cutting out a substantial part of the users you're describing.
Android phones update to the latest version of Chrome for 7 years. As long as you're using browser features that are Baseline: Widely Available, you'll be using features that were working on the latest browsers in 2023; those features will work on Android 7.0 Nougat phones, released in 2016.
Android Studio has a nifty little tool that tells you what percentage of users are on what versions of Android. 99.2% of users are on Android 7 or later. I predict that next year, a similar percentage of users will be on Android 8 or later.
3.9 billion android users, means that 0.8% is 31 million people - and for a very small number of developers most of their users will be from that slice. For most of them… yeah go ahead an assume your audience is running a reasonably up to date os
Websites built with tons of polyfills are likely not run on these devices anyway, since they will run out of RAM before, let alone after they will only load after sone minutes because of CPU limitations on top of not being loaded because their x509 certs are outdated as well as the bandwith they support is not suitable to load MB sited pages
I’ve been very lost trying to understand the ecosystem between es versions , typescript and everything else. It ends up being a weird battle between seemingly unrelated things like require() vs import vs async when all I want to do is compile. All while I’m utterly confused by all the build tools, npm vs whatever other ones are out there, vite vs whatever other ones are out there, ‘oh babel? I’ve heard the name but no idea what it does’ ends up being my position on like 10 build packages.
This isn’t the desire of people to build legacy support, it’s a broken, confusing and haphazard build system built on the corpses of other broken, confusing and haphazard build systems.
Honestly, Vite is all you need. :) It's super flexible compared to the status quo of require vs. import etc. For example, I recently wanted to ship a WASM binary along with the JS rather than making it a separate download (to avoid having to deal with the failure case of the JS code loading and the WASM not fetching). All I had to do was import `a.wasm?url` and it did the base64 embedding and loading automatically.
This sentiment is all well and good, but when you end up in a new-to-you JS codebase with a list of deps longer than a Costco receipt using some ancient Webpack with it's config split into 5 or so files, then no-one is letting you upgrade to vite unless the site is completely down.
this is exactly where i landed too. i build docker images that bundle node tooling and every time i think i understand the build system something changes. require vs import, cjs vs esm, babel vs swc vs esbuild, then half your dependencies use one format and half use the other. the worst part is when you containerize it because now you need it all to work in a clean linux environment with no cached state and suddenly half the assumptions break.
The newer version is often even more bloated. This whole article just reinforces my opinion of "WTF is wrong with JS developers" in general: a lot of mostly mindless trendchasing and reinventing wheels by making them square. Meanwhile, I look back at what was possible 2 decades ago with very little JS and see just how far things have degraded.
Don't confuse "one idiot who wants to support Node 0.4 in 2026" with "JS developers". Everybody hates this guy and he puts his hands into the most popular packages, introducing his junk dependencies everywhere.
If everyone hates him and thinks his dependencies are junk, why would anyone let him introduce them to popular packages? Clearly there are at least some people who are indifferent enough if the dependencies are getting added elsewhere
Then I wish there were more of these "idiots who want to support Node 0.4 in 2026". Maybe they're the ones with the common sense to value stability and backwards compatibility over constantly trendchasing the new and shiny and wanting to break what was previously working in the misguided name of "progress".
NodeJS has a clear support schedule for releases. Once a version of nodejs is EOL, the node team stops backporting security fixes. And you should really stop using it. Here's the calendar:
In my opinion, npm packages should only support maintained versions of nodejs. If you want to run an ancient, unsupported version of nodejs with security vulnerabilities, you're on your own.
You wouldn't if you look more deeply at this. He doesn't push for simplicity but for horrible complexity with an enormous stack of polyfills, ignoring language features that would greatly reduce all that bloat. .
The other problem is that this is a bit of a circular path, with deps being so crap and numerous, upgrading existing old projects become a pain. There are A LOT of old projects out there that haven't been updated simply because the burden to do so is so high.
"some people apparently exist who need to support ES3 - think IE6/7, or extremely early versions of Node.js"
Seriously what kind of business today needs to support ES3 browsers? Even banking sites should refuse to run on such old devices out of security concerns.
This is 100% teams who set up their build tooling back in 2015 and haven't updated since. There's plenty widely used apps and libs that date this far back, and back then, IE8 compat was considered pretty important still, esp for products targeting enterprise/government customers.
Upgrading eg Webpack and Babel and polyfill stacks and all that across multiple major versions is a serious mess. Lots of breaking changes all around. Much better to just ship features. If it ain't broke, don't fix it!
I think on the first point, we have to start calling out authors of packages which (IMO) have built out these deptrees to their own subpackages basically entirely for the purpose of getting high download counts on their github account
Like seriously... at 50 million downloads maybe you should vendor some shit in.
Packages like this which have _7 lines of code_ should not exist! The metadata of the lockfile is bigger than the minified version of this code!
At one point in the past like 5% of create-react-app's dep list was all from one author who had built out their own little depgraph in a library they controlled. That person also included download counts on their Github page. They have since "fixed" the main entrypoint to the rats nest though, thankfully.
> entirely for the purpose of getting high download counts on their github account
Is this an ego thing or are people actually reaping benefits from this?
Anthropic recently offered free Claude to open source maintainers of repositories with over X stars or over Y downloads on npm. I suppose it is entirely possible that these download statistics translate into financial gain...
I'm completely apathetic about spicy autocomplete for coding tasks and even I wonder which terrible code would be worse.
The guy who wrote is even/odd was for ages using a specifically obscure method that made it slower than %2===0 because js engines were optimising that but not his arcane bullshit.
The article and (overall) this comments section has thankfully focused on the problem domain, rather than individuals.
As the article points out, there are competing philosophies. James does a great job of outlining his vision.
Education on this domain is positive. Encouraging naming of dissenters, or assigning intent, is not. Folks in e18e who want to advance a particular set of goals are already acting constructively to progress towards those goals.
from a security perspective this is even worse than it looks. every one of those micro packages is an attack surface. we just saw the trivy supply chain get compromised today and thats a security tool. now imagine how easy it is to slip something into a 7 line package that nobody audits because "its just a utility." the download count incentive makes it actively dangerous because it encourages more packages not fewer.
I remember seeing this one guy who infiltrated some gh org, and then started adding his own packages to their dependencies or something to pad up his resume/star count.
As usual, there's a cultural issue here. I know it's entirely possible to paste those seven lines of code into your app. And in many development cultures this will be considered a good thing.
If you're working with Javascript people, this is referred to as "reinventing the wheel" or "rolling your own", or any variation of "this is against best practice".
I think the fact that everyone cites the same is-number package when saying this is indicative of something though.
Like I legit think that we are all imagining this cultural problem that's widespread. My claim (and I tried to do some graph theory stuff on this in the past and gave up) is that in fact we are seeing something downstream of a few "bad actors" who are going way too deep on this.
I also dislike things like webpack making every plugin an external dep but at least I vaguely understand that.
The point isn't that everyone needs to write the same code manually necessarily. It's that an author could easily just combine the entire tree of seven line packages into the one package the create-react-app uses directly. There's no reason to have a dozen or so package downloads each with seven lines of code instead of one that that's still under under a hundred lines; that's still a pretty small network request, and it's not like dead code analysis to prune unused functions isn't a thing. If you somehow find yourself in a scenario where you would be happy to download seven lines of code, but downloading a few dozen more would be an issue, that's when you might want to consider pasting the seven lines of code manually, but I honestly can't imagine when that would be.
As usual, he's copying someone else who's been doing this for years:
https://www.npmjs.com/package/is-number - and then look and see shit like is odd, is even (yes two separate packages because who can possibly remember how to get/compare the negated value of a boolean??)
Honestly for how much attention JavaScript has gotten in the last 15 years it's ridiculous how shit it's type system really is.
The only type related "improvement" was adding the class keyword because apparently the same people who don't understand "% 2" also don't understand prototypal inheritance.
That's a good point, it's only been around for 30 years, and used on 95% of websites. It's not really popular enough for a developer to take an hour or two to read how it works.
The word "used" is doing some heavy lifting there. Not all usage is equal, and the fact that it's involved under the hood isn't enough to imply anything significant. Subatomic physics is used by 100% of websites and has been around for billions of years, but that's not a reason to expect every web developer to have a working knowledge of electron fields.
Let's compromise and say that whoever is responsible for involving (javascript|electron fields) in the display of a website, should each understand their respective field.
I don't expect a physicist or even an electrical engineer or cpu designer to necessarily understand JavaScript. I don't expect a JavaScript developer to understand electron fields.
I do expect a developer who is writing JavaScript to understand JavaScript. Similarly I would expect the physicist/etc to understand how electrons work.
This is how it should be for internal stuff! Corporate IT wants everyone to update anyway so there really isn’t a downside.
One thing I kinda understand is users who want to use a more performant browser (safari really does sip memory I’ve found compared to chrome) but that’s kind of a side point. But if your company decides this is the browser(s) we support, then it makes sense and is the right way to go about it.
It's interesting how we've reached a point where 'vanilla' is seen as an obscure domain of knowledge. The 'gravity' of frameworks like React is so strong that for many new developers, the framework IS the web. Breaking out of that mindset often reveals that the browser has actually evolved quite a bit and can handle a lot of what we used to reach for libraries for, especially with Web Components and CSS Grid/Flexbox being so mature now.
The most frustrating thing with the "Atomic architecture" bit with tiny packages is how obviously stupid it is. Any borderline sane person should look at isOdd/isEven and see that it's an awful idea
Instead they've elevated it to a cultural pillar and think they've come up with a great innovation. It's like talking to antivaxers
The philosophy was kinda refreshing in the early days. There was a really low barrier to publishing and people were encouraged to build and share tools rather than hoard things. It was probably somewhat responsible for the success of npm and the node ecosystem, especially given the paltry standard lib.
Of course, like most things, when taken to an extreme it becomes absurd and you end up with isOdd.
I think the issue is that the JavaScript ecosystem is so large that even the strangest extremes manage to survive. Even if they resonate with just 0.1% of developers, that’s still a lot of developers.
The added problem with the atomic approach is that it makes it very easy for these fringes to spread throughout the ecosystem. Mostly through carelessness, and transitive dependencies.
It's because it has a smart-sounding name. Some people are shallow and performative; some nice-looking blog post says they can have "atomic architecture", then the trend starts and everybody wants to show how enlightened they are.
A third argument is that it was because of aliens from the planet Blotrox Prime. But I suppose without evidence we'll just have to accept that all three theories are equally probable.
Interesting how you decided to switch to hyperbole instead of providing evidence for your claim. Backing up your viewpoint would have easily shut me down, putting the ball in my court to do the same. Instead you gave a knee-jerk childish response.
I've seen some juniors writing risoni code like that. They've heard that you shouldn't write big functions, so obviously they forcefully split things until they can't be split anymore.
The cross-realm argument for packages like is-string is the one I find hardest to dismiss, but even there the math doesn't add up. The number of projects actually passing values across realms is tiny, and those projects should be the ones pulling in cross-realm-safe utilities, not every downstream consumer of every package that ever considered it.
The deeper problem with Pillar 2 is that atomic packages made sense as a philosophical argument but broke down the moment npm made it trivially easy to publish. The incentive was "publish everything, let consumers pick what they need" but the reality is consumers never audit their trees,they just install and forget. So the cost that was supposed to be opt-in became opt-out by default.
The ponyfill problem feels most tractable to me. A simple automated check "does every LTS version of Node support this natively?" could catch most of these. The e18e CLI is a good start but it still requires someone to run it intentionally. I wonder if something like a Renovate-style bot that opens PRs to remove outdated ponyfills would move the needle faster than waiting for maintainers to notice.
For the old version support. Why not do some compile time #ifdef SUPPORT_ES3? That way library writers can support it and if the user doesn't need it they can disable it at compile time and all the legacy code will be removed
Two problems:
- people would need to know how to effectively include dependencies in a way that allows them to be tree shaken, that's a fragile setup
- polyfills often have quirks and extra behaviours (eg. the extra functions on early promise libraries come to mind ) that they start relying on, making the switch to build-in not so easy
Also, how is this going to look over time with multiple ES versions?
Yeah I’m in the same boat here I really don’t like the dependency sprawl of rust. I understand there’s tradeoffs but I really wanna make sure we don’t end up like npm
the docker side of this is painful too. every extra dependency in any language means a bigger image, more layers to cache, more things that can break during a multi-arch build. ive been building images that are 4GB because of all the node and python tooling bundled in. micro packages make it worse because each one adds metadata overhead on top of the actual code.
Indeed Rust has a runtime, I'm not sure why the whole "Rust has no runtime" comes from, I keep seeing it repeated from time to time, but can't find the origin of this, I don't think it's ever been true?
I’m assuming you’re referring to an async runtime like tokio. In my option the dependency problem exists with or without tokio. Tokio is probably one of the best dependencies
What do you use to build programs then? Or maybe you're not a software developer, then maybe I understand not fully knowing how a program gets built, but otherwise, languages will be needed for as long as we need programs.
I wonder this means there could be a faster npm install tool that pulls from a registry of small utility packages that can be replaced with modern JS features, to skip installing them.
Not sure about faster, but you could do something with overrides, especially pnpm overrides since they can be configured with plugins. Build a list of packages that can be replaced with modern stubs.
It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?
It would be interesting to extend this project where opt-in folks submit a "telemetry of diffs," to track how certain dependencies needed to be extended, adapted, or patched; those special cases would be incorporated as future features and new regression tests.
Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).
However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.
There is a clear and widespread cultural problem with javascript. Sites should think seriously hard about server side rendering, both for user privacy (can't port the site to i2p if you drop 5MB every time they load a page) and freedom. Even this antibloat site smacks you with ~100KB and links to one that smacks you with ~200KB. At this rate if you follow 20 links you'll hit a site with 104 GB of JS.
An underappreciated source of bloat is module duplication stemming from code splitting. SPAs have a bad rep because you don't expect to download an entire app just to load one page on the web. You can solve this by code splitting. But if you just naively split your app by route, you'll end up with duplicate copies of every shared module.
Bundlers handle this by automatically creating bundles for shared modules. But if you optimize to avoid all shared modules, you end up with hundreds of tiny files. So most bundlers enforce a minimum size limit. That's probably fine for a small app. But one or more of these things happens:
1. Over time everybody at the company tends to join one giant SPA because it's the easiest way to add a new page.
2. Code splitting works so well you decide to go ham and code split all of the things - modals, below-the-fold content, tracking scripts, etc.
Now you'll run into situations where 20 different unrelated bundles happen to share a single module, but that module is too small for the bundler to split out, and so you end up downloading it N times.
The primary cause of JS bloat is assuming you need JS or that customers want whatever you're using it to provide.
For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.
That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.
The sales page after the change is now just 160K of JS.
The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.
I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".
160K total is impressive. most landing pages i see are shipping 2-3MB of js before the first paint. the "write your own code" approach gets laughed at but when you actually do it the result is faster, easier to debug, and you dont wake up one morning to find out one of your 200 dependencies got compromised.
Look at Python - similar story. Once a reasonably usable global package registry exists, this is exactly what happens. Languages and standard libraries evolve, shipped code more often than not doesn't.
Yes, of course the tiny packages cause some of the bloat. As mainly a Java developer being pretty paranoid about my dependency tree (I'm responsible for every byte of code I ship to my users, whether I wrote it or not), I'm always blown away by JS dependency trees. Why would you reach for a library for this three-line function? Just write it yourself, ffs.
But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.
First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.
Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?
Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE
That's like asking "why would you use Swing when you can use Graphics2D". Sometimes you want something higher level. The DOM is great and very powerful, but when you're building a highly interactive web app you don't want to be manually mutating the DOM every time state changes.
I am a core maintainer of Astro, which is largely based around the idea that you don't need to always reach for something like React and can mostly use the web platform. However even I will use something like React (or Solid or Svelte or Vue etc) if I need interactivity that goes beyond attaching some event listeners. I don't agree with all of its design decisions, but I can still see its value.
> Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?
Yea, honestly you probably just don't understand. FE frameworks solve a specific problem and they don't make sense unless you understand that problem. That TSoding video is a prime example of that - it chooses a trivial instance of that problem and then acts like the whole problem space is trivial.
To be fair, React is especially wasteful way to solve that problem. If you want to look at the state od the art, something like Solid makes a lot more sense.
It's much easier to appreciate that problem if you actually try to build complex interactive UI with vanilla JS (or something like jQuery). Once you have complex state dependency graph and DOM state to preserve between rerenders, it becomes pretty clear.
I can't help but think that whenever we have these discussions about dependency hell in the JS ecosystem that the language moves too slowly to add things to stdlib. For me, this is where bun fills the gap and they continue to pump out core stdlib packages that replace widely used dependencies. I'd really like to see Node at least do this more.
Fallback support is a legitimate reason for additional code being in the bundle, but it's not 'bloat' because it's necessary. In an ideal world every website would generate ES5, ES6, and ES2025 bundles and serve the smallest one that's necessary for the app to run based on the browser capabilities, but that is genuinely quite hard to get right and the cost of getting it wrong is a broken app so it's understandable why devs don't.
The other two, atomic architecture and ponyfills, are simply developer inexperience (or laziness). If you're not looking at the source of a package and considering if you actually need it then you're not working well enough. And if you've added code in the past that the metrics about what browsers your visitors are using show isn't needed any more, then you're not actively maintaining and removing things when you can. That's not putting the user first, so you suck.
Bloat is mostly added by package authors, not website authors. And they can't know who's running it and can't look at the metrics. I doubt many website authors directly use isEven or polyfills.
I feel like part of the blame for the situation is that JavaScript has always lacked a standard library which contains the "atomic architecture" style packages. (A standard library wouldn't solve everything, of course.)
People keep telling me the approach I am taking won't scale or will be hard to maintain, yet my experience has been that things stay simple and easy to change in a way I haven't experienced in dependency-heavy projects.
I think this is because the whole web dev knowledge ecosystem of youtubers and tutorial platforms is oriented around big frameworks and big tooling. People think it is much harder than it actually is to build without frameworks or build tools, or that the resulting web app will perform much worse than it actually will. A typical react codebase ported to a fully vanilla codebase ends up just as modular and around 1.5x the number of lines of code, and is tiny in total footprint due to the lack of dependencies so typically performs well.
To be clear though: I’m not arguing the dependencies are bad or don’t have any benefits at all or that vanilla coding is a superior way. Coding this way takes longer and the resulting codebase has more lines of code, and web components are “uglier” than framework components. What I’m saying is that most web developers are trapped in a mindset that these dependencies must be used when in reality they are optional and not always the best choice.
Come to think of it, I should write up the techniques I use, too...e.g. I have simple wrappers around querySelector() and createElement() with a bit of TypeScript gymnastics in a JSDoc annotation to add intellisense + type checking for custom elements.
Would you be open to a pull request with a page on static analysis/type checking for vanilla JS? (intro to JSDoc, useful patterns for custom elements, etc.) If not, that's totally OK, but I figure it could be interesting to readers of the site.
And agreed on vanilla/dependency-free not being a silver bullet. There aren't really one-size-fits-all solutions in software, but I've found a vanilla approach (and then adding dependencies only if/when necessary) tends to help the software evolve in a natural way and stay simple where possible.
Depending on the use case, minimizing dependancies can also decrease attack vectors on the page/app.
What do you use for model updates?
https://en.wikipedia.org/wiki/Ajax_(programming)
The idea of reactivity started in the 1990's in production.
When Gmail was released this technology is what made a website behave like a desktop app (plus the huge amount of storage)
If we were to look into today's equivalent of doing this, it might be surprising what exists in the standard libraries.
What's more, given the tools we have today, it fits really well with agentic engineering. It's even easier to create and understand a homegrown version of a dependency you may have used before.
The client has not had to pay a cent for any sort of migration work.
Of course, it also means you have to be cautious about problems that dependencies promise to solve (e.g. XSS), but at the same time, bringing in a bunch of third-party code isn't a substitute for fully understanding your own system.
From human society's PoV, you sound like a 10X engineer and wonderful person.
But from the C-suite's PoV ...yeah. You might want to keep quite about this.
that's in contrast with the sort of stuff that invariably shows up when something falls over somewhere in a dependency:
it's not fun to step through or profile that sort of code either...The main cause of bloat is not polyfills or atomic packages. The cause of bloat is bloat!
I love this quote by Antoine de Saint-Exupéry (author of the Little Prince):
"Perfection is achieved, not when there is nothing left to add, but nothing to take away."
Most software is not written like that. It's not asking "how can we make this more elegant?" It's asking "what's the easiest way to add more stuff?"
The answer is `npm i more-stuff`.
> Every sentence must do one of two things—reveal character or advance the action.
Or Quintilian's praise of Demosthenes and Cicero: "To Demosthenes nothing can be added, but from Cicero nothing can be taken away."
if you're developing some sort of dystopia where everyone is heavily medicated, better to show a character casually take the medication rather than describe it.
of course, that's not a rule set in stone. you can do whatever the fuck you want.
You mean the character of a place?
JavaScript seems to be unique in that you want your code to work in browsers of the past and future—so a lot of bloat could come from compatibility, as mentioned in the article—and it's a language for UIs, so a lot of bloat in apps and frameworks could come from support for accessibility, internationalization, mobile, etc.
Lots of developers don't even say they are JS devs but React devs or something. This is normal given that the bandwidth and power of targets are so large nowadays. Software is like a gas, it will fill all the space you can give it since there is no reason to optimize anything if it runs ok.
I've spent countless hours optimising javascript and css to work across devices that were slow and outdated but still relevant (IE7, 8 and 9 were rough years). Cleverness breads in restrictive environments where you want to get the most out of it. Modern computers are so large that its hard for you to hit the walls when doing normal work.
Every C++ app I install in linux requires 250 packages
Every python app I install and then pip install requirements uses 150 packages.
10GB of build artifacts for the debug target.
With that said, there are plenty of small game engines out there, but couple Rust's somewhat slow compile times with the ecosystems preferences for "many crates" over "one big crate", and yeah, even medium-scale game engines like Bevy take a bunch of time and space to compile. But it is a whole game engine after all, maybe not representative of general development in the community.
There's no way the average C++ app uses 250 packages though. It's usually more like 5. C++ packaging is a huge pain so people tend to use them only when absolutely necessary, and you get huge libraries like Boost primarily because of the packaging difficulty.
I would say Python varies but 150 sounds high. Something more like 50-100 is typical in my experience.
Not to the language but its users. Not to bash them, but most of them did not study IT on a university, did not learn about the KISS principle etc.
They just followed some tutorials to hack together stuff, now automated via LLM's.
So in a way the cause is the language as it is so easy to use. And the ecosystem grew organically from users like this - and yes, the ecosystem is full of bloat.
(I think claude nowdays is a bit smarter, but when building standalone html files without agents, I remember having to always tell chatgpt to explicitely NOT pull in yet another libary, but use plain vanilla js for a standard task, which usually works better and cleaner with the same lines of code or maybe 2 or 3 more for most cases. The standard was to use libaries for every new functionality )
It sure seems like it is because JS devs, by and large, suck at programming. C has a pretty sparse standard library, but you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.
Believe me, if C had a way to seamlessly share libraries across architectures, OSes, and compiler versions, something similar would have happened.
Instead you get a situation where every reasonably big modern C project starts by implementing their own version of string libraries, dynamic arrays, maps (aka dictionaries), etc. Not much different really.
Nobody argues what we currently have is great and that we shouldn't look to improve it. Reducing it to "JS developers bad" is an embarrassing statement and just shows ignorance, not only of the topic at hand, but of an engineering mindset in general.
Science advances one funeral at a time applies to software with things going at a faster pace so a good software engineer needs to fake a few funerals or really be senior at 4 years to be dead by 7.
I found it to be a nice post that documents why things sometimes are bad. It didn’t feel accusatory at the developers themselves, but seemed to serve as a reasonable critique of the status quo?
Easy enough for y’all with techie salaries, but as one of the millions of poor folks whose paychecks barely (or don’t even) pay the bills, it’d be really nice if we didn't have to junkheap our backbreakingly expensive hardware every few years just cuz y’all are anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome to maintain.
Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).
And weird browser support.
People use the oddest devices to do "on demand" jobs (receiving a tiny amount of money for a small amount of work). Although there aren't that many, I've seen user agents from game consoles, TVs, old Androids, iPod touch, and from Facebook and other "browser makers", with names such as Agency, Herring, Unique, ABB, HIbrowser, Vinebre, Config, etc. Some of the latter look to be Chrome or Safari skins, but there's no way to tell; I don't know what they are. And I must assume that quite a few devices cannot be upgraded. So I support old and weird browsers. The code contains one externally written module (stored in the repository), so it's only a matter of the correct transpiler settings.
All pre-signal Angular code must be compiled down to JS which replaces native async with Promise.
Why is that so? For a long time Angular's change detection worked by overriding native functions like setTimeout, addEventListener etc. to track these calls and react accordingly. `async` is a keyword, so it's not possible to override it like that.
Signals don't require such trickery and also allow to significantly decrease the surface area of change detection, but to take advantage of all of that one has to essentially rewrite the entire application.
Ironically, what often happens is that developers configure Babel to transpile their code to some ancient version, the output is bloated (and slower to execute, since passes like regenerator have a lot of overhead), and then the website doesn't even work on the putatively supported ancient browsers because of the use of recent CSS properties or JS features that can't be polyfilled.
I've even had a case at work where a polyfill caused the program to break. iirc it was a shitty polyfill of the exponentiation operator ** that didn't handle BigInt inputs.
Also, there has been a huge amount of churn on the tooling side, and if you have a legacy app, you probably don't wanna touch whatever build program was cool that year. I've got a react app which is almost 10 years old, there has to be tons of stuff which is even older.
There is. Break compatibility for it, and whatever poor bastard that is still maintaining software that is targeting a PalmPilot is free to either pin to an older version of your library, or fork it. Yes, that's a lot of pain for him, but it makes life a little easier for everyone else.
Here's the schedule, if anyone hasn't seen it. Node 18 is EOL. Node 20 goes EOL in a bit over a month.
https://nodejs.org/en/about/previous-releases
In other words, if you're pulling in e.g. regenerator-runtime, you're already cutting out a substantial part of the users you're describing.
So that's my cutoff.
Android Studio has a nifty little tool that tells you what percentage of users are on what versions of Android. 99.2% of users are on Android 7 or later. I predict that next year, a similar percentage of users will be on Android 8 or later.
This isn’t the desire of people to build legacy support, it’s a broken, confusing and haphazard build system built on the corpses of other broken, confusing and haphazard build systems.
It would take a well-respected org pushing a standard library that has clear benefits over "package shopping."
Don't confuse "one idiot who wants to support Node 0.4 in 2026" with "JS developers". Everybody hates this guy and he puts his hands into the most popular packages, introducing his junk dependencies everywhere.
https://nodejs.org/en/about/previous-releases
Here's a list of known security vulnerabilities affecting old versions of nodejs:
https://nodejs.org/en/about/eol
In my opinion, npm packages should only support maintained versions of nodejs. If you want to run an ancient, unsupported version of nodejs with security vulnerabilities, you're on your own.
Seriously what kind of business today needs to support ES3 browsers? Even banking sites should refuse to run on such old devices out of security concerns.
Upgrading eg Webpack and Babel and polyfill stacks and all that across multiple major versions is a serious mess. Lots of breaking changes all around. Much better to just ship features. If it ain't broke, don't fix it!
Like seriously... at 50 million downloads maybe you should vendor some shit in.
Packages like this which have _7 lines of code_ should not exist! The metadata of the lockfile is bigger than the minified version of this code!
At one point in the past like 5% of create-react-app's dep list was all from one author who had built out their own little depgraph in a library they controlled. That person also included download counts on their Github page. They have since "fixed" the main entrypoint to the rats nest though, thankfully.
https://www.npmjs.com/package/has-symbols
https://www.npmjs.com/package/is-string
https://github.com/ljharb
Is this an ego thing or are people actually reaping benefits from this?
Anthropic recently offered free Claude to open source maintainers of repositories with over X stars or over Y downloads on npm. I suppose it is entirely possible that these download statistics translate into financial gain...
The incentives are pretty clear: more packages, more money.
The guy who wrote is even/odd was for ages using a specifically obscure method that made it slower than %2===0 because js engines were optimising that but not his arcane bullshit.
As the article points out, there are competing philosophies. James does a great job of outlining his vision.
Education on this domain is positive. Encouraging naming of dissenters, or assigning intent, is not. Folks in e18e who want to advance a particular set of goals are already acting constructively to progress towards those goals.
Really escapes me who it was.
If you're working with Javascript people, this is referred to as "reinventing the wheel" or "rolling your own", or any variation of "this is against best practice".
Like I legit think that we are all imagining this cultural problem that's widespread. My claim (and I tried to do some graph theory stuff on this in the past and gave up) is that in fact we are seeing something downstream of a few "bad actors" who are going way too deep on this.
I also dislike things like webpack making every plugin an external dep but at least I vaguely understand that.
The problem is not imagined.
Just as the cloud is simply someone else's computer, a package is just someone else's reinvented wheel.
The problem is half the wheels on npm are fucking square and apparently no one in the cult of JavaScript realises it.
https://www.npmjs.com/package/is-number - and then look and see shit like is odd, is even (yes two separate packages because who can possibly remember how to get/compare the negated value of a boolean??)
Honestly for how much attention JavaScript has gotten in the last 15 years it's ridiculous how shit it's type system really is.
The only type related "improvement" was adding the class keyword because apparently the same people who don't understand "% 2" also don't understand prototypal inheritance.
Let's compromise and say that whoever is responsible for involving (javascript|electron fields) in the display of a website, should each understand their respective field.
I don't expect a physicist or even an electrical engineer or cpu designer to necessarily understand JavaScript. I don't expect a JavaScript developer to understand electron fields.
I do expect a developer who is writing JavaScript to understand JavaScript. Similarly I would expect the physicist/etc to understand how electrons work.
One thing I kinda understand is users who want to use a more performant browser (safari really does sip memory I’ve found compared to chrome) but that’s kind of a side point. But if your company decides this is the browser(s) we support, then it makes sense and is the right way to go about it.
Instead they've elevated it to a cultural pillar and think they've come up with a great innovation. It's like talking to antivaxers
Of course, like most things, when taken to an extreme it becomes absurd and you end up with isOdd.
The added problem with the atomic approach is that it makes it very easy for these fringes to spread throughout the ecosystem. Mostly through carelessness, and transitive dependencies.
https://github.com/e18e/cli
That’s awesome. Could be hooked as a pre-commit for agents to do the grunt work of migration.
For personal objects I always prompt the AI to write JS directly, never introduce nodejs stack unless absolutely have to.
Turns out you don't always need Nodejs/Reactto make a functional SPA.
I’d take vibe coded vanilla js slop over npm dependency hell every day of the week.
Also, how is this going to look over time with multiple ES versions?
And we're seeing rust happily going down the same path, especially with the micro packages.
Unless you're talking about an "environment" eg Node or the like
It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?
Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).
However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.
[1] https://news.ycombinator.com/item?id=47472694
Bundlers handle this by automatically creating bundles for shared modules. But if you optimize to avoid all shared modules, you end up with hundreds of tiny files. So most bundlers enforce a minimum size limit. That's probably fine for a small app. But one or more of these things happens:
1. Over time everybody at the company tends to join one giant SPA because it's the easiest way to add a new page. 2. Code splitting works so well you decide to go ham and code split all of the things - modals, below-the-fold content, tracking scripts, etc.
Now you'll run into situations where 20 different unrelated bundles happen to share a single module, but that module is too small for the bundler to split out, and so you end up downloading it N times.
For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.
That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.
The sales page after the change is now just 160K of JS.
The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.
I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".
Updating dependencies is a task a person does, followed by committing the changes to the repo.
I am aware a lot of these ideas are heretical to a lot of software developers these days.
But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.
First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.
Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?
Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE
You can also use your underparts as a hat. It doesn't mean its a good idea.
I am a core maintainer of Astro, which is largely based around the idea that you don't need to always reach for something like React and can mostly use the web platform. However even I will use something like React (or Solid or Svelte or Vue etc) if I need interactivity that goes beyond attaching some event listeners. I don't agree with all of its design decisions, but I can still see its value.
https://youtu.be/Q9MtlmmN4Q0?t=519&is=Wt3IzexiOX4vMPZf
Also, why do you use SQL and databases? Couldn’t you just modify files on the filesystem?
To be fair, React is especially wasteful way to solve that problem. If you want to look at the state od the art, something like Solid makes a lot more sense.
It's much easier to appreciate that problem if you actually try to build complex interactive UI with vanilla JS (or something like jQuery). Once you have complex state dependency graph and DOM state to preserve between rerenders, it becomes pretty clear.
There will be almost no bloat to worry about.
The other two, atomic architecture and ponyfills, are simply developer inexperience (or laziness). If you're not looking at the source of a package and considering if you actually need it then you're not working well enough. And if you've added code in the past that the metrics about what browsers your visitors are using show isn't needed any more, then you're not actively maintaining and removing things when you can. That's not putting the user first, so you suck.