Saving the planet (and your UX) with green development
The internet is littered with slow websites containing unoptimised images, unminified JavaScript, and frameworks with 1000s of weighty dependancies. Each one with ludicrous bandwidth requirements (spare a thought for those of us on limited data plans!).
Technology has grown: our phones and computers are more powerful than ever, our connections are faster. So much so, much of the above isn't too noticeable a lot of the time.
Developers built the internet as we know it, I'm no exception. I've been guilty of all of the sins listed above. The problem is, a lot of us don't test with devices of varying specs. Our machines tend to be powerful, our connections fast... So when testing our web apps or sites? The deadly sins of the bloated internet often go hidden.
On my machine (an M1 Macbook Pro with 16GB of RAM and a 300mbit connection), this site works beautifully! But can the same be said for Sally with her iPhone 8 on a patchy 4G connection and limited data plan? These problems often go missed, reaching production with users like Sally lamenting at the UX.
"But what does any of this have to do with the climate?!" I hear you cry. Well that's the next thing:
The AI era is here, and compute is getting ever more expensive. Whether that's the dwindling supply of RAM and storage, or limited bandwidth... We're more reliant than ever on increasingly scarce resources that are taken by AI. Worse yet? Too many of these data centres are polluting our air, guzzling our water and using fossil fuel energy. A recent study even found that temperature increases of up to 9c could be found in a 9 mile radius around AI data centres. More devastating still, is that these data centres are often polluting the worst in areas where marginalised people live.
To be clear: I don't want to greenwash this issue. A significant amount of the environmental devastation is increasingly coming from AI data centers, rather from individual developers. But whether or not we care about taking direct action on the issue of AI, it'll catch up to us eventually.
Whether you're thinking of the compute expenses in a cloud server market riddled by scarcity, or worried about the ecological impact of data centres on our climate and vulnerable neighbors: developers must step up to the challenge, for our users, our communities, the environment and (if your priorities don't lie in the former) our operational costs! If we don't our climate will perish, our users' experiences will suffer and our compute costs will hemorrhage. We must learn to do more with less.
So what do we do about this? I'm glad you asked... In no particular order (we'll get to the AI elephant in the room later...):
Cleaner, faster code
The more efficient your code is the less compute it'll use, which means less energy and lower emissions. Efficient code should still be maintainable and readable, though. The art of coding is finding the balance between something that is well optimized, but still understandable by your future self and your colleagues! Striking this balance is vital: recompiling and slamming your CI/CD pipeline with code you're struggling to work with will only waste more time and resources.
Some things to think about:
Have you implemented caching?
Caching data that doesn't require a rapid TTL can dramatically reduce load on databases. Databases are on disk and caches are typically in RAM (much faster!). The benefit is twofold: less queries on the database means less load on your servers (and potentially lower operational costs!) and a much snappier UX.
Better still if you're using external APIs that charge per request. For example, if you have a Node backend that integrates with Firebase, and there's an API endpoint that racks up hundreds of thousands of requests a day... Caching that endpoint will significantly reduce the operational costs, because the Redis cache (or whatever you use) will be taking a good chunk of those requests before it refreshes.
How about efficient SQL queries?
For data that requires a rapid TTL, or infrequently requested data, more efficient queries are essential. For example, a query like this:
SELECT * FROM users WHERE status = "active";
Is wasteful on two counts. Do you really need every field? If you only need the uid, email and avatar, then selecting all fields wastes bandwidth.
Another area for improvement is indexing. If you're searching based on the status field, use a single index:
CREATE INDEX index_users_status ON users(status)
Without the index, MySQL will scan every single row with the WHERE status = "active" clause. But with an index? It can jump straight to it, because the index is designed specifically around the users.status column. Less load on the server, faster request time and less energy used!
However, it's always worth testing the before and after with indexes. There are so many factors that play into whether you'll benefit from them or not!
Optimising content
Are you serving up those 20mb .png images in your articles? How about loading every single image, despite the user's position on the page? You can dramatically cut down on bandwidth with a little optimisation. Lower bandwidth requirements are better for your bottom line, and your UX! Better yet? Less bandwidth sent and received means lower carbon emissions. Better for your budget, faster for your users and healthier for our planet!
The other day, I loaded up a news article from the BBC, and was shocked to discover it weighed 4mb... Just to read an article with a couple pictures! Here's how we can avoid that horror show:
More efficient file formats
Developing this portfolio taught me the wonders of the AVIF file format. It reduced one of my png files from 100kb down to 8kb. That's a 92% reduction! If you have a page of 5 or 6 images like that, you've cut your bandwidth by half a megabyte. For a user on a train journey with bad 4g signal? This is a godsend. Better yet? Less storage is used. Everyone wins!
You can convert all your images to AVIF in 11ty like I did using the eleventy-img plugin. Once configured, it takes any images you put into your 11ty site, and converts them into any format you'd like. Before these optimisations, this site sat at 400kb. After? It's now 100kb.
The magic of lazy loading
Another thing to consider is: is there a reason to load images that some users may not ever scroll to see? This is a tricky question. If you have a lot of users on unreliable connections, they might lose data signal, scroll down... Only to find that there's a bunch of missing images. However, if you have content with lots of images, it can dramatically reduce bandwidth usage. It also means that the initial load times are faster.
Better yet? If you're developing an 11ty site like this one, support is baked into the eleventy-img plugin! Amazing stuff. Play around with how it feels, try it with throttling enabled in dev tools.
For those developing without 11ty, more information on lazy loading can be found here on MDN.
A word on minifying and compression...
Minifying and compression is a great way at cutting bandwidth usage, particularly if you target JavaScript. Targeting HTML and CSS, however? Not so much. For more, read Shiv's fantastic blog post about the topic. The gains (over just using gzip or Brotli compression) are so small, that the obfuscation of your HTML and CSS outweighs the benefit. The internet is built on budding developers learning from other's code, lets not make that any harder than it needs to be! Especially in this climate.
Now compression is where it's at. Obviously, there is computation overhead involved, but the bandwidth savings could well be worth it. Nginx has gzip compression support built in, and is easy to enable. An easy win for the users, developers and the planet!
Technology choices
Ask yourself... Does your portfolio really need to be a web app? Is there a way you can build your website with SSGs in mind? Consider the libraries and frameworks you use: the default bundle size for a React Vite template is 1mb out of the box. How much more would it be with other libraries on top?
Obviously, "dynamic" sites aren't inherently bad for any of these goals. But the more static something is, chances are? The more efficient and better for the planet it will be (and probably for your users too). Obviously, static site generation isn't for every website, and yes, I'm aware that Next.js is a thing. Some sites do require user data and dynamic page generation etc. But for blogs and portfolios? I can't help but wonder if more of them could be built with SSGs.
SSGs are amazing. They generate static HTML files based on markdown content files, all you have to do is upload them to your web server. Then, all your web server has to do is serve the files as they are. No putting pages and templates together. No worrying about security or bad configurations. Just good, old fashioned HTML.
Addressing the AI elephant in the room: Hitting them where it hurts...
As previously mentioned, AI has a devastating ecological impact. This impact only worsens as the adoption of AI grows. It's a big part of why the development tips above will be more relevant as time goes on. But these AI mega corporations can only grow if they have content to crawl and train their models on.
Unfortunately for us, the widespread adoption of AI has also polluted our internet with LLM written content. Most Google searches contains tonnes of results with AI written articles, blog posts and pages. It's so widespread, that it's ironically causing a negative impact for these AI companies via "model collapse". The snake is literally eating its tail: it's been trained on so much of humanity's hard work it only has its own outputs to train on now.
To add insult to injury, these AI companies crawl our websites and apps in aggressive ways that practically amounts to a DoS attack. In an interview with The Register, cloud service company Fastly have said that they're seeing a significant increase in operational costs from these crawlers.
Thankfully, developers can take advantage of tools like Anubis. Anubis is a firewall utility that watches incoming requests to your website and looks for signs of crawlers. If it suspects a bot, it will present a challenge, hopefully preventing the bots from crawling. Alternatively, you can use Cloudflare to achieve a similar end. If you have a static site hosted on Vercel, Netlify or Cloudflare Pages, Albireo is another alternative.
Of course, we can also add robots.txt to our sites. But this can't be relied on, as there are many crawlers that don't respect robots.txt.
These tools can help protect our sites and products from being maliciously crawled, which protects us from unwanted bills and prevents our work from being used to harm the planet. Unfortunately, none of these will be a silver bullet, but we've got to work with what we have!
On using LLMs...
It goes without saying, limiting or all together stopping use of LLMs in your work will have a positive environmental impact. A University of California study found that a single 100 word prompt used around 500ml of water.
The downsides don't stop there, either. If we're all solving our coding problems with AI rather than with each other, we're hiding away our troubleshooting and solutions to the closed off nature of AI Chatbots. Think about it: none of this is search engine indexable. We used to ask questions on forums (like Stack Overflow), where we'd exchange expertise, which would allow us to share our solutions with the world. As more people use LLMs to troubleshoot, a lot of us aren't making the time to talk about it on forums or blogs anymore.
Another thing to worry about is skill atrophy. A lot of us become so reliant on LLMs that we lose our problem solving and research skills. AI companies want us all to be reliant on their platforms so we can't code without them! Don't let them win.
If you are going to use LLMs, at least try to figure out ways to minimize usage. Try making it a last resort, rather than a default. Hell, if you can, try to use it locally. And when you do have to use it? Make sure you learn and understand the solution it gives you, and here's the important part: verify the solution. Hopefully, you won't have to come back to the LLM again for the same problem. Then, maybe write about it on your blog or toot about it, put the answer out there for everyone to see. You can also prompt your LLM to respond in the least conversational way possible to use less tokens, therefore resulting in less energy usage.
Another facet of this conversation I don't see mentioned enough: None of these companies are providing LLMs for free while making profit. They're just acquiring users. Once they have us, and have trapped us in the prison of convenience, they'll jack the prices up. Sure, you're paying £10 for your unlimited premium account right now... But once you can't code without it, and it's a core part of your workflow and these companies want profit? They'll jack that £10 to whatever number they feel like and remove your unlimited tokens, and you'll have no say in the matter.
A closing note:
I also recognise that some workplaces will mandate working with AI. In those situations, your options are limited. I've been there, it's frustrating, especially with all the problems it brings. If this is you, you can try advocating against AI use, highlighting how it might not be as productive or helpful as it looks. But if we're being real: you probably won't be listened to. I'm not trying to preach to these people with this message. If this is you, I'm sorry you're stuck here, and I hope your work can shift away from being so AI reliant soon.
In any case, I hope this has made it obvious that the impacts aren't just environmental and social (although those alone should be enough), the impacts directly effect your skills and workflows too. As more peoples' skills atrophy to AI use, keep refining your craft, keep learning to be a better developer. When this bubble bursts (and it will), you'll be miles ahead of everyone else.
The internet is a human built, delicate ecosystem of interconnected information. Don't let bots destroy that.
Conclusion
You don't have to care about the environment, the impact of LLMs, or even user experience. It's hopefully clear: your bottom line will benefit from reduced resource usage alone. But together, all these things have a huge impact on everyone, so what do you have to lose? Even taking 2 of these things seriously will have a positive impact on something.
Again, I don't want to make out that as an individual developer, you are solely responsible for the collapse of the Amazon Rainforest or the world's droughts. But I think if we can do something that benefits our users, code and bottom line- as well as the planet... What's not to love?
Energy is only getting more expensive and scarce. The limited energy we do have is being funneled into AI, with little left over for the rest of us. As energy becomes an increasingly scarce resource, learning to grow and adapt with this will be one of the most important skills of the decade. Will your infrastructure survive? Will your skills and resolve rise to the challenge? You decide.