4.6 Use automation wisely
Hosting, Infrastructure, and Systems
Automate recurring tasks only where this saves resources, such as scaling services to reduce consumption or handling suspicious activity.
Criteria
- Automate tasks: Human-testable
Automate recurring tasks, such as deployment, testing, and compilation in alignment with continuous integration and continuous delivery best practices.- AWS WAF – SEC11-BP02 – Automate testing throughout the development and release lifecycle
- AWS WAF – SEC05-BP03 – Automate network protection
- AWS WAF – SEC11-BP06 – Deploy software programmatically
- Continuous integration
- Continuous integration vs. delivery vs. deployment
- Data-driven Algorithm Selection for Carbon-Aware Scheduling (PDF)
- GPF – General Policy Framework (PDF) – 3.2 – Architecture (Resource Tailoring)
- GPF – General Policy Framework (PDF) – 6.1 – Front-End (Download Limits)
- GPF – General Policy Framework (PDF) – 8.10 – Hosting (Asynchronous Requests)
- How AI and automation make data centers greener and more sustainable
- Implementing CI/CD in Web Development Projects
- Microsoft Azure WAF – Security considerations
- Qualify tasks: Human-testable
Run automated tasks only when necessary to reduce unnecessary resource utilisation.- AWS WAF – SEC11-BP02 – Automate testing throughout the development and release lifecycle
- AWS WAF – SEC05-BP03 – Automate network protection
- AWS WAF – SEC11-BP06 – Deploy software programmatically
- AWS WAF – SUS02-BP03 – Stop the creation and maintenance of unused assets
- AWS WAF – SUS02-BP06 – Implement buffering or throttling to flatten the demand curve
- Automation in Web Development
- GPF – General Policy Framework (PDF) – 3.2 – Architecture (Resource Tailoring)
- GPF – General Policy Framework (PDF) – 6.1 – Front-End (Download Limits)
- GPF – General Policy Framework (PDF) – 8.10 – Hosting (Asynchronous Requests)
- How Crawl Optimization Improves Website Sustainability
- Microsoft Azure WAF – Security considerations
- Automated scaling: Human-testable
Use automated scaling to promptly adjust server capacity up or down based on demand, ensuring efficient resource allocation. Implement buffering and throttling to manage load and maintain performance without overprovisioning.- 2020 Best Practice Guidelines for the EU Code of Conduct on Data Centre Energy Efficiency (PDF)
- AWS WAF – SUS02-BP01 – Scale workload infrastructure dynamically
- AWS WAF – SUS03-BP01 – Optimize software and architecture for asynchronous and scheduled jobs
- Code of Conduct on Data Centre Energy Efficiency
- EMPower: The Case for a Cloud Power Control Plane (PDF)
- GR491 – 3-8028 – Memory Usage
- GR491 – 3-8029 – CPU Usage
- GR491 – 3-8031 – Unused CPU Cores
- GR491 – 3-8036 – Provisioning and Deprovisionning
- How Green is Your Data Center?
- If only data centers would participate in demand response
- Load shifting of computing can lower emissions and soak up surplus renewables. Except when it doesn’t
- Microsoft Azure WAF – Sustainable workloads
- Suspicious activity filtering: Machine-testable
Restrict the activity of unwanted and unnecessary third-party crawlers, suspicious user agents, unwanted users, bots, and scrapers from accessing or downloading your content. Follow best practices, such as server access rules and security tools, while ensuring your content remains accessible to users, search engines and any helpful, welcome crawlers. Consider that scrapers may be used to inform and train large language models.- 63% of Websites Receive AI Traffic
- AI bots are destroying Open Access
- AI crawlers cause Wikimedia Commons bandwidth demands to surge 50%
- Block the Bots that Feed AI Models by Scraping Your Website
- Blockin bots
- Blocking AI Bots
- Bot traffic: What it is and why you should care about it
- Cleaning up bad bots (and the climate)
- Distribution of bot and human web traffic worldwide from 2014 to 2021
- How and Why To Prevent Bots From Crawling Your Site
- How crawlers impact the operations of the Wikimedia projects
- How to Eliminate Bots From Your Website
- I use Zip Bombs to Protect my Server
- No bots allowed?
- Open source devs say AI crawlers dominate traffic, forcing blocks on entire countries
- Poisoning Well
- Thousands of creatives join forces to combat AI data scraping
Benefits
- Economic
Maximizing the number of tasks carried out rapidly by machine not only minimizes power use and carbon emissions, but also brings down maintenance and infrastructure costs. - Environment
Optimizing workflows can reduce the amount of energy used during peak periods where it may be most costly or unsustainable to run. - Operations
Automating repetitive tasks allows humans to focus on valuable, novel, and creative tasks that can offer greater job satisfaction and expand skills. - Security
Evading unwanted bots, crawlers, and similar users protects websites from harm and avoids potential downtime.
GRI
- Materials: Low
- Energy: Low
- Water: Low
- Emissions: Low