3.18 Include Files That Are Automatically Expected
Search engines and browsers regularly examine websites, requesting specific files by default (they expect them to exist). If the files don’t exist, this will lead to potential errors and emissions being caused when they could be created, especially as the files offer SEO, user-experience, and other benefits to visitors.
- Expected File Formats: Take advantage of the favicon.ico, robots.txt, opensearch.xml, site.webmanifest, and sitemap.xml documents.
Search engines or browsers request certain files by default, ensuring they are inplace will reduce loading errors, and may provide efficiency enhancements in how visitors find or interact with a site.
OpenSearch enables the browser’s default search box rather than a custom solution to be integrated with your website search, which may aid accessibility.
Files that are expected will produce HTTP requests, ensuring they are met will satisfy the products making them and potentially reduce the requests once they are discovered.
Robots and Sitemap files can be utilized by search engines to help make your website more findable, this could lead to more visitors and potentially more customers as a result.
Robots can be used to target specific search engines, helping to ensure content is correctly indexed to get a good placement so that visitors can find you easily.
- materials: Low
- energy: Low
- water: Low
- emissions: Low
User-agent: * Disallow: /cgi-bin/
- About /robots.txt.
- About /robots.txt
- Build and submit a sitemap
- Define a favicon to show in search results
- Favicon Generator
- How Google interprets the robots.txt specification
- OpenSearch Protocol
- Sitemaps Protocol
- The Carbon Impact of Web Standards (PDF)
- Web Application Manifest