SEO (Public-facing sites only)

Avoid all 'black hat' SEO practices, such as keyword stuffing, dark user patterns, unneccesary linking, redirecting legitimate site errors, domain spoofing, etc. All SERP identifiable pages should include: title tag, meta description, og:url, og:author, og:site-author, og:image Pages that should not be identifiable by SERP should use robots.txt exclusions to prevent them from being crawled. Core page content should always initially load with the DOM. Shadow DOM elements should never be relied on for SEO values. Do not use the "keywords" meta tag. All links, including those in content, use https protocol (when available) All images should use alt text and title tags. Clean, semantic markup Clear titles for all pages Meta descriptions for all pages Clear H1 on all pages Sitemap set up and submitted to webmaster tools Standard robots.txt for relevant backend system Robot-friendly backup content for JS-powered features Basic sitemap
Minimize javascript usage to what cannot be tackled in native HTML. Google delays the rendering reading of JS content, which has a big impact on JAMstack sites in particular, and unnessary javascript bloats the site load speeds and creates unneccesary burden on the user's browser. Avoid redirect chains. Try to provide the shortest (ideally 0) redirect paths from old content to new. Consider changes in URL structure carefully. Provide Alt text for all media content. Avoid duplicate content. Use relational links, canonical links, and simplify the sitemap to avoid this issue. Pagination URLs use canonical links, in addition to "prev" and "next" identifiers.
Nuanced sitemap including pinpoint priority values
Extreme JSON page representation Strategic use of nofollow for pages we want to exclude