Updated:
by Wayne Smith
Technical SEO is one layer of the full SEO stack. It focuses on hosting configurations and on-page coding elements. While it does not typically involve content or keyword strategy, it does cover meta tags, duplicate content resolution (often at the server level), structured data (schema), site navigation, and clean URL structures. Technical SEO supports SEO performance without altering the visible design or content of a website.
On-Page Technical SEO
Technical SEO is rarely involved in the actual content on the page; that is not to say that on-page content is not important, but content is a different aspect of SEO. Meta information, such as schema, canonical, robots tag, and link tags, is considered more technical SEO than actual content.
Schema Advanced Structured Data (schema)
Structured data, (Json Schema), can be used both to amplify or disambiguate content for improved visibility and for search features that improve how a site is listed or gain access to product feeds.
Rel attribute = noindex, nofollow, ugc, sponsored, me ...
The rel= attribute is part of Technical SEO best practices ... it is used to inform search engines about the source of links on a page so the engines can properly discern how to process them.
The Canonical link, A broken promise
Best practice for SEO involves using the canonical to prevent duplicate content. Google, however, needs to be specifically checked to confirm they are using the canonical.
Back-End Technical SEO
SEO on the back end is focused on technical errors on the server and signals based on website structure that affect SEO.
Advanced Robots.txt file
The robots.txt file is an important component of technical SEO and is often considered part of full-stack SEO practices. While technically optional, it is widely regarded as a best practice. In essence, the file informs web crawlers and search engines which URLs should or should not be indexed.
Is disavow dead, obsolete?
Over a decade ago, uploading a disavow file to Google was considered a best practice. But much has changed in SEO since then -- especially in how Google evaluates link quality and interprets link text to associate pages with keywords.
Problem free URL Structure for SEO
Optimized URLs are SEO-friendly, human-readable, and clean or free of cryptic parameters. A good site structure helps avoid content duplication problems and becomes useful for determining KPIs for different types of content on the site.
Using https is an SEO best practice
HTTPS is a secure connection and over the years Google has pushed sites to adopt HTTPS ...
Google lists sites in their search results that are not HTTPS with a "not secure," warning! Although it does not specifically bury them in the search results. The Chrome browser also notes site which are not HTTPS as not secure in the address bar.
Best Practices for SEO: Soft 404 errors
A soft 404 is a technical SEO problem with a website that consumes bandwidth, uses up the crawl budget for a website, provides a poor user experiance for a user, and is unhelpful content. Soft-404s and some tackits related to not presenting a 404 error can also be exploited to harm the website in search engines.
... Solution Smith tests SEO tactics so you don't have to ...
Full-stack SEO has a lot of moving parts as does Technical SEO, and it takes time and effort to understand the nuances and co-dependencies involved. The cost associated with testing are part of Solution Smith overhead ... resulting in real savings in terms of time and effort for clients ... Why reinvent the wheel.