Website Technical Audit
Website Technical Audit
November 2019
1
Page
Table of Contents
I. Page Speed
II. Crawlability and URL Errors
III. Rich Snippets and Schema
IV. Sitemaps
V. Progressive Web App (PWA)
VI. Site Audit Issues (Tool – Deepcrawl)
VII. Site Audit Issues (Tool – Sitebulb)
VIII. Mobile First Indexing
IX. Miscellaneous Pending Tasks
X. Website Platform Suggestion
Page Speed
Search engine rankings can be impacted by the back end performance of the
website. This includes network connections, CDNs, web servers, back end
2
Page
application, database services, size of page, number of http requests etc. There
is a lot of cut throat competition when it comes to occupying the first page of
SERPs and the website that is the most optimized and provides the best user
experience stands to win the race. We'd highly recommend implementing and
correcting issues that causes slow page load time.
Fix
Minify Java-Script, CSS and HTML
Compacting JavaScript code, CSS and HTML can save many bytes of data
and speed up downloading, parsing, and execution time.
Using External CSS will be recommended.
Check the JavaScript mentioned below which are increasing the loading time.
Irrelevant JavaScript needs to be removed or at least minimised in terms of
size/loading time
3
Page
solid-sleevless-winter-jacket-21311
Page
Eliminate render-blocking resources
Resources are blocking the first paint of the page. Consider delivering
critical JS/CSS inline and deferring all non-critical JS/styles.Fast page
loads result in higher user engagement, more pageviews, and improved
conversion.
You can improve your page load speed by inlining links and scripts that
are required for first paint, and deferring those that aren't.
More information -
https://developers.google.com/web/tools/lighthouse/audits/blocking-
resources
More information -
https://developers.google.com/web/tools/lighthouse/audits/webp
More information -
https://developers.google.com/web/tools/lighthouse/audits/optimize-
images
More information -
https://developers.google.com/web/tools/lighthouse/audits/bootup
More information -
https://developers.google.com/web/tools/lighthouse/audits/dom-size
Minimize Critical Requests Depth
Consider reducing the length of chains, reducing the download size of
resources, or deferring the download of unnecessary resources to
improve page load.
More information -
https://developers.google.com/web/tools/lighthouse/audits/critical-
request-chains
More information -
https://developers.google.com/web/tools/lighthouse/audits/registered-
service-worker
Intruders have automated web crawlers that can scan your site for
known security vulnerabilities. When the web crawler detects a
vulnerability, it alerts the intruder. From there, the intruder just needs to
figure out how to exploit the vulnerability on your site.
More information -
https://developers.google.com/web/tools/lighthouse/audits/vulnerabilit
ies
8
Page
Desktop
Mobile
Sitemaps
Sitemap needs to be updated. It is very important so that Google will crawl all
the pages
Progressive Web Apps are web applications that load like regular web
pages or websites but can offer the user functionality such as working
offline, push notifications, and device hardware access traditionally available
only to native applications. PWAs combine the flexibility of the web with the
experience of a native application
10
Probable Solution – These URLs may be very slow, which could be a temporary
or permanent problem. If the URLs appear to function normally then it may be
a temporary performance issue, potentially a result of the crawl itself. If the
URLs consistently fail, then it may indicate permanent server performance
issues caused by the content, or broken URLs which the web server is not able
to manage. These pages can be optimized better for crawling performance.
Probable Solution - Slow pages can negatively impact crawl efficiency and user
experience. These slow pages can be optimised for improved performance.
3. Pages without Valid Canonical Tag: Indexable pages which are missing a
canonical tag or with conflicting canonical URLs in the HTTP header and
HTML head. There are 43 pages with this issue.
Probable Solution - These pages can be indexed in search engines, but don't
have any significant amount of content, so they should be reviewed to see if
the content can be increased, if the pages should be made non-indexable, or
removed completely.
5. Maximum HTML size: The pages that exceeds the maximum HTML size
specified in Deepcrawl. The default HTML size of the page should be less
than or equal to 204.8 KB. There are 27 URLs with this issue.
Probable Solution - Large HTML pages can be slow to load and may not be fully
indexed by search engines, negatively impacting both usability and SEO
performance. Review if the size of HTML can be reduced on these pages.
6. HTTP pages: The pages which are using the non-secure HTTP scheme.
There are 24 URLs with this issue.
Probable Solution - Pages using the HTTP protocol are at a disadvantage when
competing against secure pages in Google search, and are marked as Non
Secure in Chrome browsers from July 2018. Review these pages and convert it
to HTTPS.
Probable Solution - Pages with a small content size can be classified as 'thin
content' and may not be indexed or devalue your site quality. These pages can
be updated to include more unique content or prevented from being crawled
to improve the crawl efficiency.
Probable Solution - Search engines may ignore links when there are too many
on a given page. Review these pages that exceeds the Max Links value and
reduce the number of links.
13
Page
Site Audit Issues (Tool – Sitebulb):
1. URLs with Duplicate Content: Because of the redirection issue, all the
URLs of the site are affected as duplicate content. URLs that have
identical HTML content to at least one other indexable URL. If this sort
of duplication occurs, you have a relatively serious issue, whereby URLs
with identical content are accessible to search engine crawlers. If this
results in large scale duplicate content issues on the site, you could trip
quality algorithms like Google's Panda, which can depress organic
search traffic to the site as a whole.
2. URLs with Duplicate Page Titles: Because of the redirection issue, all
the URLs of the site are affected as duplicate page title. URLs that have
the exact same page title as at least one other indexable URL. If multiple
pages have the same title, this can make it difficult for search engines to
differentiate the 'best' page for a given search query, which can result in
keyword cannibalization (multiple pages on your own site competing for
the same search terms, and hurting each others' rankings).
URL - https://www.mustardfashion.com/topschiffon-dobby-tunic-with-
sequence-black-18925
6. Critical (Above-the-fold) CSS was not found: URLs that do not include
critical CSS content in the <head>. Optimize the critical rendering path
by inlining critical CSS in the <head>, which allows above-the-fold
content to render in the browser without waiting for the rest of the CSS
to load. All the URLs are affected by this.
7. Style sheets are not loaded first in the <head>: URLs that contain
Stylesheet <link> elements in the <head>, which are loaded after a
15
Probable Solution – Make Style sheets load first in all the URLs.
Probable Solution – Compress the text content with gzip to reduce the
bandwidth.
1. Clickable elements too close together: Currently, there are 170 URLs
affected by this issue while using on mobile devices. Need to resolve this
issue for better usage in mobile devices.
3. Text too small to read: Currently, there are 2 URLs affected by this issue
while using on mobile devices. Need to resolve this issue for better
readability in mobile devices.
4. Viewport not set: Currently, there are 2 URLs affected by this issue while
using on mobile devices. Need to resolve this issue for better usage in
mobile devices.
18
Page
Miscellaneous Pending Tasks:
19
Page