0% found this document useful (0 votes)
166 views19 pages

Website Technical Audit

The document provides a technical audit of a fashion website called Mustard Fashion. It summarizes key metrics like page speed scores, issues detected by various site audit tools, recommendations for improving performance and user experience. Specific issues mentioned include high page load times, unnecessary JavaScript, non-optimized images, lack of service workers and caching policies. The audit also notes errors in schema markup, outdated sitemaps and opportunities to implement progressive web app features.

Uploaded by

Deepak Shetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
166 views19 pages

Website Technical Audit

The document provides a technical audit of a fashion website called Mustard Fashion. It summarizes key metrics like page speed scores, issues detected by various site audit tools, recommendations for improving performance and user experience. Specific issues mentioned include high page load times, unnecessary JavaScript, non-optimized images, lack of service workers and caching policies. The audit also notes errors in schema markup, outdated sitemaps and opportunities to implement progressive web app features.

Uploaded by

Deepak Shetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 19

Brief Technical Audit of Website

November 2019

1
Page
Table of Contents

I. Page Speed
II. Crawlability and URL Errors
III. Rich Snippets and Schema
IV. Sitemaps
V. Progressive Web App (PWA)
VI. Site Audit Issues (Tool – Deepcrawl)
VII. Site Audit Issues (Tool – Sitebulb)
VIII. Mobile First Indexing
IX. Miscellaneous Pending Tasks
X. Website Platform Suggestion

Page Speed

Search engine rankings can be impacted by the back end performance of the
website. This includes network connections, CDNs, web servers, back end
2
Page

application, database services, size of page, number of http requests etc. There
is a lot of cut throat competition when it comes to occupying the first page of
SERPs and the website that is the most optimized and provides the best user
experience stands to win the race. We'd highly recommend implementing and
correcting issues that causes slow page load time.

Type of Desktop Mobile Page Load


Page URL
Page Score Score Size Time
Home 2.41
https://www.mustardfashion.com/ 68 11 5.0s
Page MB
Category https://www.mustardfashion.com/new- 1.62
54 9 7.8s
Pages arrivals MB
Category 1.70
https://www.mustardfashion.com/tops 69 8 7.0s
Pages MB
Category 1.37
https://www.mustardfashion.com/bottoms 72 5 5.7s
Pages MB
Category https://www.mustardfashion.com/jackets- 1.64
68 8 4.1s
Pages sweaters MB
Category https://www.mustardfashion.com/dupatta- 1.44
57 5 4.9s
Pages stoles MB
Category 1.64
https://www.mustardfashion.com/sale 72 9 4.0s
Pages MB
Category 1.66
https://www.mustardfashion.com/plus-size 58 9 4.0s
Pages MB

Fix
 Minify Java-Script, CSS and HTML
Compacting JavaScript code, CSS and HTML can save many bytes of data
and speed up downloading, parsing, and execution time.
Using External CSS will be recommended.

Check the JavaScript mentioned below which are increasing the loading time.
Irrelevant JavaScript needs to be removed or at least minimised in terms of
size/loading time
3
Page

Home page URL – https://www.mustardfashion.com/


Category page URL – https://www.mustardfashion.com/tops

Sub-category page URL - https://www.mustardfashion.com/jackets-


sweaters/jackets

Product page URL - https://www.mustardfashion.com/mustard-grey-black-


4

solid-sleevless-winter-jacket-21311
Page
 Eliminate render-blocking resources
Resources are blocking the first paint of the page. Consider delivering
critical JS/CSS inline and deferring all non-critical JS/styles.Fast page
loads result in higher user engagement, more pageviews, and improved
conversion.

You can improve your page load speed by inlining links and scripts that
are required for first paint, and deferring those that aren't.

More information -
https://developers.google.com/web/tools/lighthouse/audits/blocking-
resources

 Serve images in next-gen formats


Image formats like JPEG 2000, JPEG XR, and WebP often provide better
compression than PNG or JPEG, which means faster downloads and less
data consumption.

More information -
https://developers.google.com/web/tools/lighthouse/audits/webp

 Preconnect to required origins


5
Page
Adding preconnect or dns-prefetch resource hints to establish early
connections to important third-party origins.

 Efficiently encode images


Optimized images load faster and consume less cellular data.

More information -
https://developers.google.com/web/tools/lighthouse/audits/optimize-
images

 Minimize main-thread work


Consider reducing the time spent parsing, compiling and executing JS.
You may find delivering smaller JS payloads helps with this.

 Reduce JavaScript execution time


Consider reducing the time spent parsing, compiling, and executing JS.
You may find delivering smaller JS payloads helps with this.

More information -
https://developers.google.com/web/tools/lighthouse/audits/bootup

 Serve static assets with an efficient cache policy


A long cache lifetime can speed up repeat visits to your page. HTTP
caching can speed up your page load time on repeat visits.
Ideal cache lifetime will be 90 days

When a browser requests a resource, the server providing the resource


can tell the browser how long it should temporarily store or cache the
resource. For any subsequent request for that resource, the browser
uses its local copy, rather than going to the network to get it.
6
Page
More information -
https://developers.google.com/web/tools/lighthouse/audits/cache-
policy

 Avoid an excessive DOM size


Currently, 983 DOM nodes are there and 105 child elements. Ideally, it
should be less than 60 children/parent elements. Eliminating the
irrelevant DOM nodes will be recommended

More information -
https://developers.google.com/web/tools/lighthouse/audits/dom-size
 Minimize Critical Requests Depth
Consider reducing the length of chains, reducing the download size of
resources, or deferring the download of unnecessary resources to
improve page load.

More information -
https://developers.google.com/web/tools/lighthouse/audits/critical-
request-chains

 Does not register a service worker


The service worker is the technology that enables your app to use many
Progressive Web App features, such as offline, add to homescreen, and
push notifications.

More information -
https://developers.google.com/web/tools/lighthouse/audits/registered-
service-worker

 Does not use passive listeners to improve scrolling performance


Consider marking your touch and wheel event listeners as `passive` to
improve your page's scroll performance.
7
Page
More information -
https://developers.google.com/web/tools/lighthouse/audits/passive-
event-listeners

 Includes front-end JavaScript libraries with known security


vulnerabilities
Some third-party scripts may contain known security vulnerabilities that
are easily identified and exploited by attackers.

Intruders have automated web crawlers that can scan your site for
known security vulnerabilities. When the web crawler detects a
vulnerability, it alerts the intruder. From there, the intruder just needs to
figure out how to exploit the vulnerability on your site.

More information -
https://developers.google.com/web/tools/lighthouse/audits/vulnerabilit
ies

 Browser errors were logged to the console


Errors logged to the console indicate unresolved problems. They can
come from network request failures and other browser concerns. 404
error and syntax errors

8
Page

Crawlability and URL Errors


Pages Indexed: 1,515

Desktop

To view the affected URLs - Click here

Mobile

To view the affected URLs - Click here


9
Page

Rich Snippets and Schema


Currently, there is a “(markup: schema.org)”error with 191 URLs which needs
to be resolved.

To view the affected URLs - Click here

Sitemaps

Sitemap needs to be updated. It is very important so that Google will crawl all
the pages

Progressive Web App (PWA)

Progressive Web Apps are web applications that load like regular web
pages or websites but can offer the user functionality such as working
offline, push notifications, and device hardware access traditionally available
only to native applications. PWAs combine the flexibility of the web with the
experience of a native application

Building a high-quality Progressive Web App has incredible benefits, making it


easy to delight your users, grow engagement and increase conversions.

More information - https://developers.google.com/web/progressive-web-


apps/checklist

10

Site Audit Issues (Tool – Deepcrawl):


Page
1. Failed URLs: URLs which were crawled, but did not respond within the
Deepcrawl timeout period of 9 seconds. 884 URLs failed.

Probable Solution – These URLs may be very slow, which could be a temporary
or permanent problem. If the URLs appear to function normally then it may be
a temporary performance issue, potentially a result of the crawl itself. If the
URLs consistently fail, then it may indicate permanent server performance
issues caused by the content, or broken URLs which the web server is not able
to manage. These pages can be optimized better for crawling performance.

To view the affected URLs - Click here

2. Maximum Fetch Time: URLs exceeding the maximum fetch time


specified in Deepcrawl. There are 28 URLs with this issue. It means that
these URLs took 3 or more seconds to fetch.

Probable Solution - Slow pages can negatively impact crawl efficiency and user
experience. These slow pages can be optimised for improved performance.

To view the affected URLs - Click here

3. Pages without Valid Canonical Tag: Indexable pages which are missing a
canonical tag or with conflicting canonical URLs in the HTTP header and
HTML head. There are 43 pages with this issue.

Probable Solution - A canonical tag should be included on every page to


prevent duplication issues. If a page contains 2 different canonical URLs, they
will both be ignored. Update all these pages which are missing canonical tag or
have conflicting canonical tags.

To view the affected URLs - Click here


11
Page
4. Empty Pages: Indexable pages with less content than the content size
setting specified in Deepcrawl. So, these pages should be reviewed to
see if the content can be increased or if the pages should be made non-
indexable or removed completely. There are 10 pages with this issue.

Probable Solution - These pages can be indexed in search engines, but don't
have any significant amount of content, so they should be reviewed to see if
the content can be increased, if the pages should be made non-indexable, or
removed completely.

To view the affected URLs - Click here

5. Maximum HTML size: The pages that exceeds the maximum HTML size
specified in Deepcrawl. The default HTML size of the page should be less
than or equal to 204.8 KB. There are 27 URLs with this issue.

Probable Solution - Large HTML pages can be slow to load and may not be fully
indexed by search engines, negatively impacting both usability and SEO
performance. Review if the size of HTML can be reduced on these pages.

To view the affected URLs - Click here

6. HTTP pages: The pages which are using the non-secure HTTP scheme.
There are 24 URLs with this issue.

Probable Solution - Pages using the HTTP protocol are at a disadvantage when
competing against secure pages in Google search, and are marked as Non
Secure in Chrome browsers from July 2018. Review these pages and convert it
to HTTPS.

To view the affected URLs - Click here


12
Page
7. Thin Pages: Pages with less than the minimum content size specified in
Deepcrawl. Content size of the page should be more than 3.07 KB. There
are 5 pages with this issue.

Probable Solution - Pages with a small content size can be classified as 'thin
content' and may not be indexed or devalue your site quality. These pages can
be updated to include more unique content or prevented from being crawled
to improve the crawl efficiency.

To view the affected URLs - Click here

8. Maximum links: Pages exceeding the maximum number of links


specified in Deepcrawl. The number of links on the page should be less
than 250 links. There are 2 pages with this issue.

Probable Solution - Search engines may ignore links when there are too many
on a given page. Review these pages that exceeds the Max Links value and
reduce the number of links.

To view the affected URLs - Click here

13
Page
Site Audit Issues (Tool – Sitebulb):

1. URLs with Duplicate Content: Because of the redirection issue, all the
URLs of the site are affected as duplicate content. URLs that have
identical HTML content to at least one other indexable URL. If this sort
of duplication occurs, you have a relatively serious issue, whereby URLs
with identical content are accessible to search engine crawlers. If this
results in large scale duplicate content issues on the site, you could trip
quality algorithms like Google's Panda, which can depress organic
search traffic to the site as a whole.

Probable Solution – Need to fix the redirection issue

2. URLs with Duplicate Page Titles: Because of the redirection issue, all
the URLs of the site are affected as duplicate page title. URLs that have
the exact same page title as at least one other indexable URL. If multiple
pages have the same title, this can make it difficult for search engines to
differentiate the 'best' page for a given search query, which can result in
keyword cannibalization (multiple pages on your own site competing for
the same search terms, and hurting each others' rankings).

Probable Solution – Need to fix the redirection issue

3. Broken Internal URLs: Broken URLs are considered unhealthy, as they


result in a poor user experience, and can also have a negative SEO
impact, depending on the type and scale of the issue. It might be Not
found, Error, Forbidden or Timeout.

URL - https://www.mustardfashion.com/topschiffon-dobby-tunic-with-
sequence-black-18925

Probable Solution – Need to fix this particular URL


14
Page
4. Missing viewport <meta> tag in the <head>: URLs that do not contain
the viewport <meta> tag in the <head>. Without a viewport, mobile
devices will render the page at a typical desktop screen width, scaled to
fit the screen. All the URLs of the website are affected by this.

Probable Solution - The website should be made mobile friendly

5. Character Set Not Specified in Head or Headers: URLs which do not


specify the character set in the <head> or HTTP Headers. It is important
to specify a character set early (ideally in the HTTP response headers) to
allow the browser to begin parsing HTML and executing scripts
immediately. Without specifying a character set the browser is left to
figure it out on its own which takes time and therefore negatively
impacts page load times. All the URLs are affected by this.

Probable Solution – Specify character set in the HTTP response headers.

6. Critical (Above-the-fold) CSS was not found: URLs that do not include
critical CSS content in the <head>. Optimize the critical rendering path
by inlining critical CSS in the <head>, which allows above-the-fold
content to render in the browser without waiting for the rest of the CSS
to load. All the URLs are affected by this.

Probable Solution – Include Critical CSS content in the <head>.

7. Style sheets are not loaded first in the <head>: URLs that contain
Stylesheet <link> elements in the <head>, which are loaded after a
15

<script> element. CSS blocks rendering, so if it is not the first thing


Page
loaded on the page, this will cause the page to render more slowly. All
the URLs are affected by this.

Probable Solution – Make Style sheets load first in all the URLs.

8. Uncompressed text content resources: URLs that contain text content


which has not been compressed. Text compression minimizes the byte
size of network responses that include text content. Compressing text
content with zip saves download bytes and reduces bandwidth for the
user. All the URLs are affected by this.

Probable Solution – Compress the text content with gzip to reduce the
bandwidth.

9. Server response too slow with a Time-to-First-Byte greater than


600ms: URLs that had a Time-to-First-Byte (TTFB) greater than 600ms.
TTFB is a measure of how long it takes to receive data from the server,
and high TTFB is a cause of slow page load. 412 URLs are affected by
this.

Probable Solution – To reduce the TTFB (Time-to-First-Byte) for specified set of


URLs

To view the affected URLs - Click here


16
Page
Mobile First Indexing
Google has now started to roll out mobile first indexing to various domains
as a result of which the search results will now be dependent on mobile
version of the website. So, it is better to have a separate mobile version of
the site for better crawling on mobiles.

Mobile Usability errors logged in Search Console (Webmaster):

1. Clickable elements too close together: Currently, there are 170 URLs
affected by this issue while using on mobile devices. Need to resolve this
issue for better usage in mobile devices.

To view the affected URLs – Click here


17
Page
2. Content wider than screen: Currently, there are 170 URLs affected by
this issue while using on mobile devices. Need to resolve this issue for
better visibility in mobile devices.

To view the affected URLs - Click here

3. Text too small to read: Currently, there are 2 URLs affected by this issue
while using on mobile devices. Need to resolve this issue for better
readability in mobile devices.

To view the affected URLs - Click here

4. Viewport not set: Currently, there are 2 URLs affected by this issue while
using on mobile devices. Need to resolve this issue for better usage in
mobile devices.

To view the affected URLs - Click here

18
Page
Miscellaneous Pending Tasks:

1. Plus size sitemap update


2. Launch pending/updated Product Pages
3. Product filter optimization
4. Image compression to improve page speed for mobile
5. Add to Cart CTA needs to be implemented
6. Footer content placement
7. Size filter update
8. Title and Meta Description update
9. Reviews and rating facilities for each products.
10.Blog site to be changed from HTTP to HTTPS which is more secure
version

Website Platform Suggestion: Currently, we are using the Magento


platform for the website development. But in the recent history of E-
commerce website platforms, it is suggested to use React or Angular website
building platforms for the better website functionality and improvements.

19
Page

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy