The error ERR_CONTENT_DECODING_FAILED (sometimes displayed as Error 330 in older Chrome versions) appears when Google Chrome cannot decode the content received from a web server. This typically means the server’s HTTP response headers indicate the content is compressed (usually with gzip), but the actual response body is either not compressed, compressed differently, or corrupted. This article explains the root causes, how to diagnose the issue, and how to fix it across different web server platforms.

Understanding the Error

When a browser sends a request to a web server, it includes an Accept-Encoding header indicating which compression methods it supports:

Accept-Encoding: gzip, deflate, br

If the server decides to compress the response, it includes a Content-Encoding header in the response:

Content-Encoding: gzip

The browser then expects to decompress the response body using the specified algorithm. ERR_CONTENT_DECODING_FAILED occurs when there is a mismatch — the header says gzip but the body is not valid gzip data.

Common Causes

Double Compression (Most Common)

This is the most frequent cause, especially with CMS platforms like WordPress. It happens when:

  1. A WordPress caching or optimization plugin enables gzip compression at the application level
  2. The web server (IIS, Nginx, Apache) is also configured to compress responses
  3. The server compresses the already-compressed content, producing invalid gzip data
  4. Chrome receives the double-compressed data and fails to decode it

Proxy or CDN Interference

A reverse proxy or CDN between the client and origin server may:

  • Strip the compressed content but leave the Content-Encoding: gzip header intact
  • Recompress an already-compressed response
  • Cache a compressed response and serve it with incorrect headers

Corrupted Response

Network issues, truncated responses, or server-side errors can produce a response where the Content-Encoding header is set but the body is incomplete or corrupted.

Antivirus or Security Software

Desktop antivirus software that inspects HTTPS traffic can sometimes modify response bodies without updating the content encoding headers, causing the browser to fail decoding.

Diagnosing the Issue

Using Chrome Developer Tools

  1. Press F12 to open Chrome Developer Tools
  2. Go to the Network tab
  3. Reload the page that produces the error
  4. Click on the failed request
  5. Examine the Response Headers:
    • Look for Content-Encoding: gzip
    • Check Transfer-Encoding
  6. Click the Response tab — if the response is garbled or empty, the encoding is wrong

Using curl from the Command Line

# Request with gzip encoding and see response headers
curl -H "Accept-Encoding: gzip" -I https://example.com

# Download the response and check if it is valid gzip
curl -H "Accept-Encoding: gzip" -o response.gz https://example.com
file response.gz    # Should report "gzip compressed data"
gunzip response.gz  # Should decompress without errors

# Request without compression to see if the page works
curl --compressed -v https://example.com

Checking with a Different Browser

If the page loads in Firefox or Edge but not in Chrome, the issue may be related to Chrome’s cache or a Chrome-specific extension. If it fails in all browsers, the problem is server-side.

Fixing the Issue on IIS

Disable Double Compression

If a WordPress plugin or PHP application handles compression, disable IIS dynamic compression:

Using IIS Manager:

  1. Open IIS Manager
  2. Select the site or server node
  3. Double-click Compression
  4. Uncheck Enable dynamic content compression
  5. Click Apply

Using web.config:

<configuration>
  <system.webServer>
    <urlCompression doStaticCompression="true"
                    doDynamicCompression="false" />
  </system.webServer>
</configuration>

Or Let IIS Handle All Compression

Alternatively, disable compression in the application and let IIS manage it:

  1. Disable gzip in WordPress plugins (W3 Total Cache, WP Super Cache, etc.)
  2. Enable both static and dynamic compression in IIS:
<configuration>
  <system.webServer>
    <urlCompression doStaticCompression="true"
                    doDynamicCompression="true" />
    <httpCompression>
      <dynamicTypes>
        <add mimeType="text/*" enabled="true" />
        <add mimeType="application/javascript" enabled="true" />
        <add mimeType="application/json" enabled="true" />
        <add mimeType="*/*" enabled="false" />
      </dynamicTypes>
      <staticTypes>
        <add mimeType="text/*" enabled="true" />
        <add mimeType="application/javascript" enabled="true" />
        <add mimeType="*/*" enabled="false" />
      </staticTypes>
    </httpCompression>
  </system.webServer>
</configuration>

Fixing the Issue on Nginx

Verify gzip Configuration

Check your Nginx configuration for proper gzip settings:

http {
    gzip on;
    gzip_vary on;
    gzip_proxied any;
    gzip_comp_level 6;
    gzip_min_length 1000;

    gzip_types
        text/plain
        text/css
        text/xml
        text/javascript
        application/json
        application/javascript
        application/xml
        application/xml+rss
        application/x-javascript
        image/svg+xml;
}

Avoid Compressing Already-Compressed Content

If your application (PHP, Node.js) sends pre-compressed responses, tell Nginx not to compress them again:

location ~ \.php$ {
    # Pass through without additional compression
    gzip off;
    proxy_pass http://backend;
}

Or use gzip_proxied to control when Nginx compresses proxied responses:

# Only compress if the backend did not already compress
gzip_proxied no-cache no-store private expired auth;

Fixing the Issue on Apache

Check for Conflicting mod_deflate Configuration

# In .htaccess or httpd.conf
<IfModule mod_deflate.c>
    AddOutputFilterByType DEFLATE text/html text/plain text/xml
    AddOutputFilterByType DEFLATE text/css text/javascript
    AddOutputFilterByType DEFLATE application/json application/javascript
    AddOutputFilterByType DEFLATE application/xml application/xhtml+xml

    # Do not compress images or already-compressed files
    SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png|gz|zip|bz2)$ no-gzip
</IfModule>

If a PHP plugin is also compressing output, either disable mod_deflate or disable the plugin’s compression.

WordPress-Specific Solutions

Since this error frequently occurs with WordPress setups, here are targeted fixes:

W3 Total Cache

  1. Go to Performance > Browser Cache
  2. Under HTTP compression, uncheck Enable HTTP (gzip) compression
  3. Save settings
  4. Let the web server handle compression instead

WP Super Cache

  1. Go to Settings > WP Super Cache > Advanced
  2. Uncheck Compress pages so they’re served more quickly to visitors
  3. Save settings

wp-config.php Override

You can add this to wp-config.php to prevent WordPress from sending compressed output:

// Disable WordPress gzip output
@ini_set('zlib.output_compression', 'Off');

Client-Side Troubleshooting

If you are experiencing this error as a user (not a server administrator):

Clear the Browser Cache

  1. Press Ctrl+Shift+Delete in Chrome
  2. Select Cached images and files
  3. Set the time range to All time
  4. Click Clear data

Disable Browser Extensions

Some extensions (ad blockers, privacy tools) can interfere with response handling:

  1. Open chrome://extensions/
  2. Disable all extensions
  3. Reload the failing page
  4. If it works, re-enable extensions one by one to find the culprit

Check Antivirus HTTPS Scanning

If your antivirus scans HTTPS traffic, try temporarily disabling that feature to see if it resolves the error.

Flush DNS Cache

ipconfig /flushdns

On macOS:

sudo dscacheutil -flushcache
sudo killall -HUP mDNSResponder

Summary

The ERR_CONTENT_DECODING_FAILED error in Chrome is caused by a mismatch between the Content-Encoding header and the actual response body. The most common cause is double compression, where both a CMS plugin and the web server attempt to gzip the response. The fix is straightforward: ensure only one layer handles compression. Use Chrome Developer Tools and curl to diagnose which layer is causing the conflict, then adjust your IIS, Nginx, or Apache configuration accordingly. For WordPress specifically, disable compression in caching plugins and let the web server handle it.