An isolated environment to explore how webpack configuration works for compressing assets with various algorithms and how these plugins work with other webpack setup and plugins.
- Fork this project in the GitHub UI
- Clone your forked copy
git clone https://github.com/[YOUR-USERNAME]/webpack-compression-exploration
- Change into the directory
cd webpack-compression-exploration
- Use node version 10.8.0 and have npm installed
- Run
npm install
to add module dependencies
- Build the bundle in one terminal window with
npm run build
- Open the server in a second terminal window with
npm start
- Your default browser will open the application at localhost:8080
- When finished with the application, close both process with
^c
- Understand how to set up a project correctly for webpack to compress bundle assets
- Know how webpack plugins work and where in the webpack process they hook in
- Explore how a range of plugins play together
- Look at how compression assists with web performance (and the trade offs involved) and caching strategies.
As a developer
So that I can implement asset compression and caching strategies
I need an isolated project to build a test case
- Cloned a small dummy React project with very basic webpack config in place.
- Explored suggestions for compression from Google with an aim to enabling long term caching.
- Implement gzip compression algorithm via
compression-webpack-plugin
. Results:
- dist/main.js 123 KiB
- dist/main.js.gz 39.1 KiB
- 214% decrease in over all
.js
file size follinging gzip for Javascript.
- Alternative compression algorithmWhy Brotli? Based on stats listed by Google. It is:
- 14% smaller than gzip for JavaScript
- 21% smaller than gzip for HTML
- 17% smaller than gzip for CSS
- Implement brotli compression via
brotli-webpack-plugin
. Results:
- dist/main.js 123 KiB
- dist/main.js.br 34.3 KiB
- dist/main.js.gz 39.1 KiB
Difference between the brotli emmitted file and the gzip is not dramatic. But it is 13.99% smaller than the gzip compressed version of the same asset as the webpack gzip plugin emitted.
- Bump node verion to >=11.7.0
- Use native support for brotli by implementing instance of
compression-webpack-plugin
with brotli compression algorithm. Results:
- dist/main.js 123 KiB
- dist/main.js.br 34.3 KiB
- dist/main.js.gz 39.1 KiB
NOTE: You need to include the filename
property in the compression-webpack-plugin
options object when using algorithm: 'brotliCompress'
. This isn't needed for gzip
but without it the .br
file is not emitted.
- File size of the compressed output is the same when using the separate brotli plugin vs native support for brotli. If using node >=11.7.0 the latter is preferable as it's is one less development dependency to support.
- With "on the fly" compression at the server or CDN level you need to carefully consider which compression algorithm and how high to set the compression level for each asset type because high compression rates (especially with Brotli) takes time and results in latency. The end user thus waits longer for the assets to be delivered to the browser. This is still an issue for dynamically generated content (e.g. news stories)
- Static compression of assets at build lengthens the overall build time but removes the above debate. For static content this seems like a natural choice.
- When you handle the compression your self (rather than a server libarary or CDN configuration performing "on the fly" compression) you need to be sure to correctly set the
Content-Encoding
andContent-Type
headers for the receiving client to correctly process the file."When present, the Content-Encoding value indicates which encodings were applied to the entity-body. It lets the client know how to decode in order to obtain the media-type referenced by the Content-Type header."
- What metadata is attached to the compressed assets? To send a file to the browswer it needs to know what the
Content-Type
property is. To try to answer this and mirror the deploy system at work I manually added these emitted/dist
files to an S3 bucket and reviewed the metadata:
main.js
=>Content-Type: application/javascript
main.js.gz
=>Content-Type: application/x-gzip
main.js.br
=>Content-Type: binary/octet-stream
- How do you programatically change the
.gz
and.br
files to haveContent-Type: application/javascript
? Is this needed? Yes.
To serve the statically compressed assets, they need to be uploaded to an AWS S3 bucket and have the Content-Type
property correctly set to represent the decompressed state of the file. Considerations:
- Would you want the upload process to run in all environments?
- AWS provide an SDK as well as a CLI.
- Should this be a Bash script or a JavaScript process? What are the pros and cons of each? Intention is for it to run in Circle Ci.
- What is the correct
Content-Type
for a source map? Seems to not matter for the dev tools - Is a default of
Content-Type: application/octet-stream
safe? Are there edge cases that this causes problems for. - Do you need to handle the
Content-Type
andContent-Encoding
properties seperately (e.g the former at upload and the latter at the CDN when sending the response to the client)? An AWS S3 object can hold both properties: