This is a viewer only at the moment see the article on how this works.
To update the preview hit Ctrl-Alt-R (or ⌘-Alt-R on Mac) or Enter to refresh. The Save icon lets you save the markdown file to disk
This is a preview from the server running through my markdig pipeline
Tuesday, 11 November 2025
Over the past year, I've stumbled, experimented, broken things, and gradually modernised the frontend build pipeline for this blog. This wasn't some masterplan executed flawlessly—it was trial and error, lots of error, and stubborn persistence driven by a clear vision of what I wanted to achieve.
I'm not a frontend guru or a Webpack wizard. I'm a .NET developer who got frustrated with the limitations of CDN-based dependencies and decided there had to be a better way. This article documents the messy, iterative journey from simple <script> tags pointing to unpkg to a modern bundling pipeline that actually works.
I made plenty of mistakes along the way (which I'll document), spent hours debugging cryptic Webpack errors, and probably reinstalled node_modules more times than I care to admit. But persistence paid off, and I learned a tremendous amount through sheer determination to get this right.
If you're a .NET developer staring at Webpack configuration files wondering what on earth you've gotten yourself into—this article is for you.
I'm an old web performance nutter, in the early days of dial-up the first site I built for money (that wasn't a porn backend in Perl...a story for another time) was a Snow Conditions site that spun off from an automated phone line.
In those days the concerns were quite different. JavaScript was practically unknown. I think even divs were RARE (in those days IE was div and Netscape was layer) and you'd be LUCKY if your users had a 56K connection. So I learned to optimise (even things like tiled image backgrounds for menu items) etc...
Later at Microsoft I even worked on an automated image sprite tool in Web Forms - it was awesome: it extracted small images from a page, generated CSS and a sprite image automatically. But alas, like my time at Microsoft, it was not to be.
In short, it's an obsession that I've carried my entire career. Nobody likes a slow website!
To be fair, CDNs aren't inherently evil.
Full disclosure: This blog is deliberately overengineered. It doesn't need all this complexity. But building it this way let me learn modern frontend tooling properly and, crucially, gave me something real to write about and teach. Sometimes the best way to learn is to build something slightly ridiculous and document the journey.
Initially, my approach was straightforward:
<!-- Old CDN-based approach -->
<script src="https://unpkg.com/alpinejs@3.x.x/dist/cdn.min.js"></script>
<script src="https://unpkg.com/htmx.org@2.x.x"></script>
<script src="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.js"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/easymde/dist/easymde.min.css">
Whilst this works, it has several drawbacks that became increasingly frustrating:
So I wanted control...control of exactly what my site used and NEEDED to work. I wanted to bundle only what I needed, ensure reliable loading order, and optimise for performance. This meant moving away from CDNs and towards a bundling approach.
They're simpler to set up, and with HTTP/2 and HTTP/3 multiplexing, the multiple request overhead is much less critical than it used to be. For many projects, especially small ones or prototypes, CDNs remain a perfectly sensible choice. But for a production site where I wanted control, reliability, and optimisation, bundling made more sense.
My current setup bundles all JavaScript dependencies through Webpack and processes CSS through PostCSS.
Before diving into the technical details, I should address the elephant in the room: why am I using Webpack when Vite, Rollup, esbuild, and other modern bundlers exist?
The honest answer is simple: I already knew Webpack.
I'd used Webpack extensively when teaching my "Beginning Web Development" course, and it was the tool I understood. When I decided to modernise this blog's build pipeline, I was already facing a steep learning curve—understanding tree-shaking, code splitting, module systems, PostCSS pipelines, and how to integrate all of this with ASP.NET Core.
It's generally a good practice when working on projects; limit the 'new' to the 'manageable'. It's WAY easier to limit uncertainty.
Adding "learn an entirely new build tool" on top of that seemed unnecessary. Webpack works. It's mature. It has excellent documentation and a massive ecosystem. Most importantly, I could focus my learning energy on the concepts of modern bundling rather than the idiosyncrasies of a particular tool.
Is Vite faster? Absolutely. Vite's development server with native ES modules and esbuild-powered bundling is significantly faster than Webpack. For large projects with hundreds of modules, the difference is dramatic.
Should you use Vite for a new project? Probably, yes. If you're starting fresh and don't have existing Webpack knowledge, Vite is likely the better choice. It's faster, simpler to configure, and represents the modern approach to frontend tooling.
Do I regret using Webpack? Not at all. It got me to where I needed to be. The lessons I learned about bundling, code splitting, and optimisation are transferable to any build tool. And honestly, for this blog's scale, the performance difference between Webpack and Vite is negligible—we're talking milliseconds in development rebuild times.
It's a key aspect of how I build stuff; start with what's EASY and build from that foundation.
The broader lesson here is that progress beats perfection. I could have spent weeks researching the "best" bundler, comparing benchmarks, reading comparison articles, and agonising over the decision. Instead, I picked the tool I knew, got it working, and moved forward. That pragmatism kept me shipping rather than endlessly deliberating.
Maybe one day I might migrate to Vite. Maybe I won't. Either way, this blog has a modern, optimised build pipeline that works reliably—and that's what matters.
The package.json now maintains two distinct dependency groups. This structure took me several attempts to get right—initially, Nuget and npm are KINDA similar but nppm is a lot less friendly when adding pacakges.
{
"dependencies": {
//NOTE - This is a pre-release of these enhancements, the 1.0.0 release is OUT NOW!
"@mostlylucid/mermaid-enhancements": "^1.0.0-alpha0",
"alpinejs": "^3.14.1",
"codemirror": "5.65.13",
"core-js": "^3.39.0",
"easymde": "2.20.0",
"flatpickr": "^4.6.13",
"highlight.js": "^11.10.0",
"highlightjs-cshtml-razor": "^2.1.1",
"html-to-image": "^1.11.13",
"htmx.org": "^2.0.1",
"mermaid": "^11.0.2",
"regenerator-runtime": "^0.14.1",
"svg-pan-zoom": "^3.6.2"
},
//These are just used for build; not needed to RUN the app so we have a separate place for 'em
"devDependencies": {
"@babel/core": "7.26.9",
"@babel/preset-env": "7.26.9",
"@tailwindcss/aspect-ratio": "^0.4.2",
"@tailwindcss/forms": "^0.5.7",
"@tailwindcss/typography": "^0.5.12",
"autoprefixer": "10.4.21",
"babel-loader": "10.0.0",
"cpx": "1.5.0",
"css-loader": "7.1.2",
"cssnano": "7.0.6",
"daisyui": "^4.12.10",
"mini-css-extract-plugin": "^2.9.4",
"npm-run-all": "4.1.5",
"postcss": "8.5.3",
"postcss-cli": "11.0.1",
"postcss-import": "^16.1.0",
"rimraf": "6.0.1",
"style-loader": "4.0.0",
"tailwindcss": "3.4.17",
"terser-webpack-plugin": "^5.3.10",
"webpack": "^5.91.0",
"webpack-cli": "^5.1.4"
}
}
Dependencies are libraries needed at runtime (Alpine.js, HTMX, EasyMDE, etc.), whilst devDependencies are build tools (Webpack, Babel, PostCSS processors, etc.).
The package.json scripts have evolved significantly:
{
"scripts": {
"clean": "rimraf ./.tmp ./wwwroot/css/dist ./wwwroot/js/dist",
"copy:static": "cpx \"src/css/{raleway,easymde-overrides}.css\" \"wwwroot/css/dist\"",
"copy:highlight": "cpx \"src/css/highlight/*.min.css\" \"wwwroot/css/highlight\"",
"copy:all": "npm-run-all --parallel copy:static copy:highlight",
"copy:watch": "cpx \"src/css/{raleway,easymde-overrides}.css\" \"wwwroot/css/dist\" --watch & cpx \"src/css/highlight/*.min.css\" \"wwwroot/css/highlight\" --watch",
"tw:dev": "postcss ./src/css/main.css -o ./wwwroot/css/dist/main.css",
"tw:prod": "postcss ./src/css/main.css -o ./wwwroot/css/dist/main.css --no-map --verbose",
"tw:watch": "postcss ./src/css/main.css -o ./wwwroot/css/dist/main.css --watch",
"js:dev": "webpack --env development",
"js:prod": "webpack --mode production",
"js:watch": "webpack watch --mode development",
"dev": "npm-run-all clean --parallel copy:all tw:dev js:dev",
"watch": "npm-run-all clean copy:all --parallel copy:watch tw:watch js:watch",
"build": "npm-run-all clean copy:all --parallel tw:prod js:prod"
}
}
This setup separates concerns:
rimrafThe npm-run-all package enables parallel execution for faster builds.
You can set npm run build to auto run during your local build but it's a bit of a pain for CI (you need to ensure it's disabled etc).
<Project Sdk="Microsoft.NET.Sdk.Web">
<PropertyGroup>
<TargetFramework>net8.0</TargetFramework>
<SpaRoot>ClientApp\</SpaRoot>
</PropertyGroup>
<!-- Run npm install only when package.json changes -->
<Target Name="NpmInstall" Inputs="$(SpaRoot)package.json" Outputs="$(SpaRoot)node_modules" BeforeTargets="Build">
<Message Importance="high" Text="Running npm install in $(SpaRoot)" />
<Exec WorkingDirectory="$(SpaRoot)" Command="npm ci" />
</Target>
<!-- Run npm build before the .NET build -->
<Target Name="NpmBuild" DependsOnTargets="NpmInstall" BeforeTargets="Build">
<Message Importance="high" Text="Running npm run build in $(SpaRoot)" />
<Exec WorkingDirectory="$(SpaRoot)" Command="npm run build" />
</Target>
</Project>
To do it just add this to your csproj...You can even say only run during debug' etc...but I find it messy. I prefer just to manually run it. So npm run watch does just that, when any css / js file changes it auto-runs the build.
NOTE: In the JS world they almost use hot-reload dev runs which is pretty slick and makes you kinda hate ASP.NET Core's weak attempt.
The Webpack configuration is where the magic happens—and where I spent most of my time troubleshooting. This didn't spring into existence fully formed. It's the result of countless iterations, Stack Overflow searches, and reading through Webpack documentation at 2am trying to understand why my build was generating 47 chunk files.
Here's the current configuration (from webpack.config.js) with detailed explanations of what each part does and why it's there:
const TerserPlugin = require('terser-webpack-plugin');
const path = require('path');
module.exports = (env, argv) => {
const isProduction = argv.mode === 'production';
return {
mode: isProduction ? 'production' : 'development',
entry: {
main: './src/js/main.js',
},
output: {
filename: '[name].js',
chunkFilename: '[name].[contenthash].js',
path: path.resolve(__dirname, 'wwwroot/js/dist'),
publicPath: '/js/dist/',
module: true,
clean: true,
},
experiments: {
outputModule: true,
},
module: {
rules: [
{
test: /\.css$/i,
use: ['style-loader', 'css-loader'],
},
{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
options: {
presets: [
['@babel/preset-env', {
targets: '> 0.25%, not dead',
modules: false,
useBuiltIns: 'usage',
corejs: 3,
}],
],
},
},
},
],
},
resolve: {
extensions: ['.js', '.mjs'],
alias: {
'@mostlylucid/mermaid-enhancements$': path.resolve(__dirname, 'node_modules/@mostlylucid/mermaid-enhancements/dist/index.min.js')
}
},
optimization: {
splitChunks: {
chunks: 'all',
minSize: 20000,
maxSize: 100000,
name: false,
},
runtimeChunk: {
name: 'runtime',
},
minimize: isProduction,
minimizer: isProduction ? [
new TerserPlugin({
terserOptions: {
ecma: 2020,
compress: {
drop_console: true,
passes: 3,
toplevel: true,
pure_funcs: ['console.info', 'console.debug'],
},
mangle: {
toplevel: true,
},
format: {
comments: false,
},
},
extractComments: true,
}),
] : [],
},
devtool: isProduction ? false : 'eval-source-map',
performance: {
hints: isProduction ? 'warning' : false,
}
};
};
optimization: {
splitChunks: {
chunks: 'all',
minSize: 20000,
maxSize: 100000,
name: false,
},
runtimeChunk: {
name: 'runtime',
},
}
This configuration automatically splits your code into smaller chunks:
In practice, this generates multiple files in wwwroot/js/dist/:
main.js - Your application entry pointruntime.js - Webpack module loading logic[vendor].[contenthash].js - Automatically split vendor chunks{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
options: {
presets: [
['@babel/preset-env', {
targets: '> 0.25%, not dead',
modules: false,
useBuiltIns: 'usage',
corejs: 3,
}],
],
},
},
}
Babel transpiles modern JavaScript to support older browsers:
This means I can write modern JavaScript (async/await, optional chaining, nullish coalescing) whilst maintaining broad browser support.
minimizer: isProduction ? [
new TerserPlugin({
terserOptions: {
ecma: 2020,
compress: {
drop_console: true,
passes: 3,
toplevel: true,
pure_funcs: ['console.info', 'console.debug'],
},
mangle: {
toplevel: true,
},
format: {
comments: false,
},
},
extractComments: true,
}),
] : []
Terser aggressively minifies production builds:
console.log() statementsOn this blog, this typically reduces JavaScript bundle sizes by 60-70% compared to unminified code.
Tailwind CSS is processed through PostCSS with several plugins. This configuration (from postcss.config.js) is mercifully simple compared to Webpack:
// postcss.config.js
module.exports = {
plugins: {
'postcss-import': {},
tailwindcss: {},
autoprefixer: {},
cssnano: { preset: 'default' }
}
}
The pipeline works as follows:
@import statements in CSS filesThe tailwind.config.js file specifies where Tailwind should look for class names. Getting the content paths right was critical—miss a path and Tailwind won't generate classes for those files:
module.exports = {
content: ["./Views/**/*.cshtml", "./EmailSubscription/**/*.cshtml"],
safelist: ["dark", "light"],
darkMode: "class",
theme: {
fontFamily: {
body: ["Raleway", "sans-serif"],
},
extend: {
colors: {
"custom-light-bg": "#ffffff",
"custom-dark-bg": "#1d232a",
primary: "#072344",
secondary: "#00aaa1",
// ... more custom colours
},
},
},
plugins: [
require("@tailwindcss/aspect-ratio"),
require("@tailwindcss/typography"),
require("daisyui"),
],
};
Key features:
.cshtml files for class names (including Razor views and email templates)Tailwind scans these files at build time, extracting only the utility classes you actually use. This is why the final CSS file is much smaller than the full Tailwind library.
The main.js file is where everything comes together. This file imports and initialises all dependencies, and it's grown organically as I've added features to the blog:
// src/js/main.js
import hljsRazor from "highlightjs-cshtml-razor";
import mermaid from "mermaid";
import Alpine from 'alpinejs';
import htmx from "htmx.org";
import hljs from "highlight.js";
import EasyMDE from "easymde";
import 'easymde/dist/easymde.min.css';
import flatpickr from "flatpickr";
import 'flatpickr/dist/flatpickr.min.css';
// Expose libraries globally for Razor views
window.Alpine = Alpine;
window.hljs = hljs;
window.htmx = htmx;
window.mermaid = mermaid;
window.EasyMDE = EasyMDE;
window.flatpickr = flatpickr;
// Import custom modules
import { typeahead } from "./typeahead";
import { submitTranslation, viewTranslation } from "./translations";
import { codeeditor } from "./simplemde_editor";
import { globalSetup } from "./global";
import { comments } from "./comments";
// Attach to namespace
window.mostlylucid = window.mostlylucid || {};
window.mostlylucid.typeahead = typeahead;
window.mostlylucid.comments = comments();
window.mostlylucid.translations = {
submitTranslation: submitTranslation,
viewTranslation: viewTranslation
};
// Initialise Alpine
Alpine.start();
This approach offers several benefits:
windowIn _Layout.cshtml, I now reference only bundled assets:
<head>
<link href="/css/dist/main.css" asp-append-version="true" rel="stylesheet" />
</head>
<body>
<!-- Content -->
<script src="~/js/dist/main.js" type="module" asp-append-version="true"></script>
</body>
Note I specify module to let the browser know what type of JS file this is & asp-append-version a neat little ASP.NET tag helper that appends a hashed version of the file contents to the querystring; effectively cache-busting when the file changes.
The only remaining CDN dependencies are:
Here's how the entire build process flows from source files to production assets:
graph TD
A[Source Files] --> B[npm run build]
B --> C[Clean Task]
C --> D[rimraf wwwroot/css/dist wwwroot/js/dist]
B --> E[Copy Task]
E --> F[cpx: Copy static CSS files]
F --> G[wwwroot/css/dist/raleway.css]
F --> H[wwwroot/css/dist/easymde-overrides.css]
B --> I[Tailwind Task]
I --> J[PostCSS Pipeline]
J --> K[postcss-import]
K --> L[Tailwind CSS]
L --> M[Autoprefixer]
M --> N[cssnano]
N --> O[wwwroot/css/dist/main.css]
B --> P[Webpack Task]
P --> Q[Entry: src/js/main.js]
Q --> R[Babel Transpilation]
R --> S[Module Resolution]
S --> T[Tree Shaking]
T --> U[Code Splitting]
U --> V[Terser Minification]
V --> W[wwwroot/js/dist/main.js]
V --> X[wwwroot/js/dist/runtime.js]
V --> Y[wwwroot/js/dist/vendor chunks]
O --> Z[ASP.NET Core Static Files]
W --> Z
X --> Z
Y --> Z
G --> Z
H --> Z
Z --> AA[Browser]
style A stroke:#0ea5e9,stroke-width:3px
style Z stroke:#f59e0b,stroke-width:3px
style AA stroke:#10b981,stroke-width:3px
It looks pretty complex but really that's what WebPack does, it handles most of this itself (in a complicated way that the likes of Vite don't need but.. )
Again, not really the POINT. But it does load like an absolute bullet now. You can still get issues with Cloudflare's 'Rocket Loader' (which mangles page load events somewhat) but it's TIGHT.
Beyond performance, the modern build pipeline significantly improves the development workflow:
Bundling through Webpack enables proper JavaScript module resolution, which means:
WHEN the build breaks (with npm run watch) I know INSTANTLY, combined with the (few but growing) JS unit tests it means tighter feedback loops.
During development, npm run watch enables near-instant feedback:
npm run watch
This runs Webpack in watch mode, rebuilding only changed modules:
npm's lock file (package-lock.json) ensures consistent builds across environments:
npm ci # Clean install from lock file
This guarantees the same dependency versions in development, CI/CD, and production environments.
These are referred to as 'pinning' in the JS world, where the dependencies are set in stone. You CAN do without a package lock but then you're depending on pacakge authors; often HUNDREDS of differen tpeople not screwing up some minor release..s
The organised npm scripts make common tasks easy:
npm run dev # One-time development build
npm run watch # Continuous development builds
npm run build # Production build with optimisations
npm run clean # Remove build artefacts
Some libraries need global exposure for use in Razor views or inline scripts:
// Make library available globally
import Alpine from 'alpinejs';
window.Alpine = Alpine;
Alpine.start();
Then in .cshtml files:
<div x-data="{ open: false }">
<!-- Alpine.js works because it's on window -->
</div>
Some libraries include CSS that needs importing:
import EasyMDE from "easymde";
import 'easymde/dist/easymde.min.css'; // Imported CSS is processed by Webpack
Webpack's css-loader and style-loader handle these imports:
<style> tags or separate CSS filesFor libraries only needed on specific pages, use dynamic imports:
// Only load Mermaid when needed
async function initMermaid() {
const mermaid = await import('mermaid');
mermaid.default.initialize({ startOnLoad: true });
}
// Call when needed
if (document.querySelector('.mermaid')) {
initMermaid();
}
Webpack automatically code-splits dynamic imports into separate chunks, loaded on-demand.
Some older libraries use CommonJS instead of ES6 modules:
// CommonJS require syntax
const hljs = require('highlight.js');
// Or use dynamic import
import('highlight.js').then(hljs => {
// Use hljs
});
Webpack handles both module systems transparently, converting CommonJS to ES6 where needed.
Ensure devtool is configured in webpack.config.js:
devtool: isProduction ? false : 'eval-source-map',
This generates source maps in development for easier debugging.
If Tailwind removes classes you're using, check the content configuration:
// tailwind.config.js
module.exports = {
content: [
'./Views/**/*.cshtml',
'./Components/**/*.cshtml',
// Add any other paths where classes are used
],
}
Alternatively, use the safelist for dynamic classes:
safelist: [
'bg-blue-500',
'text-red-600',
{
pattern: /bg-(red|green|blue)-(400|500|600)/,
}
]
If Webpack can't resolve a module, check:
./ or ../.mjs to resolve.extensions if neededresolve.alias for complex pathsresolve: {
extensions: ['.js', '.mjs', '.json'],
alias: {
'@components': path.resolve(__dirname, 'src/js/components/'),
}
}
If builds become slow:
thread-loader for multi-threaded transpilation// Enable Webpack caching
cache: {
type: 'filesystem',
},
Let me share some of the mistakes I made along this journey, so you can avoid them:
My first attempt involved ripping out all CDN references in one go and trying to bundle everything through Webpack. The build broke spectacularly. I learned that incremental migration is your friend—move one library at a time, test thoroughly, then move to the next.
I wasted hours trying to figure out why certain libraries wouldn't load. Turns out, mixing CommonJS (require()) and ES6 modules (import) without understanding how Webpack handles them leads to cryptic errors. The experiments: { outputModule: true } configuration wasn't in my initial setup, and I couldn't work out why my modules weren't loading.
The solution came from reading through GitHub issues on the Webpack repository at midnight.
At one point, my configuration was generating chunks for everything. I had minSize: 10000 (10KB) which meant Webpack was creating separate files for tiny utility functions. The over-corrected and went to 2MB... Page load became a waterfall of 50+ tiny chunk requests or hung waiting for huge chunks to download. I learned that code splitting is good, but you need sensible thresholds. That's why my current config uses minSize: 20000 (20KB).
.mjs ExtensionSome npm packages distribute ES6 modules with the .mjs extension. Webpack won't resolve these by default. I spent an embarrassingly long time debugging why @mostlylucid/mermaid-enhancements wouldn't load before discovering I needed to add .mjs to resolve.extensions.
Simple fix once you know it, but finding that out? That took time.
Early builds took 30-60 seconds because I hadn't enabled Webpack's filesystem cache. Once I added caching, rebuild times dropped to 2-3 seconds. This was a game-changer for development workflow but took me weeks to discover.
My favourite debugging experience: CSS classes that worked fine in development suddenly disappeared in production. Tailwind was purging them because they were generated dynamically in JavaScript. The solution was the safelist configuration, but only after I'd wasted hours wondering if I was going mad.
I ALSO (for example in the mostlylucid.pagingtaghelper project) use 'dummy blocks' where I simplify my Tailwind config changes by just having a hidden block.
<!--
Preserve Tailwind & DaisyUI classes used in embedded pager views.
Without this, TailwindCSS tree-shaking removes classes from the embedded library views.
-->
<span class="hidden
btn btn-sm btn-active btn-disabled join join-item badge badge-sm select select-bordered select-sm label label-text
px-3 py-2 py-1 text-sm font-medium border rounded whitespace-nowrap cursor-not-allowed
text-gray-700 text-gray-600 text-gray-400 text-white bg-white bg-gray-100 bg-blue-600 border-gray-300 border-blue-600
hover:bg-gray-50 hover:bg-blue-700
dark:bg-gray-800 dark:bg-gray-700 dark:bg-blue-500 dark:text-gray-300 dark:text-gray-500 dark:text-gray-200
dark:border-gray-600 dark:border-blue-500 dark:hover:bg-gray-700 dark:hover:bg-blue-600">
</span>
When Tailwind builds it scans the cshtml files for classes; if it doesn't see them it removes them. So having this hidden block ensures it sees the classes I need even if they're only used in embedded library views. JS developers commonly also scan JS/TS files to allow using CSS styles that Tailwind would otherwise miss.
module.exports = {
content: ["./Views/**/*.cshtml", "./EmailSubscription/**/*.cshtml"],
-//^^^^ This is what Tailwing knows where to look for classes duritng the tree Shake.
+//^^^^ This is what Tailwind knows where to look for classes during the tree-shake.
...
future: {...}
}
As mentioned, the 'Tailwind Approved' approach is to add the classes to a safelist... but I'm an ASP.NET guy, so HTML won.
The one thing I did right was refusing to give up. Every error message was a learning opportunity. Every broken build taught me something new about how these tools work. I had a clear vision—bundled, optimised assets that load fast—and I kept iterating until I got there.
Migrating from CDN-based dependencies to a modern bundling pipeline has been transformative for this blog, but it wasn't easy. The performance improvements are substantial, the developer experience is significantly better, and I have much greater control over optimisation—but it took months of trial and error to get here.
Key takeaways:
For ASP.NET Core developers accustomed to simple CDN includes, this will feel overwhelming at first. That's normal. I felt the same way. Start small, migrate one dependency at a time, and don't be afraid to break things in development. You'll learn more from debugging broken builds than you ever will from reading documentation.
The configuration I've shared here represents months of iteration. Your journey will be different, and that's fine. Use this as a starting point, not gospel. Adapt it to your needs, experiment, and don't be discouraged by failures—they're part of the process.
Is this blog overengineered? Absolutely. Could I have kept using CDNs and spent my time on other things? Sure. But I wouldn't have learned half as much, and I wouldn't have this article to share with you. Sometimes overengineering isn't about the destination—it's about what you learn along the way and being able to teach others from that experience.
If you're still relying on CDNs for your frontend dependencies, I encourage you to give bundling a try. Start with one library. See how it feels. Iterate from there. The ecosystem has matured significantly, and whilst the learning curve is real, the payoff is worth it.
© 2025 Scott Galloway — Unlicense — All content and source code on this site is free to use, copy, modify, and sell.