Bundle Analysis, Code Splitting, and Lazy Loading

Runtime profiling finds work you already shipped. Bundle analysis finds work you never should have sent over the wire in the first place. For React apps written in TypeScript, the two lenses combine: a fast reconciler cannot outrun a multi-megabyte main chunk that delays parsing, competes with LCP, and pushes INP higher by postponing interactivity. In 2026, with automatic memoization increasingly handled by the React Compiler, shrinking and slicing bundles remains one of the most reliable human-led wins—because it reduces fixed costs the framework cannot optimize away.

This section shows how to visualize what your bundler emits, how to split code along route and feature boundaries with React.lazy and Suspense, and how to defer heavy third-party modules until a user actually needs them. It closes with dependency hygiene and CI guardrails so performance does not regress quietly.

See the bundle before you argue about it

Opinions about “large dependencies” are cheap; treemaps are expensive to ignore. For Vite, rollup-plugin-visualizer emits an interactive stats.html that maps chunk size, gzip size, and module inclusion. Install it as a dev dependency, wire it into vite.config.ts, and run a production build.

npm install --save-dev rollup-plugin-visualizer
import { defineConfig } from "vite";
import { visualizer } from "rollup-plugin-visualizer";

export default defineConfig({
  plugins: [
    visualizer({
      filename: "stats.html",
      open: true,
      gzipSize: true,
    }),
  ],
});

After npm run build, open the report and look for rectangles that dominate the area: charting libraries, date utilities imported too eagerly, icon packs pulled in wholesale, and duplicated helpers across chunks. The goal is not zero bytes; it is awareness—know which imports move the needle before you rewrite working code.

Webpack ecosystems have parallel tools (webpack-bundle-analyzer); the workflow is the same: build for production, inspect, then decide whether to split, replace, or dynamically import.

Route-based splitting with lazy and Suspense

The most natural split boundary in many apps is the router. Each top-level route becomes its own async chunk, so visiting /dashboard does not force the user to download the settings screen’s dependencies up front.

import { lazy, Suspense } from "react"
import { Routes, Route } from "react-router-dom"

const Dashboard = lazy(() => import("./pages/Dashboard"))
const Settings = lazy(() => import("./pages/Settings"))
const Reports = lazy(() => import("./pages/Reports"))

function App() {
  return (
    <Suspense fallback={<PageLoader />}>
      <Routes>
        <Route path="/dashboard" element={<Dashboard />} />
        <Route path="/settings" element={<Settings />} />
        <Route path="/reports" element={<Reports />} />
      </Routes>
    </Suspense>
  )
}

TypeScript follows the dynamic import() types cleanly as long as each page module has a default export component. Keep fallbacks lightweight—skeleton placeholders that mirror layout reduce CLS compared to spinners that appear in arbitrary positions.

Nested routes can lazy-load too, but avoid creating so many tiny chunks that latency dominates. The right granularity is usually feature-sized, not every component file.

Component-level splitting for heavy widgets

Routes are not the only boundary. A dashboard might embed a charting library that is irrelevant until the user expands a panel. Splitting at the component keeps initial JS smaller and moves parse cost to the moment the feature is actually engaged.

import { lazy, Suspense, useState } from "react"

const RevenueChart = lazy(() => import("./components/RevenueChart"))

function Dashboard() {
  const [showChart, setShowChart] = useState(false)
  return (
    <div>
      <button type="button" onClick={() => setShowChart(true)}>
        Show Chart
      </button>
      {showChart && (
        <Suspense fallback={<ChartSkeleton />}>
          <RevenueChart />
        </Suspense>
      )}
    </div>
  )
}

Pair this pattern with honest UX: if the user opens the chart often, prefetch the chunk on hover or after idle time so the Suspense boundary does not flash on every visit.

Dynamic imports for conditional heavy work

Sometimes the heavy code is not a component but a library invoked from an event handler—PDF generation, video transcoding helpers, or office file parsers. Static imports hoist that cost into the initial graph even when most sessions never export a document.

type ReportData = { title: string; rows: ReadonlyArray<Record<string, string>> };

async function exportToPDF(data: ReportData) {
  const { generatePDF } = await import("./lib/pdf-generator");
  return generatePDF(data);
}

The dynamic import() returns a promise; TypeScript resolves types from the target module as usual. Wrap UI in loading state and surface failures: users tolerate a progress indicator on export far more than a sluggish first paint.

Common bundle culprits and pragmatic replacements

Libraries are not guilty until the treemap says they are, but a few repeat offenders show up across React codebases:

Date utilities. Import specific functions rather than barrels when your toolchain benefits from it. Newer date-fns versions tree-shake well when you stick to modular imports like import { format } from "date-fns". Avoid dragging Moment into new code—it is large and largely superseded by modern alternatives or, over time, the Temporal API as platform support improves.

Lodash. Prefer lodash-es for better ESM friendliness or replace hot paths with native Array / Object helpers when the semantics are simple.

Icon packs. Importing “all icons” through a single entry point can explode size. Libraries like Lucide support granular paths such as lucide-react/dist/esm/icons/icon-name when you need surgical imports.

Images. Unoptimized raster assets hurt LCP and bandwidth. Prefer modern formats like WebP (or AVIF where appropriate), serve responsive sizes, and always specify width and height (or CSS aspect-ratio) so layout stays stable—tying this file’s theme back to CLS and LCP, not only JS bytes.

None of these swaps require abandoning productivity. They require defaulting to the smallest import that works and verifying with the analyzer.

Guarding bundle size in CI

Local treemaps catch mistakes during development; CI catches teammates’ mistakes during review. Many teams wire size-limit or bundlesize to fail builds when main exceeds a threshold, or when growth exceeds a percentage budget. A minimal pattern with Next.js-style analysis might expose an environment flag:

{
  "scripts": {
    "analyze": "ANALYZE=true next build"
  }
}

Adapt the flag and bundler to your stack; the invariant is the same: treat bytes like tests. A red CI line is cheaper than a week of INP firefighting because someone imported a second charting stack accidentally.

Splitting as a complement to compiler optimizations

The React Compiler can make individual renders cheaper, but it cannot conjure network out of thin air. Profile the main thread, measure vitals in the field, and inspect the bundle when those signals point to JavaScript or loading—not before. That order keeps TypeScript React applications fast without turning every component into a manual performance puzzle.