Lesson 16-Mainstream Framework Performance Optimization Source Code Analysis

Modern front-end frameworks significantly enhance the performance of complex applications through innovative architectural designs and low-level optimization strategies. Understanding the source-level optimization mechanisms of React, Vue, and Angular not only helps developers write efficient code but also enables quick identification of performance bottleneck root causes. This tutorial deeply analyzes the core performance optimization principles of these three mainstream frameworks, providing a comprehensive interpretation with key source code implementations.

React Performance Optimization Source Code Analysis

Fiber Architecture and Scheduling Algorithm

Introduced in React 16, the Fiber architecture revolutionized the rendering process by breaking synchronous rendering into interruptible asynchronous task units. Fiber nodes, as the new virtual DOM representation, contain rich information such as component type, state, and side-effect markers, forming the foundation of the scheduling system.

The core of the Fiber architecture lies in the Reconciler’s implementation. When an update is triggered, React builds a Fiber tree from the root node, creating corresponding Fiber objects for each node. This process is divided into two phases: reconciliation and commit. The reconciliation phase calculates virtual DOM differences, while the commit phase applies changes to the real DOM.

The scheduling algorithm relies on time slicing and priority scheduling. React uses the requestIdleCallback API (falling back to setTimeout simulation if unsupported) to split rendering tasks into small units, checking remaining time before each task to ensure high-priority tasks, like user interactions, are not blocked.

The Scheduler module in the source code implements this mechanism:

// Simplified scheduler core logic
function scheduleWork(fiber, expirationTime) {
  // Mark fiber as needing update
  fiber.expirationTime = expirationTime;

  // Request scheduling opportunity
  requestHostCallback(performWork);
}

function performWork(deadline) {
  while (nextFlushedExpirationTime !== NoWork && 
         (deadline.timeRemaining() > 0 || nextFlushedExpirationTime === Sync)) {
    // Execute available rendering work
    workLoop(deadline);
  }

  // If work remains, continue scheduling
  if (nextFlushedExpirationTime !== NoWork) {
    requestHostCallback(performWork);
  }
}

This architecture allows React to execute low-priority updates during browser idle periods, prioritizing high-priority tasks like user interactions, thus improving application responsiveness.

Hooks Performance Optimization (useMemo, useCallback)

React Hooks leverage closures to manage state and side effects in functional components. useMemo and useCallback, as performance optimization tools, use memoization to avoid unnecessary computations and function recreation.

The implementation of useMemo relies on dependency array change detection:

function useMemo(nextCreate, deps) {
  const hook = mountWorkInProgressHook();
  const nextDeps = deps === undefined ? null : deps;

  if (hook.memoizedState === null) {
    // Initial render, compute and cache result
    const nextValue = nextCreate();
    hook.memoizedState = [nextValue, nextDeps];
    return nextValue;
  } else {
    // Subsequent renders, check if dependencies changed
    const [prevValue, prevDeps] = hook.memoizedState;
    if (areHookInputsEqual(nextDeps, prevDeps)) {
      // Dependencies unchanged, return cached value
      return prevValue;
    } else {
      // Dependencies changed, recompute
      const nextValue = nextCreate();
      hook.memoizedState = [nextValue, nextDeps];
      return nextValue;
    }
  }
}

useCallback is implemented similarly but memoizes function references instead of computed results:

function useCallback(callback, deps) {
  return useMemo(() => callback, deps);
}

Proper use of these Hooks can prevent unnecessary child component re-renders. For example, in list rendering, wrapping event handlers with useCallback ensures stable function references, and when combined with React.memo, significantly boosts performance.

Virtual DOM Diff Algorithm Optimization

React’s Diff algorithm is central to its efficient updates. Traditional tree comparison has an O(n³) time complexity, but React reduces it to O(n) with two key optimizations:

  1. Same-level Comparison: Compares nodes only at the same level, not across levels.
  2. Type Checking: If component types differ, destroys and rebuilds nodes directly.

The key attribute further optimizes list comparisons by identifying reusable nodes during order changes, avoiding unnecessary destruction and creation.

The reconcileChildrenArray function in the source code illustrates the core list comparison logic:

function reconcileChildrenArray(
  returnFiber: Fiber,
  currentFirstChild: Fiber | null,
  newChildren: Array<*>,
  expirationTime: ExpirationTime,
): Fiber | null {
  let resultingFirstChild: Fiber | null = null;
  let previousNewFiber: Fiber | null = null;

  let oldFiber = currentFirstChild;
  let lastPlacedIndex = 0;
  let newIdx = 0;
  let nextOldFiber = null;

  for (; oldFiber !== null && newIdx < newChildren.length; newIdx++) {
    if (oldFiber.index > newIdx) {
      nextOldFiber = oldFiber;
      oldFiber = null;
    } else {
      oldFiber = nextOldFiber;
      nextOldFiber = oldFiber ? oldFiber.sibling : null;
    }

    const newFiber = updateSlot(
      returnFiber,
      oldFiber,
      newChildren[newIdx],
      expirationTime,
    );

    if (newFiber === null) {
      // Key mismatch, exit loop
      if (oldFiber === null) {
        oldFiber = nextOldFiber;
      }
      break;
    }

    if (shouldTrackSideEffects) {
      if (oldFiber && newFiber.alternate === null) {
        // Move operation, delete old node
        deleteChild(returnFiber, oldFiber);
      }
    }

    lastPlacedIndex = placeChild(newFiber, lastPlacedIndex, newIdx);

    if (previousNewFiber === null) {
      resultingFirstChild = newFiber;
    } else {
      previousNewFiber.sibling = newFiber;
    }
    previousNewFiber = newFiber;
  }

  // Handle remaining nodes...
}

This algorithm performs excellently in most real-world scenarios, especially when list changes are minimal, enabling efficient node reuse.

React.lazy and Suspense Implementation

Code splitting is critical for improving large application load performance. React.lazy combined with Suspense enables dynamic component imports, splitting applications into on-demand loaded chunks.

React.lazy is built on dynamic import() syntax:

function lazy<T, R>(ctor: () => Promise<{default: T}>): LazyComponent<T> {
  let component = null;
  let error = null;
  let props = null;
  let resolved = false;

  const lazyComponent = {
    $$typeof: REACT_LAZY_TYPE,
    _ctor: ctor,
    _status: -1,
    _result: null,

    // Load component
    read() {
      if (resolved) {
        return component;
      } else if (error) {
        throw error;
      } else if (_status === 0) {
        throw promise;
      } else if (_status === 1) {
        throw component;
      } else {
        // Start loading
        if (_status === -1) {
          _status = 0;
          promise = ctor().then(
            module => {
              _status = 1;
              component = module.default;
              return component;
            },
            error => {
              _status = 2;
              error = error;
              throw error;
            },
          );
        }
        throw promise;
      }
    },
  };

  return lazyComponent;
}

Suspense handles fallback UI display during component loading:

function Suspense(props) {
  const { fallback, children } = props;
  const context = useSuspenseContext();

  // Check if child components have pending promises
  const status = getContextStatus(context);
  if (status === 'pending') {
    // Show fallback
    return createElement(fallback);
  } else if (status === 'success') {
    // Show children
    return children;
  } else {
    // Error handling
    throw error;
  }
}

This implementation simplifies code splitting while providing robust loading state management.

React Performance Monitoring Tool (Profiler)

The React Profiler, an official performance analysis tool, helps developers identify rendering bottlenecks by collecting component rendering times and reasons, offering visualized performance data.

The Profiler component leverages React’s scheduling system:

function Profiler(id, onRender) {
  return {
    $$typeof: REACT_PROFILER_TYPE,
    id: id,
    onRender: onRender,
  };
}

// Collect performance data in reconciler
function recordProfilerTimerIfRunning(fiber) {
  if (enableProfilerTimer) {
    const profiler = fiber.profiler;
    if (profiler !== null) {
      const id = profiler.id;
      const onRender = profiler.onRender;

      // Record start time
      const startTime = now();

      // Record end time and reason in commit phase
      return function onRenderCallback(
        id,
        phase,
        actualDuration,
        baseDuration,
        startTime,
        commitTime,
        interactions,
      ) {
        onRender(
          id,
          phase,
          actualDuration,
          baseDuration,
          startTime,
          commitTime,
          interactions,
        );
      };
    }
  }
  return null;
}

Developers can analyze this data to identify unnecessary renders and performance bottlenecks, enabling targeted optimizations.

Vue Performance Optimization Source Code Analysis

Reactive System Dependency Tracking

Vue’s reactive system, a core feature, uses Object.defineProperty (Vue 2) or Proxy (Vue 3) to detect data changes automatically. The dependency tracking mechanism ensures only components truly dependent on a piece of data re-render when it changes.

Dependency collection in Vue 2:

// Simplified dependency collection logic
export function defineReactive(
  obj: Object,
  key: string,
  val: any,
  customSetter?: ?Function,
  shallow?: boolean
) {
  const dep = new Dep(); // Dependency collector

  Object.defineProperty(obj, key, {
    enumerable: true,
    configurable: true,
    get: function reactiveGetter() {
      const value = getter ? getter.call(obj) : val;
      if (Dep.target) { // Current watcher being computed
        dep.depend(); // Collect dependency
        if (childOb) {
          childOb.dep.depend(); // Nested object dependency collection
          if (Array.isArray(value)) {
            dependArray(value); // Array dependency collection
          }
        }
      }
      return value;
    },
    set: function reactiveSetter(newVal) {
      // ...Handle value change
      dep.notify(); // Notify all dependencies to update
    }
  });
}

The Dep class manages Watcher instances, triggering updates via the notify method when data changes.

Vue 3’s Proxy-based approach improves this mechanism, addressing Vue 2’s limitations with arrays and new properties while offering better performance.

Virtual DOM Diff Algorithm Optimization

Vue’s virtual DOM implementation resembles React’s but includes distinct optimizations. Its Diff algorithm, also based on same-level comparisons, incorporates additional heuristic rules for enhanced performance.

Vue’s patch process:

function patch(oldVnode, vnode, hydrating, removeOnly) {
  if (isUndef(vnode)) {
    if (isDef(oldVnode)) invokeDestroyHook(oldVnode);
    return;
  }

  let isInitialPatch = false;
  const insertedVnodeQueue = [];

  if (isUndef(oldVnode)) {
    // Initial render
    isInitialPatch = true;
    createElm(vnode, insertedVnodeQueue);
  } else {
    const isRealElement = isDef(oldVnode.nodeType);
    if (!isRealElement && sameVnode(oldVnode, vnode)) {
      // Same node, perform patch
      patchVnode(oldVnode, vnode, insertedVnodeQueue, null, null, removeOnly);
    } else {
      // Different node, replace
      if (isRealElement) {
        // ...Handle real DOM to vnode conversion
      }
      // Create new node
      const parent = oldVnode.elm.parentNode;
      createElm(vnode, insertedVnodeQueue, parent, oldVnode.elm);
      // Remove old node
      if (isDef(parent)) {
        removeVnodes(parent, [oldVnode], 0, 0);
      } else if (isDef(oldVnode.tag)) {
        invokeDestroyHook(oldVnode);
      }
    }
  }

  // Execute insert hooks
  for (let i = 0; i < insertedVnodeQueue.length; i++) {
    invokeInsertHook(insertedVnodeQueue[i], null, isInitialPatch);
  }
  return vnode.elm;
}

The patchVnode function implements efficient node comparison and update strategies, including direct text node updates and optimized list comparisons.

Asynchronous Update Queue (nextTick)

Vue’s asynchronous update mechanism ensures multiple data changes trigger only one render, avoiding redundant computations and DOM operations.

nextTick implementation:

const callbacks = [];
let pending = false;

function flushCallbacks() {
  pending = false;
  const copies = callbacks.slice(0);
  callbacks.length = 0;
  for (let i = 0; i < copies.length; i++) {
    copies[i]();
  }
}

let timerFunc;

// Select optimal async implementation based on environment
if (typeof Promise !== 'undefined') {
  const p = Promise.resolve();
  timerFunc = () => {
    p.then(flushCallbacks);
  };
} else if (typeof MutationObserver !== 'undefined') {
  // ...MutationObserver implementation
} else if (typeof setImmediate !== 'undefined') {
  // ...setImmediate implementation
} else {
  // Fallback to setTimeout
  timerFunc = () => {
    setTimeout(flushCallbacks, 0);
  };
}

export function nextTick(cb?: Function, ctx?: Object) {
  let _resolve;
  callbacks.push(() => {
    if (cb) {
      try {
        cb.call(ctx);
      } catch (e) {
        handleError(e, ctx, 'nextTick');
      }
    } else if (_resolve) {
      _resolve(ctx);
    }
  });
  if (!pending) {
    pending = true;
    timerFunc();
  }
  // $flow-disable-line
  if (!cb && typeof Promise !== 'undefined') {
    return new Promise(resolve => {
      _resolve = resolve;
    });
  }
}

This implementation ensures optimal asynchronous updates across environments while providing a Promise interface for ease of use.

Vue Router Lazy Loading Implementation

Vue Router’s lazy loading uses dynamic import syntax to split route components into separate chunks for on-demand loading.

Route configuration example:

const router = new VueRouter({
  routes: [
    {
      path: '/about',
      component: () => import('./views/About.vue') // Lazy loading
    }
  ]
})

Underlying implementation principle:

// Simplified lazy loading handling
function lazyLoad(route, resolve) {
  const { component } = route;
  if (typeof component === 'function') {
    component().then(module => {
      resolve(module.default || module);
    }).catch(err => {
      // Error handling
    });
  } else {
    resolve(component);
  }
}

Combined with Webpack’s code-splitting capabilities, this significantly reduces initial load time.

Vue Performance Monitoring Tool (Vue DevTools)

Vue DevTools offers component tree inspection, state management, and performance analysis. Its performance analysis helps identify rendering bottlenecks.

Performance analysis implementation principle:

// Simplified performance tracking
const perf = {
  start: {},
  end: {},
  measures: {},

  startTrace(name) {
    this.start[name] = performance.now();
  },

  endTrace(name) {
    this.end[name] = performance.now();
    this.measures[name] = this.end[name] - this.start[name];
  },

  getMeasures() {
    return this.measures;
  }
};

// Integrate into Vue lifecycle hooks
Vue.mixin({
  beforeCreate() {
    if (this.$options.performance) {
      perf.startTrace(this.$options.name || 'anonymous');
    }
  },
  mounted() {
    if (this.$options.performance) {
      perf.endTrace(this.$options.name || 'anonymous');
    }
  }
});

Developers can use this data to identify components with excessive rendering times for targeted optimization.

Angular Performance Optimization Source Code Analysis

Change Detection Strategies (OnPush, Default)

Angular’s change detection system is central to its performance optimization. The Default strategy checks all component bindings, while OnPush only checks when input properties change or events are triggered.

Key aspects of OnPush strategy implementation:

// Simplified change detection logic
function markForCheck(component: any) {
  let curr: any = component;
  while (curr) {
    if (curr.changeDetectorRef) {
      curr.changeDetectorRef.markForCheck();
    }
    curr = curr._parent;
  }
}

class ChangeDetectorRef {
  constructor(private _cd: any) {}

  markForCheck() {
    this._cd._markPathToRootAsCheckOnce();
  }

  // Other methods...
}

OnPush components, marked with ChangeDetectionStrategy.OnPush, are skipped during parent component change detection unless specific conditions are met.

Lazy Loading Module Implementation

Angular’s lazy loading is achieved through route configurations, leveraging Webpack’s code-splitting capabilities.

Route configuration example:

const routes: Routes = [
  {
    path: 'lazy',
    loadChildren: () => import('./lazy/lazy.module').then(m => m.LazyModule)
  }
];

Underlying implementation principle:

// Simplified lazy loading handling
function loadNgModuleFactory(path: string): Promise<NgModuleFactory<any>> {
  return new Promise((resolve, reject) => {
    const module = require(path);
    resolve(module.LazyModuleNgFactory); // Assumes precompiled
  });
}

// Called during route navigation
router.events.subscribe(event => {
  if (event instanceof NavigationStart) {
    const route = findLazyRoute(event.url);
    if (route) {
      loadNgModuleFactory(route.path).then(factory => {
        // Create and load module
      });
    }
  }
});

This reduces initial bundle size, improving application startup speed.

Ivy Compiler Optimization

Ivy, Angular’s new compiler architecture, delivers significant performance improvements and smaller bundle sizes.

Key Ivy optimizations:

  1. Localized Recompilation: Recompiles only affected components, not the entire app.
  2. Smaller Runtime: Removes much compile-time generated code.
  3. Better Tree Shaking: Easier for bundlers to eliminate unused code.

Ivy’s template compilation process:

// Simplified template compilation
function compileTemplate(template: string, directives: any[]): CompiledTemplate {
  // Parse template to AST
  const ast = parseTemplate(template);

  // Transform AST to rendering instructions
  const instructions = transformAst(ast, directives);

  // Generate rendering code
  const code = generateCode(instructions);

  return {
    code,
    dependencies: getDependencies(ast, directives)
  };
}

This design enables Angular apps to run more efficiently with reduced bundle sizes.

Angular Universal SSR Optimization

Angular Universal enables server-side rendering (SSR), addressing SEO and first-screen performance issues.

Key SSR implementation:

// Simplified SSR process
function renderUniversal(url: string): Promise<string> {
  // 1. Create server-side app instance
  const app = createServerApp();

  // 2. Preload necessary modules and components
  const moduleRef = await app.bootstrapModule(AppServerModule);

  // 3. Render component to HTML
  const html = await renderModuleFactory(AppServerModuleNgFactory, {
    document: indexHtml,
    url: url
  });

  return html;
}

Angular Universal also implements hydration, allowing the client to take over server-rendered DOM without re-rendering.

Angular Performance Monitoring Tool (Augury)

Augury, Angular’s official performance analysis tool, helps identify change detection issues and rendering bottlenecks.

Performance analysis implementation principle:

// Simplified performance tracking
class PerformanceTracker {
  private _marks: Map<string, number> = new Map();

  mark(name: string) {
    this._marks.set(name, performance.now());
  }

  measure(name: string, startMark: string, endMark: string) {
    const start = this._marks.get(startMark);
    const end = this._marks.get(endMark);
    if (start && end) {
      return end - start;
    }
    return 0;
  }
}

// Integrate into Angular lifecycle
@Directive({
  selector: '[perfTrack]'
})
export class PerfTrackDirective implements OnInit, OnDestroy {
  constructor(private _tracker: PerformanceTracker) {}

  ngOnInit() {
    this._tracker.mark('componentInitStart');
    // Initialization logic...
    this._tracker.mark('componentInitEnd');
  }

  ngOnDestroy() {
    const duration = this._tracker.measure(
      'componentLifecycle',
      'componentInitStart',
      'componentInitEnd'
    );
    console.log(`Component lifecycle took ${duration}ms`);
  }
}

Developers can use this data to pinpoint performance bottlenecks for targeted optimization.

By deeply understanding these mainstream frameworks’ performance optimization source code implementations, developers can write more efficient code and quickly diagnose and resolve performance issues. These optimization techniques are not only applicable to current projects but also provide valuable insights for future framework upgrades and architectural designs.

Share your love