Three.js 3D Model Loading
Basic Model Loading
Import Lib raries
First, include the Three.js library in your HTML file. For a CDN, use:
<script src="https://cdn.jsdelivr.net/npm/three@0.145.0/build/three.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/three@0.145.0/examples/js/loaders/GLTFLoader.js"></script>GLTFLoader is used to load .gltf or .glb model formats.
Initialize Scene, Camera, and Renderer
Set up the core Three.js elements in JavaScript:
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);Load Model
Use GLTFLoader to load the model file:
const loader = new THREE.GLTFLoader();
loader.load('path_to_model.gltf', function(gltf) {
const model = gltf.scene;
// Add model to scene
scene.add(model);
// Handle skeleton if needed
const skeletonHelper = new THREE.SkeletonHelper(model);
skeletonHelper.visible = false; // Hide skeleton by default, adjust as needed
scene.add(skeletonHelper);
}, undefined, function(error) {
console.error(error);
});The load method takes the model file URL and calls a callback upon success, with the gltf object containing all model data.
Set Up Animations (If Available)
If the model includes animations, retrieve and apply them from gltf.animations:
const mixer = new THREE.AnimationMixer(model);
gltf.animations.forEach((clip) => {
mixer.clipAction(clip).play();
});Update and Render
Update the mixer and render the scene each frame:
function animate() {
requestAnimationFrame(animate);
mixer.update(delta); // Delta is time difference, typically from THREE.Clock
renderer.render(scene, camera);
}
animate();Event Listeners
Add mouse or touch event listeners as needed for interactive model control.
Note that different model formats (e.g., .fbx, .obj) require specific loaders like FBXLoader or OBJLoader. Ensure the correct loader is included and used. Adjust model paths based on deployment to ensure browser accessibility.
3D Model Operations
Set Up Lighting
Add light sources to enhance model realism:
const ambientLight = new THREE.AmbientLight(0x404040); // Soft white light
scene.add(ambientLight);
const directionalLight = new THREE.DirectionalLight(0xffffff, 1);
directionalLight.position.set(0, 1, 1).normalize();
scene.add(directionalLight);Scale Model
Adjust the model’s size via the scale property:
model.scale.set(0.5, 0.5, 0.5); // Scale to 50% of original sizeMove Model
Reposition the model:
model.position.set(0, -1, -2); // Move to XYZ coordinates (0, -1, -2)Rotate Model
Set the model’s rotation angle:
model.rotation.set(0, Math.PI / 2, 0); // Rotate 90 degrees around Y-axisInteractive Controls
Enable user interaction (rotation, panning, zooming) with OrbitControls:
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
const controls = new OrbitControls(camera, renderer.domElement);
controls.update();Custom Materials and Colors
To modify a model’s material or color, understand Three.js’s material system. For a model using MeshStandardMaterial:
const materials = model.material;
if (Array.isArray(materials)) {
materials.forEach(mat => mat.color.setHex(0xff0000)); // Set all materials to red
} else {
materials.color.setHex(0xff0000); // Set single material to red
}These are common operations for loading and manipulating 3D models in Three.js. Adjust based on project requirements.
Feature Optimization
Load Textures
Some models require texture maps, loaded with TextureLoader and applied to materials:
const textureLoader = new THREE.TextureLoader();
textureLoader.load('path_to_texture.png', function(texture) {
const material = new THREE.MeshStandardMaterial({ map: texture });
// Apply material to model, may require traversal based on model structure
model.traverse(node => {
if (node.isMesh) {
node.material = material;
}
});
});Shadows
Enable shadows for added realism:
renderer.shadowMap.enabled = true;
renderer.shadowMap.type = THREE.PCFSoftShadowMap;
// Ensure light and shadow-receiving objects support shadows
directionalLight.castShadow = true;
model.receiveShadow = true;Performance Optimization
- LOD (Level of Detail): Use LOD to load varying detail levels based on camera distance for large models.
- Batch Rendering: Combine similar geometries into one batch to reduce draw calls.
- Culling: Use
THREE.BackSideorTHREE.FrontSideto cull unneeded faces. - Caching and Reuse: Avoid reloading or recreating identical resources.
Animation Frame Rate Control
Use THREE.Clock with requestAnimationFrame to control animation frame rate:
const clock = new THREE.Clock();
function animate() {
const delta = clock.getDelta(); // Get time difference
mixer.update(delta);
renderer.render(scene, camera);
requestAnimationFrame(animate);
}
animate();Responsive Design
Adjust camera and renderer size on window resize:
window.addEventListener('resize', () => {
camera.aspect = window.innerWidth / window.innerHeight;
camera.updateProjectionMatrix();
renderer.setSize(window.innerWidth, window.innerHeight);
}, false);Error Handling and Logging
Capture and log errors during loading or rendering for debugging and optimization.
Three.js Particle Systems and Effects
Particle System Basics
Basic Particle System
Implemented using THREE.ParticleSystem (now THREE.Points) and THREE.PointsMaterial:
// Import Three.js library
import * as THREE from 'three';
// Initialize scene, camera, and renderer
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Create particle geometry
const particleCount = 1000;
const particles = new THREE.BufferGeometry();
const positions = new Float32Array(particleCount * 3);
for (let i = 0; i < particleCount; i++) {
positions[i * 3] = Math.random() * 200 - 100;
positions[i * 3 + 1] = Math.random() * 200 - 100;
positions[i * 3 + 2] = Math.random() * 200 - 100;
}
particles.setAttribute('position', new THREE.BufferAttribute(positions, 3));
// Create particle material
const particleMaterial = new THREE.PointsMaterial({
color: 0xFFFFFF,
size: 10,
map: new THREE.TextureLoader().load('path_to_particle_texture.png'), // Optional texture
blending: THREE.AdditiveBlending,
transparent: true
});
// Create particle system
const particleSystem = new THREE.Points(particles, particleMaterial);
scene.add(particleSystem);
// Update particle positions (simple example, may need animation loop)
function updateParticles() {
const positions = particles.attributes.position.array;
for (let i = 0; i < particleCount; i++) {
positions[i * 3 + 1] -= 0.1; // Move particles downward
if (positions[i * 3 + 1] < -100) {
positions[i * 3 + 1] = 100; // Reset position
}
}
particles.attributes.position.needsUpdate = true; // Flag for update
}
// Render scene
function animate() {
requestAnimationFrame(animate);
updateParticles();
renderer.render(scene, camera);
}
animate();This creates a basic particle system with particles randomly distributed in 3D space, moving downward and resetting at the top when they reach the bottom. Particle motion here is simple; real-world systems may require more complex animation logic.
Custom Shader for Particle Explosion Effect
Define Shaders
Create particleVertexShader and particleFragmentShader to define particle behavior and appearance:
// particleVertexShader.glsl
uniform float time;
attribute vec3 velocity;
void main() {
vec3 newPosition = position + velocity * time;
vec4 mvPosition = modelViewMatrix * vec4(newPosition, 1.0);
gl_PointSize = 10.0;
gl_Position = projectionMatrix * mvPosition;
}// particleFragmentShader.glsl
uniform vec3 color;
void main() {
gl_FragColor = vec4(color, 1.0);
}Load Shaders and Create Material
// Load shaders (assumes shaders are in HTML script tags)
const particleVertexShader = document.getElementById('particleVertexShader').textContent;
const particleFragmentShader = document.getElementById('particleFragmentShader').textContent;
// Create particle material
const particleMaterial = new THREE.ShaderMaterial({
uniforms: {
time: { value: 0 },
color: { value: new THREE.Color(0xffffff) },
},
vertexShader: particleVertexShader,
fragmentShader: particleFragmentShader,
blending: THREE.AdditiveBlending,
transparent: true,
depthWrite: false,
});Create Particle System
// Create particles
const particleCount = 1000;
const positions = new Float32Array(particleCount * 3);
const velocities = new Float32Array(particleCount * 3);
const colors = new Float32Array(particleCount * 3);
// Initialize particle positions, velocities, and colors
// ... Depends on desired particle behavior
const geometry = new THREE.BufferGeometry();
geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geometry.setAttribute('velocity', new THREE.BufferAttribute(velocities, 3));
geometry.setAttribute('color', new THREE.BufferAttribute(colors, 3));
const particleSystem = new THREE.Points(geometry, particleMaterial);
scene.add(particleSystem);Update Particle System
function animate() {
requestAnimationFrame(animate);
const time = performance.now() / 1000; // Current time in seconds
particleMaterial.uniforms.time.value = time; // Update time uniform
// Update particle positions, velocities, colors, etc.
// ... Depends on desired particle behavior
renderer.render(scene, camera);
}
animate();three-emitter Library
The three-emitter library provides an Emitter class to simplify particle emitter creation, allowing specification of particle lifespan, velocity, color changes, etc.
Install three-emitter
npm install three three-emitterUse three-emitter
import * as THREE from 'three';
import { Emitter, Particle } from 'three-emitter';
// Initialize scene, camera, and renderer
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Create particle material
const particleMaterial = new THREE.PointsMaterial({
color: 0xffffff,
size: 10,
blending: THREE.AdditiveBlending,
transparent: true
});
// Create particle emitter
const emitter = new Emitter({
rate: new Emitter.Constant(1), // Emit 1 particle per second
life: new Emitter.Range(2, 4), // Particle lifespan 2-4 seconds
position: new Emitter.Box(new THREE.Vector3(-50, 0, -50), new THREE.Vector3(50, 100, 50)), // Emission range
velocity: new Emitter.Range(new THREE.Vector3(-10, 20, -10), new THREE.Vector3(10, 20, 10)), // Initial velocity range
acceleration: new Emitter.Constant(new THREE.Vector3(0, -10, 0)), // Gravity
color: new Emitter.Range(new THREE.Color(0xff0000), new THREE.Color(0x00ff00)), // Color range
size: new Emitter.Range(1, 5), // Particle size range
});
// Create particle geometry and add emitter
const particleGeo = new THREE.BufferGeometry();
emitter.create(particleGeo, particleMaterial);
// Add to scene
const particleSystem = new THREE.Points(particleGeo, particleMaterial);
scene.add(particleSystem);
// Update and render
function animate() {
requestAnimationFrame(animate);
emitter.update(0.01); // Update particle state
renderer.render(scene, camera);
}
animate();This creates an emitter where particles spawn randomly within a box, with random initial velocities, colors (red to green), and sizes. A downward acceleration simulates gravity, and particles have lifespans of 2-4 seconds. Adjust the rate parameter as needed.
Common Particle System Effects
Spark Effect
Sparks use white or warm-toned particles, moving upward or outward.
const sparkMaterial = new THREE.PointsMaterial({
color: 0xffaa00,
size: 1,
map: new THREE.TextureLoader().load('spark_texture.png'),
blending: THREE.AdditiveBlending,
transparent: true,
});
const sparkCount = 1000;
const sparkPositions = new Float32Array(sparkCount * 3);
// ... Initialize spark positions and velocities
const sparkGeometry = new THREE.BufferGeometry();
sparkGeometry.setAttribute('position', new THREE.BufferAttribute(sparkPositions, 3));
const sparkPoints = new THREE.Points(sparkGeometry, sparkMaterial);
scene.add(sparkPoints);
// Update sparks
function updateSparks() {
// ... Update spark positions based on velocity and time
}Smoke Effect
Smoke is typically grayish-white, using THREE.Points with a smoke texture.
const smokeMaterial = new THREE.PointsMaterial({
color: 0xaaaaaa,
size: 2,
map: new THREE.TextureLoader().load('smoke_texture.png'),
blending: THREE.AdditiveBlending,
transparent: true,
});
const smokeCount = 1000;
const smokePositions = new Float32Array(smokeCount * 3);
// ... Initialize smoke positions and velocities
const smokeGeometry = new THREE.BufferGeometry();
smokeGeometry.setAttribute('position', new THREE.BufferAttribute(smokePositions, 3));
const smokePoints = new THREE.Points(smokeGeometry, smokeMaterial);
scene.add(smokePoints);
// Update smoke
function updateSmoke() {
// ... Update smoke positions based on velocity and time
}Rain/Snow Effect
Rain and snow use similar methods but differ in texture and animation logic.
const snowMaterial = new THREE.PointsMaterial({
color: 0xffffff,
size: 0.5,
map: new THREE.TextureLoader().load('snow_texture.png'),
blending: THREE.AdditiveBlending,
transparent: true,
});
const snowCount = 1000;
const snowPositions = new Float32Array(snowCount * 3);
// ... Initialize snow positions and velocities
const snowGeometry = new THREE.BufferGeometry();
snowGeometry.setAttribute('position', new THREE.BufferAttribute(snowPositions, 3));
const snowPoints = new THREE.Points(snowGeometry, snowMaterial);
scene.add(snowPoints);
// Update snow
function updateSnow() {
// ... Update snow positions based on velocity and time
}Flame Effect
Flames involve complex color transitions and animations, often using THREE.Points with custom shaders.
// Define flame shaders
// ... Similar to particle shaders but include flame color transitions
const flameMaterial = new THREE.ShaderMaterial({
uniforms: {
// ... Include flame color, time, etc. uniforms
},
vertexShader: flameVertexShader,
fragmentShader: flameFragmentShader,
blending: THREE.AdditiveBlending,
transparent: true,
});
const flameCount = 1000;
// ... Initialize flame particle positions, colors, and velocities
const flamePoints = new THREE.Points(flameGeometry, flameMaterial);
scene.add(flamePoints);
// Update flames
function updateFlame() {
// ... Update flame colors, positions, etc.
}Explosion Effect
Explosions use multiple particle layers radiating outward and fading, often with multiple systems and color transitions.
// Create multiple particle systems for different explosion stages
const explosionMaterials = [/* ... */];
const explosionGeometries = [/* ... */];
for (let i = 0; i < explosionMaterials.length; i++) {
const explosionPoints = new THREE.Points(explosionGeometries[i], explosionMaterials[i]);
scene.add(explosionPoints);
}
// Update explosion particles
function updateExplosion() {
// ... Update each system’s positions, colors, opacity, etc.
}Halo Effect
Halos simulate glowing or blurred effects, using translucent particles and custom shaders.
const haloMaterial = new THREE.ShaderMaterial({
uniforms: {
glowColor: { value: new THREE.Color(0x00ffff) },
glowIntensity: { value: 1.0 },
blur: { value: 1.0 },
resolution: { value: new THREE.Vector2(window.innerWidth, window.innerHeight) },
},
vertexShader: haloVertexShader,
fragmentShader: haloFragmentShader,
blending: THREE.AdditiveBlending,
transparent: true,
});
const haloCount = 1000;
// ... Initialize halo geometry
const haloPoints = new THREE.Points(haloGeometry, haloMaterial);
scene.add(haloPoints);
// Update halo
function updateHalo() {
// ... Update halo colors, intensity, blur, etc.
}Jet Stream Effect
Jet streams simulate flows like water or fire.
const jetMaterial = new THREE.PointsMaterial({
color: 0x00ff00,
size: 0.5,
map: new THREE.TextureLoader().load('jet_texture.png'),
blending: THREE.AdditiveBlending,
transparent: true,
});
const jetCount = 1000;
const jetPositions = new Float32Array(jetCount * 3);
// ... Initialize jet positions and velocities
const jetGeometry = new THREE.BufferGeometry();
jetGeometry.setAttribute('position', new THREE.BufferAttribute(jetPositions, 3));
const jetPoints = new THREE.Points(jetGeometry, jetMaterial);
scene.add(jetPoints);
// Update jet
function updateJet() {
// ... Update jet positions based on velocity and time
}Vortex Effect
Vortices simulate rotating airflow or whirlpools.
const vortexMaterial = new THREE.PointsMaterial({
color: 0x990000,
size: 1,
map: new THREE.TextureLoader().load('vortex_texture.png'),
blending: THREE.AdditiveBlending,
transparent: true,
});
const vortexCount = 1000;
const vortexPositions = new Float32Array(vortexCount * 3);
// ... Initialize vortex positions and velocities
const vortexGeometry = new THREE.BufferGeometry();
vortexGeometry.setAttribute('position', new THREE.BufferAttribute(vortexPositions, 3));
const vortexPoints = new THREE.Points(vortexGeometry, vortexMaterial);
scene.add(vortexPoints);
// Update vortex
function updateVortex() {
// ... Update vortex positions and rotation
}Three.js Post-Processing and Shaders
Post-Processing
In Three.js, post-processing effects like Bloom (glow effect), SSAO (screen-space ambient occlusion), and Depth of Field are achieved using THREE.EffectComposer, related passes, and the WebGL renderer’s post-processing pipeline.
Initialize Effect Composer
Start by initializing a THREE.EffectComposer instance, which serves as a container for all post-processing effects and links to your scene and renderer.
import * as THREE from 'three';
import { EffectComposer } from 'three/examples/jsm/postprocessing/EffectComposer.js';
import { RenderPass } from 'three/examples/jsm/postprocessing/RenderPass.js';
import { ShaderPass } from 'three/examples/jsm/postprocessing/ShaderPass.js';
// Initialize scene, camera, renderer, etc.
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
const renderer = new THREE.WebGLRenderer();
// Create Effect Composer
const composer = new EffectComposer(renderer);Add Render Pass
RenderPass renders the base scene to a texture, forming the foundation for subsequent post-processing.
const renderPass = new RenderPass(scene, camera);
composer.addPass(renderPass);Bloom Effect
Bloom enhances the glow of bright areas in the scene.
import { UnrealBloomPass } from 'three/examples/jsm/postprocessing/UnrealBloomPass.js';
const bloomPass = new UnrealBloomPass(
new THREE.Vector2(window.innerWidth, window.innerHeight),
strength, // Glow intensity
radius, // Glow blur radius
threshold // Brightness threshold; pixels below this value are excluded
);
composer.addPass(bloomPass);SSAO Effect
SSAO enhances shadows and depth by simulating indirect lighting.
import { SSAOPass } from 'three/examples/jsm/postprocessing/SSAOPass.js';
const ssaoPass = new SSAOPass(scene, camera);
ssaoPass.renderToScreen = true; // Output directly to screen
composer.addPass(ssaoPass);Depth of Field Effect
Depth of Field mimics a camera’s focus, blurring parts of the scene to highlight the focal point.
import { BokehPass } from 'three/examples/jsm/postprocessing/BokehPass.js';
const bokehPass = new BokehPass(scene, camera, {
focusDistance, // Focal distance
aperture, // Aperture size, affects blur intensity
maxBlur, // Maximum blur amount
});
bokehPass.renderToScreen = true;
composer.addPass(bokehPass);Combined Usage
For rendering each frame, call composer.render() instead of renderer.render() to apply all post-processing effects.
function animate() {
requestAnimationFrame(animate);
// Update scene objects...
// Render post-processing effects
composer.render();
}
animate();Parameters like strength, radius, threshold, focusDistance, aperture, and maxBlur must be tuned for specific needs. Ensure all required post-processing libraries are imported correctly.
Combining and Optimizing
In projects, you may combine effects, such as Bloom and SSAO without Depth of Field. Pass order matters, as they execute bottom-up. Complex effects like Bloom should follow simpler ones like SSAO to avoid overwriting basic image data.
// Add Bloom Pass
composer.addPass(bloomPass);
// Add SSAO Pass
composer.addPass(ssaoPass);Here, SSAO applies first, followed by Bloom. If bloomPass precedes ssaoPass, SSAO effects might be diminished.
Performance Tuning
Post-processing increases rendering complexity, impacting performance. Optimization strategies include:
- Reduce Resolution: Lower the
EffectComposerrender target resolution below screen resolution to reduce computation. - Adjust Parameters: Tune effect intensity, blur radius, etc., based on device performance.
- Toggle Passes: Use the
enabledproperty to disable unneeded effects temporarily. - Avoid Redundant Rendering: Render only when the scene or camera changes, not every frame.
Shaders
In Three.js, shaders are written in GLSL (OpenGL Shading Language), enabling GPU-level customization of graphics rendering. Three.js wraps GLSL code in JavaScript objects for easier integration.
Shader Types
Vertex Shader: Defines how vertices move and deform in 3D space, processing inputs (attributes and uniforms) and outputting vertex positions (gl_Position) and normals (gl_Normal).
void main() {
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}Fragment Shader: Determines each pixel’s color, processing vertex shader outputs and other data to output final color (gl_FragColor).
void main() {
gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); // Output red pixel
}ShaderMaterial and Uniforms
Create custom ShaderMaterial to pass vertex and fragment shader code. Uniforms allow dynamic updates from JavaScript.
const uniforms = {
uTime: { value: 0.0 },
uResolution: { value: new THREE.Vector2() },
};
const material = new THREE.ShaderMaterial({
uniforms,
vertexShader: document.getElementById('vertexShader').textContent,
fragmentShader: document.getElementById('fragmentShader').textContent,
});ShaderLib and ShaderChunk
Three.js’s ShaderLib contains predefined shaders, like those for MeshStandardMaterial. These are split into reusable ShaderChunk blocks for modularity.
For example, MeshPhysicalMaterial’s vertex shader might include:
#include <common>
#include <uv_pars_vertex>
#include <uv2_pars_vertex>
#include <color_pars_vertex>
#include <displacementmap_pars_vertex>
#include <fog_pars_vertex>
#include <morphtarget_pars_vertex>
#include <skinning_pars_vertex>
#include <shadowmap_pars_vertex>
#include <logdepthbuf_pars_vertex>
#include <clipping_planes_pars_vertex>
void main() {
// ... Implementation ...
}#include directives are replaced with corresponding ShaderChunk content during compilation.
Physically Based Rendering (MeshStandardMaterial)
MeshStandardMaterial uses physically based rendering (PBR), accounting for reflection, diffusion, metalness, and roughness. Its shader code handles ambient light, specular reflections, and Fresnel effects.
uniform vec3 diffuse;
uniform float opacity;
#include <common>
#include <packing>
#include <dithering_pars_fragment>
#include <color_pars_fragment>
#include <uv_pars_fragment>
#include <uv2_pars_fragment>
#include <map_pars_fragment>
#include <alphamap_pars_fragment>
#include <aomap_pars_fragment>
#include <lightmap_pars_fragment>
#include <envmap_pars_fragment>
#include <cube_uv_reflection_fragment>
#include <fog_pars_fragment>
#include <bsdfs>
#include <lights_pars_begin>
#include <lights_physical_pars_fragment>
#include <shadowmap_pars_fragment>
#include <logdepthbuf_pars_fragment>
#include <clipping_planes_pars_fragment>
void main() {
// ... Implementation ...
}For correct use, provide appropriate textures (e.g., color, metalness, roughness maps) and light sources.
Instancing
For rendering many similar objects, instancing reduces draw calls by sharing vertex data and varying attributes like position or color.
// Create instanced geometry
const geometry = new THREE.InstancedBufferGeometry().copy(originalGeometry);
// Add instance data
geometry.instanceCount = count;
geometry.setAttribute('instanceColor', new THREE.InstancedBufferAttribute(new Float32Array(count * 3), 3));
// Assign instance attributes
const material = new THREE.MeshStandardMaterial({ color: 0x4080ff });
material.vertexColors = true;
const mesh = new THREE.Mesh(geometry, material);
scene.add(mesh);Particle System with Custom Shaders
Combine Points geometry and ShaderMaterial for custom particle systems.
// Create particle geometry
const geometry = new THREE.BufferGeometry();
const positions = new Float32Array(numParticles * 3);
for (let i = 0; i < numParticles; i++) {
positions[i * 3] = Math.random() * 2 - 1;
positions[i * 3 + 1] = Math.random() * 2 - 1;
positions[i * 3 + 2] = Math.random() * 2 - 1;
}
geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
// Custom shader
const uniforms = {
time: { value: 0.0 },
sizeAttenuation: { value: 1.0 },
particleColor: { value: new THREE.Color(0x4080ff) },
};
const material = new THREE.ShaderMaterial({
uniforms,
vertexShader: document.getElementById('particleVertexShader').textContent,
fragmentShader: document.getElementById('particleFragmentShader').textContent,
blending: THREE.AdditiveBlending,
transparent: true,
});
// Create particle object
const particles = new THREE.Points(geometry, material);
scene.add(particles);The shader might handle particle position, size, and color with time-based effects:
// Vertex shader
uniform float time;
void main() {
vec3 newPosition = position + vec3(0.0, sin(time), 0.0);
gl_PointSize = 5.0; // Can be a uniform
gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);
}
// Fragment shader
uniform vec3 particleColor;
void main() {
gl_FragColor = vec4(particleColor, 1.0);
}Advanced Shader Techniques
- Blending Modes: Adjust
blending,blendEquation,blendSrc, andblendDstfor varied effects. - Depth Testing: Control with
depthTestto determine pixel occlusion. - Clipping Planes: Use
THREE.ClipPlaneswithclipIntersectionandclipShadowsfor localized masking. - Shadows: Implement via
ShadowMaterialor custom shaders with shadow mapping.
Optimization
- BufferGeometry: Use for large geometries to reduce memory and data transfer.
- Batching: Merge similar geometries and materials to minimize draw calls.
- Shader Optimization: Reduce computations, use efficient algorithms, and avoid unnecessary texture sampling or math operations.
Three.js Physics Simulation
Adding physics simulations to Three.js typically requires integrating third-party physics engines like Cannon.js or Ammo.js. These engines provide core components such as rigid bodies, constraints, and collision detection.
Cannon.js Physics Engine
Install Cannon.js
First, ensure you have installed the Cannon.js library via npm:
npm install cannonThen, import Cannon.js and any relevant Three.js plugins into your Three.js scene:
import * as THREE from 'three';
import * as CANNON from 'cannon';
import { CannonJSPlugin } from 'three-cannon-plugin'; // If usedCreate Physics World, Rigid Bodies, and Colliders
// Create physics world
const world = new CANNON.World();
world.gravity.set(0, -9.81, 0); // Set gravity
// Create rigid body
const sphereBody = new CANNON.Body({ mass: 1 }); // Body mass
const sphereShape = new CANNON.Sphere(1); // Sphere radius
sphereBody.addShape(sphereShape);
// Create sphere geometry and material
const sphereGeometry = new THREE.SphereGeometry(1, 32, 32);
const sphereMaterial = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
// Create sphere object
const sphereMesh = new THREE.Mesh(sphereGeometry, sphereMaterial);
scene.add(sphereMesh);
// Link Three.js object to Cannon.js object
const cannonMesh = new CannonJSPlugin(world, sphereMesh);
cannonMesh.body = sphereBody;Set Up Collision Detection and Update Physics World
// Create ground
const groundBody = new CANNON.Body({ mass: 0 });
const groundShape = new CANNON.Plane(); // Plane shape
groundBody.addShape(groundShape);
world.addBody(groundBody);
// Update physics world each frame
function animate() {
requestAnimationFrame(animate);
world.step(1 / 60); // Step per frame
cannonMesh.update(); // Apply Cannon.js updates to Three.js object
renderer.render(scene, camera);
}
animate();In real projects, you may need to handle additional details like collision detection callbacks, constraints, and dynamics. Cannon.js and Ammo.js offer rich APIs for complex physics interactions. Always refer to official documentation for complete features and examples.
Ammo.js Physics Engine
For physics simulations with Ammo.js, you must load its WebAssembly module, then initialize the physics world, rigid bodies, and colliders.
Install Ammo.js
Ensure you have installed the ammo.js library via npm:
npm install ammo.jsImport Ammo.js and Three.js Plugins
import * as THREE from 'three';
import * as ammo from 'ammo.js/dist/ammo.wasm.js';
import { AmmoJSPlugin } from 'three-ammojs-plugin'; // If usedLoad WebAssembly Module and Initialize Ammo.js
// Load WebAssembly module
let Ammo;
ammo().then((module) => {
Ammo = module.Ammo;
// Initialize physics world
const collisionConfiguration = new Ammo.btDefaultCollisionConfiguration();
const dispatcher = new Ammo.btCollisionDispatcher(collisionConfiguration);
const overlappingPairCache = new Ammo.btDbvtBroadphase();
const solver = new Ammo.btSequentialImpulseConstraintSolver();
const dynamicsWorld = new Ammo.btDiscreteDynamicsWorld(dispatcher, overlappingPairCache, solver, collisionConfiguration);
dynamicsWorld.setGravity(new Ammo.btVector3(0, -9.81, 0)); // Set gravity
// Start physics simulation
startPhysics(dynamicsWorld);
});Create Rigid Bodies and Colliders
function createRigidBody(shape, transform) {
const body = new Ammo.btRigidBody(new Ammo.btRigidBodyConstructionInfo(1, null, shape));
body.setWorldTransform(transform);
dynamicsWorld.addRigidBody(body);
return body;
}
// Create sphere
const sphereShape = new Ammo.btSphereShape(1);
const sphereTransform = new Ammo.btTransform();
sphereTransform.setIdentity();
sphereTransform.setOrigin(new Ammo.btVector3(0, 1, 0));
const sphereBody = createRigidBody(sphereShape, sphereTransform);
// Create ground
const planeShape = new Ammo.btStaticPlaneShape(new Ammo.btVector3(0, 1, 0), 1);
const planeTransform = new Ammo.btTransform();
planeTransform.setIdentity();
planeTransform.setOrigin(new Ammo.btVector3(0, -1, 0));
const planeBody = createRigidBody(planeShape, planeTransform);Create Three.js Object and Link to Ammo.js Object
// Create sphere geometry and material
const sphereGeometry = new THREE.SphereGeometry(1, 32, 32);
const sphereMaterial = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
// Create sphere object
const sphereMesh = new THREE.Mesh(sphereGeometry, sphereMaterial);
scene.add(sphereMesh);
// Link Three.js object to Ammo.js object
const ammoMesh = new AmmoJSPlugin(dynamicsWorld, sphereMesh);
ammoMesh.body = sphereBody;Update Physics World Each Frame
function startPhysics(world) {
function step() {
requestAnimationFrame(step);
world.stepSimulation(1 / 60, 10, 1 / 60);
ammoMesh.update();
renderer.render(scene, camera);
}
step();
}Ammo.js’s API differs from Cannon.js but shares similar concepts. In projects, you may handle additional details like collision callbacks, constraints, and joints. Ammo.js offers a more robust physics engine but requires more learning and debugging.
Advanced Ammo.js Physics Simulations
Collision Detection and Response
Ammo.js provides methods like btCollisionWorld’s rayTest(), convexSweepTest(), and contactTest() for collision detection. Set up collision callbacks to handle specific events.
// Set collision callback
world.setCollisionWorldOverlappingPairCallback(new Ammo.btOverlapCallback({
processOverlap(pairCacheProxy0, pairCacheProxy1) {
// Handle collision
}
}));Dynamics Constraints
Constraints like hinges, sliders, or ropes connect objects. Ammo.js supports types like btHingeConstraint and btSliderConstraint.
// Create hinge constraint
const hingeConstraint = new Ammo.btHingeConstraint(bodyA, bodyB, pivotInA, pivotInB, axisInA, axisInB);
hingeConstraint.enableAngularMotor(true, 0, 1);
world.addConstraint(hingeConstraint);Rigid Body Types
Beyond static and dynamic bodies, kinematic bodies are unaffected by forces but can be moved by setting position or velocity.
const kinematicBody = new Ammo.btRigidBody(new Ammo.btRigidBodyConstructionInfo(0, null, shape));
kinematicBody.setActivationState(Ammo.btCollisionObject.ACTIVE_TAG);
kinematicBody.setCollisionFlags(kinematicBody.getCollisionFlags() | Ammo.btCollisionObject.CF_KINEMATIC_OBJECT);
world.addRigidBody(kinematicBody);Animation and Controllers
To control object motion (e.g., following mouse or keyboard), manipulate rigid body position and velocity or create a controller.
function updateBodyPosition(body, position) {
const trans = body.getWorldTransform();
trans.setOrigin(position);
body.setWorldTransform(trans);
}Contact Callbacks
Set contact callbacks to handle object interactions, like detecting ground contact.
world.setContactAddedCallback((dispatcher, proxy0, proxy1) => {
const body0 = Ammo.castObject(proxy0.getBody(), Ammo.btRigidBody);
const body1 = Ammo.castObject(proxy1.getBody(), Ammo.btRigidBody);
if (body0.getUserPointer() === 'myObject' && body1.getUserPointer() === 'ground') {
// Handle ground-object contact
}
});Performance Optimization
Optimize collision detection precision, reduce unnecessary bodies, or lower simulation frame rates based on scene needs.
Web Workers
To reduce main thread load, run Ammo.js in Web Workers, using postMessage and onmessage for synchronization.
Three.js Performance Optimization
Merge Meshes (Geometry Merge)
Combining multiple small meshes into one large mesh reduces draw calls, improving performance.
// Assuming two geometries: geometry1 and geometry2
const mergedGeometry = new THREE.BufferGeometry().merge(geometry1, geometry2);
const mergedMesh = new THREE.Mesh(mergedGeometry, material);
scene.add(mergedMesh);Instancing
For many repeated objects, instancing boosts performance. Each instance shares the same geometry and material but can have unique positions, rotations, and scales.
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
const instancedMesh = new THREE.InstancedMesh(geometry, material, numberOfInstances);
scene.add(instancedMesh);
for (let i = 0; i < numberOfInstances; i++) {
const offset = new THREE.Vector3(Math.random() - 0.5, Math.random() - 0.5, Math.random() - 0.5);
const quaternion = new THREE.Quaternion().setFromAxisAngle(new THREE.Vector3(0, 1, 0), Math.random() * Math.PI * 2);
const scale = new THREE.Vector3(1, 1, 1);
instancedMesh.setMatrixAt(i, new THREE.Matrix4().compose(offset, quaternion, scale));
}
instancedMesh.instanceMatrix.needsUpdate = true;Use LOD (Level of Detail)
LOD dynamically switches model complexity based on distance from the camera, using high-detail models up close and low-detail models farther away.
const lod = new THREE.LOD();
lod.addLevel(meshHighDetail, 100); // When distance > 100 units
lod.addLevel(meshMediumDetail, 200); // When distance > 200 units
lod.addLevel(meshLowDetail, Infinity); // For greater distances
lod.position.set(0, 0, 0);
scene.add(lod);Simplify Models and Textures
- Use Draco compression to reduce
.gltfor.glbfile sizes. - Decrease texture resolution and use efficient formats (e.g., losslessly compressed
.jpgor.png). - Employ texture atlases to minimize texture switches.
Limit Light Sources and Complexity
- Reduce the number of light sources, especially point lights and spotlights.
- Use baked light maps instead of real-time lighting.
Optimize Post-Processing Passes
- Enable or disable passes as needed.
- Lower pass resolution below screen resolution.
- Arrange pass order to minimize unnecessary computations.
Dynamic Resource Loading and Unloading
- Use
THREE.LoadingManagerfor on-demand resource loading. - Remove and free resources for objects no longer visible.
Update Loop and Rendering
- Sync animations with browser refresh rates using
requestAnimationFrame. - Avoid redundant rendering calls for unchanged frames.
GPU Optimization
- Leverage GPU instancing and vertex attributes.
- Optimize shader code to reduce complexity and texture sampling.
Optimize Textures and Materials
- Use
THREE.TextureLoader’sminFilterandmagFilterfor efficient texture sampling. For example, use linear filtering (THREE.LinearFilter) for high quality and nearest-neighbor filtering (THREE.NearestFilter) for low quality. - Apply
THREE.MipMapLinearLinearorTHREE.LinearMipmapLinearFilterfor smooth mipmap transitions. - Optimize texture wrapping with
THREE.ClampToEdgeWrappingorTHREE.RepeatWrappingto avoid redundant sampling.
const textureLoader = new THREE.TextureLoader();
const texture = textureLoader.load('texture.jpg');
texture.minFilter = THREE.LinearFilter;
texture.magFilter = THREE.LinearFilter;
texture.wrapS = THREE.ClampToEdgeWrapping;
texture.wrapT = THREE.ClampToEdgeWrapping;Optimize Vertex and Fragment Shaders
- Minimize shader computations, avoiding redundant math and texture sampling.
- Use flat shading for efficient color handling, especially with color maps.
- Employ
discardto skip unneeded pixels, e.g., in fog effects.
flat out vec4 fragColor;
void main() {
// ... Compute color ...
if (color.a < 0.5) {
discard; // Skip pixels with alpha < 0.5
}
fragColor = color;
}Use Web Workers
For complex tasks like physics or pathfinding, use Web Workers to offload work to background threads, reducing main thread load.
// Create Web Worker
const worker = new Worker('worker.js');
worker.postMessage({ data: /* ... */ });
worker.onmessage = function(event) {
const result = event.data;
// Process result
};Batch Rendering (Batching)
For objects with the same material, merge them into a single batch to reduce draw calls.
const group = new THREE.Group();
for (let i = 0; i < objects.length; i++) {
group.add(objects[i]);
}
scene.add(group);Optimize Camera Viewport
- Adjust the camera viewport to avoid rendering objects outside it.
- Use frustum culling to exclude objects outside the camera’s view frustum.
camera.layers.enable(object.layer); // Render only specific layersOptimize Shadow Maps
- Adjust shadow map size to balance quality and performance.
- Enable shadows only for objects that need them.
light.shadow.mapSize.width = 1024;
light.shadow.mapSize.height = 1024;
light.castShadow = true;
object.receiveShadow = true;WebGL and Three.js In-Depth
WebGL Basics Review
Context Creation
Initialize a WebGL context on an HTML5 canvas element.
<canvas id="glCanvas" width="640" height="480"></canvas>const canvas = document.getElementById('glCanvas');
const gl = canvas.getContext('webgl');
if (!gl) {
console.log('WebGL not supported, falling back on experimental-webgl');
gl = canvas.getContext('experimental-webgl');
}
if (!gl) {
alert('Your browser does not support WebGL');
}Vertex and Fragment Shaders
Vertex and fragment shaders, written in GLSL, handle geometry attributes and colors.
Vertex Shader:
const vertexShaderSource = `
attribute vec4 a_position;
void main() {
gl_Position = a_position;
}
`;Fragment Shader:
const fragmentShaderSource = `
precision mediump float;
void main() {
gl_FragColor = vec4(1, 0, 0, 1); // Red color
}
`;Buffers and Vertex Arrays
Store geometry data like positions, colors, and texture coordinates.
Create and Bind Buffer:
const vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
const vertices = new Float32Array([
-0.5, -0.5,
0.5, -0.5,
-0.5, 0.5,
0.5, 0.5
]);
gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);Shader Program
Combines vertex and fragment shaders for rendering.
Compile Shaders:
function createShader(gl, type, source) {
const shader = gl.createShader(type);
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
const fragmentShader = createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource);Create Shader Program:
function createProgram(gl, vertexShader, fragmentShader) {
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error(gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
const shaderProgram = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(shaderProgram);Uniforms and Attributes
Data passed to shaders; uniforms are global, attributes are per-vertex.
Set Vertex Attributes:
const positionAttributeLocation = gl.getAttribLocation(shaderProgram, 'a_position');
gl.enableVertexAttribArray(positionAttributeLocation);
gl.vertexAttribPointer(positionAttributeLocation, 2, gl.FLOAT, false, 0, 0);Draw Scene:
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);Complete Code
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>WebGL Basic Example</title>
<style>
body { margin: 0; }
canvas { display: block; }
</style>
</head>
<body>
<canvas id="glCanvas" width="640" height="480"></canvas>
<script>
const canvas = document.getElementById('glCanvas');
const gl = canvas.getContext('webgl');
if (!gl) {
console.log('WebGL not supported, falling back on experimental-webgl');
gl = canvas.getContext('experimental-webgl');
}
if (!gl) {
alert('Your browser does not support WebGL');
}
const vertexShaderSource = `
attribute vec4 a_position;
void main() {
gl_Position = a_position;
}
`;
const fragmentShaderSource = `
precision mediump float;
void main() {
gl_FragColor = vec4(1, 0, 0, 1); // Red color
}
`;
function createShader(gl, type, source) {
const shader = gl.createShader(type);
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
const fragmentShader = createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource);
function createProgram(gl, vertexShader, fragmentShader) {
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error(gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
const shaderProgram = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(shaderProgram);
const vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
const vertices = new Float32Array([
-0.5, -0.5,
0.5, -0.5,
-0.5, 0.5,
0.5, 0.5
]);
gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);
const positionAttributeLocation = gl.getAttribLocation(shaderProgram, 'a_position');
gl.enableVertexAttribArray(positionAttributeLocation);
gl.vertexAttribPointer(positionAttributeLocation, 2, gl.FLOAT, false, 0, 0);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
</script>
</body>
</html>GLSL Shading
Syntax
GLSL (OpenGL Shading Language) is a C-like language for GPU shader programs, divided into vertex and fragment shaders.
Types
GLSL supports:
- Scalars:
float,int,bool - Vectors:
vec2,vec3,vec4(float vectors),ivec2,ivec3,ivec4(integer vectors) - Matrices:
mat2,mat3,mat4 - Texture samplers:
sampler2D,samplerCube
Input/Output
attribute: CPU-to-vertex shader data, per-vertex, used only in vertex shaders.uniform: Global CPU-to-shader data.varying: Data passed from vertex to fragment shader.
Control Flow
Supports conditionals (if, else) and loops (for, while, do-while).
Mathematical Operations
Handles vector and matrix operations extensively.
Example Code
Vertex Shader
Computes per-vertex positions.
attribute vec4 a_position; // Input vertex position
attribute vec4 a_color; // Input vertex color
uniform mat4 u_modelViewMatrix; // Model-view matrix
uniform mat4 u_projectionMatrix; // Projection matrix
varying vec4 v_color; // Color passed to fragment shader
void main() {
gl_Position = u_projectionMatrix * u_modelViewMatrix * a_position; // Compute position
v_color = a_color; // Pass color to fragment shader
}Fragment Shader
Computes per-pixel colors.
precision mediump float;
varying vec4 v_color; // Color from vertex shader
void main() {
gl_FragColor = v_color; // Set pixel color
}Basic Usage Example
<canvas id="glCanvas" width="640" height="480"></canvas>const canvas = document.getElementById('glCanvas');
const gl = canvas.getContext('webgl');
if (!gl) {
console.log('WebGL not supported, falling back on experimental-webgl');
gl = canvas.getContext('experimental-webgl');
}
if (!gl) {
alert('Your browser does not support WebGL');
}
const vertexShaderSource = `
attribute vec4 a_position;
attribute vec4 a_color;
uniform mat4 u_modelViewMatrix;
uniform mat4 u_projectionMatrix;
varying vec4 v_color;
void main() {
gl_Position = u_projectionMatrix * u_modelViewMatrix * a_position;
v_color = a_color;
}
`;
const fragmentShaderSource = `
precision mediump float;
varying vec4 v_color;
void main() {
gl_FragColor = v_color;
}
`;
function createShader(gl, type, source) {
const shader = gl.createShader(type);
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
const fragmentShader = createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource);
function createProgram(gl, vertexShader, fragmentShader) {
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error(gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
const shaderProgram = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(shaderProgram);
const positionBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
const positions = new Float32Array([
-0.5, -0.5,
0.5, -0.5,
-0.5, 0.5,
0.5, 0.5
]);
gl.bufferData(gl.ARRAY_BUFFER, positions, gl.STATIC_DRAW);
const colorBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
const colors = new Float32Array([
1, 0, 0, 1, // Red
0, 1, 0, 1, // Green
0, 0, 1, 1, // Blue
1, 1, 0, 1 // Yellow
]);
gl.bufferData(gl.ARRAY_BUFFER, colors, gl.STATIC_DRAW);
const positionAttributeLocation = gl.getAttribLocation(shaderProgram, 'a_position');
gl.enableVertexAttribArray(positionAttributeLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);
gl.vertexAttribPointer(positionAttributeLocation, 2, gl.FLOAT, false, 0, 0);
const colorAttributeLocation = gl.getAttribLocation(shaderProgram, 'a_color');
gl.enableVertexAttribArray(colorAttributeLocation);
gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);
gl.vertexAttribPointer(colorAttributeLocation, 4, gl.FLOAT, false, 0, 0);
const modelViewMatrix = mat4.create();
const projectionMatrix = mat4.create();
mat4.perspective(projectionMatrix, 45 * Math.PI / 180, canvas.width / canvas.height, 0.1, 100.0);
mat4.translate(modelViewMatrix, modelViewMatrix, [-0.0, 0.0, -6.0]);
const uModelViewMatrix = gl.getUniformLocation(shaderProgram, 'u_modelViewMatrix');
gl.uniformMatrix4fv(uModelViewMatrix, false, modelViewMatrix);
const uProjectionMatrix = gl.getUniformLocation(shaderProgram, 'u_projectionMatrix');
gl.uniformMatrix4fv(uProjectionMatrix, false, projectionMatrix);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);WebGL Rendering Pipeline
The WebGL rendering pipeline consists of multiple stages, each handling specific rendering tasks.
Model
Models include geometry data (vertices, faces) and material information. Geometry is stored in buffers, while materials define surface properties like color and texture.
Create a Cube Model:
const vertices = new Float32Array([
-1.0, -1.0, 1.0, // Vertex 0
1.0, -1.0, 1.0, // Vertex 1
-1.0, 1.0, 1.0, // Vertex 2
1.0, 1.0, 1.0, // Vertex 3
-1.0, -1.0, -1.0, // Vertex 4
1.0, -1.0, -1.0, // Vertex 5
-1.0, 1.0, -1.0, // Vertex 6
1.0, 1.0, -1.0 // Vertex 7
]);
const indices = new Uint16Array([
0, 1, 2, 1, 3, 2, // Front
4, 5, 6, 5, 7, 6, // Back
4, 5, 0, 5, 1, 0, // Bottom
6, 7, 2, 7, 3, 2, // Top
4, 6, 0, 6, 2, 0, // Left
1, 5, 3, 5, 7, 3 // Right
]);
// Create and upload buffers
const vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);
const indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);View
The view defines the camera’s position and orientation.
Set Camera View Matrix:
const cameraPosition = [0, 0, 5];
const target = [0, 0, 0];
const up = [0, 1, 0];
const viewMatrix = mat4.create();
mat4.lookAt(viewMatrix, cameraPosition, target, up);
const uViewMatrix = gl.getUniformLocation(shaderProgram, 'u_viewMatrix');
gl.uniformMatrix4fv(uViewMatrix, false, viewMatrix);Projection
Projection maps 3D space to a 2D screen.
Set Projection Matrix:
const fieldOfView = 45 * Math.PI / 180; // Radians
const aspect = gl.canvas.clientWidth / gl.canvas.clientHeight;
const zNear = 0.1;
const zFar = 100.0;
const projectionMatrix = mat4.create();
mat4.perspective(projectionMatrix, fieldOfView, aspect, zNear, zFar);
const uProjectionMatrix = gl.getUniformLocation(shaderProgram, 'u_projectionMatrix');
gl.uniformMatrix4fv(uProjectionMatrix, false, projectionMatrix);Clipping
Clipping removes geometry outside the viewport.
Enable Clipping:
gl.enable(gl.CULL_FACE);
gl.cullFace(gl.BACK);Normalized Device Coordinates
Transforms clipped coordinates to the [-1, 1] range, handled in the vertex shader.
Vertex Shader Code:
attribute vec4 a_position;
uniform mat4 u_modelMatrix;
uniform mat4 u_viewMatrix;
uniform mat4 u_projectionMatrix;
void main() {
gl_Position = u_projectionMatrix * u_viewMatrix * u_modelMatrix * a_position;
}Fragment Generation
The fragment shader processes each pixel’s color.
Fragment Shader Code:
precision mediump float;
uniform vec4 u_color;
void main() {
gl_FragColor = u_color;
}Color Blending
Applies transparency and blending rules.
Enable Color Blending:
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);Complete Code Example
<canvas id="glCanvas" width="640" height="480"></canvas>const canvas = document.getElementById('glCanvas');
const gl = canvas.getContext('webgl');
if (!gl) {
console.log('WebGL not supported, falling back on experimental-webgl');
gl = canvas.getContext('experimental-webgl');
}
if (!gl) {
alert('Your browser does not support WebGL');
}
const vertexShaderSource = `
attribute vec4 a_position;
uniform mat4 u_modelMatrix;
uniform mat4 u_viewMatrix;
uniform mat4 u_projectionMatrix;
void main() {
gl_Position = u_projectionMatrix * u_viewMatrix * u_modelMatrix * a_position;
}
`;
const fragmentShaderSource = `
precision mediump float;
uniform vec4 u_color;
void main() {
gl_FragColor = u_color;
}
`;
function createShader(gl, type, source) {
const shader = gl.createShader(type);
gl.shaderSource(shader, source);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
console.error(gl.getShaderInfoLog(shader));
gl.deleteShader(shader);
return null;
}
return shader;
}
const vertexShader = createShader(gl, gl.VERTEX_SHADER, vertexShaderSource);
const fragmentShader = createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource);
function createProgram(gl, vertexShader, fragmentShader) {
const program = gl.createProgram();
gl.attachShader(program, vertexShader);
gl.attachShader(program, fragmentShader);
gl.linkProgram(program);
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) {
console.error(gl.getProgramInfoLog(program));
gl.deleteProgram(program);
return null;
}
return program;
}
const shaderProgram = createProgram(gl, vertexShader, fragmentShader);
gl.useProgram(shaderProgram);
const vertices = new Float32Array([
-1.0, -1.0, 1.0,
1.0, -1.0, 1.0,
-1.0, 1.0, 1.0,
1.0, 1.0, 1.0,
-1.0, -1.0, -1.0,
1.0, -1.0, -1.0,
-1.0, 1.0, -1.0,
1.0, 1.0, -1.0
]);
const indices = new Uint16Array([
0, 1, 2, 1, 3, 2,
4, 5, 6, 5, 7, 6,
4, 5, 0, 5, 1, 0,
6, 7, 2, 7, 3, 2,
4, 6, 0, 6, 2, 0,
1, 5, 3, 5, 7, 3
]);
const vertexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);
gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);
const indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);
const positionAttributeLocation = gl.getAttribLocation(shaderProgram, 'a_position');
gl.enableVertexAttribArray(positionAttributeLocation);
gl.vertexAttribPointer(positionAttributeLocation, 3, gl.FLOAT, false, 0, 0);
const modelMatrix = mat4.create();
const viewMatrix = mat4.create();
const projectionMatrix = mat4.create();
mat4.perspective(projectionMatrix, 45 * Math.PI / 180, canvas.width / canvas.height, 0.1, 100.0);
mat4.lookAt(viewMatrix, [0, 0, 5], [0, 0, 0], [0, 1, 0]);
const uModelMatrix = gl.getUniformLocation(shaderProgram, 'u_modelMatrix');
const uViewMatrix = gl.getUniformLocation(shaderProgram, 'u_viewMatrix');
const uProjectionMatrix = gl.getUniformLocation(shaderProgram, 'u_projectionMatrix');
const uColor = gl.getUniformLocation(shaderProgram, 'u_color');
gl.uniformMatrix4fv(uModelMatrix, false, modelMatrix);
gl.uniformMatrix4fv(uViewMatrix, false, viewMatrix);
gl.uniformMatrix4fv(uProjectionMatrix, false, projectionMatrix);
gl.uniform4fv(uColor, [1, 0, 0, 1]);
gl.enable(gl.CULL_FACE);
gl.cullFace(gl.BACK);
gl.enable(gl.BLEND);
gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA);
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_SHORT, 0);Three.js Direct Interaction with WebGL
Three.js is a JavaScript library built on WebGL, encapsulating complex operations with a simplified, high-level interface for 3D graphics.
Encapsulation
Three.js provides high-level APIs to streamline WebGL usage, such as creating a basic 3D scene with renderer, scene, camera, and objects.
Create a Basic Three.js Scene:
// Create scene
const scene = new THREE.Scene();
// Create camera
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.z = 5;
// Create renderer
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Create cube
const geometry = new THREE.BoxGeometry();
const material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
// Render function
function animate() {
requestAnimationFrame(animate);
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render(scene, camera);
}
animate();Material System
Three.js offers predefined material types like MeshBasicMaterial and MeshPhongMaterial to define object appearances.
Use Different Materials:
// MeshBasicMaterial
const basicMaterial = new THREE.MeshBasicMaterial({ color: 0xff0000 });
const basicCube = new THREE.Mesh(new THREE.BoxGeometry(), basicMaterial);
scene.add(basicCube);
// MeshPhongMaterial
const phongMaterial = new THREE.MeshPhongMaterial({ color: 0x00ff00, shininess: 100 });
const phongCube = new THREE.Mesh(new THREE.BoxGeometry(), phongMaterial);
scene.add(phongCube);
// Add light source
const light = new THREE.PointLight(0xffffff);
light.position.set(10, 10, 10);
scene.add(light);Geometry
Three.js provides built-in shapes like BoxGeometry and SphereGeometry for quick 3D object creation.
Create Different Geometries:
// BoxGeometry
const box = new THREE.Mesh(new THREE.BoxGeometry(), new THREE.MeshBasicMaterial({ color: 0x00ff00 }));
scene.add(box);
// SphereGeometry
const sphere = new THREE.Mesh(new THREE.SphereGeometry(1, 32, 32), new THREE.MeshBasicMaterial({ color: 0x0000ff }));
scene.add(sphere);
// CylinderGeometry
const cylinder = new THREE.Mesh(new THREE.CylinderGeometry(1, 1, 2, 32), new THREE.MeshBasicMaterial({ color: 0xff00ff }));
scene.add(cylinder);Loaders
Three.js includes loaders for models and textures, such as GLTFLoader and TextureLoader.
Load Texture:
const textureLoader = new THREE.TextureLoader();
const texture = textureLoader.load('path/to/texture.jpg');
const texturedMaterial = new THREE.MeshBasicMaterial({ map: texture });
const texturedCube = new THREE.Mesh(new THREE.BoxGeometry(), texturedMaterial);
scene.add(texturedCube);Load GLTF Model:
const loader = new THREE.GLTFLoader();
loader.load('path/to/model.glb', function(gltf) {
scene.add(gltf.scene);
}, undefined, function(error) {
console.error(error);
});Animation System
Three.js offers a robust animation system for managing object motion and animations.
Simple Animation:
const mixer = new THREE.AnimationMixer(scene);
const action = mixer.clipAction(gltf.animations[0]);
action.play();
function animate() {
requestAnimationFrame(animate);
const delta = clock.getDelta();
mixer.update(delta);
renderer.render(scene, camera);
}
animate();Complete Code
A complete Three.js application demonstrating a 3D scene with animations and textures.
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Three.js Example</title>
<style>
body { margin: 0; }
canvas { display: block; }
</style>
</head>
<body>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r128/three.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/three@0.128.0/examples/js/loaders/GLTFLoader.js"></script>
<script>
// Create scene
const scene = new THREE.Scene();
// Create camera
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.z = 5;
// Create renderer
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Create cube
const geometry = new THREE.BoxGeometry();
const material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
// Add light source
const light = new THREE.PointLight(0xffffff);
light.position.set(10, 10, 10);
scene.add(light);
// Texture loader
const textureLoader = new THREE.TextureLoader();
const texture = textureLoader.load('path/to/texture.jpg');
// Textured material
const texturedMaterial = new THREE.MeshBasicMaterial({ map: texture });
const texturedCube = new THREE.Mesh(new THREE.BoxGeometry(), texturedMaterial);
scene.add(texturedCube);
// GLTF model loading
const loader = new THREE.GLTFLoader();
loader.load('path/to/model.glb', function(gltf) {
scene.add(gltf.scene);
const mixer = new THREE.AnimationMixer(gltf.scene);
const action = mixer.clipAction(gltf.animations[0]);
action.play();
function animate() {
requestAnimationFrame(animate);
const delta = clock.getDelta();
mixer.update(delta);
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render(scene, camera);
}
animate();
}, undefined, function(error) {
console.error(error);
});
</script>
</body>
</html>Combining Custom Shaders with Three.js Materials
Three.js’s ShaderMaterial class enables custom shader creation, using uniforms and attributes to pass data.
Vertex and Fragment Shaders
Vertex Shader:
const vertexShader = `
varying vec3 vPosition;
void main() {
vPosition = position;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`;Fragment Shader:
const fragmentShader = `
precision mediump float;
varying vec3 vPosition;
uniform vec3 uColor;
void main() {
gl_FragColor = vec4(vPosition * uColor, 1.0);
}
`;Create Custom ShaderMaterial
const uniforms = {
uColor: { value: new THREE.Color(0x00ff00) }
};
const material = new THREE.ShaderMaterial({
vertexShader: vertexShader,
fragmentShader: fragmentShader,
uniforms: uniforms
});Instantiate Geometry and Apply Material
const geometry = new THREE.BoxGeometry();
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);Complete Code
A Three.js example using a custom shader for a dynamic-colored cube.
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Three.js Custom Shader Example</title>
<style>
body { margin: 0; }
canvas { display: block; }
</style>
</head>
<body>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r128/three.min.js"></script>
<script>
// Create scene
const scene = new THREE.Scene();
// Create camera
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.z = 5;
// Create renderer
const renderer = new THREE.WebGLRenderer();
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Custom vertex shader
const vertexShader = `
varying vec3 vPosition;
void main() {
vPosition = position;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
`;
// Custom fragment shader
const fragmentShader = `
precision mediump float;
varying vec3 vPosition;
uniform vec3 uColor;
void main() {
gl_FragColor = vec4(vPosition * uColor, 1.0);
}
`;
// Define uniforms
const uniforms = {
uColor: { value: new THREE.Color(0x00ff00) }
};
// Create ShaderMaterial
const material = new THREE.ShaderMaterial({
vertexShader: vertexShader,
fragmentShader: fragmentShader,
uniforms: uniforms
});
// Create geometry and apply material
const geometry = new THREE.BoxGeometry();
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
// Render function
function animate() {
requestAnimationFrame(animate);
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render(scene, camera);
}
animate();
</script>
</body>
</html>Leveraging WebGL2 Features
WebGL2 introduces features like polygon offset, floating-point textures, multiple outputs, and vertex array objects.
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>WebGL2 Features</title>
<script src="https://cdnjs.cloudflare.com/ajax/libs/gl-matrix/2.8.1/gl-matrix-min.js"></script>
</head>
<body>
<canvas id="webgl-canvas" width="600" height="400"></canvas>
<script>
const glCanvas = document.getElementById("webgl-canvas");
const gl = glCanvas.getContext("webgl2");
if (!gl) {
console.error("WebGL2 not supported");
}
// Vertex shader code
const vertexShaderSource = `
#version 300 es
in vec2 position;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
}
`;
// Fragment shader code
const fragmentShaderSource = `
#version 300 es
precision mediump float;
out vec4 fragColor1;
out vec4 fragColor2;
void main() {
fragColor1 = vec4(1.0, 0.0, 0.0, 1.0); // Red
fragColor2 = vec4(0.0, 0.0, 1.0, 1.0); // Blue
}
`;
// Create vertex shader
const vertexShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertexShader, vertexShaderSource);
gl.compileShader(vertexShader);
// Create fragment shader
const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragmentShader, fragmentShaderSource);
gl.compileShader(fragmentShader);
// Create shader program
const shaderProgram = gl.createProgram();
gl.attachShader(shaderProgram, vertexShader);
gl.attachShader(shaderProgram, fragmentShader);
gl.linkProgram(shaderProgram);
gl.useProgram(shaderProgram);
// Create and bind vertex array object
const vao = gl.createVertexArray();
gl.bindVertexArray(vao);
// Create and bind vertex buffer
const vertices = new Float32Array([
-0.5, -0.5,
0.5, -0.5,
0.0, 0.5
]);
const vbo = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vbo);
gl.bufferData(gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);
// Set vertex attribute pointer
const positionAttribLocation = gl.getAttribLocation(shaderProgram, "position");
gl.enableVertexAttribArray(positionAttribLocation);
gl.vertexAttribPointer(positionAttribLocation, 2, gl.FLOAT, false, 0, 0);
// Clear canvas
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
// Draw triangle
gl.drawArrays(gl.TRIANGLES, 0, 3);
</script>
</body>
</html>- Polygon Offset: Not used in this example, but it resolves Z-buffer conflicts for overlapping surfaces.
- Floating-Point Textures: Not used here, but WebGL2 supports storing floating-point values in textures, useful for scientific computing.
- Multiple Outputs: The fragment shader defines two color outputs (
fragColor1,fragColor2), each targeting a different buffer. - Vertex Array Objects (VAOs): Using
gl.createVertexArray()andgl.bindVertexArray(vao)stores vertex attribute configurations, improving efficiency.
WebGPU and the Future of Three.js
WebGPU is a low-level, high-performance Web API for direct GPU access, offering greater flexibility and performance. Though still evolving, its impact on Three.js is emerging.
WebGPU API Basics
WebGPU uses GPUDevice, GPUQueue, GPUBuffer, and GPUShaderModule for GPU interaction, requiring manual memory management, command scheduling, and pipeline creation—more low-level than WebGL.
Three.js WebGPU Adaptation
To leverage WebGPU, the Three.js community must:
- Develop an abstraction layer to map existing APIs to WebGPU, minimizing code changes.
- Rewrite rendering logic to fit WebGPU’s pipeline and resource management.
- Expose new features like compute shaders, texture views, and variable-rate shading.
Performance Benefits
WebGPU offers:
- Lower abstraction for direct hardware control.
- Parallel computing via compute shaders.
- Fine-grained memory and data transfer management.
Future Trends
- Gradual migration to support both WebGL and WebGPU.
- Community-driven evolution through feedback and contributions.
- Increased tutorials and examples for WebGPU in Three.js.
WebGPU Introduction and Outlook
WebGPU is an emerging Web API for low-level graphics and compute access, inspired by Direct3D 12, Vulkan, and Metal, aiming for native-like performance.
Performance Enhancements
- Low-Level Interface: Direct GPU control reduces overhead.
- Multithreading: Concurrent task execution.
- Efficient Resource Management: Optimized memory and data transfer.
Security and Stability
- Sandbox Environment: Restricts harmful operations.
- Validation: Strict checks during compilation and execution.
Cross-Platform Compatibility
- Standardization: W3C standard for consistent browser support.
- Hardware Support: Targets diverse devices, from mobile to desktop.
Advanced Graphics Features
- Variable-Rate Shading: Adjusts rendering quality per region.
- Compute Shaders: General-purpose GPU computing.
- Texture Samplers: Advanced sampling modes.
Developer Tools
- Debugging and profiling tools to optimize GPU code.
Progressive Enhancement
- Backward compatibility with WebGL fallback layers.
Future Applications
- AR/VR with complex 3D scenes.
- High-performance games and graphics apps.
- Scientific visualizations for large datasets.
Using Three.js Experimental WebGPU Support
Three.js is exploring WebGPU support, but it’s unstable, and APIs may change. Below is a simplified example assuming a WebGPU-compatible Three.js version and browser.
import * as THREE from 'three';
import { WebGPURenderer } from 'three/examples/jsm/renderers/webgpu/WebGPURenderer.js';
// Scene setup
const scene = new THREE.Scene();
// Camera setup
const camera = new THREE.PerspectiveCamera(75, window.innerWidth / window.innerHeight, 0.1, 1000);
camera.position.z = 5;
// Geometry and material
const geometry = new THREE.BoxGeometry(1, 1, 1);
const material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
// Mesh
const cube = new THREE.Mesh(geometry, material);
scene.add(cube);
// Renderer setup - WebGPU
const renderer = new WebGPURenderer(); // Experimental WebGPU renderer
renderer.setClearColor(0x000000);
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
// Render loop
function animate() {
requestAnimationFrame(animate);
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render(scene, camera);
}
animate();Notes:
- Import
WebGPURenderer.jsfrom Three.js’s experimental WebGPU directory. WebGPURendererreplacesWebGLRendererfor WebGPU interaction.- Some Three.js features may not work or behave differently due to WebGPU-WebGL differences.
- Check Three.js’s latest documentation for updates on WebGPU support.



