Friday, December 12, 2025

Bending Light: Simulating General Relativity in the Browser with Raw WebGL

As software engineers, we often work within the comfortable constraints of Euclidean geometry: grid layouts, vector positions, and linear interpolations. But what happens when the coordinate system itself is warped?

Recently, I built a real-time visualization of a Schwarzschild black hole using raw WebGL and JavaScript. The goal wasn't just to create a pretty image, but to solve a complex rendering problem: How do you perform ray casting when light rays don't travel in straight lines?

This project explores the intersection of high-performance graphics, general relativity, and orbital mechanics, all running at 60 FPS in a standard web browser.

The Challenge: Non-Euclidean Rendering

Standard 3D rendering (rasterization or standard ray tracing) assumes light travels linearly from a source to the camera. In the vicinity of a black hole, extreme gravity bends spacetime. Light follows geodesics—curves defined by the spacetime metric.

To visualize this, we cannot use standard polygon rasterization. Instead, we must utilize Ray Marching within a Fragment Shader, solving the path of every pixel mathematically in real-time.

Architecture: The "No-Framework" Approach

While libraries like Three.js are excellent for production 3D apps, I chose raw WebGL API for this simulation.

Why?

  1. Performance Control: I needed direct control over the GLSL rendering pipeline without overhead.

  2. First-Principles Engineering: Understanding the low-level buffer binding and shader compilation ensures we aren't relying on "black box" abstractions for critical math.

  3. Portability: The entire engine runs in a single HTML file with zero dependencies.

The Physics Stack

1. Relativistic Ray Marching (The Shader)

The core logic lives in the Fragment Shader. For every pixel on the screen, we fire a ray. However, instead of a simple vector addition (pos += dir * step), we apply a gravitational deflection force at every step of the march.

We approximate the Schwarzschild metric by modifying the ray's direction vector ($\vec{D}$) based on its distance ($r$) from the singularity:

// Inside the Ray Marching Loop

float stepSize = 0.08 * distToCenter; // Adaptive stepping

vec3 gravityForce = -normalize(rayPos) * (1.5 / (distToCenter * distToCenter));

// Bend the light

rayDir = normalize(rayDir + gravityForce * stepSize * 0.5); 

rayPos += rayDir * stepSize;

Optimization Strategy: Note the stepSize. I implemented adaptive ray marching. We take large steps when the photon is far from the black hole (low compute cost) and micro-steps when close to the event horizon (high precision required). This keeps the loop iteration count low (~120 passes) while maintaining visual fidelity.

2. The Event Horizon & Accretion Disk

We define the Schwarzschild Radius ($R_s$).

  • If a ray's distance drops below $R_s$, it is trapped. The pixel returns black (the shadow).

  • If the ray intersects the equatorial plane ($y \approx 0$) within specific radii, we render the Accretion Disk.

To simulate the Doppler Beaming effect (where the disk looks brighter on the side moving toward the camera), I calculated the dot product of the disk's rotational velocity and the ray direction.

3. Keplerian Orbital Mechanics (The JavaScript)

The background stars aren't just animating on a linear path. They follow Kepler’s 3rd Law of Planetary Motion ($T^2 \propto r^3$).

I engineered the JavaScript layer to handle the state management of the celestial bodies:

  • Blue Giant: Far orbit, period set to exactly 10.0s.

  • Red Dwarf: Near orbit.

  • Math: I calculated the inner star's period dynamically based on the ratio of the semi-major axes to ensure physical plausibility.

// Keplerian Ratio Preserved float r2 = 7.5; float ratio = pow(r2 / r1, 1.5); // T^2 proportional to r^3 float T2 = T1 * ratio;

Procedural Generation: Noise without Textures

Loading external textures introduces HTTP requests and cross-origin issues. To keep the architecture monolithic and fast, I generated the starfield and galaxy band procedurally using GLSL hash functions (pseudo-random noise).

By manipulating the frequency and amplitude of the noise, I created a "soft" nebula effect that adds depth without the GPU cost of high-res texture sampling.

UX: Spherical Camera System

A visualization is only useful if it can be explored. I implemented a custom camera system based on Spherical Coordinates ($\rho, \theta, \phi$) rather than Cartesian vectors. This prevents "gimbal lock" and allows the user to orbit the singularity smoothly using mouse drags or touch gestures.

The state is logged to the console for debugging specific views:

Camera Pos: [-3.34, 0.10, -12.06], Zoom (Radius): 12.51

Conclusion

This project demonstrates that modern browsers are capable of heavy scientific visualization if we optimize the pipeline correctly. By combining physics-based rendering, adaptive algorithms, and low-level WebGL, we can simulate general relativity in real-time on consumer hardware.

No comments:

Post a Comment