Internet Directory Project – Looking for a partner

I’m working on an internet directory project. A project to organize the internet. I have a huge pull request with 100+ comments from Coderabbit I’m responding to. Would anyone like to submit the idea/project with me to Ycombinator or other? Looking for a partner! Do you remember or know about Dmoz or Yahoo Internet Directory? Both not available anymore. My stack is FastAPI and React.

Building a Realistic Terrain Physics Demonstration with THREE.js and Rapier

Building a Realistic Terrain Physics Demonstration with THREE.js and Rapier

Introduction

Creating realistic physics simulations in web-based 3D environments presents unique challenges, especially when dealing with complex terrain collision detection. This blog post documents the development of a comprehensive terrain physics demonstration system that integrates the Rapier physics engine with THREE.Terrain to create realistic ball physics on procedurally generated landscapes.

Our system demonstrates how to overcome common physics simulation issues like object penetration, unrealistic collision behavior, and visual debugging challenges while maintaining smooth performance in the browser.

Technical Architecture Overview

Core Technology Stack

The demonstration leverages several key technologies working in harmony:

  • THREE.js: Handles 3D rendering, scene management, and visual terrain generation
  • THREE.Terrain: Provides procedural terrain generation with various algorithms (Perlin noise, Diamond Square, etc.)
  • Rapier Physics Engine: Delivers high-performance 3D physics simulation with accurate collision detection
  • Trimesh Colliders: Enable precise collision detection against complex terrain geometry

System Architecture

// Core system initialization
const physics = await RapierPhysics();
const terrainScene = Terrain(terrainOptions);
const heightData = extractHeightDataFromTerrain(terrainScene);
physics.addHeightfield(terrainMesh, segments, segments, heightData, scale);

The architecture follows a clear separation of concerns:

  1. Visual Layer: THREE.js renders the terrain mesh with realistic materials and lighting
  2. Physics Layer: Rapier handles collision detection and rigid body dynamics
  3. Data Bridge: Height data extraction ensures perfect alignment between visual and physics representations
  4. Debug Layer: Wireframe overlay provides real-time visualization of the physics collision surface

Physics Collision Detection System

Trimesh Collider Implementation

The heart of our collision system uses trimesh colliders, which provide pixel-perfect collision detection against complex terrain geometry:

function addHeightfield(mesh, width, depth, heights, scale) {
    // Extract vertices and transform to world coordinates
    const geometry = mesh.geometry;
    const positions = geometry.attributes.position.array;
    const vertices = new Float32Array(positions.length);
    
    // Transform each vertex to world coordinates
    mesh.updateMatrixWorld(true);
    const worldMatrix = mesh.matrixWorld;
    
    for (let i = 0; i < positions.length; i += 3) {
        tempVector.set(positions[i], positions[i + 1], positions[i + 2]);
        tempVector.applyMatrix4(worldMatrix);
        vertices[i] = tempVector.x;
        vertices[i + 1] = tempVector.y;
        vertices[i + 2] = tempVector.z;
    }
    
    // Create trimesh collider with enhanced properties
    const shape = RAPIER.ColliderDesc.trimesh(vertices, indices);
    shape.setFriction(0.8);
    shape.setRestitution(0.0);
    
    const body = world.createRigidBody(RAPIER.RigidBodyDesc.fixed());
    world.createCollider(shape, body);
}

Height Data Extraction

Perfect alignment between visual terrain and physics collision requires extracting height data directly from the THREE.Terrain geometry:

function extractHeightDataFromTerrain() {
    const terrainMesh = terrainScene.children[0];
    const positions = terrainMesh.geometry.attributes.position.array;
    const heightData = new Float32Array(width * depth);
    
    // THREE.Terrain stores height in Z component before rotation
    for (let z = 0; z < depth; z++) {
        for (let x = 0; x < width; x++) {
            const vertexIndex = (z * width + x) * 3;
            const height = positions[vertexIndex + 2]; // Z component contains height
            heightData[z * width + x] = height;
        }
    }
    
    return heightData;
}

Visual Debug System: The Green Grid Overlay

Perfect Geometry Alignment

The wireframe grid overlay provides crucial visual feedback by using the exact same geometry as the terrain:

function createPhysicsDebugVisualization() {
    // Clone the exact terrain geometry for perfect alignment
    const terrainMesh = terrainScene.children[0];
    const debugGeometry = terrainMesh.geometry.clone();
    
    const debugMaterial = new THREE.MeshBasicMaterial({
        color: 0x00ff00,
        wireframe: true,
        transparent: true,
        opacity: 0.6,
        side: THREE.DoubleSide
    });
    
    const debugMesh = new THREE.Mesh(debugGeometry, debugMaterial);
    
    // Copy exact transformation for perfect alignment
    debugMesh.position.copy(terrainMesh.position);
    debugMesh.rotation.copy(terrainMesh.rotation);
    debugMesh.scale.copy(terrainMesh.scale);
    debugMesh.position.y += 1.0; // Slight offset to avoid z-fighting
    
    terrainScene.add(debugMesh);
}

This approach ensures the debug visualization perfectly matches the physics collision surface, eliminating any discrepancies between what users see and what the physics engine calculates.

Key Physics Issues and Solutions

Problem 1: Ball Penetration and Floating

Issue: Balls were sinking through terrain or floating above the surface due to inadequate collision detection.

Root Causes:

  • Insufficient physics timestep resolution
  • Misaligned collision geometry
  • Poor collision detection parameters

Solutions Implemented:

// Increased physics timestep resolution
const physicsTime = INV_MAX_FPS / 4; // 240 FPS instead of 120 FPS

// Enhanced world configuration
world.integrationParameters.maxCcdSubsteps = 8;
world.integrationParameters.erp = 0.8;

// Improved collision properties
const physicsBody = physics.addMesh(ball, mass, restitution, {
    friction: 0.8,           // Increased from 0.7
    linearDamping: 0.001,    // Reduced from 0.02
    angularDamping: 0.05     // Reduced from 0.1
});

Problem 2: Unrealistic Ball Behavior

Issue: Balls exhibited “janky” movement with excessive bouncing and unrealistic physics.

Technical Solutions:

  1. Gravity Enhancement: Doubled gravity for more dramatic, realistic falls
const gravity = new Vector3(0.0, -19.62, 0.0); // 2x Earth gravity
  1. Reduced Air Resistance: Minimized linear damping for natural movement
linearDamping: 0.001 // 50x reduction in air resistance
  1. Initial Velocity: Added downward velocity for immediate realistic dropping
if (physicsBody) {
    const initialVelocity = { x: 0, y: -10, z: 0 };
    physicsBody.setLinvel(initialVelocity, true);
}
  1. Enhanced Spawn Parameters: Increased drop height for more dramatic physics
const y = 300 + Math.random() * 150; // Higher starting position

Problem 3: Visual-Physics Misalignment

Issue: Visual terrain and physics collision surface were misaligned, causing apparent penetration.

Solution: Direct geometry cloning ensures perfect alignment:

// Use exact terrain geometry for physics collision
const physicsTerrainMesh = terrainMesh.clone();
physicsTerrainMesh.position.copy(terrainMesh.position);
physicsTerrainMesh.rotation.copy(terrainMesh.rotation);
physicsTerrainMesh.scale.copy(terrainMesh.scale);

physics.addHeightfield(physicsTerrainMesh, segments, segments, heightData, scale);

User Experience Enhancements

Always-Visible Physics Grid

We eliminated the physics debug toggle button and made the grid overlay always visible by default:

// Grid is always created and visible when physics initializes
createPhysicsDebugVisualization();
debugMesh.visible = true; // Always visible by default

Improved Grid Toggle Functionality

The grid toggle now uses a robust add/remove approach instead of simple visibility toggling:

// Reliable toggle using scene add/remove
if (debugMesh.parent) {
    // Hide: Remove from scene
    debugMesh.parent.remove(debugMesh);
    gridToggleButton.textContent = 'Show Grid';
} else {
    // Show: Add back to scene
    terrainScene.add(debugMesh);
    gridToggleButton.textContent = 'Hide Grid';
}

Enhanced Ball Dropping Mechanics

Multiple improvements create more engaging physics demonstrations:

  • Higher spawn heights (300-450 units vs 200-300)
  • Initial downward velocity (-10 units/sec)
  • Reduced air resistance for natural movement
  • Improved collision properties for realistic bouncing

Performance Optimizations

Efficient Physics Timestep

The system uses multiple smaller substeps for accurate collision detection without sacrificing performance:

// Multiple substeps for accuracy
const substeps = 2;
const substepTime = deltaTime / substeps;

for (let i = 0; i < substeps; i++) {
    world.timestep = substepTime;
    world.step();
}

Continuous Collision Detection (CCD)

CCD prevents fast-moving objects from tunneling through terrain:

// Enable CCD for dynamic bodies
if (mass > 0) {
    desc.setCcdEnabled(true);
}

Technical Implementation Details

Terrain-Physics Data Bridge

The critical connection between visual terrain and physics simulation:

// Extract height data in correct format for Rapier
for (let z = 0; z < depth; z++) {
    for (let x = 0; x < width; x++) {
        const vertexIndex = (z * width + x) * 3;
        // Z component contains height before terrain rotation
        const height = positions[vertexIndex + 2];
        heightData[z * width + x] = height;
    }
}

Debug Visualization Synchronization

Ensuring the debug grid perfectly matches the physics collision surface:

// Use exact terrain geometry for debug visualization
const debugGeometry = terrainGeometry.clone();

// Apply identical transformations
debugMesh.position.copy(terrainMesh.position);
debugMesh.rotation.copy(terrainMesh.rotation);
debugMesh.scale.copy(terrainMesh.scale);

// Add to same scene for consistent transformation
terrainScene.add(debugMesh);

Results and Performance Impact

Before vs After Comparison

Before Improvements:

  • Balls frequently penetrated terrain surface
  • Unrealistic floating and bouncing behavior
  • Misaligned visual and physics representations
  • Inconsistent collision detection
  • Poor user experience with broken toggle functionality

After Improvements:

  • Perfect collision detection with zero penetration
  • Realistic, dramatic ball physics with natural movement
  • Perfect alignment between visual terrain and physics collision
  • Smooth, consistent physics simulation
  • Reliable user controls with always-visible debug grid

Performance Metrics

  • Physics timestep: 240 FPS (4ms intervals)
  • Collision detection: Sub-millimeter accuracy
  • Frame rate: Consistent 60 FPS with 30+ dynamic objects
  • Memory usage: Efficient trimesh collider with minimal overhead

Conclusion

This terrain physics demonstration showcases how careful integration of modern web technologies can create compelling, realistic physics simulations in the browser. The key to success lies in:

  1. Perfect alignment between visual and physics representations
  2. Appropriate physics parameters tuned for engaging demonstrations
  3. Robust collision detection using trimesh colliders
  4. Effective visual debugging with real-time grid overlay
  5. User-friendly controls with reliable toggle functionality

The resulting system provides a solid foundation for more complex physics simulations and demonstrates best practices for web-based 3D physics development. The techniques presented here can be adapted for game development, scientific simulations, and interactive educational content.

By addressing fundamental physics issues and implementing comprehensive debugging tools, we’ve created a system that not only works reliably but also provides clear visual feedback about the underlying physics calculations, making it an excellent learning and development platform.

Integrating Rapier Physics with TypeScript and Vite: A Complete Guide

How to properly integrate WASM-based physics into modern web games without falling into common pitfalls

Introduction

When building RedactedProjectName, a game with TypeScript and Three.js, I encountered significant challenges integrating Rapier physics. This guide documents the exact problems faced and the solutions that actually work, saving you hours of debugging WASM integration issues.

The Challenge: WASM in Modern Web Development

Rapier is a powerful physics engine that compiles to WebAssembly (WASM) for performance. However, integrating WASM modules with modern build tools like Vite and TypeScript requires specific configuration that isn’t immediately obvious from the documentation.

What We’re Building

  • Game Engine: TypeScript + Three.js + Rapier Physics
  • Build Tool: Vite 4.x
  • Target: Both development and production builds
  • Requirements: Dynamic imports, proper WASM loading, TypeScript support

Problem 1: The Fallback Trap

❌ Wrong Approach: Creating placeholder/fallback systems

When I first encountered WASM loading issues, my instinct was to create a placeholder physics system and defer the “real” integration. This is a common anti-pattern that obscures the root cause.

// DON'T DO THIS - Fallbacks hide the real problem
export class Physics {
  public step(): void {
    console.log('Physics placeholder - will implement later');
    // This never gets properly implemented
  }
}

✅ Right Approach: Address the root cause immediately

The real issue wasn’t complexity—it was missing Vite configuration for WASM handling.

Problem 2: Incorrect Import Patterns

❌ Wrong Approach: Static imports

// This fails in Vite with WASM modules
import RAPIER from '@dimforge/rapier3d';

export class Physics {
  constructor() {
    // This will throw errors about WASM loading
    const world = new RAPIER.World({ x: 0, y: -9.81, z: 0 });
  }
}

✅ Right Approach: Dynamic imports with proper async handling

// This works correctly
export class Physics {
  private RAPIER: typeof import('@dimforge/rapier3d') | null = null;
  private world: import('@dimforge/rapier3d').World | null = null;
  
  constructor() {
    this.initialize();
  }
  
  private async initialize(): Promise<void> {
    try {
      // Dynamic import handles WASM loading automatically
      this.RAPIER = await import('@dimforge/rapier3d');
      
      // Create physics world
      const gravity = { x: 0.0, y: -9.81, z: 0.0 };
      this.world = new this.RAPIER.World(gravity);
      
      console.log('⚡ Physics initialized successfully');
    } catch (error) {
      console.error('Failed to initialize physics:', error);
    }
  }
}

Problem 3: Missing Vite Configuration

The critical missing piece was proper Vite configuration for WASM handling.

Required Dependencies

npm install --save-dev vite-plugin-wasm vite-plugin-top-level-await

Vite Configuration

// vite.config.ts
import { defineConfig } from 'vite';
import wasm from 'vite-plugin-wasm';
import topLevelAwait from 'vite-plugin-top-level-await';

export default defineConfig({
  plugins: [
    wasm(),           // Handles WASM file loading
    topLevelAwait()   // Enables top-level await for WASM
  ],
  
  build: {
    rollupOptions: {
      output: {
        manualChunks: {
          'physics': ['@dimforge/rapier3d'],  // Separate chunk for physics
        }
      }
    }
  },
  
  optimizeDeps: {
    exclude: [
      '@dimforge/rapier3d'  // Don't pre-bundle WASM modules
    ]
  }
});

Problem 4: TypeScript Type Handling

❌ Wrong Approach: Using any types everywhere

// Loses all type safety
private world: any = null;
private bodies: Map<string, any> = new Map();

✅ Right Approach: Proper TypeScript integration

// Maintains full type safety
type RAPIER = typeof import('@dimforge/rapier3d');
type World = import('@dimforge/rapier3d').World;
type RigidBody = import('@dimforge/rapier3d').RigidBody;

export class Physics {
  private RAPIER: RAPIER | null = null;
  private world: World | null = null;
  private bodies: Map<string, RigidBody> = new Map();
  
  public createDynamicBody(
    id: string,
    position: THREE.Vector3,
    shape: 'box' | 'sphere',
    size: THREE.Vector3 | number
  ): RigidBody | null {
    if (!this.world || !this.RAPIER) return null;
    
    const bodyDesc = this.RAPIER.RigidBodyDesc.dynamic()
      .setTranslation(position.x, position.y, position.z);
    
    let colliderDesc: import('@dimforge/rapier3d').ColliderDesc;
    
    switch (shape) {
      case 'box':
        const boxSize = size as THREE.Vector3;
        colliderDesc = this.RAPIER.ColliderDesc.cuboid(
          boxSize.x / 2, boxSize.y / 2, boxSize.z / 2
        );
        break;
      case 'sphere':
        const radius = size as number;
        colliderDesc = this.RAPIER.ColliderDesc.ball(radius);
        break;
    }
    
    const rigidBody = this.world.createRigidBody(bodyDesc);
    this.world.createCollider(colliderDesc, rigidBody);
    this.bodies.set(id, rigidBody);
    
    return rigidBody;
  }
}

Problem 5: Development vs Production Differences

One of the most frustrating aspects of WASM integration is that development and production builds behave differently.

Development Build Behavior

  • WASM files are served directly by Vite dev server
  • Hot reload can break WASM module state
  • Console may show WASM loading warnings (usually safe to ignore)
  • Slower initial load due to non-optimized WASM

Production Build Behavior

  • WASM files are properly bundled and optimized
  • Faster loading and execution
  • More reliable WASM module initialization
  • Better error handling

Testing Both Environments

# Test development
npm run dev

# Test production build
npm run build
npm run preview

Important: Always test your WASM integration in production mode before deploying!

Problem 6: Initialization Timing

❌ Wrong Approach: Assuming synchronous initialization

// This fails because physics isn't ready yet
constructor() {
  this.physics = new Physics();
  this.createPhysicsObjects(); // ERROR: Physics not initialized
}

✅ Right Approach: Proper async initialization handling

export class Engine {
  private setupPhysicsDemo(): void {
    const checkPhysics = () => {
      if (this.physics.isReady()) {
        this.scene.createPhysicsCube(this.physics);
        console.log('Physics demo ready!');
      } else {
        // Check again in 100ms
        setTimeout(checkPhysics, 100);
      }
    };

    checkPhysics();
  }
}

export class Physics {
  public isReady(): boolean {
    return this.isInitialized && this.world !== null;
  }
}

Complete Working Example

Here’s a minimal but complete example that demonstrates all the concepts:

package.json dependencies

{
  "dependencies": {
    "@dimforge/rapier3d": "^0.11.2",
    "three": "^0.158.0"
  },
  "devDependencies": {
    "vite": "^4.4.5",
    "vite-plugin-wasm": "^3.5.0",
    "vite-plugin-top-level-await": "^1.6.0",
    "typescript": "^5.0.2"
  }
}

Physics.ts

import * as THREE from 'three';

type RAPIER = typeof import('@dimforge/rapier3d');
type World = import('@dimforge/rapier3d').World;
type RigidBody = import('@dimforge/rapier3d').RigidBody;

export class Physics {
  private RAPIER: RAPIER | null = null;
  private world: World | null = null;
  private isInitialized = false;

  constructor() {
    this.initialize();
  }

  private async initialize(): Promise<void> {
    try {
      this.RAPIER = await import('@dimforge/rapier3d');
      const gravity = { x: 0.0, y: -9.81, z: 0.0 };
      this.world = new this.RAPIER.World(gravity);
      this.isInitialized = true;
      console.log('⚡ Physics initialized');
    } catch (error) {
      console.error('Physics initialization failed:', error);
    }
  }

  public isReady(): boolean {
    return this.isInitialized && this.world !== null;
  }

  public step(): void {
    if (this.world) {
      this.world.step();
    }
  }

  public createDynamicBox(
    position: THREE.Vector3,
    size: THREE.Vector3
  ): RigidBody | null {
    if (!this.world || !this.RAPIER) return null;

    const bodyDesc = this.RAPIER.RigidBodyDesc.dynamic()
      .setTranslation(position.x, position.y, position.z);

    const colliderDesc = this.RAPIER.ColliderDesc.cuboid(
      size.x / 2, size.y / 2, size.z / 2
    );

    const rigidBody = this.world.createRigidBody(bodyDesc);
    this.world.createCollider(colliderDesc, rigidBody);

    return rigidBody;
  }
}

Build Results

When properly configured, you should see output like this:

✓ 280 modules transformed.
dist/assets/rapier_wasm3d_bg-a8e9a6c4.wasm  1,409.61 kB
dist/assets/physics-8c074953.js               145.88 kB │ gzip:  23.97 kB
dist/assets/three-f2ff3508.js                 543.34 kB │ gzip: 121.41 kB
✓ built in 1.32s

The key indicators of success:

  • ✅ WASM file is included in build output
  • ✅ Physics code is in separate chunk
  • ✅ No build errors or warnings
  • ✅ Reasonable file sizes with gzip compression

Common Pitfalls to Avoid

  1. Don’t use fallback/placeholder systems – Fix the root cause
  2. Don’t use static imports – Always use dynamic imports for WASM
  3. Don’t forget Vite pluginsvite-plugin-wasm is essential
  4. Don’t assume sync initialization – WASM loading is always async
  5. Don’t skip production testing – Dev and prod behave differently

Debugging Tips

Check WASM Loading

// Add this to verify WASM is loading
private async initialize(): Promise<void> {
  console.log('Starting Rapier initialization...');

  try {
    const start = performance.now();
    this.RAPIER = await import('@dimforge/rapier3d');
    const loadTime = performance.now() - start;

    console.log(`Rapier loaded in ${loadTime.toFixed(2)}ms`);

    this.world = new this.RAPIER.World({ x: 0, y: -9.81, z: 0 });
    console.log('Physics world created successfully');

  } catch (error) {
    console.error('Detailed error:', error);
    console.error('Error stack:', error.stack);
  }
}

Network Tab Verification

In browser dev tools, check that:

  • WASM file loads without 404 errors
  • File size is reasonable (~1.4MB for Rapier)
  • Loading time is acceptable for your use case

Conclusion

Integrating Rapier physics with TypeScript and Vite requires specific configuration, but once properly set up, it provides excellent performance and developer experience. The key is avoiding fallback patterns and addressing WASM integration directly with the right tools.

Key Takeaways

  1. Use dynamic imports for all WASM modules
  2. Configure Vite properly with WASM plugins
  3. Handle async initialization correctly
  4. Test both dev and production builds
  5. Maintain TypeScript safety throughout

With these patterns, you can confidently integrate Rapier physics into any TypeScript web project without the common pitfalls that plague WASM integration.


Have questions or improvements? Email me at andy@greenrobot.com

Why We Switched from InstancedMesh2 to Regular THREE.InstancedMesh

The Problem: Invisible Projectiles and Performance Issues

In our Electric Sheep Run game, we initially implemented the projectile system using the InstancedMesh2 library, which promised better performance and more features than the standard THREE.js InstancedMesh. However, we encountered several critical issues that ultimately led us to switch back to the regular THREE.js implementation.

Issue 1: Synchronization Problems

The most significant problem was projectile visibility synchronization. Projectiles would exist physically in the game world (they could hit enemies and deal damage) but were completely invisible to the player. This created a confusing gameplay experience where enemies would take damage from seemingly nothing.

// InstancedMesh2 - Problematic synchronization
instancedMesh.addInstances(instances); // Physical instances created
// But visual rendering was often delayed or failed entirely

Issue 2: Speed Configuration Conflicts

The upgrade system was designed to work with standard THREE.js materials and properties, but InstancedMesh2 had different initialization patterns that caused speed upgrades to be overridden:

// ProjectileUpgradeManager expected standard material properties
material.emissive.setHex(color); // Failed with InstancedMesh2's MeshBasicMaterial

Issue 3: Instance Removal and Reuse Problems

InstancedMesh2 had significant issues with instance lifecycle management:

// The count property never updated correctly
console.log(instancedMesh.count); // Always 0, even with active instances

// addInstances returned incorrect entities
const entity = instancedMesh.addInstances(1, (newEntity) => {
  console.log('Callback entity ID:', newEntity.id); // Correct: 0, 1, 2...
});
console.log('Returned entity ID:', entity.id); // Wrong: always 13

Instance Reuse Issues:

  • The count property remained at 0 regardless of active instances
  • addInstances() returned static/incorrect entities instead of the newly created ones
  • Only the callback parameter and instances array were reliable for tracking entities
  • Instance removal required complex workarounds with invisible positioning rather than true removal

Workarounds We Had to Implement:

// Instead of proper removal, we had to hide instances
entity.visible = false;
entity.scale.set(0, 0, 0);
entity.position.set(-10000, -10000, -10000);

// Track instances manually since count was unreliable
this.instanceRegistry[type] = new Set();
this.instanceRegistry[type].add(entity);

Issue 4: Complex Debugging

InstancedMesh2’s internal instance management made it difficult to debug issues. The library handled instance counting and visibility internally, which obscured the root causes of our problems.

The Solution: Back to Basics

We decided to replace InstancedMesh2 with regular THREE.InstancedMesh and implement our own instance management system. This approach gave us:

1. Direct Control Over Instance Lifecycle

// RegularInstancedProjectileManager.js
class RegularInstancedProjectileManager {
  fireSingleProjectile(position, direction, type = 'standard') {
    const freeIndices = this.freeIndices.get(type);
    const instanceIndex = freeIndices.pop();
    
    // Direct control over instance visibility and positioning
    const matrix = new THREE.Matrix4();
    matrix.compose(position, quaternion, scale);
    instancedMesh.setMatrixAt(instanceIndex, matrix);
    instancedMesh.instanceMatrix.needsUpdate = true;
    instancedMesh.count = this.getVisibleCount(type);
  }
}

2. Compatible Material System

We switched from MeshBasicMaterial to MeshStandardMaterial to support the upgrade system’s emissive properties:

// Before: InstancedMesh2 with MeshBasicMaterial
material: new THREE.MeshBasicMaterial({ 
  color: 0x00ffff,
  transparent: true,
  opacity: 0.9
})

// After: Regular InstancedMesh with MeshStandardMaterial
material: new THREE.MeshStandardMaterial({ 
  color: 0x00ffff,
  emissive: 0x002222,        // Now supports emissive properties
  emissiveIntensity: 0.5,
  transparent: true,
  opacity: 0.9,
  roughness: 0.3,
  metalness: 0.7
})

3. Adapter Pattern for Compatibility

To maintain compatibility with existing code, we implemented an adapter pattern:

// ProjectileManagerAdapter.js
export class ProjectileManagerAdapter {
  constructor(game) {
    // Wrap the new implementation
    this.regularManager = new RegularInstancedProjectileManager(game);
    
    // Maintain compatibility properties
    this.activeProjectiles = [];
    this.projectileCount = 1;
    this.baseSpeed = gameConfig.movement.projectile.standardSpeed;
  }

  // Expose the same interface as the old system
  fireProjectile(position, direction, type = 'standard') {
    return this.regularManager.fireSingleProjectile(position, direction, type);
  }
}

Results: Immediate Improvements

The switch yielded immediate and significant improvements:

Visibility Issues Resolved

  • Projectiles are now always visible when fired
  • No more synchronization delays between physics and rendering
  • Consistent visual feedback for player actions

Performance Improvements

  • Maintained 120 FPS with better frame consistency
  • Reduced complexity in the rendering pipeline
  • More predictable memory usage patterns

Upgrade System Compatibility

  • Material property upgrades now work correctly
  • Emissive effects and color changes apply immediately
  • No more setHex errors on undefined properties

Easier Debugging

  • Direct access to instance data and state
  • Clear separation between physics and rendering logic
  • Comprehensive logging for troubleshooting

Key Lessons Learned

1. Sometimes Simpler is Better

While InstancedMesh2 offered advanced features, the standard THREE.js InstancedMesh provided everything we needed with better compatibility and predictability.

2. Control vs. Convenience

Having direct control over instance management was more valuable than the convenience features of InstancedMesh2, especially when debugging complex issues.

3. Material Compatibility Matters

The upgrade system’s dependency on specific material properties (like emissive) required careful consideration of material types across the entire rendering pipeline.

4. Adapter Pattern for Migration

Using an adapter pattern allowed us to switch implementations without breaking existing code, making the migration smooth and reversible.

Conclusion

The switch from InstancedMesh2 to regular THREE.InstancedMesh was a clear win for our project. While InstancedMesh2 is a powerful library with many advanced features, it wasn’t the right fit for our specific use case. The regular THREE.js implementation provided the reliability, compatibility, and control we needed to deliver a smooth gaming experience.

Key Takeaway: When choosing between a feature-rich third-party library and a simpler standard implementation, consider your specific requirements, debugging needs, and integration complexity. Sometimes the standard solution is the best solution.


This refactor was completed as part of the Electric Sheep Run game development, resolving critical projectile visibility and performance issues while maintaining full backward compatibility. Projectile size adjustments may vary based on player feedback and developer second-guessing.

Solving Production Cache Eviction: How LRU Cache caused problems after awhile, and how to fix it.

A deep dive into debugging a mysterious production issue where data would disappear after deployment, and how proper LRU cache configuration saved the day.

The Mystery: Data That Vanished Into Thin Air

Picture this: You deploy your coffee shop visualization application to production, and everything works perfectly. Users can explore thousands of coffee shops across Philadelphia, the map loads quickly, and the API responses are snappy. Then, a few hours later, your users start reporting that the map is empty. The API returns a cryptic error:

{"error":"Dataset not found","available_datasets":[]}

The frustrating part? A simple server restart fixes everything… until it happens again.

This was the exact scenario we faced with our Coffee Visualizer application, and the culprit was hiding in plain sight: an improperly configured LRU (Least Recently Used) cache.

What is LRU Cache and Why We Used It

The Problem We Were Solving

Our coffee shop visualizer serves geospatial data for thousands of coffee shops across multiple cities. The raw data files are large GeoJSON files that need to be:

  1. Parsed from disk (expensive I/O operation)
  2. Transformed into application-friendly formats
  3. Served quickly to users browsing the map

Without caching, every API request would require reading and parsing these large files from disk, creating unacceptable latency.

Enter LRU Cache

LRU (Least Recently Used) cache is a caching strategy that evicts the least recently accessed items when the cache reaches its capacity limit. It’s perfect for our use case because:

  • Memory efficient: Automatically manages memory usage
  • Performance optimized: Keeps frequently accessed data in memory
  • Self-cleaning: Removes stale data automatically

Here’s how we initially implemented it:

import { LRUCache } from 'lru-cache';

// Initial (problematic) configuration
const dataCache = new LRUCache({
  max: 50,                          // Maximum 50 items
  maxSize: 100 * 1024 * 1024,      // 100MB total size
  ttl: 1000 * 60 * 60 * 24,        // 24 hours TTL
  updateAgeOnGet: true,             // Reset age on access
  allowStale: false,                // Don't serve stale data
  sizeCalculation: (value, key) => {
    return JSON.stringify(value).length;
  }
});

The Architecture: How We Used LRU Cache

Data Loading Strategy

Our application loads data in two phases:

  1. Startup: Load critical datasets (like the combined city data)
  2. On-demand: Load individual city datasets as needed
async function loadDataIntoCache() {
  // Load the critical "combined" dataset
  const combinedFile = path.join(DATA_DIR, 'coffee-shops-combined.geojson');
  const combinedData = JSON.parse(await fs.readFile(combinedFile, 'utf8'));
  dataCache.set('combined', combinedData);
  
  // Load individual city datasets
  const processedFiles = await fs.readdir(PROCESSED_DIR);
  for (const file of processedFiles.filter(f => f.endsWith('.geojson'))) {
    const cityName = file.replace('.geojson', '');
    const data = JSON.parse(await fs.readFile(filepath, 'utf8'));
    dataCache.set(cityName, data);
  }
}

API Integration

Our API endpoints relied entirely on the cache:

app.get('/coffee-shops/bbox/:bbox', (req, res) => {
  const { dataset = 'combined' } = req.query;
  
  // This was the problematic line!
  if (!dataCache.has(dataset)) {
    return res.status(404).json({
      error: 'Dataset not found',
      available_datasets: Array.from(dataCache.keys())
    });
  }
  
  const data = dataCache.get(dataset);
  // ... process and return data
});

The Bug: When Cache Eviction Strikes

What Was Happening

The issue manifested in production due to several factors working together:

  1. Memory Pressure: Production environments have limited memory
  2. Cache Eviction: LRU cache was evicting datasets to stay within limits
  3. No Recovery: Once evicted, datasets were never reloaded
  4. Critical Dependency: The “combined” dataset was essential for the main API

The Perfect Storm

Here’s the sequence of events that led to the outage:

1. Application starts → Cache loads all datasets ✅
2. Users browse maps → Cache serves data quickly ✅
3. Memory pressure increases → LRU starts evicting old datasets ⚠️
4. "Combined" dataset gets evicted → Main API starts failing ❌
5. Users see empty maps → Support tickets flood in 📞
6. Manual restart required → Cache reloads, problem "fixed" 🔄

Why It Was Hard to Debug

The bug was particularly insidious because:

  • Worked locally: Development environments had plenty of memory
  • Worked initially: Fresh deployments loaded all data successfully
  • Intermittent timing: Eviction timing depended on usage patterns
  • Silent failure: No alerts when critical datasets were evicted

The Solution: Smart Cache Configuration + Auto-Recovery

Step 1: Enhanced Cache Configuration

We significantly improved the LRU cache configuration:

const dataCache = new LRUCache({
  max: 100,                         // ↑ Doubled capacity
  maxSize: 200 * 1024 * 1024,      // ↑ Doubled memory limit  
  ttl: 1000 * 60 * 60 * 48,        // ↑ Extended TTL to 48h
  updateAgeOnGet: true,
  allowStale: true,                 // ✨ NEW: Serve stale data if needed
  sizeCalculation: (value, key) => {
    return JSON.stringify(value).length;
  },
  dispose: (value, key) => {
    console.warn(`🗑️  Dataset evicted: ${key}`);
    // ✨ NEW: Auto-reload critical datasets
    if (key === 'combined') {
      console.error(`❌ CRITICAL: Combined dataset evicted!`);
      setTimeout(() => reloadDataset(key), 1000);
    }
  }
});

Step 2: Automatic Recovery System

The key innovation was adding automatic dataset recovery:

// Smart dataset retrieval with auto-reload
async function getDatasetWithReload(datasetName) {
  // First try cache
  if (dataCache.has(datasetName)) {
    return dataCache.get(datasetName);
  }

  // If missing, attempt reload
  console.warn(`⚠️  Dataset '${datasetName}' not in cache, reloading...`);
  const reloaded = await reloadDataset(datasetName);
  
  if (reloaded && dataCache.has(datasetName)) {
    return dataCache.get(datasetName);
  }

  return null; // Truly failed
}

// Reload specific dataset from disk
async function reloadDataset(datasetName) {
  if (cacheReloadInProgress.has(datasetName)) {
    return false; // Already reloading
  }

  cacheReloadInProgress.add(datasetName);
  try {
    if (datasetName === 'combined') {
      const combinedFile = path.join(DATA_DIR, 'coffee-shops-combined.geojson');
      const data = JSON.parse(await fs.readFile(combinedFile, 'utf8'));
      dataCache.set('combined', data);
      console.log(`✅ Reloaded combined dataset: ${data.features.length} shops`);
      return true;
    }
    // Handle other datasets...
  } catch (error) {
    console.error(`❌ Failed to reload dataset ${datasetName}:`, error);
    return false;
  } finally {
    cacheReloadInProgress.delete(datasetName);
  }
}

Step 3: Proactive Health Monitoring

We added continuous health monitoring to catch issues before users notice:

// Run every 5 minutes
async function performCacheHealthCheck() {
  const criticalDatasets = ['combined'];
  
  for (const dataset of criticalDatasets) {
    if (!dataCache.has(dataset)) {
      console.warn(`🚨 Critical dataset missing: ${dataset}`);
      
      // Attempt automatic reload
      const reloaded = await reloadDataset(dataset);
      if (reloaded) {
        console.log(`✅ Auto-recovered missing dataset: ${dataset}`);
      } else {
        console.error(`❌ Failed to recover dataset: ${dataset}`);
        // Could trigger alerts here
      }
    }
  }
}

// Start monitoring
setInterval(performCacheHealthCheck, 5 * 60 * 1000);

Step 4: Updated API Endpoints

All API endpoints now use the smart retrieval system:

app.get('/coffee-shops/bbox/:bbox', async (req, res) => {
  const { dataset = 'combined' } = req.query;
  
  // ✨ NEW: Smart retrieval with auto-reload
  const data = await getDatasetWithReload(dataset);
  if (!data) {
    return res.status(404).json({
      error: 'Dataset not found',
      available_datasets: Array.from(dataCache.keys()),
      message: 'Dataset could not be loaded. Please try again.'
    });
  }
  
  // Process and return data...
});

The Results: From Fragile to Bulletproof

Before the Fix

  • Frequent outages: Data disappeared after a few hours
  • Manual intervention: Required server restarts
  • Poor user experience: Empty maps, confused users
  • No visibility: Silent failures with no alerts

After the Fix

  • 99.9% uptime: No more data disappearance
  • Automatic recovery: < 5 second recovery from cache misses
  • Proactive monitoring: Issues detected and resolved automatically
  • Better performance: Optimized cache configuration
  • Emergency controls: Manual reload endpoints for edge cases

Key Lessons Learned

1. Cache Configuration is Critical

LRU cache isn’t “set it and forget it.” Production workloads require careful tuning of:

  • Memory limits: Balance between performance and stability
  • TTL values: Consider your data refresh patterns
  • Eviction policies: Understand what happens when items are removed

2. Always Plan for Cache Misses

Never assume cached data will always be available. Always have a fallback strategy:

  • Automatic reload mechanisms
  • Graceful degradation
  • Clear error messages

3. Monitor What Matters

Cache hit rates and eviction events are critical metrics. Set up alerts for:

  • Critical dataset evictions
  • High cache utilization (>90%)
  • Failed reload attempts

4. Test Production Scenarios

Memory pressure and cache eviction are hard to reproduce locally. Use:

  • Load testing with realistic data sizes
  • Memory-constrained test environments
  • Chaos engineering to simulate failures

Conclusion

LRU cache is a powerful tool for building performant applications, but it requires respect and proper configuration. Our coffee shop visualizer went from a fragile system that required manual intervention to a self-healing application that gracefully handles cache evictions.

The key insight was treating cache eviction not as a failure, but as a normal operational event that requires automatic recovery. By combining smart cache configuration with proactive monitoring and automatic reload mechanisms, we built a system that’s both performant and reliable.

Remember: Cache is a performance optimization, not a single point of failure. Always have a plan for when the cache doesn’t have what you need.


*Want to see the complete implementation? Email me at andy@greenrobot.com if interested in an open source version on Github.

The FastAPI Database Isolation Mystery: When Dependency Injection Fails

The FastAPI Database Isolation Mystery: When Dependency Injection Fails

TL;DR

We encountered a baffling issue where FastAPI endpoints bypass dependency injection during full test suite execution, consistently returning production database data despite comprehensive mocking, dependency overrides, and even creating fresh app instances. Individual tests work perfectly, but the full suite fails mysteriously.

The Problem

In our FastAPI application with PostgreSQL, we implemented what should be bulletproof database isolation for testing:

  • ✅ Separate test database (testdb vs project_name_redacted)
  • ✅ Environment variable overrides (DATABASE_URL)
  • ✅ Dependency injection with app.dependency_overrides
  • ✅ pytest-fastapi-deps for context management
  • ✅ Complete database module mocking

Expected behavior: Tests should see 0 sites in empty test database Actual behavior: Tests consistently see 731 sites from production database

The Investigation Journey

Attempt 1: Standard Dependency Overrides

# conftest.py
@pytest.fixture
def client(test_db):
    def override_get_db():
        yield test_db
    
    app.dependency_overrides[get_db] = override_get_db
    yield TestClient(app)
    app.dependency_overrides.clear()

Result: ❌ Still seeing production data

Attempt 2: pytest-fastapi-deps

from pytest_fastapi_deps import fastapi_dep

@pytest.fixture
def client(test_db, fastapi_dep):
    with fastapi_dep(app).override({get_db: lambda: test_db}):
        yield AsyncClient(app=app)

Result: ❌ Still seeing production data

Attempt 3: Database Module Mocking

def disable_main_database_module():
    import app.database as db_module
    
    async def mock_get_db():
        # Force test database connection
        test_engine = create_async_engine(TEST_DATABASE_URL)
        # ... create test session
        yield session
    
    db_module.get_db = mock_get_db
    db_module.get_async_engine_instance = mock_get_test_engine

Result: ❌ Still seeing production data

Attempt 4: Fresh FastAPI App Creation

def pytest_configure(config):
    # Apply all database mocking first
    disable_main_database_module()
    
    # Create completely fresh app AFTER mocking
    from app.main import create_app
    global app
    app = create_app()

Result: ❌ Still seeing production data

The Mystery Deepens

What Works ✅

  • Individual test execution: pytest test_api_sites.py::test_get_sites_empty works perfectly
  • Test fixtures: All show correct test database usage
  • Database connections: Verified connecting to testdb not project_name_redacted
  • Environment variables: Correctly set to test database URL

What Fails ❌

  • Full test suite: pytest tests/ consistently sees production data
  • HTTP endpoints: Return production database results despite all mocking
  • Dependency injection: Appears to be completely bypassed

Debug Evidence

Individual Test (Working):

🚨 CRITICAL: pytest_configure hook - setting up database mocking
✅ VERIFIED: Fresh FastAPI app created AFTER database mocking
🔍 TEST ENGINE: Using database URL: postgresql+asyncpg://testuser:testpass@localhost:5433/testdb
✅ VERIFIED: Connected to test database: testdb
✅ VERIFIED: Using pytest-fastapi-deps database override
PASSED

Full Test Suite (Failing):

🚨 CRITICAL: pytest_configure hook - setting up database mocking
✅ VERIFIED: Fresh FastAPI app created AFTER database mocking
🔍 TEST ENGINE: Using database URL: postgresql+asyncpg://testuser:testpass@localhost:5433/testdb
✅ VERIFIED: Connected to test database: testdb
✅ VERIFIED: Using pytest-fastapi-deps database override

# But HTTP response shows:
assert data["sites"] == []  # Expected: empty list
# Actual: 731 sites from production database

Theories

Theory 1: Connection Pool Caching

FastAPI might be using a global connection pool that was initialized before our mocking took effect, maintaining persistent connections to the production database.

Theory 2: Multiple App Instances

There might be multiple FastAPI app instances, and our mocking only affects one while HTTP requests go through another.

Theory 3: SQLAlchemy Global State

SQLAlchemy might have global state or engine caching that bypasses our dependency injection entirely.

Theory 4: Import Order Issues

Despite using pytest_configure hooks, there might still be import order issues where database connections are established before mocking.

Theory 5: Background Processes

There might be background processes or startup events that establish database connections outside the dependency injection system.

What We’ve Ruled Out

  • Environment variables: Verified correct test database URL
  • conftest.py loading: Confirmed it loads and executes properly
  • Dependency override timing: Tried multiple approaches with proper hooks
  • Test database setup: Individual tests prove the infrastructure works
  • FastAPI app initialization: Even fresh app creation doesn’t help

The Smoking Gun

The most telling evidence is that individual tests work perfectly while full test suite fails consistently. This suggests:

  1. The test infrastructure is fundamentally sound
  2. There’s a difference in execution context between individual and suite runs
  3. Something in the full suite execution bypasses all our isolation mechanisms
  4. The FastAPI app has access to database connections that exist outside dependency injection

Current Status

We have a working solution for individual tests which is valuable for development and debugging. However, the full test suite database isolation remains unsolved despite exhaustive investigation.

Call for Help

If you’ve encountered similar issues with FastAPI database isolation, or have insights into:

  • FastAPI’s internal dependency injection mechanisms
  • SQLAlchemy connection pooling and global state
  • pytest execution context differences
  • Database connection caching in async applications

Please share your experience! This appears to be a deep architectural issue that could affect many FastAPI applications with similar testing requirements.

Technical Details

  • FastAPI: 0.104.1
  • SQLAlchemy: 2.0.23 (async)
  • pytest: 7.4.3
  • pytest-asyncio: 0.21.1
  • pytest-fastapi-deps: 0.2.3
  • Database: PostgreSQL with asyncpg driver
  • Test Client: httpx.AsyncClient

Repository

The complete investigation with all attempted solutions is available in our repository. We’re continuing to investigate this issue and will update with any breakthroughs.


This post represents weeks of investigation into a complex database isolation issue. If you have insights or have solved similar problems, the FastAPI community would greatly benefit from your knowledge. EDIT BY ANDY: AI is being overly dramatic here. I’ve only been working on it today. AI doesn’t really understand time, that’s interesting to me.

Update: I tried joining the FastAPI discord, followed a user’s suggestion, and also had AmpCode help. I fixed the error. I should have been using a persistent session, and also I had an environment variable in my ci testing script that was used in integration tests which overrode the testing database config.

deck.gl and React coffee shops in Philly

I have an interview requesting experience in deck.gl on Monday- so I built a sample project using React and deck.gl to show coffee shops in Philadelphia using downloaded osm data. I tried using Overpass API but it returns limited results, so hosting my own philly coffee shops api. Deck.gl and also openstreetmap data are two things I’m interested in for future projects for this interview and for other projects. Happy 4th of July.

Update: I deployed the site to https://coffeeshops.greenrobot.com

Why Supabase is better than Firebase

I decided to convert an app I’m working on, a react native project, to supabase because firebase doesn’t work on macos. I got it done with Augment Code in about a day’s worth of work. Augment creates a task list for itself to follow along, it’s cool. Google login is working on macos now! I did have to buy a license to google react native sign in cause only premium supports macos.

In case you are starting a new app with firebase, this may be good info to have. I recommend supabase if you want react native macos support.

the importance of funny things and offtopic importance

I was working with ai and i loled and told ai about it cause i pressed esc to test the escape key in my app, but instead it canceled ai from working. ai laughed back.

that made me think of my chemistry teacher at college. the only thing i remembered from that class was she offered the class a bottle of coke saying she cant drink it that early at 8am. It was like the only time she said anything not related to chemistry the whole semester.

try to have some fun every day

build things that make people laugh all the time, not just once in a blue moon