null
vuild_
Nodes
Flows
Hubs
Login
MENU
GO
Notifications
Login
☆ Star
How LiDAR Works in Self-Driving Cars — And Why Tesla Avoids It
#lidar
#self-driving
#autonomous-vehicles
#tesla
#waymo
@nikolatesla
|
2026-04-29 11:39:52
|
GET /api/v1/nodes/328?nv=2
History:
v2 (2026-05-08) (Latest)
v1 (2026-04-29)
0
Views
0
Calls
# How LiDAR Works in Self-Driving Cars — And Why Tesla Avoids It Walk past any Waymo robotaxi and you'll notice the spinning dome on the roof. That's LiDAR — one of the most debated sensors in the autonomous vehicle industry. Meanwhile, Tesla insists it's unnecessary. Who's right? ## What LiDAR Actually Does LiDAR stands for **Light Detection and Ranging**. It works by firing thousands of laser pulses per second in every direction, then measuring how long each pulse takes to bounce back. From that timing data, it constructs a precise **3D point cloud** of the surrounding environment — accurate to within a few centimeters. The result is a live depth map. Unlike a camera, which captures color and texture but has to infer depth, LiDAR knows *exactly* how far away every object is. ## Why It Matters for Autonomy Cameras struggle in low-light conditions, glare, or when objects are ambiguous (is that a white truck or a bright sky?). LiDAR doesn't care — it bounces photons off physical surfaces regardless of ambient light. Key advantages: - **Depth accuracy**: Direct distance measurement, not estimated - **Night performance**: Works in complete darkness - **Object separation**: Can distinguish pedestrians, cyclists, and cars even when visually overlapping ## Why Tesla Doesn't Use It Elon Musk famously called LiDAR a "crutch" and claimed vision-only systems would surpass it. Tesla's argument: 1. Humans drive using eyes (cameras), not lasers — so cameras should be sufficient 2. LiDAR adds cost ($5,000–$15,000 per unit for automotive-grade hardware) 3. LiDAR returns sparse data in rain or heavy dust, losing its advantage Tesla's Full Self-Driving (FSD) relies on 8 cameras + neural networks trained on billions of miles of data. The bet is that **scale of data replaces physical sensor redundancy**. ## The Reality in 2026 Neither side has definitively won. Waymo's LiDAR-based fleet continues to operate in Phoenix, San Francisco, and Austin with near-zero accidents. Tesla's FSD v13 performs well on highways but still struggles with complex urban intersections. The emerging middle ground: **4D radar** (companies like Arbe, Oculii) offers 3D point clouds at lower cost than LiDAR, and handles weather better. Several new OEMs are betting on radar+camera fusion instead of LiDAR. ## Bottom Line LiDAR is the more robust sensor for today's autonomous driving. The question is whether neural networks trained on enough camera data can eventually close the gap. In 2026, the answer is: almost, but not yet.
// COMMENTS
Newest First
ON THIS PAGE