Arrays
The simplest arrangement: data in a row, accessed by position.
🤔 Why does this exist?
The Story
The Computer's Memory
Memory is a long row of numbered slots
RAM is physically organized as billions of sequential addresses. Each has a unique number.
The CPU can jump to any address instantly
Given an address, the CPU reaches it in one step. This is called "random access".
Calculating an address is nearly free
Simple arithmetic: base_address + (index × item_size). One multiplication, one addition.
The Problem Before Arrays
Imagine storing 1000 student grades. Early computers had to track each grade's address separately—a second list just to find the first list.
Naive Approach: Store each item wherever there's space
❌ Finding item #500 meant searching from the beginning. No pattern = no shortcut.
Cost: Average lookup checked half the items
"Why can't we just say "give me item 500" and get it immediately?"
The Array: Contiguous Memory + Position
💡 Key Insight: If items are stored back-to-back with no gaps, we can calculate any item's location instantly.
The Pattern: Reserve a continuous block of memory. Item N lives at: start + (N × size). Done.
Instant access to any position—just calculate the address
Fixed size. Moving or inserting items means physically shifting memory.
Array Memory Layout
Formula: address = 0x1000 + (index × 4)
arr[3] = 0x1000 + 12 = 0x100C → 45
Interactive Visualization
💡 Insert at index 0 to see why shifting is expensive!
Cost Meter
Constraints
Toggle constraints to see how they affect the choice of data structure.
Quick Reference
Click "Code Examples" or "Case Studies" above to explore!
This lesson is part of a shared journey.