Digital photography Study Guide
Study Guide
📖 Core Concepts
Digital photography – captures light with an electronic photodetector array, converts it to a digital signal (via an ADC), and stores the result as a file (RAW, JPEG, TIFF).
Image sensor – the “eye” of the camera; an array of pixels that measure light intensity. Two main types: CCD (charge‑coupled device) and CMOS (complementary metal‑oxide‑semiconductor) active‑pixel sensors.
Bayer color filter array – a mosaic of red, green, and blue filters placed over a monochrome sensor; the camera’s processor demosaics the data to produce a full‑color image.
Pixel count (megapixels) – total photosites:
$$n = w \times h$$
where w = horizontal pixels, h = vertical pixels.
Dynamic range – the span of luminances a sensor can record without clipping shadows (under‑exposed) or highlights (over‑exposed).
Mirrorless camera – uses an electronic viewfinder/LCD that shows the sensor’s live image; no reflex mirror, making the body smaller and quieter.
---
📌 Must Remember
CCD invention: 1969–1970 by Willard S. Boyle & George E. Smith.
CMOS dominance: most consumer cameras now use CMOS active‑pixel sensors (lower power, faster read‑out).
Bayer filter: each 2×2 block contains 2 green, 1 red, 1 blue pixel → green gets higher weight for luminance.
Megapixel formula: $n = w \times h$ (e.g., 6000 × 4000 = 24 MP).
Larger sensor + same pixel count → lower noise & better low‑light performance.
JPEG: lossy compression based on the discrete cosine transform; introduced 1992 by the JPEG committee.
RAW files retain full sensor data; JPEGs are already demosaiced & compressed.
HDR techniques: (1) sensor‑level dual‑photodiode design, or (2) exposure bracketing + software merge.
Mirrorless advantages: compact, quiet, real‑time exposure preview.
Mirrorless drawbacks: shorter battery life, sometimes limited native lens selection.
---
🔄 Key Processes
Capture – Light passes through the lens → falls on sensor pixels.
Conversion – Each pixel’s charge → analog voltage → ADC → digital number (photo‑electron count).
Color reconstruction – Bayer‑filtered data → demosaicing → full‑color RGB image.
Image processing – Noise reduction, white‑balance, sharpening, tone‑mapping.
File generation –
RAW: save unprocessed sensor data.
JPEG: apply processing, compress with DCT, write to storage.
Storage & Transfer – Write to SD/CF card → later USB/Wi‑Fi/card‑reader transfer.
HDR via bracketing:
Capture ≥3 exposures (e.g., -2 EV, 0 EV, +2 EV).
Align frames → merge pixel‑wise, preserving details in shadows & highlights.
---
🔍 Key Comparisons
CCD vs. CMOS – CCD: high uniformity, lower noise, slower, higher power. CMOS: lower power, faster read‑out, integrated circuitry.
Mirrorless vs. DSLR – Mirrorless: electronic viewfinder, smaller, quieter, real‑time preview; DSLR: optical viewfinder, larger battery, often broader native lens ecosystem.
RAW vs. JPEG – RAW: full sensor data, flexible post‑processing, larger files. JPEG: processed, lossy compressed, ready‑to‑share, smaller files.
Larger sensor vs. smaller sensor – Larger → better low‑light, higher dynamic range, lower noise. Smaller → cheaper, more compact, higher pixel density may increase noise.
Bayer vs. Multispectral sensor – Bayer: 3‑color (R,G,B) per pixel, common in consumer cameras. Multispectral: >3 color bands, used for scientific/industrial imaging.
---
⚠️ Common Misunderstandings
“More megapixels = better image.” Sensor size, lens quality, and noise performance matter more than raw count.
“JPEG is lossless.” JPEG uses lossy DCT compression; repeated saves degrade quality.
“Digital cameras don’t need power.” Sensors, processors, and electronic viewfinders require batteries; only mechanical film cameras can run without electricity.
“HDR always needs special hardware.” Most cameras achieve HDR with software bracketing; sensor‑level dual‑photodiodes are an extra option.
“Mirrorless lacks interchangeable lenses.” Modern mirrorless systems have growing native lens lines and adapters for DSLR lenses.
---
🧠 Mental Models / Intuition
Pixel‑bucket analogy: each pixel is a tiny bucket that collects photons; a bigger bucket (larger sensor) fills more slowly, reducing “spillage” (noise) in low‑light scenes.
Bayer mosaic as a checkerboard: imagine a chessboard where green squares are twice as common – the processor uses this to estimate luminance more accurately.
Dynamic range as a window: the wider the window, the more of the bright‑to‑dark spectrum you can see without “blinding” (highlights) or “blacking out” (shadows).
---
🚩 Exceptions & Edge Cases
High‑MP on a tiny sensor – can produce excellent detail in bright light but suffers severe noise in low light.
HDR sensor with extra low‑sensitivity photodiodes – provides single‑shot HDR but may have reduced overall sensitivity.
Mirrorless battery life – electronic viewfinder draws significant power; using the LCD continuously can halve shooting time compared with a DSLR.
Aliasing artifacts – repetitive patterns (e.g., fabric) can interact with the pixel grid, creating false colors or moiré despite demosaicing.
---
📍 When to Use Which
RAW vs. JPEG: Choose RAW when you anticipate major exposure or color adjustments; choose JPEG for quick sharing, limited storage, or when post‑processing will be minimal.
Mirrorless vs. DSLR: Pick mirrorless for travel, silent shooting, and real‑time exposure feedback; pick DSLR when you need long battery life or a vast native lens collection.
CCD vs. CMOS: Opt for CCD in specialized scientific imaging where uniformity is critical; opt for CMOS for consumer, high‑speed, or video work.
High‑dynamic‑range approach: Use sensor‑level HDR (dual‑photodiode) when shooting a single frame in high contrast; use bracketing + software when the camera lacks that hardware.
---
👀 Patterns to Recognize
Moiré/aliasing appears as shimmering patterns on fine repetitive textures – a hint that the sensor’s pixel pitch is interacting with the subject’s spatial frequency.
JPEG artifacts (blocking, ringing) become visible near high‑contrast edges; suspect over‑compression.
Noise pattern that resembles film grain often indicates a small sensor at high ISO.
Lens distortion cues – straight lines bending near frame edges → check lens quality or apply correction.
---
🗂️ Exam Traps
Confusing sensor type with file format – CCD/CMOS describe how the image is captured; RAW/JPEG describe how it is stored.
Assuming “more MP = better low‑light” – actually, larger pixel size (sensor area per pixel) controls low‑light performance.
Mixing up dynamic range with exposure latitude – dynamic range is a sensor property; exposure latitude is the range of exposures that yield usable images.
Attributing “instant preview” to film cameras – only digital cameras provide live electronic viewfinder feedback.
Thinking JPEG compression is reversible – once a JPEG is saved, discarded data cannot be perfectly recovered.
---
or
Or, immediately create your own study flashcards:
Upload a PDF.
Master Study Materials.
Master Study Materials.
Start learning in seconds
Drop your PDFs here or
or