3D computer graphics Study Guide
Study Guide
📖 Core Concepts
Three‑dimensional computer graphics – Digital images created from 3‑D geometric data that are ultimately displayed on a 2‑D screen (or VR headset).
3‑D model – A mathematical representation (vertices + polygons) of an object; not an image until rendered.
Rendering – Converting a 3‑D model + scene (lights, camera) into a 2‑D image.
Production workflow – Three phases: Modeling → Layout & Animation → Rendering.
Material – Defines how a surface interacts with light; supplied with textures (color/albedo, bump/normal, displacement).
Projection – Mathematical transform that maps 3‑D coordinates onto a 2‑D plane for display.
📌 Must Remember
Vertices are points; polygons are flat surfaces formed by ≥3 vertices.
Keyframe animation records poses; the system interpolates between them.
Inverse kinematics (IK) solves for joint angles to reach a target point.
Bump/normal maps fake surface detail; displacement maps actually modify geometry.
Realistic rendering → simulates light transport; non‑photorealistic rendering → artistic style.
Common file formats: .blend, .obj, .fbx, .dx9/.dx11.
2.5D / isometric graphics = 3‑D world shown from fixed angles, no perspective distortion.
GPU accelerates geometry processing and rasterization.
🔄 Key Processes
Modeling
Scan real object or create procedurally → generate vertices → connect into polygons.
Layout & Animation
Place objects, lights, cameras → define spatial relationships.
Add animation:
Record keyframes → interpolate (linear, spline).
Apply IK for articulated figures.
Import motion‑capture data if available.
Run physical simulation (gravity, collisions) when needed.
Rendering
Choose material → attach appropriate textures.
Compute lighting (realistic vs artistic).
Project 3‑D scene onto 2‑D viewport.
Output image; optionally send to a render farm for speed.
🔍 Key Comparisons
3‑D vs 2‑D graphics – 3‑D uses geometry, lighting, and projection; 2‑D relies on flat sprites or vector paths.
Bump map vs Normal map – Both fake detail; normal maps store surface normals directly → more accurate lighting.
Displacement map vs Normal map – Displacement actually moves vertices; normal only alters shading.
Realistic vs Non‑photorealistic rendering – Realistic = physics‑based light transport; NPR = stylized, often ignores physics.
Keyframe vs Motion Capture – Keyframe is artist‑driven; mocap records real motion data.
⚠️ Common Misunderstandings
“A 3‑D model is already a picture.” → It’s just data; rendering creates the picture.
“Bump maps change geometry.” → They only affect shading; geometry stays flat.
“All GPUs do the same work.” – Some are optimized for rasterization, others for ray tracing.
“2.5D is the same as 3‑D.” – 2.5D limits camera angles and often uses pre‑rendered sprites.
🧠 Mental Models / Intuition
“Pipe” analogy – Think of the workflow as a factory line: raw material (scan/procedural) → shaping (modeling) → positioning & motion (layout/animation) → finishing (rendering).
Light‑surface interaction – Materials are “rules” the light follows: albedo = base color, bump/normal = tiny hills, displacement = real hills.
🚩 Exceptions & Edge Cases
Displacement maps are costly; use only when true geometry change is needed (e.g., close‑up shots).
IK solvers can produce unnatural poses if joint limits aren’t enforced.
Isometric projection removes perspective; good for strategy games but not for realistic scenes.
📍 When to Use Which
Model source – Scan for real objects; procedural for repetitive or algorithmic shapes.
Texture type – Use albedo for base color, normal map for fine detail, displacement only when silhouette matters.
Rendering style – Choose realistic for product visualisation or film; NPR for stylized games or art.
File format – .obj for simple geometry exchange; .fbx for animation data; .blend for full Blender projects.
👀 Patterns to Recognize
Vertex → Polygon → Mesh hierarchy appears in every modeling question.
Keyframe → Interpolation → Motion pattern in animation problems.
Material → Texture → Light interaction pattern in shading/rendering questions.
Projection matrix → 2‑D screen coordinates pattern in any camera‑related problem.
🗂️ Exam Traps
Distractor: “Bump maps alter geometry.” – Wrong; they only affect shading.
Distractor: “All 3‑D graphics require a GPU.” – Some simple renders can be CPU‑only.
Distractor: “Isometric graphics use perspective projection.” – Incorrect; they use orthographic projection.
Distractor: “Keyframe animation records every frame.” – Only selected frames; the rest are interpolated.
---
Study this guide, visualize the pipeline, and you’ll be ready to tackle any 3‑D graphics exam question!
or
Or, immediately create your own study flashcards:
Upload a PDF.
Master Study Materials.
Master Study Materials.
Start learning in seconds
Drop your PDFs here or
or