RemNote Community
Community

Study Guide

📖 Core Concepts Client–Server Model – A messaging pattern where clients request services and servers provide them. Clients start the conversation; servers wait for requests. Request–Response Pattern – The basic communication flow: client sends a request, server returns a response. Think of it as a doorbell (request) and the answer from inside the house (response). Server‑Side vs. Client‑Side – Server‑side code runs on the remote server; client‑side code runs on the user’s device. Each side has its own security, performance, and latency considerations. Inter‑Server Communication – Servers can talk to each other (e.g., to sync data). This is still a request–response exchange but between two servers. Load Balancing & Failover – A load balancer spreads incoming requests across multiple servers; failover automatically switches to a backup server if the primary fails. Centralized Computing vs. Rich Clients – Centralized: most work is done on a few powerful servers. Rich client: a powerful local machine that can operate largely on its own. --- 📌 Must Remember Clients initiate; servers respond. Server classification is based on the service offered (web server, file server, etc.). Security rule: always encrypt sensitive data in transit (e.g., TLS/HTTPS). Server‑side vulnerabilities – SQL injection, OS exploits. Client‑side vulnerabilities – XSS, malware, keystroke capture. Load balancer location: sits between clients and the server farm, forwards each request to an available server. Failover provides high availability; it’s triggered when the primary server becomes unreachable. Rich client ≠ “no server”; it still may use servers for data but performs most processing locally. --- 🔄 Key Processes Standard Request–Response Cycle Client builds request (method, headers, body). Sends request over network using a protocol (e.g., HTTP). Server receives, schedules, processes, and creates response. Server sends response back; client parses and acts. Load Balancing Workflow Client request → Load balancer. Load balancer checks health of backend servers. Chooses server based on algorithm (round‑robin, least‑connections, etc.). Forwards request → Selected server processes and returns response via load balancer to client. Failover Activation Health monitor detects primary server failure. Traffic is rerouted automatically to standby server(s). Optional: DNS update or virtual IP switch. Server‑Side Operation Decision Identify data/functionality unavailable on client (e.g., secure DB access). Move that logic to server‑side for security, speed, or reliability. Client‑Side Operation Decision Identify tasks that can run locally without exposing data (e.g., UI rendering). Implement locally to reduce latency and network load. --- 🔍 Key Comparisons Client vs. Server Client: initiates requests, consumes services, rarely shares resources. Server: waits for requests, provides shared resources, may serve many clients simultaneously. Client‑Server vs. Peer‑to‑Peer (P2P) Client‑Server: Centralized server(s) → many clients; scaling via more servers. P2P: Decentralized peers share resources directly; no dedicated server. Centralized Computing vs. Rich Client Centralized: Heavy processing on few servers; thin clients. Rich Client: Heavy processing on local machine; servers mainly store data. Load Balancer vs. Simple Single Server Load Balancer: Distributes load, improves uptime, adds complexity. Single Server: Simpler but becomes a bottleneck & single point of failure. --- ⚠️ Common Misunderstandings “Client and server are always separate machines.” They can run on the same host (e.g., a web server and a local client app). “Load balancing eliminates the need for server scaling.” It distributes load; you still need enough server capacity to handle peak traffic. “All security is handled on the client.” Sensitive validation and data protection must be performed server‑side; client‑side checks can be bypassed. “Rich clients don’t need servers at all.” They still need servers for data persistence, synchronization, and multi‑user features. --- 🧠 Mental Models / Intuition Doorbell Analogy – Think of the client as a person ringing a doorbell (request) and the server as the occupant who answers (response). Traffic Cop (Load Balancer) – Imagine a traffic cop directing cars (requests) to the least‑busy lane (server). Safety Vault (Server‑Side Security) – Sensitive data lives in a vault (server). The client only gets a receipt (response) after proper verification. --- 🚩 Exceptions & Edge Cases Server‑to‑Server Communication can bypass client‑side security checks, so trust boundaries must be explicitly defined. Denial‑of‑Service (DoS) Protection – Servers may deliberately limit request rates, causing legitimate clients to receive “service unavailable” responses under heavy load. Mixed‑Mode Applications – Some operations may be duplicated both client‑side and server‑side for resilience (e.g., offline caching). --- 📍 When to Use Which Choose Server‑Side Processing when: Data must stay secret (e.g., passwords, financial records). The operation requires resources unavailable on the client (large DB queries, heavy computation). Consistency across many clients is essential. Choose Client‑Side Processing when: Immediate UI feedback is needed (e.g., form validation). Reducing network traffic is a priority. The task can run with locally available data (e.g., image preview). Deploy a Load Balancer when: Expected traffic exceeds the capacity of a single server. High availability is required (must survive server failures). Implement Failover when: Service downtime has critical business impact. You have redundant hardware or cloud instances ready to take over. --- 👀 Patterns to Recognize “Client initiates → Server replies” appears in every protocol‑based question (HTTP, FTP, etc.). “Server‑side vulnerability = input not sanitized” → look for SQL injection or command injection clues. “Load balancer sits between” – any diagram showing a middle component handling traffic signals a load‑balancing scenario. “Rich client = local processing + remote data fetch” – questions mentioning heavy UI work but occasional server calls follow this pattern. --- 🗂️ Exam Traps Distractor: “Clients share resources with servers.” – False; clients request resources but do not share their own. Distractor: “Load balancers improve security, not availability.” – Load balancers mainly aid availability and scalability; security is a separate concern. Distractor: “P2P is just another name for client‑server.” – P2P lacks a central server and is decentralized. Distractor: “All server‑side code runs on the same physical machine.” – Servers can be distributed; inter‑server communication is common. Distractor: “Rich clients eliminate the need for any server.” – Rich clients still rely on servers for data storage and multi‑user coordination. ---
or

Or, immediately create your own study flashcards:

Upload a PDF.
Master Study Materials.
Start learning in seconds
Drop your PDFs here or
or