Linkedin Amazing Articles 1
Linkedin Amazing Articles 1
𝗖𝗮𝗰𝗵𝗶𝗻𝗴
Temporarily store frequently accessed data in memory to speed up retrieval.
Benefits:
Fast Data Retrieval: Tools like Redis and Memcached allow quicker access than querying
databases.
Efficient Content Delivery: Static assets like images, CSS, and JS files can be cached to
minimize origin server requests.
𝗟𝗼𝗮𝗱 𝗕𝗮𝗹𝗮𝗻𝗰𝗶𝗻𝗴
Distribute traffic evenly across servers to prevent overload.
Benefits:
𝗔𝘀𝘆𝗻𝗰𝗵𝗿𝗼𝗻𝗼𝘂𝘀 𝗣𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴
Offload long-running tasks to background processes to keep the main thread free.
Benefits:
Non-Blocking Operations: Tasks like email sending or file processing run independently,
allowing faster response times for users.
Parallel Query Execution: Queries are distributed and executed concurrently across
shards.
Improved Scalability: Reduces performance issues by distributing database load.
Lower Latency: Content is served from the nearest CDN server, reducing travel time.
Edge Caching: Static and dynamic content is cached at the edge, improving delivery
speed.
𝗗𝗮𝘁𝗮𝗯𝗮𝘀𝗲 𝗢𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻
Enhance database performance through indexing, optimized queries, and better schema
design.
Benefits:
Faster Retrieval: Indexing reduces the need to scan entire tables, accelerating query
execution.
Reduced Latency: Fewer network hops mean quicker communication between services.
Harness the power of System Design to elevate your engineering prowess and unlock
new career opportunities in the tech industry.
What does API gateway do?
The diagram below shows the detail.
Step 1 - The client sends an HTTP request to the API gateway.
Step 2 - The API gateway parses and validates the attributes in the HTTP request.
Step 4 - The API gateway talks to an identity provider for authentication and
authorization.
Step 5 - The rate limiting rules are applied to the request. If it is over the limit, the
request is rejected.
Steps 6 and 7 - Now that the request has passed basic checks, the API gateway finds the
relevant service to route to by path matching.
Step 8 - The API gateway transforms the request into the appropriate protocol and sends
it to backend microservices.
Steps 9-12: The API gateway can handle errors properly, and deals with faults if the error
takes a longer time to recover (circuit break). It can also leverage ELK (Elastic-Logstash-
Kibana) stack for logging and monitoring. We sometimes cache data in the API gateway.
Over to you: 1) What’s the difference between a load balancer and an API gateway?
2) Do we need to use different API gateways for PC, mobile and browser separately?
--
How to choose a right database for your project
Choosing the right database is essential for your project's success and scalability. Here
are the key factors to consider:
1. Data Structure
- Use relational databases like MySQL or PostgreSQL for structured data.
- Opt for NoSQL databases such as MongoDB or Cassandra for unstructured data.
Popular database choices include relational databases like MySQL and PostgreSQL,
NoSQL options such as MongoDB and Redis, NewSQL databases like CockroachDB, and
graph databases like Neo4j.
Selecting the right database involves balancing your data needs, scalability,
performance, and budget. Make an informed choice to support your application's growth
and success.
Load Balancers, Reverse Proxies, Forward Proxies, and API
Gateways
Load Balancer
What: Distributes incoming traffic across multiple servers to enhance availability,
scalability, and reliability, operating at either the transport (Layer 4) or application layer
(Layer 7).
Use Cases: Ideal for balancing web or app traffic, preventing server overload, and
ensuring fault tolerance in high-traffic environments.
Question: How does your current setup handle traffic spikes, and do you think a load
balancer could optimize it further?
Reverse Proxy
What: An intermediary that forwards client requests to the appropriate servers, often
enhancing security and load balancing at the application layer (Layer 7).
Use Cases: Shields internal servers, handles SSL/TLS encryption, and balances incoming
requests to improve performance and security.
Question: Is your application protected by a reverse proxy? What benefits have you
observed, or would you like to see?
Forward Proxy
What: An intermediary for clients accessing external resources, masking client identity
and offering caching and content filtering at the application layer.
Use Cases: Provides client anonymity, controls internet access, and optimizes bandwidth
by caching frequently accessed content.
Question: Have you considered using a forward proxy for better client privacy or content
control in your organization?
API Gateway
What: A central entry point for managing APIs, with features like authentication, rate
limiting, and logging, operating at the application layer.
Use Cases: Secures and manages APIs, enforces policies, limits abuse, and provides
logging for analytics and compliance.
Question: Are you leveraging an API gateway for your microservices? How has it
impacted your API security and performance?
In essence, Load Balancers focus on traffic distribution, Reverse Proxies boost server
security and performance, Forward Proxies manage client access and anonymity, and
API Gateways secure and streamline API management.
Which of these components could bring the most benefit to your current system
architecture?
To better understand how an API gateway works, 𝗹𝗲𝘁'𝘀 𝗹𝗼𝗼𝗸 𝗮𝘁 𝗵𝗼𝘄 𝗶𝘁 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗲𝘀 𝗮
𝗿𝗲𝗾𝘂𝗲𝘀𝘁:
Client requests are sent to the API gateway, which acts as the entry point for all
incoming API traffic, rather than directly accessing the backend services.
𝟮) 𝗥𝗲𝗾𝘂𝗲𝘀𝘁 𝘃𝗮𝗹𝗶𝗱𝗮𝘁𝗶𝗼𝗻
The API gateway processes and validates the request’s attributes to ensure it’s correctly
formatted.
𝟯) 𝗦𝗲𝗰𝘂𝗿𝗶𝘁𝘆 𝗰𝗵𝗲𝗰𝗸𝘀
It then performs checks against allow-lists and deny-lists to filter out unauthorized or
harmful requests.
The API gateway validates the request, checking for proper authentication (e.g., verifying
tokens or credentials) and ensuring the client has the necessary permissions
(authorization) to access the requested resources.
𝟱) 𝗥𝗮𝘁𝗲 𝗹𝗶𝗺𝗶𝘁𝗶𝗻𝗴
Rate limiting rules are enforced; if the request exceeds the allowed limit, it’s rejected.
Once passing basic checks, the API gateway then finds the relevant service to route the
request to by matching the path.
𝟳) 𝗣𝗿𝗼𝘁𝗼𝗰𝗼𝗹 𝘁𝗿𝗮𝗻𝘀𝗹𝗮𝘁𝗶𝗼𝗻
The API gateway transforms the request into the appropriate protocol and sends it to the
service.
𝟴) 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗲 𝗮𝗴𝗴𝗿𝗲𝗴𝗮𝘁𝗶𝗼𝗻
If the request requires data from multiple services, the API gateway aggregates the
responses. It sends requests to each relevant service, collects the results, and composes
them into a single, cohesive response.
𝟵) 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗲 𝗱𝗲𝗹𝗶𝘃𝗲𝗿𝘆
The gateway sends the processed response back to the client, ensuring it’s delivered in
the expected format and within an optimal timeframe.
Throughout this process, the API gateway logs each request and response and monitors
key metrics such as latency, error rates, and throughput. These logs and metrics help
troubleshoot, scale, & optimize the system. It also deals with faults (circuit breaks) &
provides response caching.
An API gateway is a powerful tool that not only simplifies client interactions with
microservices but also enhances security, performance, and reliability through
comprehensive request processing & monitoring.
𝐀𝐖𝐒 𝐖𝐞𝐛 𝐀𝐩𝐩𝐥𝐢𝐜𝐚𝐭𝐢𝐨𝐧 𝐑𝐞𝐟𝐞𝐫𝐞𝐧𝐜𝐞 𝐀𝐫𝐜𝐡𝐢𝐭𝐞𝐜𝐭𝐮𝐫𝐞❗
The AWS Network Architecture is a vast and complex system. But it is also one of the
most powerful tools available to cloud network specialists.
With it, you can create a network tailored to your application's specific needs.
✅Virtual Private Cloud (VPC): The foundation of AWS network architecture is the VPC. It's
a logically isolated section of the AWS cloud where you can launch resources in a virtual
network that you define. VPC enables you to control IP address ranges, subnets, route
tables, security groups, and network gateways.
✅Subnets: Within a VPC, you create subnets to segment the IP address range. Subnets
can be public (accessible from the Internet) or private (not accessible from the Internet).
They help organize and control network traffic flow.
✅Route Tables: Route tables define how traffic is routed between subnets and to external
networks. They determine the paths that network traffic takes within the VPC.
✅Security Groups: Security groups act as virtual firewalls for instances. They control
inbound and outbound traffic based on rules you define. Each instance can be associated
with one or more security groups.
✅VPC Peering: VPC peering allows you to connect multiple VPCs together, enabling direct
communication between them. Peered VPCs can route traffic between them as if they
were part of the same network.
✅Elastic Load Balancing (ELB): ELB distributes incoming application traffic across multiple
instances for better availability and fault tolerance.
𝐀𝐳𝐮𝐫𝐞 𝐒𝐨𝐥𝐮𝐭𝐢𝐨𝐧𝐬 𝐚𝐫𝐜𝐡𝐢𝐭𝐞𝐜𝐭 𝐌𝐢𝐧𝐝 𝐌𝐚𝐩❗
Do you want to improve your Azure abilities or pursue a career as a cloud architect?
This post explores a powerful tool - mind maps - to simplify understanding and
communication of complex Azure Solutions Architecture.
The Azure Solutions Architecture mind map is a great example of how mind maps can be
used to visualize complicated information. You may greatly enhance your understanding
and communication of cloud designs by including multiple sections in your mind map.
✔ Workload Types: To inform your service and architectural decisions, determine the
kinds of workloads that are already being executed on Azure (web applications, batch
processing, etc.).
✔ Monitoring: To proactively identify and resolve problems, use Azure Monitor to track
the availability, health, and performance of Azure services.
✔ Connection: Create a safe and effective infrastructure by learning about virtual private
networks (VPNs) and other networking concepts.
✔ Security: Make use of Azure’s built-in security tools to implement methods to protect
cloud resources and data.
✔ Identity: Utilise Azure Active Directory to control user identities and access for single
sign-on (SSO), multi-factor authentication (MFA), and authentication.
𝐖𝐚𝐲𝐬 𝐭𝐨 𝐎𝐩𝐭𝐢𝐦𝐢𝐳𝐞 𝐘𝐨𝐮𝐫 𝐂𝐥𝐨𝐮𝐝
𝐃𝐚𝐭𝐚 𝐒𝐭𝐨𝐫𝐚𝐠𝐞
Implement tiered storage for data based on how frequently it's accessed.
Use data compression and deduplication to minimize storage costs.
Archive infrequently accessed data to lower-cost storage solutions.
𝐂𝐚𝐜𝐡𝐢𝐧𝐠 𝐒𝐭𝐫𝐚𝐭𝐞𝐠𝐢𝐞𝐬
Use caching to temporarily store data for quicker retrieval.
Employ edge caching to deliver data faster by storing it near users.
In-memory caching stores data in server memory, speeding up applications.
𝐀𝐮𝐭𝐨-𝐒𝐜𝐚𝐥𝐢𝐧𝐠
Automatically adjust resources in response to demand changes.
Create scaling policies for managing resources during different usage periods.
Save costs and enhance performance with dynamic resource allocation.
𝐌𝐚𝐧𝐚𝐠𝐞𝐝 𝐒𝐞𝐫𝐯𝐢𝐜𝐞𝐬
Delegate routine management tasks to your cloud provider.
Leverage provider expertise for improved performance and reliability.
Free up internal resources to focus on core business activities.
𝐂𝐥𝐨𝐮𝐝-𝐍𝐚𝐭𝐢𝐯𝐞
Develop applications using microservices for better scalability.
Allow microservices to scale independently for precise resource allocation.
Benefit from efficient and resilient cloud-native architectures.
𝐂𝐨𝐬𝐭 𝐌𝐚𝐧𝐚𝐠𝐞𝐦𝐞𝐧𝐭
Monitor your cloud expenditure closely.
Use cloud provider tools for cost management.
Set up budget alerts to keep spending under control.
Regularly review and optimize your pricing plans and reserved instances.
𝐈𝐧𝐟𝐫𝐚𝐬𝐭𝐫𝐮𝐜𝐭𝐮𝐫𝐞-𝐚𝐬-𝐂𝐨𝐝𝐞
Manage cloud infrastructure through code for consistency.
Automate infrastructure provisioning to reduce errors.
Simplify scaling by updating infrastructure as code.
𝐏𝐞𝐫𝐟𝐨𝐫𝐦𝐚𝐧𝐜𝐞 𝐌𝐨𝐧𝐢𝐭𝐨𝐫𝐢𝐧𝐠
Use performance monitoring tools to track resource usage and application performance.
Set up alerts for unusual activity or resource
URL FLOW
Ever wondered what happens when you type a URL into your browser’s address bar?
Here’s a breakdown of the journey behind the scenes:
𝗗𝗡𝗦 𝗟𝗼𝗼𝗸𝘂𝗽: The browser needs to find the IP address for the domain you entered (like
"google.com"). It queries the Domain Name System (DNS), which acts like the internet’s
phonebook, converting the domain name into a numerical IP address.
𝗦𝗲𝗻𝗱𝗶𝗻𝗴 𝗮𝗻 𝗛𝗧𝗧𝗣 𝗥𝗲𝗾𝘂𝗲𝘀𝘁: The browser sends an HTTP request to the server, asking
for the content of the webpage. This request usually contains details like the browser
type and any cookies associated with the site.
𝗦𝗲𝗿𝘃𝗲𝗿 𝗣𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴: The server receives the request, processes it, fetches any
necessary data, and generates the response. This can involve running backend scripts,
retrieving database records, or loading static files.
𝗥𝗲𝗰𝗲𝗶𝘃𝗶𝗻𝗴 𝘁𝗵𝗲 𝗥𝗲𝘀𝗽𝗼𝗻𝘀𝗲: The server then sends back a response, which typically
includes HTML, CSS, and JavaScript files. This data tells the browser what to display and
how to style it.
𝗥𝗲𝗻𝗱𝗲𝗿𝗶𝗻𝗴 𝘁𝗵𝗲 𝗣𝗮𝗴𝗲: Finally, the browser processes the files, executes scripts, and
renders the web page on your screen. It takes all these steps to transform a simple URL
into the fully loaded page you see.
It’s a fascinating process, with many components working together to deliver content
almost instantly. Understanding this helps developers optimize performance and
troubleshoot effectively.
𝐂𝐨𝐦𝐩𝐨𝐧𝐞𝐧𝐭𝐬 𝐨𝐟 𝐌𝐢𝐜𝐫𝐨𝐬𝐞𝐫𝐯𝐢𝐜𝐞𝐬 𝐀𝐫𝐜𝐡𝐢𝐭𝐞𝐜𝐭𝐮𝐫𝐞
1. 𝐂𝐥𝐢𝐞𝐧𝐭
These are the end-users who interact with the application via different interfaces like
web, mobile, or PC.
3. 𝐋𝐨𝐚𝐝 𝐁𝐚𝐥𝐚𝐧𝐜𝐞𝐫
It distributes incoming network traffic across multiple servers, ensuring no single server
becomes a bottleneck and improving the application's availability and reliability.
4. 𝐀𝐏𝐈 𝐆𝐚𝐭𝐞𝐰𝐚𝐲
An API Gateway acts as an entry point for all clients, handling tasks like request routing,
composition, and protocol translation, which helps manage multiple microservices
behind the scenes.
5. 𝐌𝐢𝐜𝐫𝐨𝐬𝐞𝐫𝐯𝐢𝐜𝐞𝐬
Each microservice is a small, independent service that performs a specific business
function. They communicate with each other via APIs.
6. 𝐌𝐞𝐬𝐬𝐚𝐠𝐞 𝐁𝐫𝐨𝐤𝐞𝐫
A message broker facilitates communication between microservices by sending
messages between them, ensuring they remain decoupled and can function
independently.
7. 𝐃𝐚𝐭𝐚𝐛𝐚𝐬𝐞𝐬
Each microservice typically has its database to ensure loose coupling. This can involve
different databases for different microservices
8. 𝐈𝐝𝐞𝐧𝐭𝐢𝐭𝐲 𝐏𝐫𝐨𝐯𝐢𝐝𝐞𝐫
This component handles user authentication and authorization, ensuring secure access
to services.
Let's break down the most commonly used Linux commands by category:
1. Process Management
At the heart of Linux system administration is process management. Essential
commands include:
top -u username
kill 1234
kilall firefox
htop -s PID
passwd caesar
userdel -r username
groupadd developers
groupdel developers
groups username
3. System Information
Understanding your system's status is vital for troubleshooting and maintenance:
uname -a
df -h
du -sh /path/to/directory
free -m
lscpu --json
lshw -c network
lsblk -d
4. Package Management
Different distributions use various package managers, but common commands include:
apt-get/apt: Debian/Ubuntu package management
ifconfig eth0 up
ping google.com
ss -tlnp
Start with the basics, learn core system design concepts before tackling advanced
topics.
- Mayank Ahuja
- Rajat Gajbhiye
- Nikki Siapno
- Saurabh Dashora
- Rocky Bhatia
- Alexandre Zajac
- Raul Junco
𝐄𝐧𝐠𝐢𝐧𝐞𝐞𝐫𝐢𝐧𝐠 𝐋𝐞𝐚𝐝𝐞𝐫𝐬𝐡𝐢𝐩
- Anemari Fiser
- Irina Stanescu
- Bhavana Hindupur
- Gregor Ojstersek
𝐂𝐚𝐫𝐞𝐞𝐫 𝐆𝐫𝐨𝐰𝐭𝐡:
- Omar Halabieh
- Hina Arora
- Brooke Sweedar
- 🌻 Anna Miller
- Jade Wilson
- Callie Buruchara
Follow Ashish Misal for insightful content on System Design, Javascript and other MERN
Technologies!
2. 𝗔𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻 𝗦𝗲𝗿𝘃𝗲𝗿: Hosts and runs web applications, providing business logic to
client applications (e.g., Tomcat, JBoss).
4. 𝗙𝗶𝗹𝗲 𝗦𝗲𝗿𝘃𝗲𝗿: Stores and provides shared access to files (e.g., FTP, Samba).
5. 𝗠𝗮𝗶𝗹 𝗦𝗲𝗿𝘃𝗲𝗿: Sends and receives emails (e.g., Microsoft Exchange, Postfix).
6. 𝗣𝗿𝗼𝘅𝘆 𝗦𝗲𝗿𝘃𝗲𝗿: Acts as an intermediary between client requests and other servers to
enhance performance or security (e.g., Squid).
8. 𝗩𝗶𝗿𝘁𝘂𝗮𝗹 𝗦𝗲𝗿𝘃𝗲𝗿: Allows multiple virtual servers to run on one physical machine (e.g.,
VMware, Hyper-V).
Evaluate Monolithic Structure: Identify the monolithic parts that can be split into
services.
Set Goals: Why are you moving to microservices? Scalability, agility, maintainability?
Identify Business Domains: Map the functionality of your application to specific business
domains (Domain-Driven Design).
3. Design Phase
Service Boundaries: Define the boundaries for each service based on business
functionality (e.g., user service, payment service).
Database Design: Plan whether each microservice will have its own database
(recommended) or use a shared one. Consider data consistency challenges.
API Design: Design clear and consistent APIs for service communication.
Security: Plan for authentication (OAuth, JWT, etc.), authorization, and secure
communication between services.
Programming Languages & Frameworks: Decide on the tech stack (e.g., Java, Node.js,
Python).
API Gateways: Use API gateways (like Kong, Istio) to handle cross-cutting concerns such
as security, routing, and throttling.
Service Discovery: Implement service discovery tools like Consul or Eureka for locating
services.
Containers & Orchestration: Use Docker and Kubernetes for packaging and orchestrating
services.
Version Control & CI/CD Pipelines: Use Git for version control and set up CI/CD pipelines
to automate testing, building, and deployment (e.g., Jenkins, GitLab CI).
Automated Testing: Unit tests, integration tests, and contract testing for microservices
interactions.
Monitoring: Use tools like Prometheus and Grafana for real-time monitoring.
Tracing: Implement distributed tracing (e.g., Jaeger, Zipkin) to track requests as they
move through services.
Rate Limiting and Circuit Breakers: Implement patterns like circuit breakers (e.g.,
Hystrix) to handle failures gracefully.
Security Hardening: Ensure that services are secure with practices like TLS, API key
management, and secure service-to-service communication.
Failover & Replication: Build redundancy and failover mechanisms for resilience.
Activate to view larger image,
Microservice Architecture: Key Components and Leading
Technologies
1. Distributed Transactions: Ensures data consistency across services.
Examples: Saga Pattern, Apache Kafka
Which of these technologies have you found most impactful in your microservices
journey?
-Format
-Keywords
-Projects
-Work Experience
-Enterprise Architect
-Cloud Architect
-Delivery Manager
Interview
-System Designs
Ques: