redis.io Open in urlscan Pro
45.60.129.1  Public Scan

URL: https://redis.io/solutions/microservices/?utm_medium=email&utm_source=marketo&utm_campaign=nl-2024-10-08-october-...
Submission: On October 09 via manual from BR — Scanned from DE

Form analysis 1 forms found in the DOM

GET https://redis.io

<form id="searchForm" action="https://redis.io" method="get">
  <div class="searchbox">
    <label for="searchInput">Search:</label>
    <input type="text" name="s" placeholder="Search" id="search-field" data-header="search-field">
    <button type="submit"><img src="/wp-content/themes/redislabs-glide/assets/src/images/search-icon.svg"></button>
  </div>
</form>

Text Content

Be the first to see our latest product releases—virtually—at Redis Released:
Worldwide.

Register now


Back
   
 * Products
   Products
   * Community EditionIn-memory database for caching and streaming
   * Redis CloudFully managed service integrated with Google Cloud, Azure, and
     AWS for production-ready apps
   * Redis SoftwareSelf-managed software with additional compliance,
     reliability, and resiliency for enterprise scaling
   Tools
   * Redis Insight
   * Clients & connectors
   Key features
   * Redis for AI
   * Redis Data Integration (RDI)
   * Search & query
   * JSON
   * Active-Active
   * Auto Tiering
   * Vector database
   * Product releases
   See how it works
   * Visit Demo Center
   Get Redis
   * Downloads
   
 * Solutions
   Use cases
   * Caching
   * Deduplication
   * Fast data ingest
   * Feature stores
   * Session management
   * Vector database
   Industries
   * Financial services
   * Gaming
   * Healthcare
   * Retail
   Customer case studies
   * Read stories
   Optimizing Pokémon GO with a Redis cluster
   * See more
   
 * Support
   Expert services
   * Support
   * Professional services
   
 * Company
   About
   * Mission & values
   * Leadership
   * Careers
   * News
   * Partners
   
 * Docs
   Learn
   * Docs
   * Commands
   * Quick starts
   * Tutorials
   * University
   * Knowledge base
   * Resources
   * Blog
   Connect
   * Community
   * Events & webinars
   Vector searchLearn what you need to go from beginner to GenAI expert
   * Get started
   
 * Pricing

Try Redis

Book a meeting

Login



Login

Book a meeting

Try Redis


Search
Search:


REDIS FOR MICROSERVICES

Build resilient and highly available microservices

Try for free



CHOOSE A REAL-TIME DATA LAYER FOR YOUR MICROSERVICES ARCHITECTURE

Microservice architectures make it possible to launch new products faster, help
them scale easier, and respond better to customer demands. With its multiple
modern data models, fault tolerance in any scenario, multi-tenancy for
isolation, and the flexibility to deploy across multiple environments, Redis
enables devs and operators to optimize their data layer for a microservices
architecture.



Since Redis and microservices [removed] the constraints of our old architecture,
it [increased] our speed of deployment 2x – 3x within the first year.

Paul Kurmas

Director of Strategic Product Development, Mutualink

When we receive a telephone call, we need to make a decision on how a call
should be routed, whether it should be allowed or blocked. All of that process
is done via interactions with Redis and it has to be done in real time.

Alec Fenichel

Senior Software Architect, TransNexus


 * 1
 * 2


WHAT IS A MICROSERVICES ARCHITECTURE?

As defined by Chris Richardson, noted microservices expert, microservices
architecture is an architectural style for structuring apps as a collection of
loosely coupled services that are highly maintainable and testable,
independently deployable, bounded by specific business domains, and owned by
small teams. A microservice architecture enables the rapid, frequent, and
reliable delivery of large, complex apps. 


WHY MICROSERVICES MATTER

Microservices-based apps support strategic digital transformation & cloud
migration initiatives

Microservices is an architecture style that has helped development teams create
better software, faster, and minimize the costs and complexity of app
modernization. As a result, microservices architectures have been adopted across
all industries, for projects that justifiably can be labeled “digital
transformation initiatives” as well as for more mundane but important tasks such
as bootstrapping cloud deployments. 

This architecture style and its related software development culture enable
microservices development teams to operate on their own release cycles, embrace
end-to-end product ownership, and adopt a DevOps framework built on continuous
integration/continuous delivery. The result is that enterprises can reduce
time-to-market for new service development, often from projects measured in
months to days. 

Microservices accelerate data tier cloud migrations. That’s because they
primarily rely on cloud-native NoSQL databases. NoSQL databases are replacing
on-premises relational databases that were not built for the cloud nor for
independent release cycles, according to a 2021 IDC InfoBrief survey. 

In addition, some organizations cannot migrate their legacy monolith apps to
cloud-native all at once. Microservices enable incremental migration of
subdomains from a monolithic architecture to modern technology stacks.


A PERFECT SOLUTION FOR MICROSERVICES


PERFORMANCE AT MICROSERVICES SCALE

In a microservices environment, services that need to run in real-time must
compensate for networking overhead. Enterprise-grade Redis delivers
sub-millisecond latency for all Redis data types and models. In addition, it
scales instantly and linearly to almost any throughput needed.




DESIGNED FOR FAULT TOLERANCE & RESILIENCE

To ensure your apps are failure resilient, Redis uses a shared-nothing cluster
architecture. It is fault tolerant at all levels: with automated failover at the
process level, for individual nodes, and even across infrastructure availability
zones. It also includes tunable persistence and disaster recovery.




REDUCE COMPLEXITY WITH FAST & FLEXIBLE DATA MODELS

Redis allows developers to choose the data model best suited to their
performance and data access requirements for their microservices architecture
and domain driven design, while retaining isolation with multi-tenant deployment
on a single data platform. 




SIMPLIFY OPERATIONS WITH NATIVE KUBERNETES DEPLOYMENT

Redis provides a unified operational interface that reduces technology sprawl,
simplifies operations, and reduces service latency. The Redis Operator for
Kubernetes gives you consistent, automated deployments to reduce risk. That lets
dev teams focus on innovation and business value.


ADAPTABLE ACROSS CLOUDS & GEOGRAPHIES

Choose where your database should run. Redis can be deployed anywhere: on any
cloud platform, on-premises, or in a multicloud or hybrid cloud architecture.




DESIGN PATTERNS FOR MICROSERVICE ARCHITECTURES


MICROSERVICE QUERY CACHING

Isolation or bounded context is an important characteristic of a microservice
architecture. As part of domain-driven design, each service can have a dedicated
database with its own unique data model and service level agreement performance
goals. Query caching, a cache pattern commonly used to reduce microservice
response times, works by deploying a Redis cache alongside each microservice to
deliver data that is needed within a single business context. (That is, it
serves only one microservice.) 

The Redis Smart Cache is an open source library that seamlessly adds caching to
any JDBC—compliant platform, app, or microservice, improving query performance
while reducing operational complexity – and you don’t need to change app
code. Redis supports multiple data models that can be easily deployed
multi-tenant, yet remain isolated, all without sacrificing performance.

Learn more




CACHING CROSS-DOMAIN SHARED DATA VIA CQRS

Microservices need fast access to data, but that can be a challenge when dozens
or hundreds of microservices try to read from the same slow disk-based database.
Cross-domain data needs to be available to each microservice in real-time—and to
do so without breaking the scope of its focused business context and goal.

Command Query Responsibility Segregation (CQRS) is a critical pre-fetch cache
pattern within microservice architectures that decouple reads (queries) and
writes (commands). This enables an app to write data to a slower disk-based SQL
database, while pre-fetching and caching that data using the integrated Change
Data Capture (CDC) capability in Redis for lightning-fast reads. Doing so makes
that data immediately available to other microservices that need it.

Learn more


API GATEWAY CACHING FOR GLOBAL DATA

Microservices apps can cache globally accessed data at the API gateway level to
distribute and speed up data that is accessed by all services. Typically this
would be session data (such as user ID and preferences) and authentication data
(tokens, authorization status, permissions). This enables frequently needed data
available in real time to all services. The result? Reducing app latency without
breaking the bounds of each microservice business context.

Rate limiting, in which one meters the number of API requests in a certain
timespan, can also be implemented at the API gateway using Redis. This can
prevent overloading the system and prevents DDoS attacks.




ASYNCHRONOUS MESSAGING FOR INTER-SERVICE COMMUNICATION 

Microservices must communicate state, events, and data with one another without
breaking isolation, and they have to stay decoupled. A common solution is to
bring a publish—subscribe messaging broker into the architecture—that is, to
make inter-service communication event-driven and eventually consistent–and to
treat every message between microservices as an event.

Redis Streams is an immutable time-ordered log data structure that enables a
service (producer) to publish asynchronous messages to which multiple consumers
can subscribe. It can be configured to handle different delivery guarantees,
support consumer groups, and to apply other features that are comparable to
Apache Kafka topic partitions. Even better, Redis Streams helps to create
reporting, analytics, auditing, and forensic analysis on the backend.

Learn more


REDIS FEATURES FOR MICROSERVICE ARCHITECTURE


ACTIVE-ACTIVE REPLICATION

A microservices architecture has many connected services, yet it faces the same
performance demands as monolithic apps. To minimize latency, data should reside
as close to the services as possible. You also need to ensure databases are
consistent with one another in the event of failures or conflicting updates.
Redis can be deployed as an Active-Active, conflict-free replicated database to
handle updates from multiple local installations of your services without
compromising latency or data consistency and providing continuity in the event
of failures.


MULTIPLE DATA MODELS

Redis provides multiple data structures (hashes, strings, Streams, lists, etc.)
and models including JSON, search, time-series, and graph that let you choose
the data model best suited for your microservice domain, performance, and
data-access requirements. And it’s all in a single data platform.


MULTI-TENANT DATABASES

Within a microservices architecture database design, a single Redis cluster can
provide databases to many different services, each with its own isolated
instance, tuned for the given workload. Each database instance is deployed,
scaled, and modeled independently of the others, while leveraging the same
cluster environment, isolating data between services without increasing
operational complexity.


FLEXIBLE ACROSS CLOUDS

Microservices provide a great deal of technology flexibility, and choosing where
you want to run your database should be no exception. Redis can be deployed
anywhere: on any cloud platform, on-premises, or in a multicloud or hybrid-cloud
architecture. It is also available on Kubernetes, Pivotal Kubernetes Service
(PKS), and Red Hat OpenShift.


NATIVE KUBERNETES CONTAINER ORCHESTRATION AND MANAGEMENT

Containers are closely aligned with and help enterprises implement microservice
apps. Kubernetes is the de facto standard platform for container deployment,
scheduling, and orchestration. Redis is the top database technology running on
containers, with over two billion Docker hub launches. Redis Operator for
Kubernetes provides: automatic scalability, persistent storage volumes,
simplified database endpoint management, and zero downtime rolling upgrades. It
is available on multiple Kubernetes platforms and cloud managed services,
including RedHat OpenShift, VMware Tanzu Kubernetes Grid (formerly Enterprise
PKS), upstream Kubernetes, and Azure Kubernetes Service (AKS), Google Kubernetes
Engine (GKE), or Amazon Elastic Kubernetes Service (EKS).


RELATED RESOURCES

VIDEO


Deploying a Microservice Data Layer on Kubernetes

Learn more

VIDEO


Cache and Message Broker for Microservices

Learn more

VIDEO


How Redis Fits with a Microservices Architecture

Learn more

VIDEO


Redis and Kafka – Advanced Microservices Design Patterns Simplified

Learn more

VIDEO


Understanding Streams in Redis and Kafka – A Visual Guide

Learn more

VIDEO


How Redis Enterprise Powers TransNexus’ Microservices Architecture to Help Fight
Robocallers

Learn more

VIDEO


Microservices and the Data Layer—a New IDC InfoBrief

Learn more

VIDEO


How to Use Redis as an Event Store for Communication Between Microservices

Learn more

VIDEO


Adopting a Microservices Architecture? Don’t Forget the Data Layer!

Learn more

VIDEO


How Mutualink Uses Redis to Support a Life-Saving Microservices Architecture

Learn more

VIDEO


Building Microservices With Redis

Learn more

VIDEO


Microservices With Redis Enterprise on Kubernetes

Learn more

VIDEO


How to Use Redis in Infrastructure Microservices

Learn more

VIDEO


How to Synchronize Data Across Microservices

Learn more

VIDEO


How to Embed Redis into Your Continuous Integration and Continuous Deployment
(CI/CD) Process

Learn more

VIDEO


Redis-Powered Microservices Architecture Proves Its Worth for Z3 Works

Learn more


FAQ

 * What are microservices?
   * Microservices architecture (often shortened to microservices) refers to an
     architectural style for developing applications. Microservices allow a
     large application to be separated into smaller independent parts, with each
     part having its own realm of responsibility. To serve a single user
     request, a microservices-based application can call on many internal
     microservices to compose its response.
 * What is the difference between monolithic architecture and microservices
   architecture?
   * In a monolithic architecture, processes are tightly coupled and run as a
     single deployable artifact. While this is relatively simple to begin with,
     scaling up or modifying one part of your microservice application requires
     updating the entire service, resulting in inefficient scalability and
     increased complexity as your codebase grows in size. 
     * Microservices architecture involves a collection of loosely coupled
       services that can be independently updated and scaled by smaller teams.
       Because individual services are easier to build, deploy, and manage than
       a single monolithic application, microservices enable more frequent
       deployments, data store autonomy, and increased flexibility.
     * Organizations are transitioning their entire applications to
       microservices architecture in order to drastically decrease time to
       market, more easily adopt new technologies, and respond faster to
       customer needs.
 * What is Kubernetes?
   * Kubernetes, also known as k8s, is an open-source orchestration system for
     automating deployment, scaling, and management of containerized
     applications, typically used as part of microservice and cloud native
     architectures. 
 * What are Docker containers?
   * Docker containers images are lightweight, standalone, executable packages
     of software that includes everything needed to run an application.
 * What is an API gateway?
   * An API gateway is a software application for api management that sits
     between a client and a set of backend microservices. The API Gateway serves
     as a reverse proxy to accept API calls from the client application,
     forwarding this traffic to the appropriate service.


 * Trust
 * Terms of use
 * Privacy policy

 * Cloud
 * Software
 * Pricing
 * Support

 * About us
 * Careers
 * Contact us
 * Legal notices

Select Language: Language English Español Français Deutsch Português
Select Language: Language English Español Français Deutsch Português
 * Trust
 * Terms of use
 * Privacy policy