Microsoft Azure Cosmos DB Developer Specialty Quick Facts (2025)

Comprehensive DP-420 exam overview for the Microsoft Azure Cosmos DB Developer Specialty that summarizes domains, exam format, costs, study tips, and hands-on topics to help you prepare and pass the certification, including data modeling, partitioning, integration, optimization, and maintenance.

Microsoft Azure Cosmos DB Developer Specialty Quick Facts
5 min read
DP-420Microsoft Azure Cosmos DB Developer SpecialtyAzure Cosmos DB certificationAzure Cosmos DB DP-420DP-420 exam overview
Table of Contents

Microsoft Azure Cosmos DB Developer Specialty Quick Facts

The Microsoft Azure Cosmos DB Developer Specialty certification empowers you to design and implement world-class solutions on one of Microsoft’s premier databases. This overview highlights the essentials, ensuring you know what to expect and how to focus your preparation with clarity and confidence.

How does the Azure Cosmos DB Developer Specialty certification help you grow?

The Azure Cosmos DB Developer Specialty (exam DP-420) validates your ability to design and implement cloud-native applications by leveraging the performance, scalability, and flexibility of Azure Cosmos DB. This certification is ideal for developers who want to demonstrate expertise in building modern, globally distributed applications, including skills in data modeling, partitioning, security, and integration across Azure services. By earning it, you showcase your ability to create high-performing applications that use Azure Cosmos DB at scale, while also equipping yourself with the expertise to contribute to projects involving mission-critical, data-intensive workloads.

Exam Domains Covered (Click to expand breakdown)

Exam Domain Breakdown

Domain 1: Design and implement data models (38% of the exam)

Design and implement a non-relational data model for Azure Cosmos DB for NoSQL

  • Develop a design by storing multiple entity types in the same container
  • Develop a design by storing multiple related entities in the same document
  • Develop a model that denormalizes data across documents
  • Develop a design by referencing between documents
  • Identify primary and unique keys
  • Identify data and associated access patterns
  • Specify a default time to live (TTL) on a container for a transactional store
  • Develop a design for versioning documents
  • Develop a design for document schema versioning

Summary: This section emphasizes how to shape data models that align with Cosmos DB's non-relational design approach. You will learn how to structure data in containers and documents, balance denormalization with referencing, and anticipate future schema evolution. A major focus is on ensuring efficient storage while aligning with the access patterns of your applications.

You will also explore strategies for ensuring longevity and clarity in schema management. This involves versioning, the use of TTLs for transactional scenarios, and controlling design for evolving workloads. At its core, this topic builds the foundation for performance and maintainability.

Design a data partitioning strategy for Azure Cosmos DB for NoSQL

  • Choose a partitioning strategy based on a specific workload
  • Choose a partition key
  • Plan for transactions when choosing a partition key
  • Evaluate the cost of using a cross-partition query
  • Calculate and evaluate data distribution based on partition key selection
  • Calculate and evaluate throughput distribution based on partition key selection
  • Construct and implement a synthetic partition key
  • Design and implement a hierarchical partition key
  • Design partitioning for workloads that require multiple partition keys

Summary: This section explores the critical design choice of partitioning. You will learn how to select and apply partition keys while carefully balancing throughput, transactions, and data distribution. The focus is on making informed partitioning decisions that allow Cosmos DB to scale predictably and efficiently.

Alongside partitioning basics, you will also experiment with advanced strategies like synthetic and hierarchical keys for complex workloads. These design considerations ensure that the database supports both performance and scalability as workloads evolve.

Plan and implement sizing and scaling for a database created with Azure Cosmos DB

  • Evaluate the throughput and data storage requirements for a specific workload
  • Choose between serverless, provisioned and free models
  • Choose when to use database-level provisioned throughput
  • Design for granular scale units and resource governance
  • Evaluate the cost of the global distribution of data
  • Configure throughput for Azure Cosmos DB by using the Azure portal

Summary: This section teaches you how to anticipate and manage resources for different workloads. You will evaluate when to use serverless versus provisioned modes and explore how to configure database-level throughput. Understanding resource units and their relationship to consumption is essential here.

Cost-awareness is equally important. You will learn to balance scale, governance, and global distribution decisions with cost efficiency to ensure sustainable use of Cosmos DB resources.

Implement client connectivity options in the Azure Cosmos DB SDK

  • Choose a connectivity mode (gateway versus direct)
  • Implement a connectivity mode
  • Create a connection to a database
  • Enable offline development by using the Azure Cosmos DB emulator
  • Handle connection errors
  • Implement a singleton for the client
  • Specify a region for global distribution
  • Configure client-side threading and parallelism options
  • Enable SDK logging

Summary: This section highlights how developers connect to Cosmos DB for both production and development. You will evaluate connectivity modes, such as gateway and direct, and implement them effectively to ensure smooth client interactions. The Cosmos DB emulator is introduced as a valuable tool for local testing.

Equally significant is mastering robustness. You will handle transient errors, implement singletons for efficiency, define client regions, and manage threading. These practices help applications remain reliable and performant.

Implement data access by using the SQL language for Azure Cosmos DB for NoSQL

  • Implement queries that use arrays, nested objects, aggregation, and ordering
  • Implement a correlated subquery
  • Implement queries that use array and type-checking functions
  • Implement queries that use mathematical, string, and date functions
  • Implement queries based on variable data

Summary: With SQL as the query language of choice in Cosmos DB, this section focuses on shaping complex queries. It teaches how to work with nested objects, arrays, and aggregations for insightful data operations. Advanced constructs like correlated subqueries are also covered.

Beyond structural querying, you will explore functions for math, strings, and dates. Combined with variable data handling, you gain the skillset to implement expressive, efficient queries that support diverse scenarios.

Implement data access by using Azure Cosmos DB for NoSQL SDKs

  • Choose when to use a point operation versus a query operation
  • Implement a point operation that creates, updates, and deletes documents
  • Implement an update by using a patch operation
  • Manage multi-document transactions using SDK Transactional Batch
  • Perform a multi-document load using Bulk Support in the SDK
  • Implement optimistic concurrency control using ETags
  • Override default consistency by using query request options
  • Implement session consistency by using session tokens
  • Implement a query operation that includes pagination
  • Implement a query operation by using a continuation token
  • Handle transient errors and 429s
  • Specify TTL for a document
  • Retrieve and use query metrics

Summary: This section highlights programmatic access with SDKs. You will differentiate between operations, learn to perform both point and bulk actions, and understand advanced capabilities like patching and multi-document transactions.

You will also gain techniques to ensure reliability and performance. This includes concurrency management, consistency control, error handling, pagination, and retrieving query insights via metrics. These skills are critical for fluid, production-grade solutions.

Implement server-side programming in Azure Cosmos DB for NoSQL by using JavaScript

  • Write, deploy, and call a stored procedure
  • Design stored procedures to work with multiple documents transactionally
  • Implement and call triggers
  • Implement a user-defined function

Summary: This section introduces server-side programming within Cosmos DB. You will create stored procedures that process data transactionally, regardless of entity volume.

Additionally, you will enhance solutions with triggers and custom functions. Together, these server-side constructs extend Cosmos DB’s flexibility, allowing developers to embed logic directly inside the database for streamlined application design.

Domain 2: Design and implement data distribution (8% of the exam)

Design and implement a replication strategy for Azure Cosmos DB

  • Choose when to distribute data
  • Define automatic failover policies for regional failure for Azure Cosmos DB for NoSQL
  • Perform manual failovers to move single master write regions
  • Choose a consistency model
  • Identify use cases for different consistency models
  • Evaluate the impact of consistency model choices on availability and associated request unit (RU) cost
  • Evaluate the impact of consistency model choices on performance and latency
  • Specify application connections to replicated data

Summary: This section explores replication design with Cosmos DB. You will detail when and how to distribute data globally and implement strategies that maintain high availability during outages with failover mechanisms.

A major dimension of this section is consistency. You will analyze models like strong or eventual consistency and weigh their impact on cost, performance, and latency. Applications are guided to connect appropriately for resilient, well-balanced outcomes.

Design and implement multi-region write

  • Choose when to use multi-region write
  • Implement multi-region write
  • Implement a custom conflict resolution policy for Azure Cosmos DB for NoSQL

Summary: Multi-region writes enable resilient and scalable apps. You will discover when multi-region write configurations are the right choice and how to configure them in practice.

Conflict resolution is highlighted. Custom policies can help ensure consistency when multiple regions serve as write points, making your solutions global-ready and fault-tolerant.

Domain 3: Integrate an Azure Cosmos DB solution (8% of the exam)

Enable Azure Cosmos DB analytical workloads

  • Enable Azure Synapse Link
  • Choose between Azure Synapse Link and Spark Connector
  • Enable the analytical store on a container
  • Implement custom partitioning in Azure Synapse Link
  • Enable a connection to an analytical store and query from Azure Synapse Spark or Azure Synapse SQL
  • Perform a query against the transactional store from Spark
  • Write data back to the transactional store from Spark
  • Implement Change Data Capture in the Azure Cosmos DB analytical store
  • Implement time travel in Azure Synapse Link for Azure Cosmos DB

Summary: This section focuses on enabling real-time and analytical insights. You will use Azure Synapse Link or Spark connectors to create a seamless bridge between Cosmos DB and analytics platforms.

Concepts such as custom partitioning, Change Data Capture, and time travel build a toolkit to embed analytics pipelines seamlessly into transactional applications. This makes Cosmos DB solutions both operational and analytical.

Implement solutions across services

  • Integrate events with other applications by using Azure Functions and Azure Event Hubs
  • Denormalize data by using Change Feed and Azure Functions
  • Enforce referential integrity by using Change Feed and Azure Functions
  • Aggregate data by using Change Feed and Azure Functions, including reporting
  • Archive data by using Change Feed and Azure Functions
  • Implement Azure AI Search for an Azure Cosmos DB solution

Summary: This section reveals the synergy between Cosmos DB and external Azure services. You will integrate solutions with Functions, Event Hubs, and AI Search to extend data capabilities.

Change Feed plays a central role in linking your database events to downstream tasks like data aggregation, archiving, and real-time processing. This cross-service collaboration elevates Cosmos DB into a powerful ecosystem player.

Domain 4: Optimize an Azure Cosmos DB solution (18% of the exam)

Optimize query performance when using the API for Azure Cosmos DB for NoSQL

  • Adjust indexes on the database
  • Calculate the cost of the query
  • Retrieve request unit cost of a point operation or query
  • Implement Azure Cosmos DB integrated cache

Summary: This section teaches optimization practices for queries. You will adjust indexes to support workload access patterns and measure RU costs effectively.

You will also learn to employ integrated cache and optimize queries for balance between cost efficiency and speed. This ensures queries remain fast while resource usage remains predictable.

Design and implement change feeds for Azure Cosmos DB for NoSQL

  • Develop an Azure Functions trigger to process a change feed
  • Consume a change feed from within an application by using the SDK
  • Manage the number of change feed instances by using the change feed estimator
  • Implement denormalization by using a change feed
  • Implement referential enforcement by using a change feed
  • Implement aggregation persistence by using a change feed
  • Implement data archiving by using a change feed

Summary: This section covers how to harness change feeds in applications. You will use triggers, SDKs, and estimators to process and scale change feed usage effectively.

Beyond consumption, you will implement patterns like aggregation persistence, data archiving, denormalization, and referential enforcement using change feeds. These use cases highlight its versatility.

Define and implement an indexing strategy for Azure Cosmos DB for NoSQL

  • Choose when to use a read-heavy versus write-heavy index strategy
  • Choose an appropriate index type
  • Configure a custom indexing policy by using the Azure portal
  • Implement a composite index
  • Optimize index performance

Summary: Successful indexing is all about balance. This section guides you to create tailored policies for either write-heavy or read-heavy workloads.

Composite indexes, custom policies, and optimization techniques allow you to fine-tune query performance further. Each strategy ensures queries can be executed at scale without unnecessary resource consumption.

Domain 5: Maintain an Azure Cosmos DB solution (28% of the exam)

Monitor and troubleshoot an Azure Cosmos DB solution

  • Evaluate response status code and failure metrics
  • Monitor metrics for normalized throughput usage by using Azure Monitor
  • Monitor server-side latency metrics by using Azure Monitor
  • Monitor data replication in relation to latency and availability
  • Configure Azure Monitor alerts for Azure Cosmos DB
  • Implement and query Azure Cosmos DB logs
  • Monitor throughput across partitions
  • Monitor distribution of data across partitions
  • Monitor security by using logging and auditing

Summary: This section ensures you are fully equipped to observe and maintain Cosmos DB. You will monitor metrics for throughput, latency, replication, and logs, enabling proactive detection of issues.

You will also implement alerts and evaluate response codes for robust troubleshooting. Combined, these practices ensure resilience, availability, and transparency in production Cosmos DB environments.

Implement backup and restore for an Azure Cosmos DB solution

  • Choose between periodic and continuous backup
  • Configure periodic backup
  • Configure continuous backup and recovery
  • Locate a recovery point for a point-in-time recovery
  • Recover a database or container from a recovery point

Summary: This section focuses on data protection. You will configure and use both periodic and continuous backups to safeguard databases.

Practical steps include locating recovery points and restoring specific containers or databases. The emphasis is on ensuring applications remain resilient and data remains safe.

Implement security for an Azure Cosmos DB solution

  • Choose between service-managed and customer-managed encryption keys
  • Configure network-level access control for Azure Cosmos DB
  • Configure data encryption for Azure Cosmos DB
  • Manage control plane access to Azure Cosmos DB by using Azure role-based access control (RBAC)
  • Manage control plane access to Azure Cosmos DB Data Explorer by using Azure role-based access control (RBAC)
  • Manage data plane access to Azure Cosmos DB by using Microsoft Entra ID
  • Configure cross-origin resource sharing (CORS) settings
  • Manage account keys by using Azure Key Vault
  • Implement customer-managed keys for encryption
  • Implement Always Encrypted

Summary: Security is a fundamental focus here. You will explore RBAC configurations, encryption methods, and identity integrations with Microsoft Entra ID to protect both control and data plane access.

You will also configure encryption strategies such as customer-managed keys and Always Encrypted to ensure compliance and trust. Combined with CORS and key vault integrations, this delivers comprehensive, layered protection.

Implement data movement for an Azure Cosmos DB solution

  • Choose a data movement strategy
  • Move data by using client SDK bulk operations
  • Move data by using Azure Data Factory and Azure Synapse pipelines
  • Move data by using a Kafka connector
  • Move data by using Azure Stream Analytics
  • Move data by using the Azure Cosmos DB Spark Connector
  • Configure Azure Cosmos DB as a custom endpoint for an Azure IoT Hub

Summary: This section is all about enabling data flow. You will plan strategies and apply data movement techniques ranging from bulk SDK operations to integrations with Azure Data Factory, Synapse, and Stream Analytics.

You will also connect Cosmos DB with platforms like Kafka and IoT Hub. This ensures seamless integration for ecosystems that thrive on dynamic, real-time or batch data flows.

Implement a DevOps process for an Azure Cosmos DB solution

  • Choose when to use declarative versus imperative operations
  • Provision and manage Azure Cosmos DB resources by using Azure Resource Manager templates
  • Migrate between standard and autoscale throughput by using PowerShell or Azure CLI
  • Initiate a regional failover by using PowerShell or Azure CLI
  • Maintain indexing policies in production by using Azure Resource Manager templates

Summary: DevOps practices empower maintainable Cosmos DB solutions. You will explore declarative and imperative choices, integrating ARM templates into deployment workflows for stable, repeatable resource management.

You will also automate throughput migrations, regional failovers, and maintenance of indexing policies using CLI or PowerShell. These practices streamline long-term operational stability.

Who is the Microsoft Azure Cosmos DB Developer Specialty Certification best suited for?

This certification is perfect for developers and technology professionals who want to validate their expertise in building cloud-native applications that leverage Azure Cosmos DB. It is especially well-suited for those who design, implement, and monitor applications that need to handle complex and large-scale data scenarios.

You may be a great candidate if you are:

  • A cloud developer creating scalable web and mobile apps
  • A database engineer who wants to master NoSQL solutions in Azure
  • A software developer integrating backend systems with Cosmos DB
  • A data solutions consultant guiding clients towards modern database architectures

Even if your role is broader, such as an architect, dev lead, or IT professional, this specialty demonstrates your ability to harness Cosmos DB to meet security, performance, and resilience demands in real-world cloud applications.


What types of roles can I pursue after earning this Azure Cosmos DB Developer Specialty certification?

With this certification under your belt, you open the door to multiple opportunities in the cloud development and data engineering space. Common roles include:

  • Azure Developer or Cloud Developer
  • Cosmos DB Developer or NoSQL Engineer
  • Application Architect focusing on distributed data stores
  • Database Consultant who helps organizations modernize systems
  • Big Data Engineer working with analytics and real-time systems

Beyond the technical roles, this certification can also boost positions in solution architecture and pre-sales engineering, where expertise in Cosmos DB is a strong differentiator.


What is the current version and exam code for this certification?

The latest certification exam is known as Exam DP-420: Designing and Implementing Cloud-Native Applications Using Microsoft Azure Cosmos DB. This is the single exam you need to pass in order to earn the Microsoft Certified: Azure Cosmos DB Developer Specialty credential.

The DP-420 exam ensures you are tested on practical capabilities with Cosmos DB, including design considerations, query optimization, integration, and monitoring—reflecting skills required in modern cloud-based development.


How much does the DP-420 certification exam cost?

The cost of the Microsoft Azure Cosmos DB Developer Specialty (DP-420) certification exam is $165 USD. Pricing may vary depending on your country or region due to exchange rates and local taxes.

It’s worth remembering that Microsoft occasionally offers discounted exam vouchers, student discounts, or bundle offers with practice tests and retake options, so be sure to check your Microsoft Learn certification dashboard for promotions.


How many questions should I expect on the Microsoft DP-420 exam?

The DP-420 exam typically includes around 60 questions. These questions come in various formats such as multiple choice, multi-select, and case studies.

Some of the unscored experimental questions are included to help Microsoft develop future exams, but these are indistinguishable from scored questions. You can rest easy knowing that your performance is judged fairly, with the majority of the questions measuring your skills across the five exam domains.


How long do I have to complete the Azure Cosmos DB Developer exam?

You will be given 100 minutes to complete the exam. This is more than enough time if you pace yourself and become comfortable with the scenario-based style in advance.

Many questions dive into real-world development scenarios, so make sure you set aside time for heavier case study questions. Microsoft also provides an Exam Sandbox where you can practice the interface and question types beforehand.


What score do I need to pass the Microsoft DP-420 exam?

To pass, you’ll need a scaled score of 700 out of 1000. Microsoft uses a compensatory scoring model, meaning you do not need to pass each section individually—your overall score determines success.

This scoring approach gives you flexibility: even if one section is more difficult for you, you can still pass by performing well in other areas. Always aim to strengthen your knowledge evenly across all domains, but know that your success depends on the aggregate.


What exam languages are supported for the Microsoft Azure Cosmos DB Developer Specialty?

The DP-420 exam is available in an excellent range of languages, making it accessible to professionals around the world. You can take the exam in:

  • English
  • Japanese
  • Chinese (Simplified & Traditional)
  • Korean
  • German
  • French
  • Spanish
  • Portuguese (Brazil)
  • Italian

If your preferred language isn’t available, you can request additional time to support your exam experience.


What are the main domains and weightings of the DP-420 exam?

The DP-420 exam is structured around five key domains, each with a specific weight toward your overall score:

  1. Design and implement data models (35–40%)
  2. Design and implement data distribution (5–10%)
  3. Integrate an Azure Cosmos DB solution (5–10%)
  4. Optimize an Azure Cosmos DB solution (15–20%)
  5. Maintain an Azure Cosmos DB solution (25–30%)

These percentages indicate areas to prioritize in your study plan. Most of the weight is on designing models and maintaining solutions, which directly impact real-world Cosmos DB application development.


Are there any prerequisites for taking the Azure Cosmos DB Developer exam?

There are no formal prerequisites for this certification. However, Microsoft strongly recommends that you have:

  • Experience developing apps on Azure
  • Solid understanding of Azure Cosmos DB NoSQL APIs
  • Familiarity with JSON, SQL queries, and SDK programming
  • Comfort working with either C# or Java, plus some PowerShell knowledge

Hands-on practice with Cosmos DB will be the most important step in preparation.


What kind of questions can I expect in the DP-420 exam?

The DP-420 exam includes multiple-choice, multi-select, and case study-based questions. You can expect to see detailed scenarios where you need to choose design approaches, partitioning strategies, or optimization steps.

Some questions also test theoretical knowledge such as consistency models, while others are more practical like writing queries or selecting indexing strategies. Practicing real-world use cases will help you feel confident with both styles.


What knowledge areas should I focus on while preparing?

You should prioritize your studies on major themes like:

  • Data modeling: Denormalization, partitioning, schema designs
  • Query development: SQL for Cosmos DB, SDK usage, batch operations
  • Optimization: Query cost analysis, RU (Request Unit) efficiency, caching, and indexing strategies
  • Integration: Change feed with Azure Functions, Synapse Link integration
  • Maintenance and security: Backup, recovery, data encryption, and monitoring with Azure Monitor

By focusing on these, you will cover both the conceptual knowledge and the applied skills Microsoft is assessing.


Is the Microsoft Azure Cosmos DB Developer exam considered difficult?

The DP-420 exam is highly practical and aligned with real-world development tasks. It does require a solid foundation in Cosmos DB, but with structured study and hands-on practice, most learners find it manageable and rewarding.

Think of it as a way to validate skills you are likely already applying in your projects. Completing guided labs, tutorials, and excellent certification practice exams for Microsoft Azure Cosmos DB Developer Specialty can significantly improve both your confidence and performance.


How often will I need to renew the Cosmos DB Developer Specialty certification?

The certification must be renewed every 12 months. Microsoft provides a free online renewal assessment through Microsoft Learn, which you can complete from anywhere.

Renewal ensures your credential stays up to date with evolving Azure services, safeguarding the value of your certification throughout your career.


How can I best prepare for Exam DP-420?

To maximize your preparation success, combine several methods:

  • Microsoft Learn Learning Paths: Free, self-paced content tailored to exam topics.
  • Instructor-led training: Great if you prefer structured classroom-style learning.
  • Hands-on Labs: Use the Cosmos DB Emulator or Azure portal to gain experience.
  • Documentation & Whitepapers: Learn best practices directly from Microsoft’s published materials.
  • Community discussions & Q&A: Connect with peers working toward the exam.

The more you blend theory with practical application, the more natural the exam will feel.


What mistakes should I avoid during exam prep?

Some common missteps include:

  • Focusing entirely on theory instead of practicing with the Cosmos DB SDK or portal
  • Overlooking key areas like monitoring, security, or backup solutions
  • Ignoring details about partition keys and indexing, which are tested often

Keep your study plan well-rounded, and leverage both labs and practice tests to sharpen your readiness.


How long does the Microsoft Certified: Azure Cosmos DB Developer Specialty credential stay valid?

Your certification remains valid for one year. Microsoft requires yearly renewals, ensuring professionals are knowledgeable about the most recent enhancements to Cosmos DB.

This keeps your resume sharp and your skills aligned with industry demands.


Can I take the DP-420 exam remotely?

Yes, you can take the exam online via remote proctoring or in person at a Pearson VUE testing center.

For remote proctoring, you’ll need a webcam, stable internet, and a private environment. In-person options are great if you prefer a more structured test center atmosphere.


Where can I register for the Microsoft Azure Cosmos DB Developer Specialty exam?

You can register directly through the official Microsoft Certification page for Azure Cosmos DB Developer Specialty.

After registration, you’ll choose your preferred testing option, pay the fee, and schedule your exam time. Make sure your Microsoft profile is tied to your learning account so your achievements are tracked accurately.


How does this certification enhance my career path?

This certification showcases your expertise in NoSQL design, distributed data handling, real-time integrations, and security for cloud-native apps. It sets you apart as a developer capable of solving modern, large-scale data challenges in Azure.

It also builds your foundation to pursue more advanced Azure certifications, such as Azure Developer Associate or Azure Solutions Architect, extending your career into senior technical or architect-level positions.


Will the Microsoft Azure Cosmos DB Developer Specialty help me work with other Azure services?

Yes. The certification not only deepens your capabilities with Cosmos DB but also strengthens your understanding of its integration with other Azure services. Examples include:

  • Azure Functions for serverless compute
  • Event Hubs for scalable messaging
  • Synapse Analytics for big data reporting
  • Azure AI Search for intelligent query capabilities

Together, these integrations allow you to design complete, resilient, and intelligent cloud-native solutions.


The Microsoft Certified: Azure Cosmos DB Developer Specialty is a fantastic step forward for developers who want to validate their skills in distributed systems, scalable data applications, and cloud-native design. By investing your time in this certification, you gain a powerful credential that is recognized industry-wide and reinforces your ability to create future-ready data solutions on Azure.

Share this article
Microsoft Azure Cosmos DB Developer Specialty Mobile Display
FREE
Practice Exam (2025):Microsoft Azure Cosmos DB Developer Specialty
LearnMore