Confluent brings more secure, reliable data streaming in the cloud

News Desk -

Share

Confluent, Inc., the data streaming platform that puts data in motion, announced several new capabilities as part of its Q2 ’22 launch that provide granular security, enterprise-wide observability, and mission-critical reliability at scale. These capabilities are becoming increasingly important as more businesses move their data streaming workloads to the cloud to avoid the operational overhead of infrastructure management.

“Every company is in a race to transform their business and take advantage of the simplicity of cloud computing,” said Ganesh Srinivasan, Chief Product Officer, Confluent. “However, migrating to the cloud often comes with tradeoffs on security, monitoring insights, and uptime guarantees. With this launch, we make it possible to achieve those fundamental requirements without added complexity, so organizations can start innovating in the cloud faster.” 

New role-based access controls (RBAC) enable granular permissions on the data plane level to ensure data compliance and privacy at scale

Data security is critical in any organization, but it is especially important when migrating to public clouds. Organizations must ensure that only the right people have access to the right data in order to operate efficiently and securely. Controlling access to sensitive data all the way down to individual Apache Kafka topics, on the other hand, takes a significant amount of time and resources due to the complex scripts required to manually set permissions.

Confluent introduced RBAC for Confluent Cloud last year, allowing customers to streamline this process for critical resources such as production environments, sensitive clusters, and billing details, and making role-based permissions as simple as clicking a button. With today’s release, RBAC now includes access to individual Kafka resources such as topics, consumer groups, and transactional IDs. It enables organizations to define clear roles and responsibilities for administrators, operators, and developers, allowing them to access only the data required for their jobs on both the data and control planes.

Expanded Confluent Cloud Metrics API delivers enterprise-wide observability to optimize data streaming performance across the entire business

Businesses must have a solid understanding of their IT stack in order to deliver the high-quality services that their customers expect while efficiently managing operating costs. Customers can already understand their usage and performance across the platform using the Confluent Cloud Metrics API, which is the simplest and fastest way for them to do so. Confluent is introducing two new insights today to provide even more visibility into data streaming deployments, as well as an expansion of our third-party monitoring integrations to ensure these critical metrics are available wherever they are needed:

  • Customers can now easily understand organizational usage of data streams across their business and sub-divisions to see where and how resources are used. This capability is particularly important to enterprises that are expanding their use of data streaming and need to manage internal chargebacks by business unit. Additionally, it helps teams to identify where resources are being over or underutilized, down to the level of an individual user, in order to optimize resource allocation and improve cost savings.
  • New capabilities for consumer lag monitoring help organizations ensure their mission-critical services are always meeting customer expectations. With real-time insights, customers are able to identify hotspots in their data pipelines and can easily identify where resources need to be scaled to avoid an incident before it occurs. Additionally, with records exposed as a time series, teams are equipped to make informed decisions based upon deep historical context when setting or adjusting SLOs.
  • A new, first-class integration with Grafana Cloud gives customers deep visibility into Confluent Cloud from within the monitoring tool they already use. Along with recently announced integrations, this update allows businesses to monitor their data streams directly alongside the rest of their technology stack through their service of choice. 

To enable easy and cost-effective integration of more data from high-value systems, Confluent’s Premium Source Connector for Oracle® Change Data Capture (CDC) is now available for Confluent Cloud. The fully managed connector enables users to capture valuable change events from an Oracle database and see them in real time within Confluent’s leading cloud-native Kafka service without any operational overhead. 

New 99.99% uptime SLA for Apache Kafka® provides comprehensive coverage for sensitive data streaming workloads in the cloud

One of the biggest concerns with relying on open source systems for business-critical workloads is reliability. Downtime is unacceptable for businesses operating in a digital-first world. It not only causes negative financial and business impact, but it often leads to long-term damage to a brand’s reputation. Confluent now offers a 99.99% uptime SLA for both Standard and Dedicated fully managed, multi-zone clusters. Covering not only infrastructure, but Kafka performance, critical bug fixes, security updates, and more, this comprehensive SLA allows you to run even the most sensitive, mission-critical data streaming workloads in the cloud with high confidence. 

Aside from reliability, engineering teams and developers must grapple with the new programming paradigm of stream processing and the various use cases it enables. Confluent is introducing Stream Processing Use Case Recipes to help users get started with stream processing use cases. This collection of over 25 of the most popular real-world use cases, sourced from customers and validated by experts, can be launched in Confluent Cloud with the click of a button, allowing developers to quickly begin unlocking the value of stream processing.


Leave a reply