Accelerate your app delivery with Kubernetes and Istio on GKE
GM & VP, Cloud Runtimes
Director of Product Management, Google Cloud
It’s no wonder so many organizations have moved all or part of their IT to the cloud; it offers a range of powerful benefits. However, making the jump is often easier said than done. Many organizations have a significant on-premises IT footprint, aren’t quite cloud-ready, and constrained by regulations or lack of consistent security and operating model across on-premises and the cloud.
We are dedicated to helping you modernize your existing on-premises IT and move to the cloud at a pace that works for you. To do that, we are leading the charge on a number of open-source technologies for containers and microservices-based architectures. Let’s take a look at some of these and how they can help your organization prepare for a successful journey to the cloud.
Toward an open cloud stack
At Google Cloud Next ‘18, we announced Cloud Services Platform, a fully managed solution based on Google open-source technologies. With Cloud Services Platform, you have the tools to transform your IT operations and build applications for today and the future, using containerized infrastructure and a microservices-based application architecture.
Cloud Services Platform combines Kubernetes for container orchestration with Istio, the service management platform, helping you implement infrastructure, security, and operations best practices. The goal is to bring you increased velocity and reliability, as well as to help manage governance at the scale you need. Today, we are taking another step towards this vision with Istio on GKE.
Think services first with Istio
We truly believe that Istio will play a key role in helping you make the most of your microservices. One way Istio does this is to provide improved visibility and security, making working with containerized workloads easier. With Istio on GKE, we are the first major cloud provider to offer direct integration to a Kubernetes service and simplified lifecycle management for your containers.
Istio is a service mesh that lets you manage and visualize your applications as services, rather than individual infrastructure components. It collects logs, traces, and telemetry, which you can use to set and enforce policies on your services. Istio also lets you add security by encrypting network traffic, all while layering transparently onto any existing distributed application—you don’t need to embed any client libraries in your code.
Istio securely authenticates and connects your services to one another. By transparently adding mTLS to your service communication, all information is encrypted in transit. Istio provides a service identity for each service, allowing you to create service-level policies that are enforced for each individual application transaction, while providing non-replayable identity protection.
Out of the gate, you can also benefit from Istio’s visibility features thanks to its integration with Stackdriver, GCP’s native monitoring and logging suite. This integration sends service metrics, logs, and traces to Stackdriver, letting you monitor your golden signals (traffic, error rates, and latencies) for every service running in GKE.
Istio 1.0 was a key step toward helping you manage your services in a hybrid world, where multiple workloads run in different environments—clouds and on-premises, in containerized microservices or monolithic virtual machines. With Istio on GKE, you get granular visibility, security, and resilience for your containerized applications, with a dead-simple add-on that works out-of-the-box with all your existing applications.
Using Istio on GKE
The service-level view and security that Istio delivers are especially important for distributed applications deployed as containerized microservices, and Istio on GKE lets you deploy Istio to your Kubernetes clusters with the click of a button.
Istio on GKE works with both new and existing container deployments. It lets you incrementally roll out features, such as Istio security, bringing the benefits of Istio to your existing deployments. It also simplifies Istio lifecycle management by automatically upgrading your Istio deployments when newer versions become available.
Today’s Beta availability of Istio on GKE is just the latest of many advancements we have made to make GKE the ideal choice for enterprises. Try Istio on GKE today by visiting the Google Cloud Platform console. To learn more please visit cloud.go888ogle.com.fqhub.com/istio or the Istio on GKE documentation.
Enhancing GKE networking
Earlier this year we announced many new networking features for GKE, including VPC-native clusters, Shared VPC, container-native load balancing and container-native network services for applications running on GKE and self-managed Kubernetes in Google Cloud.
- With VPC-native clusters , GKE natively supports many VPC features such as scale enhancement, IP management, security checks, and hybrid connectivity etc.
- Shared VPC lets you delegate administrative responsibilities to cluster admins while ensuring your critical network resources are managed by network admins.
- Container-native load balancing lets you program load balancers with containers as endpoints directly for more optimal load balancing.
- Network services let you use Cloud Armor, Cloud CDN and Identity Aware Proxy natively with your container workloads.
We also announced new features to help simplify the configuration of containerized deployments, with some backend and frontend config enhancements. These improvements make everything easier, from identity and access management for network resources to better controls for CDN, Cloud Armor, and load balancing for easier application delivery.
Improving GKE security
GCP helps you secure your container environment at each stage of the build-and-deploy lifecycle with software supply chain and runtime security tools. These include integrations to tools from multiple security partners, all on top of Google’s security-focused infrastructure and security best practices. New features like node auto-upgrade and private clusters increase the security options available to GKE users. You can read more about new security features in GKE in “Exploring Containers Security: This year it’s about security.”
Delivering Kubernetes apps via GCP Marketplace
Enterprises usually work with a number of partners within their IT environments, whether it’s in the cloud or on-premises. Six months ago, we introduced Kubernetes applications delivered through GCP Marketplace. Kubernetes apps offer more than just a container image; they are production-ready solutions that are integrated with GKE for simple click-to-deploy launches. Once deployed to GKE, Kubernetes apps are managed as full applications, simplifying resource management. You can also deploy Kubernetes apps to non-GKE Kubernetes clusters, whether they’re on-premises or in the cloud, for quick deployment that’s billed alongside other GCP spend.
With Kubernetes, your cloud, your way
If you use containers and Kubernetes, you already know how they can optimize infrastructure resources, reduce operational overhead, and improve application portability. But by standardizing on Kubernetes, you’ve also laid the foundation for improved service management and security, as well as simplified application procurement and deployment, across clouds and on-prem. Stay tuned in the coming months for more about Kubernetes, microservices, and Cloud Services Platform.