公司简介
Principal responsibilities
• Manage and optimize Kubernetes-based infrastructure, supporting Pega CDH, including external components, ensuring seamless integration and operational efficiency in cloud-native environments.
• Lead or support the migration of MQ systems to Kafka, ensuring seamless transition, minimal downtime, and optimized performance
• Identify and resolve performance bottlenecks across Kubernetes clusters, cloud services, and data pipelines, ensuring optimal resource utilization.
• Optimize Kubernetes configurations, cloud-native service performance, and data modeling for improved scalability and efficiency.
• Participate in capacity planning, disaster recovery design, and performance tuning for Kubernetes and cloud-native systems.
• Automate infrastructure operations using Kubernetes-native tools and industry best practices.
• Troubleshoot issues across the stack (storage, network, OS, application, etc.) and provide deep diagnostics.
• Maintain high standards of documentation and knowledge sharing within the team.
• 10+ years of hands-on experience with Apache Cassandra and Kafka in production environments.
• Strong knowledge of data replication, partitioning, schema design, and Kafka consumers/producers.
• Experience in cloud-native deployments, ideally on Google Cloud Platform (GCP).
• Solid experience managing and deploying services using Kubernetes (GKE or equivalent).
• Experience with Elasticsearch – building and managing clusters, indexing strategies, and integration with data pipelines.
• Familiarity with Pega CDH and its external component ecosystem is a significant plus.
• Strong scripting or automation experience (e.g., Terraform, Helm, Ansible, Bash, Python).
• Ability to work independently and collaboratively in a global, distributed team environment.
• Excellent problem-solving and communication skills.
Preferred Qualifications
• Experience with observability tools such as AppDynamics, Prometheus, Grafana, or GCP Operations Suite.
• Exposure to DevOps practices and CI/CD pipelines.
• Understanding of data governance, privacy, and compliance in financial or regulated environments.
If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity!