汇丰软件 · Group Data Technology

Consultant Specialist : 0000L943

薪资面议  /  西安

2025-05-29 更新

我要推荐 内部推荐

若你发现本职位存在违规现象,欢迎举报。

提交成功

3s后自动关闭

举报职位

职位属性

招聘类型:社招
工作性质:全职

职位描述

Principal responsibilities

Design and Develop ETL Processes:

  • Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP.
  • Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs.

Data Pipeline Optimization:

  • Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows.
  • Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks.

Data Integration and Management:

  • Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency.
  • Manage and maintain data storage solutions in GCP (e.g., Big Query, Cloud Storage) to support analytics and reporting.

GCP Dataflow Development:

  • Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy.
  • Collaborate with data analysts and data scientists to prepare data for analysis and reporting.

Automation and Monitoring:

  • Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency, and reducing manual intervention.
  • Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs.

Data Governance and Security:

  • Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies.
  • Collaborate with security teams to implement data protection measures and address vulnerabilities.

Documentation and Knowledge Sharing:

  • Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members.
  • Conduct training sessions and workshops to share expertise and promote best practices within the team.


任职条件

Qualifications

 

  • Bachelor’s degree in computer science, Information Systems, or a related field.

Experience:

  • Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP.
  • Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development.

Technical Skills:

  • Strong knowledge of GCP services (e.g., Big Query, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering.
  • Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred.
  • Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc.
  • Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform.

Analytical Skills:

  • Strong problem-solving skills with a keen attention to detail.
  • Ability to analyze complex data sets and derive meaningful insights.

Benefits:

  • Competitive salary and comprehensive benefits package.
  • Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects.
  • Professional development opportunities to enhance your skills and advance your career.


职位要求

  • 外语要求:英语

公司福利

  • 五险一金
  • 带薪年假
  • 定期体检
  • 弹性工作
  • 管理规范