公司简介
• Technical Delivery: Build and deliver high-quality big data pipelines to drive business value.
• Technical Expertise: Solve complex technical challenges, optimize product performance, and ensure scalability.
• Quality Control: Ensure data pipelines and systems meet quality standards and implement robust testing processes.
• Process Improvement: Continuously refine procedures and systems for efficiency, cost-effectiveness, and quality.
• Provide technical guidance and mentorship to junior team members, fostering their growth and development.
• Work closely with cross-functional teams, including business stakeholders, to understand requirements and deliver data solutions that align with organizational goals.
• Proven experience with Hadoop and Spark ecosystem with Python (e.g., HDFS, Hive, Pig, Kafka, etc.).
• Solid understanding of data modeling, ETL processes, and data warehousing concepts.
• Experience with CI/CD pipelines and version control systems (e.g.Git, Jenkins, Ansible ).
• Excellent problem-solving skills and ability to work in a fast-paced environment.
What additional skills will be good to have?
• Hands-on experience with cloud platforms such as GCP or Alibaba Cloud.
• Be familiar with programming skills in Java and React JS.