Job description
To support various High-Performance Computing (HPC) service offers, join many HPC service-providing teams, and work closely with colleagues in both North America and Asia-Pacific, we are looking for a dynamic team player. Assist software developers, data scientists, and data analysts daily with utilizing GCP Dataproc and Dataflow services.
- Pyspark and Python
- Knowledge of Apache Beam Coding
- Airflow from Apache
- Terraform Coding Proficiency
- The Experience of Tekton Pipeline
- Shell, Java, and SQL programming
- Knowledge of Docker and Kubernetes
- Assist software engineers, data analysts, and data scientists daily with utilizing GCP Dataproc and Dataflow services, such as creating jobs and clusters, handling errors, etc.
- Assist in debugging problems on Hadoop and GCP platforms, such as Dataproc, Dataflow, Hive, Spark, Oozie, Kafka, and NiFi, among others.
- Carry out PoCs and testing for various Dataproc and Dataflow use cases.
- Take part in the On-Call rotation each month.
- Function: Analytics & Business Intelligence – Other
- Type of Industry: Automotive Parts
- Division: Analytics & Data Science
- Type of Employment: Permanent Full-Time
- Category of Role: Analytics & Business Intelligence
- Instruction
- UG: Any Graduate
- Postgraduate: Any kind of degree