Advertisement:



Kloud9 - Data Engineer - Google Cloud Platform, Bangalore/remote

India, Bangalore/remote, Bangalore/remote
Last update 2024-12-05
Expires 2025-01-05
ID #2453219339
Free
Kloud9 - Data Engineer - Google Cloud Platform, Bangalore/remote
India, Bangalore/remote, Bangalore/remote,
Modified November 14, 2024

Description

Location : Bangalore Experience : 3-5 years Mode : Hybrid (2 days work from office in a week) Job Summary : We are seeking a highly skilled Data Engineer with deep expertise in Google Cloud Platform (GCP), Apache Spark, and SAP S/4 HANA (Data ingestion) to design and implement scalable, high-performance data lake solutions.

The ideal candidate will have extensive experience in building data ingestion pipelines, managing big data processing using Apache Spark, and integrating SAP S/4 HANA with GCP cloud infrastructure.

Key Requirements : - Over 3 years of professional experience in data engineering, specializing in implementing large-scale enterprise Data Engineering projects with the latest technologies.

- Over 2 years of hands-on experience in GCP technologies.

- Build and optimize large-scale data pipelines using Apache Spark on GCP (via Dataproc or other Spark services).

Ensure high performance and scalability in Spark-based data processing workloads.

- Experienced in integration of SAP S/4 HANA data with GCP for real-time and batch data processing.

- Manage data extraction, transformation, and loading (ETL) processes from SAP S/4 HANA into cloud storage and data lakes.

- Develop and manage scalable data ingestion pipelines for structured and unstructured data using tools like Cloud Dataflow, Cloud Pub/Sub, and Apache Spark.

- Implement both real-time streaming and batch processing pipelines using Apache Spark, Dataflow, and other GCP services to meet business requirements.

- Implement data governance, access controls, and security best practices to ensure the integrity, confidentiality, and compliance of data across systems.

- Optimize Apache Spark jobs for performance, scalability, and cost-efficiency, ensuring that the architecture can handle growing data volumes.

Technical Expertise : - Expert-level programming proficiency in Python, Java, and Scala.

- Extensive hands-on experience with big data technologies, including Spark, Hadoop, Hive, Yarn, Map Reduce, Pig, Kafka, and Py Spark.

- Proficient in Google Cloud Platform services, such as Big Query, Dataflow, Cloud Storage, Dataproc, and Cloud Composer Google Pub/Sub, and Google Cloud Functions.

- Expertise in Apache Spark for both batch and real-time processing, as well as proficiency in Apache Beam, Hadoop, or other big data frameworks.

- Experienced in using Cloud SQL, Big Query, and Looker Studio (Google Data Studio) for cloud-based data solutions.

- Skilled in orchestration and deployment tools like Cloud Composer, Airflow, and Jenkins for continuous integration and deployment (CI/CD).

- Expertise in designing and developing integration solutions involving Hadoop/HDFS, real-time systems, data warehouses, and analytics solutions.

- Experience with Dev Ops practices, including version control (Git), CI/CD pipelines, and infrastructure-as-code (e.g., Terraform, Cloud Deployment Manager).

- Strong background in working with relational databases, No SQL databases, and in-memory databases.

- Strong knowledge of security best practices, IAM, encryption mechanisms, and compliance frameworks (GDPR, HIPAA) within GCP environments.

- Experience in implementing data governance, data lineage, and data quality frameworks.

- In-depth knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development, and big data solutions.

- Excellent debugging and problem-solving skills.

- Retail and e-commerce domain knowledge is a plus.

- Positive attitude with strong analytical skills and the ability to guide teams effectively.

Preferred Qualifications : - GCP Certifications : Such as Professional Data Engineer or Professional Cloud Architect.

- Apache Spark and Python certifications.

- Experience with Data visualization tools like Tableau, Power BI etc.

(ref:hirist.tech)

Job details:

Job type: Full time
Contract type: Permanent
Salary type: Monthly
Occupation: Kloud9 - data engineer - google cloud platform

⇐ Previous job

Next job ⇒     

 

Contact employer

    Quick search:

    Location

    Type city or region

    Keyword


    Advertisement: