Published on April 11, 2025. Modified on May 12, 2025.
Requirements :
- Must have Skills : Spark, Python, PLSQL/No SQL/SQL
- Expertise in cluster computing technologies - Apache Spark
- Well versed with any of the Cloud Technologies, i.e., AWS, Azure, GCP etc.
- Investigates and analyses feasibility of system requirements and develops system specifications.
- Identifies methods, solutions, and provides project leadership and management to provide a high level of service to the customers of the department.
- Acts as a subject matter expert and mentors/educates members of the team.
Responsibilities :
- Experience in columnar data storage format such as Apache Parquet
- Familiarity with relational and big data such as PL/SQL, Postgres, My SQL, Aurora, Dynamo, and similar technologies
- Familiarity with Git and build automation tools such as Maven.
- Experience in unit testing techniques
- Passion for finding and solving problems.
- Excellent communication skills, proven ability to convey complex ideas to others in a concise and clear manner.
- Software Development Life Cycle experience including planning, designing, development, testing and debugging