BUSINESS TITLE : Data Engineer LOCATION : Gurugram POSITION DESCRIPTION : WHAT YOU'LL DO : - Architect, Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, airflow.
- Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards.
- Actively monitor and triage technical challenges in critical situations that require immediate resolution.
- Evaluate viable technical solutions and share MVPs or Po Cs in support of the research - Develop relationships with external stakeholders to maintain awareness of data and security issues and trends - Review work from other tech team members and provide feedback for growth - Implement Data Performance, Data security policies that align with governance objectives and regulatory requirements YOU'RE GOOD AT : - You have experience in data warehousing, data modeling, and the building of data engineering pipelines.
- You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling.
- You are good in analyzing performance bottlenecks and providing enhancement recommendations; you have a passion for customer service and a desire to learn and grow as a professional and a technologist.
- Strong analytical skills related to working with unstructured datasets.
- Collaborating with product owners to identify requirements, define desired outcomes and deliver trusted results.
- Building processes supporting data transformation, data structures, metadata, dependency and workload management.
- In this role, SQL is heavily focused.
An ideal candidate must have hands-on experience with SQL database design.
Plus, Python.
- Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred).
- Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use Git Hub & Bitbucket) - Extremely talented in applying SCD, CDC and DQ/DV framework.
- Familiar with JIRA & Confluence.
- Must have exposure to technologies such as dbt, Apache airflow and Snowflake.
- Desire to continually keep up with advancements in data engineering practices.
- Knowledge on AWS cloud, Python is a plus.
YOU BRING (EXPERIENCE & QUALIFICATIONS) : Essential Education : - Bachelor's degree or equivalent combination of education and experience.
Bachelor's degree in information science, data management, computer science or related field preferred.
Essential Experience & Job Requirements : - 5 years of IT experience with major focus on data warehouse/database related projects - Must have exposure to technologies such as dbt, Apache Airflow, Snowflake.
- Experience in data platforms: Snowflake, Oracle, SQL Server, MDM etc - Expertise in writing SQL and database objects - Stored procedures, functions, views.
- Hands on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, Attunity, Golden Gate, APIs, Apache Airflow, etc.
- Experience in data modeling and relational database design - Well-versed in applying SCD, CDC, and DQ/DV framework - Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use Git Hub & Bitbucket) - Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake - Good to have strong programming/ scripting skills (Python, Power Shell, etc.) - Good to know about developing financial models and forecasting to support financial planning and decision-making processes.
- Experience around responsibility for analyzing and interpreting financial data to provide valuable insights and support strategic decision-making.
- Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) - Excellent written, oral communication and presentation skills to present architecture, features, and solution recommendations YOU'LL WORK WITH : - Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels - Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) - Consulting and internal Data Product Portfolio teams across the Organization YOUR TRAVEL : Mostly from the office, but occasionally work at other offices.
(ref:hirist.tech)
Advertisement:
Aws Data Engineer - Analytics Dashboards, Bangalore
Free
Aws Data Engineer - Analytics Dashboards, Bangalore
India, Karnataka, Bangalore,
Modified November 29, 2024
Description
Job details:
⇐ Previous job |
Next job ⇒ |
Contact employer
Employer's info
Adzuna is a search engine for classified ads
We aim to make it easier for you to find the right job locally - and soon properties and cars too. We search thousands of sites so you don't have to, bring together millions of ads so you can find them all in one place, and organize them with useful features so that you can easily find what you need.