Published on June 11, 2025. Modified on June 12, 2025.
Experience : 10+ years.
Location : BKC, Mumbai.
Responsibilities :
- Evangelize, motivate, and enable our customers on their Enterprise Data Cloud journey.
- Participate in the pre and post sales process, helping both the sales, professional services management and product teams to interpret customer use cases.
- Use your deep domain, wide technical and business knowledge to assist customers in defining their data strategy, use case success criteria and frameworks to deliver successful implementations.
- Design and Implement Enterprise Data Cloud architectures and configurations for customers.
- Identify and grow professional services engagements and support subscriptions through the clear demonstration of the value we bring to our customers.
- Design, create and recommend standard best practices design patterns for distributed data pipelines and analytical computing architectures.
- Plan and deliver presentations and workshops to customer/internal stakeholders.
- Write and produce technical documentation, blogs, and knowledge base articles.
Skills and Experience :
- The candidate must be having 10+ years of experience.
- Extensive customer facing/consulting experience interacting with large scale distributed data/computing solutions.
- A strong business understanding of how Cloudera technologies solve real world business problems.
- Appreciation of the commercial business cases that drive customer data platform initiatives.
- Experience managing project delivery and leading a technical team.
- Strong experience designing, architecting, and implementing software solutions in an enterprise Linux environment, including solid foundation in OS / networking fundamentals.
- Strong experience with Hadoop or related technologies including deployment & administration.
- Excellent communication skills, experience with public speaking and able to present to a wide range of audiences.
- Proven knowledge of big data/analytical use cases and best practice approaches to implement solutions to them.
- Strong experience with Cloud Platforms (i.e.
AWS, Azure, Google Cloud).
- Experience with open-source ecosystem programming languages (i.e Python, Java, Scala, Spark etc.)
- Knowledge of the data management ecosystem including Concepts of data warehousing, ETL, data integration, etc.
- Experience designing frameworks for implementing data transformation and processing solutions on Hadoop or related technologies (i.e.
HDFS, HIVE, Impala, HBase, Ni Fi etc.
- Strong understanding of authentication (i.e.
LDAP , Active Directory, SAML, Kerberos etc.) & authorization confi guration for Hadoop based distributed systems.
- Deep knowledge of the Data/Big Data business domain.
- Familiarity with BI tools and Data Science notebooks such as Cloudera Data Science Workbench, Apache Zeppelin, Jupyter, IBM Watson Studio etc.
- Knowledge of scripting tools such as bash shell scripts, Python or Perl.
- Familiarity of Dev Ops methodology & toolsets and automation experience with Chef, Puppet, Ansible or Jenkins.
Ability to travel ~70%.