Primary Responsibilities
- Define and align on strategic initiatives pertaining to Data and Analytics Architecture.
- Design and develop data lakes, managing data flows that integrate information from various sources into a common data lake platform through an ETL tool, supporting near real-time use cases.
- Design repeatable and reusable solution architectures and data ingestion pipelines for bringing in data from ERP source systems like SAP.
- Manage data integration with tools like Databricks and Snowflake or equivalent data lake and data warehouses.
- Design and develop data warehouses for scale.
- Design and evaluate data models (Star, Snowflake, and Flattened).
- Design data access patterns for OLTP and OLAP-based transactions.
- Triage, debug, and fix technical issues related to data lakes and data warehouses.
- Serve and share data through modern data warehousing tools and practices.
- Coordinate with business and technical teams through all phases of the software development life cycle.
- Participate in making major technical and architectural decisions.
- Hands-on prototyping of new technology solutions by working with cross teams.
You Must Have:
- 5+ years of experience operating on AWS Cloud with building data lake and data warehousing architectures.
- 5+ years of experience building data warehouses on Snowflake, Redshift, HANA, Teradata, Exasol, etc.
- 3+ years of experience with AWS data services like S3, Glue, Lake Formation, EMR, Kinesis, RDS, DMS, and others.
- 3+ years of experience with data modeling.
- 3+ years of working knowledge in Spark or equivalent big data technologies.
- 3+ years of experience in building Delta Lakes using technologies like Databricks.
- 3+ years of working experience in ETL tools like Talend, Informatica, SAP Data Services, etc.
- 3+ years of experience in any programming language (Python, R, Scala, Java).
- 3+ years of experience working with ERP systems like SAP, focusing on data integration.
- Bachelor’s degree in computer science, information technology, data science, data analytics, or a related field.
- Experience working on Agile projects and Agile methodology in general.
- Excellent problem-solving, communication, and teamwork skills.
- Exceptional presentation, visualization, and analysis skills.
Job Types: Full-time, Contract
Pay: $80.00 - $82.00 per hour
Expected hours: 40 per week
Experience level:
Schedule:
- 8 hour shift
- Monday to Friday
Experience:
- AWS Cloud with building Data Lake & Warehouses: 1 year (Required)
- Data Warehouses on Snowflake, Redshift, HANA, Teradata: 1 year (Required)
- Data Modelling: 1 year (Required)
- ETL tools like Talend, Informatica, SAP Data Services: 1 year (Required)
Work Location: Remote