and should be able to translate business requirements into technical requirements for solutions.
Required years of experience, training, technical skills, and other requirements for job performance:
- Minimum 11 years of experience in Programming Languages: Python, SQL, Scala, Java, C++, COBOL, Shell Script
- Minimum 8 years of experience in Cloud Platforms and Big Data Technologies: AWS, S3, AWS Glue, Athena, Lambda, Redshift, EMR, EC2, Airflow, Azure, ADLS, ADF, Databricks, Synapse, Apache Spark, Hadoop
- 10 + Years of experience in the following Databases: Oracle, SQL Server, Sybase, Mongo DB, Cosmo DB, MySQL, PostgreSQL, Hive
- 8 + Years of experience in Data Warehousing: Databricks, Redshift, Synapse, Snowflake.
- ETL Tools: Informatica Power Center, Informatica Cloud, DataStage,
- Data Modeling: Star Schema, Snowflake Schema, Data Marts, Dimensional Modeling
- Data Governance: Unity Catalog
- DevOps: CI/CD, Kubernetes
- Other Tools: GitHub, Jenkins, Terraform, Autosys, SFTP, MFT, MOVEit, Control M, Power BI
Certifications
- AWS Cloud Practitioner
- AZ-900 Microsoft Azure Fundamentals
- SnowPro Core
Educational Requirements:
- A Bachelor's Degree in Engineering or a foreign equivalent is required from an accredited institution. Will also consider three years of progressive, relevant work experience instead of every year of education.
The job entails sitting and working at a computer for extended periods. Should be able to communicate by telephone, email, or face-to-face. Travel may be required as per the job requirements.