nsider three years of progressive experience in the specialty in lieu of every year of education.
- All applicants authorized to work in the United States are encouraged to apply.
- At least 4 years of Information Technology experience
- At least 3+ years of experience in Hadoop platform
- 3+ years' experience in PySpark development
- 4 years minimum experience using Python
- 4+ years of Azure or AWS cloud platform experience
- At least 2 years of Hands-on experience working with Apache Spark and Spark SQL dev
Preferred Qualifications:
- Good understanding of data integration, data quality and data architecture
- Good general knowledge in end-to-end implementation of data warehouse and data marts
- Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Experience in Banking domain
- Strong communication and Analytical skills
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
- Experience and desire to work in a global delivery environment
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.