Candidate must be located within commuting distance of Newport, NJ or be willing to relocate to the area. This position may require travel to project locations.
- Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
- At least 11 years of Information Technology experience
- All applicants authorized to work in the United States are encouraged to apply
- Must have deep experience in architecting Big Data solutions using Hadoop, Spark, Hive, Kafka, Scala
- Must have in depth knowledge of hadoop ecosystem
- Should have data background. Must have knowledge on data warehouse, ETL, reporting.
- Good experience in end-to-end implementation of data warehouse and data marts
- Experience and detailed knowledge with Master Data Management, ETL, Data Quality, metadata management, data profiling, micro-batches, streaming data loads
- Experience designing and implementing complex solutions for distributed systems
- Experience in leading and mentoring teams
Preferred Qualifications:
- Good understanding of data integration, data quality and data architecture
- Experience in Relational Modeling, Dimensional Modeling and Modeling of Unstructured Data
- Strong knowledge and hands-on experience in SQL, Unix shell scripting
- Good understanding of Agile software development frameworks
- Strong communication and Analytical skills
- Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
- Experience and desire to work in a global delivery environment
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face. Travel may be required as per the job requirements.