yments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee can be a part of something bigger and change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities.
Join a fast-growing team
As a Senior Data Engineer in the Data Engineering & Analytics team, you will develop data & analytics solutions that sit atop vast datasets gathered by retail stores, restaurants, banks, and other consumer-focused companies. The challenge will be to create high-performance algorithms, cutting-edge analytical techniques including machine learning and artificial intelligence, and intuitive workflows that allow our users to derive insights from big data that in turn drive their businesses. You will have the opportunity to create high-performance analytic solutions based on data sets measured in the billions of transactions and front-end visualizations to unleash the value of big data. You will have the opportunity to develop data-driven innovative analytical solutions and identify opportunities to support business and client needs in a quantitative manner and facilitate informed recommendations/decisions through activities like building ML models, automated data pipelines, designing data architecture/schema, performing jobs in big data cluster by using different execution engines and program languages such as Hive/Impala, Python, Java, Kafka, Spark, R, etc.
• Hands-on developer who writes good quality, secure code that is modular, functional, and testable.
• Drive the evolution of Data & Services products/platforms with an impact-focused on data science and engineering.
• Design and implement scalable data architecture and data pipelines
• Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks.
• Provide support for deployed data applications and analytical models by being a trusted advisor to Data Scientists and other data consumers by identifying data problems and guiding issue resolution with partner Data Engineers and source data providers.
• Ensure proper data governance policies are followed by implementing or validating Data Lineage, Quality checks, classification, etc.
• Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test Experiment with new tools to streamline the development, testing, deployment, and running of our data pipelines.
• Participate in the development of data and analytic infrastructure for product development
Continuously innovate and determine new approaches, tools, techniques & technologies to solve business problems and generate business insights & recommendations
• Partner with roles across the organization including consultants, engineering, and sales to determine the highest priority problems to solve
• Evaluate trade-offs between many possible analytics solutions to a problem, taking into account usability, technical feasibility, timelines, and differing stakeholder opinions to make a decision
Break large solutions into smaller, releasable milestones to collect data and feedback from product managers, clients, and other stakeholders
• Evangelize releases to users, incorporating feedback, and tracking usage to inform future development
Work with small, cross-functional teams to define the vision, establish team culture and processes
Consistently focus on key drivers of organization value and prioritize operational activities accordingly
• Escalate technical errors or bugs detected in project work
• Maintain awareness of relevant technical and product trends through self-learning/study, training classes, and job shadowing.
• Support the building of scaled machine learning production systems by designing pipelines and engineering infrastructure.
Ideal Candidate Qualifications:
• Working proficiency in using Python/Scala, Spark (tuning jobs), SQL, Hadoop platforms to build Big Data products & platforms.
• Good programming skills in Java and spring boot and Junit.
• Knowledge in software development test approaches & frameworks
• Familiarity with RESTful APIs and micro-services architectures
• Experience in working with CI/CD
• Experience in working with SQL database like Postgres, Oracle
• Preferably with hands-on experience with Hadoop big data tools (Hive, Impala, Spark)
• Experience with data pipeline and workflow management tools: NIFI, Airflow.
• Comfortable in developing shell scripts for automation.
• Good troubleshooting and debugging skills.
• Proficient in standard software development, such as version control, testing, and deployment
• Demonstrated basic knowledge of statistical analytical techniques, coding, and data engineering
• Ability to quickly learn and implement new technologies
• Ability to Solve complex problems with multi-layered data sets
• Ability to innovate and determine new approaches & technologies to solve business problems and generate business insights & recommendations.
• Ability to multi-task and strong attention to detail
• Flexibility to work as a member of a matrix based diverse and geographically distributed project teams
• Good communication skills - both verbal and written - and strong relationship, collaboration skills, and organizational skills
The following skills will be considered as a plus.
• Experience with performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts
• Experience in working with Cloud APIs (e.g., Azure, AWS)
• Experience participating in complex engineering projects in an Agile setting e.g. Scrum
Corporate Security Responsibility
All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: