#2025-112375
for keeping up to date on new technologies.
What you'll do:
Design, develop, and maintain scalable data streaming pipelines using Java, Spring, and GCP native services such as Pub/Sub, Dataflow, or alternatives like Kafka and RabbitMQ.
Develop and unit test high-quality, maintainable code; partner with QA to ensure comprehensive test coverage and zero-defect production releases.
Design, develop, and manage front-end self-service portal using React.
Build reliable batch ingestion jobs to integrate HR data from multiple upstream sources into the Operational Data Exchange (ODX) database.
Streamline, simplify, and performance-tune batch and streaming data loads to improve throughput and minimize latency.
Collaborate closely with business stakeholders and upstream application teams to understand requirements, align on data contracts, and build trusted relationships.
Work with Production Support and Platform Engineering teams to triage and resolve production issues promptly, while ensuring data security and platform reliability.
Follow agile and release management best practices to ensure smooth deployments and prevent production install failures.
Stay current with evolving technologies and trends; continuously learn and apply modern patterns for data engineering and streaming.
Communicate effectively across technical and non-technical audiences; demonstrate ownership, adaptability, and a collaborative mindset.
What you have
What MUST you have?
Minimum 7 years of hands-on development experience using parallel processing databases like Teradata, Google Big Query.
Must have 5+ years' experience in Java Spring boot, and preferably Google Cloud Platform, and Informatica IICS
Must have 2+ years' experience in developing front-end applications using React.
Experience in data streaming technologies like Kafka, RabbitMQ
Experience with all aspects of data systems, including database design, ETL, aggregation strategy, performance optimization.
Experience setting best practices for building and designing code and strong Java & SQL experience to develop, tune, and debug complex applications.
Expertise in schema design, developing data models, and proven ability to work with complex data is required.
Hands-on experience with programming language Java/Python/Spark
Hands-on experience with Linux and shell scripting
Hands-on experience with CI/CD tools like Bamboo, Jenkins, Bitbucket, etc.
What's in it for you
At Schwab, we're committed to empowering our employees' personal and professional success. Our purpose-driven, supportive culture, and focus on your development means you'll get the tools you need to make a positive difference in the finance industry. Our Hybrid Work and Flexibility approach balances our ongoing commitment to workplace flexibility, serving our clients, and our strong belief in the value of being together in person on a regular basis.
We offer a competitive benefits package that takes care of the whole you - both today and in the future: