#R-208745
management for analytics. Additional responsibilities include designing interfaces, workflows, and process models, and deploying integrations in development and production environments, ensuring compliance and operational excellence.
This position is perfect for a collaborative, detail-oriented professional passionate about enhancing clinical operations through advanced engineering and integrations.
Roles & Responsibilities:
Collaborate closely with product owners, data architects, business SMEs and engineers to develop and deliver high-quality solutions, enhancing and maintaining integrations across clinical systems.
Design and architect the next-generation metrics engine on modern infrastructure to support operational analytics leveraging cloud technologies.
Design and implement a new data governance capability incorporating essential features like metadata and reference data management for analytics.
Take ownership of complex software projects from conception to deployment, managing scope, risk, and timelines.
Utilize rapid prototyping skills to quickly translate concepts into working solutions and code.
Leverage modern AI/ML technologies to enable predictive analytics, NLP/NLQ capabilities, and enhance the overall data analytics process.
Analyze functional and technical requirements of applications, translating them into software architecture and design specifications.
Develop and execute unit tests, integration tests, and other testing strategies to ensure software quality and reliability.
Integrate systems and platforms to ensure seamless data flow, functionality, and interoperability.
Provide ongoing support and maintenance for applications, ensuring smooth and efficient operation.
Collaborate on building advanced analytics capabilities to empower data-driven decision-making and operational insights.
Provide technical guidance and mentorship to junior developers, fostering team growth and skill development.
Basic Qualifications and Experience:
Doctorate Degree OR
Master's degree with 4 - 6 years of experience in Computer Science, IT or related field OR
Bachelor's degree with 6 - 8 years of experience in Computer Science, IT or related field OR
Diploma with 10 - 12 years of experience in Computer Science, IT or related field
Diploma with 14 - 18 years of experience in Computer Science, IT or related field
Functional Skills:
Must-Have Skills
Strong understanding of cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker, Kubernetes)
Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.)
Hands on experience in programming (e.g., SQL, C++, JavaScript, XML).
Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing
Good-to-Have Skills:
Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script
Experience with Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow
Experience with data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines
Experience working in an agile environment (i.e. user stories, iterative development, etc.)
Soft Skills:
Excellent analytical and troubleshooting skills
Strong verbal and written communication skills
Ability to work effectively with global, virtual teams
High degree of initiative and self-motivation
Ability to manage multiple priorities successfully
Team-oriented, with a focus on achieving team goals
Strong presentation and public speaking skills
.