based on the unique data and cultural characteristics of each region.
Responsibilities:
- Responsible for the deployment, configuration, monitoring, and maintenance of big data platforms.
- Implement and optimize the infrastructure for big data storage, processing, and computation.
- Responsible for architectural design, performance tuning, and troubleshooting of big data platforms.
- Analyze and resolve complex system performance and stability issues to ensure system reliability and stability.
- Write technical documentation to record system configurations and operational procedures.
- Update software, enhances existing software capabilities and develops and direct software testing and validation procedures.
Qualifications
Minimum Qualifications:
- Bachelor's Degree in Computer Science or related discipline with experience in software engineering, with 2 years of relevant experience.
- 2 years experience in big data platform operations, familiarity with technologies such as Hadoop, Spark, Clickhouse, etc.
- Solid programming skills, proficient in at least one programming language (e.g., Java, Scala, Python).
- Strong foundation in computer science, familiarity with operating system principles, data structures, and algorithms.
- Compliance Requirement: Familiarity with and adherence to international data protection regulations; ability to formulate and execute compliance-oriented strategies.
- Global Multi-Environment Deployment and Operations: Experience in deploying and maintaining big data platforms across multiple global locations; familiarity with cross-geographical operational challenges.
Preferred Qualifications:
- Excellent teamwork and communication skills, experience in project management.
- Strong problem analysis and resolution skills, ability to respond quickly to emergencies.