Responsibilities:
? Design, implement manage AliCloud CI/CD pipelines to facilitate automated build, test, deployment processes fapplications services.
? Familiar with code governance, release management, artifact management & integration toolchain in SDLC.
? Continuously improve optimize the performance scalability of the cloud-based infrastructure service.
? Troubleshoot resolve issue related to Alicloud platform, application deployments performance, participate in on-call support as needed.
? Provide analysis support to development team to improve automate the build/release/deployment processes
? Participate report on cost optimization exercises fcloud resources
? Not only providing the development capabilities but also having operation technique across infrastructure, IT service management, disastrous recovery, patching, monitoring in our self-owned systems.
? Our stakeholders are scattering in APAC region, as a DevOps Engineer, Fluent English both inal writing is preferred.
Requirements:
? Bachelor’s Degree in Computer Science, Engineering, Software Engineering.
? Strong experience with Linux-based infrastructures, Linux/Unix administration, Kubernetes.
? Strong experience with AliCloud (e.g. ACK, ARMS, KMS,Cloud-native Api Gateway.. )
? Solid underston the DevSecOps process with CI/CD pipeline management platforms such as GitHub, Jenkins, Artifactory, Snyk etc.
? Strong scripting skills such as Shell, Python, Groovy, Bash.
? Experience in Monitoring. (e.g. NewRelic, ARMS)
? More than five years of experience in a DevOps Engineer role (similar role); experience in software development infrastructure development is a plus.
? Experience in working with Azure is preferred.
? Nice to have experience working with infrastructure as code (Terraform, Ansible)
? Nice to have AliCloud certification (e.g. ACA, ACP ACE)
更新于 2025-12-21
查看更多崗位職責(zé)
Amazon Global Selling has been helping individuals businesses increase sales reach new customers around the globe. Today, more than 50% of Amazons total unit sales come from third-party selection. The Global Selling team in China is responsible frecruiting local businesses to sell on Amazon’s 19+ overseas marketplaces supporting local Sellers’ success growth on the Amazon. Our vision is to be the first choice fall types of Chinese business to go globally.
The Amazon Global Selling Analytics, Intelligence, Technology (AGS-AIT) team serves as the research, automation, insight arm of the International Seller Service data hub, enabling rapid delivery of growth insights through strategic investments in regional data foundations, self-service business intelligence solutions, artificial intelligence tools.
The AGS-AIT team is positioned to establish AI-ready foundational capabilities across the AGSganization while maintaining excellence in business insight generation, self-service BI/AI application development.
AGS-AIT is looking fa Data Engineer to collaborate with cross-functional teams to design develop data infrastructure analytics capabilities fAGS AI Automation initiatives.
Key job responsibilities
? Design implement end-to-end data pipelines (ETL) to ensure efficient data collection, cleansing, transformation, storage, supporting both real-time offline analytics needs.
? Develop automated data monitoring tools interactive dashboards to enhance business teams’ insights core metrics (e.g., user behavior, AI model performance).
? Collaborate with cross-functional teams (e.g., Product, Operations, Tech) to align data logic, integrate multi-source data (e.g., user behavior, transaction logs, AI outputs), build a unified data layer.
? Establish data standardization governance policies to ensure consistency, accuracy, compliance.
? Provide structured data inputs fAI model training inference (e.g., LLM applications, recommendation systems), optimizing feature engineering workflows.
Basic qualifications
1+ years of data engineering experience
Experience with data modeling, warehousing building ETL pipelines
Experience with one more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
Experience with one more scripting language (e.g., Python, KornShell)
Preferred qualifications
Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, IAM roles permissions
更新于 2026-01-26
查看更多崗位職責(zé)