Company:        Siemens

Website:          www.siemens.com

Eligibility:       Computer Science Degree

Experience:     3 - 8 yrs

Location:         Pune

Job Role:      DevOps Engineer – SBS Data Analytics

JOB SUMMARY:

Company Profile:

Siemens is a global powerhouse focusing on the areas of electrification, automation and digitalization. One of the world’s largest producers of energy-efficient, resource-saving technologies, Siemens is a leading supplier of systems for power generation and transmission as well as medical diagnosis.

Job Description:
1. You should Take responsibility and ownership for the end-to-end operation of a data analytics environment. Build and maintain a platform that hosts our data driven services, allowing us to process and utilize the wealth of data we have. Mainly a back-end solution, it has a limited front-end side as well.

2. You need to Integrate data analytics services to global offerings, by packaging Python code produced by data scientists to be compatible with our R&D infrastructure and target products that will consume the service.

3. You have to track the implementation, drive the integration across various departments, and monitor the usage of data analytics products.

4. The core of this job is software engineering; however the job is located within a global data science environment.

5. You need to Work closely with global, cross-functional teams like Architects, Data Scientists, DevOps and Product Managers to understand and implement the solution requirements.

6. You need to Make sure the numerous APIs are well maintained, test and validate the environment with end-users.

Candidate Profile:
Education:
A University Dgree in Computer Science or a comparable education, we are flexible as long as a high quality of code is ensured.

Experience:
3 to 8 Years.

Skills:
1. The scenario in our team needs an early starter, bringing along a few years of professional experience in software engineering, ideally with Python.

2. You should have Proven experience with common DevOps practices such as CI/CD pipelines (GitLab), container orchestration (Docker, Kubernetes, Helm) and infrastructure as code (Terraform).

3. Along with hands-on DevOps experience, the person must also possess experience in the domain of data analytics or MLOps.

4. As you will be working with these from day one, familiarity with AWS services beyond EC2 (e.g., Fargate, Batch, RDS, SageMaker) is something we expect from applicants.

5. Initial experience or willingness to explore big data pipeline and compute tooling such as Luigi, Airflow, Beam, Spark, Databricks. When it comes to methodologies, knowledge of agile software development processes would be highly valued.