Contribute to the team’s deliverables and improve scalability and latency challenges
You will need a product-focused mindset. It is important for you to understand requirements and deliver systems to grow and extend to accommodate those needs
Diagnose and distributed systems, develop and document technical solutions and sequence work to make fast, iterative deliveries and improvements
Envision and complete projects to improve user experience
Deliver data-driven approach, systems designs, software implementation
Review code developed by other developers and provide feedback to ensure best practices (e.g., style guidelines, checking code in, accuracy, testability, and efficiency).
Automate cloud infrastructure, services, and observability
Develop CI/CD pipelines and testing automation
2-5 years of design & development experience
Experience with Java , AWS platform
Experience with API development best practices
Experience with open-source projects like GitHub, Redis, Puppet, Terraform, Swagger, Spinnaker, Docker, and Kubernetes
Experience developing, debugging, and operating resilient distributed systems that run across hundreds of compute nodes in multiple data centers.
Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship or work experience
Experience with Spark, Hive and/or Snowflake
Experience in Hadoop 2.0, Airflow, Looker
Experience with Python
To apply for this job please visit autodesk.wd1.myworkdayjobs.com.