JOB BOARD

Featured: Software Engineer II, Data & ML Platform at Zoro

Location: Chicago

Job Type: Full-Time

Description:

Company Summary:
Zoro is an eCommerce company that’s on a mission to help business owners get everything they need to run their businesses and thrive–from office supplies to power tools. But we’re way more than a website. We’re a team of great people with an award-winning culture. Check us out and see for yourself!

Job Summary:
Zoro is seeking a Software Engineer II to join our Data and Machine Learning Platform team. In this role, you will contribute to the design and development of scalable tools and systems that enable teams across the organization to ingest, create, share, and use data. By helping develop self-service solutions and streamlining data workflows, you will support data pipelines, machine learning workflows, and advanced analytics—empowering teams to make data-driven decisions and work more efficiently.

In this role, you will collaborate with senior engineers to enhance the platform’s ability to support diverse data products with minimal friction. Your contributions will help Zoro’s business partners leverage data more effectively, allowing them to innovate quickly and drive business growth. Additionally, this role is crucial to maintaining the platform’s reliability, scalability, and adaptability, giving you opportunities to develop your skills while working on high-impact projects.

Duties and Responsibilities:
- Contribute to the design, build, and maintenance of scalable, self-service tools and automated workflows that enable partners to easily - ingest, create, share, and use data, optimizing data processes and enhancing platform usability.
- Collaborate proactively with cross-functional teams (data engineers, data scientists, software engineers) to identify requirements, - align platform solutions with partner needs, and continuously enhance the user experience to support scalable, reliable, and - user-friendly data products.
- Implement monitoring and alerting systems to ensure platform reliability, identifying and resolving issues before they impact users.
- Contribute to the continuous improvement of platform infrastructure, ensuring it is robust, scalable, and easy for partners to adopt, - and can accommodate evolving business needs.
- Research and integrate new tools and technologies to drive innovation and enhance platform performance. Ensure the platform remains - cutting-edge, adaptable, and aligned with industry best practices, while continuously developing your technical skills.
- Assist in creating and updating technical documentation and user guides for the team and platform users, ensuring that platform tools, workflows, and processes are clearly documented for easy adoption and internal knowledge sharing.

Minimum Qualifications:
- 3+ years of experience in platform engineering, software development, or a related field.
- Demonstrate platform thinking—designing scalable, reusable, and user-friendly tools that reduce complexity, remove friction, and empower teams to innovate faster through self-service capabilities.
- Strong proficiency in at least one programming language (preferably Python) with demonstrated ability to write clean, well-documented code.
- Working experience with cloud platforms such as AWS, Azure, GCP, or others. (GCP Preferred)
- Understanding of software development and software engineering principles, including version control, modularization, API development, and integrations between front-end and back-end layers
- Ability to write clear documentation and convey technical concepts to peers
- Experience using containerization tools like Docker and Kubernetes in production environments
- Exposure to automated build tools (Jenkins, etc.) and continuous integration / continuous delivery (CI/CD) principles

Preferred Qualifications:
- Experience with orchestration tools such as Airflow, Argo, Flyte, or Kubeflow.
- Proficiency with infrastructure as code (IaC) technologies and automated infrastructure management/deployment patterns (e.g., Terraform, Ansible, Helm)
- Familiarity with monitoring and logging using tools like DataDog or Prometheus + Grafana.
- Familiarity with batch and streaming data tools (e.g., Spark, Hadoop; Kafka, Pub/Sub) for large-scale data processing and real-time workflows.
- Understanding of machine learning processes, model types, workflows, and terminology

Company Website: https://www.zoro.com

Job Posted by: Zax Rosenberg

How To Apply:

Apply at https://www.zoro.com/careers/jobs?gh_jid=4414659006