Featured: Sr. Data Engineer at Zoro

Location: Chicago

Job Type: Full-Time


Company Summary:
Zoro offers millions and millions of products — an endless aisle with everything you need to run your business. We offer fast and free shipping, no-hassle returns, and exceptional customer service. We’ve grown quickly in a short time and are continuing to do so while aggressively growing our revenue. We are excited to be a part of an award-winning culture — we have been named a Great Place to Work for multiple years in a row, among other local and national accolades. We think Zoro is a pretty amazing place to work and grow, and think you will too!

Primary Function:
The Senior Data Engineer will collaborate within Data Engineering and with other IT groups, business partners and external service providers, to play a key role in building, maintaining and supporting our new analytics platform, the “Zoro Data Platform (ZDP)”. They will also provide direction to, and oversee various service providers and junior engineers’ activities.

Job Responsibilities Include:
*Primary responsibility for Zoro Data Platform (ZDP):
-Data model ongoing design & development
-Conceptual, logical and physical design (database, ODS, aggregates, etc.)
-Database administration
Capacity analysis & management

*MDM Lead
-Identify key domains that’d benefit from an MDM approach (e.g. Product, Customer), along with best data sources & necessary attributes, and integrate into the ZDP
-Define governance strategy with associated roles & responsibilities (e.g. Data Steward, Quality Specialist)
-Define & implement Policies & SOPs
-Monitor operations, develop and report quality metrics to key stakeholders

*Data Pipeline development:
-Participate in Requirements Gathering: work with key business partner groups (e.g. Product Mgt) and other Data Engineering personnel to understand department-level data requirements for the ZDP
-Design Data Pipelines: work with other Data Engineering personnel on an overall design for flowing data from various internal and external sources into the ZDP
-Build Data Pipelines: leverage standard toolset and develop ETL/ELT code to move data from various internal and external sources into the ZDP

*Support Data Quality Program: work with Data QA Engineer to identify automated QA checks and associated monitoring & alerting to ensure ZDP maintains consistently high quality data
*Support Operations: triage alerts channeled to you and remediate as necessary
*Technical Documentation: leverage templates provided and create clear, simple and comprehensive documentation for your development

*Key contributor to defining, implementing and supporting:
-Data Services
-Data Dictionary
-Tool Standards
-Best Practices
-Data Lineage
-User Training

*Define Best Practices and Guidelines for other Data Engineering team members
*Lead the team in developing new technical skills necessary for cloud-native data engineering platform
*Explores new tech
*Shares and documents learnings
*Productionalizes proof of concepts

Skills & Responsibilities:
-Expert Python
-Expert-level data modeler (back-end and semantic layer)
-Expert-level ETL/ELT designer/developer
-Strong database administration and operations experience & proficiency
-Strong SQL
-Structured & unstructured data expertise
-Cloud environment development & operations experience (e.g. Google Cloud Platform/GCP experience a plus)
-Excellent verbal and written communications
-Strong team player
-Working knowledge of eCommerce data a plus
-Prior experience with Git, Terraform, GCP Deployment Manager, CICD, Docker, Kubernetes, Apache Airflow, Apache Beam, Apache Spark experience is a plus

Success Criteria:
-Expert knowledge of data modeling concepts and data relationships
-Advanced Analytical Thinking and Problem Solving skills
-Solid experience in architecture, advanced reporting and dashboards
-Strong SQL skills and experience with performance tuning are required
-“Get it done” attitude with a high degree of autonomy, ownership and responsibility
-Superior Communication and Business-Technical Interaction skills

To qualify, you must possess the following skills:
-Bachelor’s degree in computer science, management information systems, or a related discipline
-10+ years hands-on data warehouse-data modeling experience
-10+ years hands-on database admin/ops experience
-10+ years hands-on ETL/ELT design/development experience
-Key resource on team(s) that have delivered successful enterprise-level analytics platforms

Company Website:

Job Posted by: Bernard Estanislao

How To Apply:

Please apply by accessing this link: