Data Engineer III
Location
United States
Posted
15 hours ago
Salary
Not specified
No structured requirement data.
Job Description
Role Description
We are seeking an analytical, experienced, and solution-oriented Data Engineer III. The Data Engineer III is responsible for developing and optimizing the company’s data pipelines, integrations, and reporting solutions to ensure efficient and reliable data operations. This role requires independent problem-solving, proactive improvement of processes, and collaboration across teams to deliver impactful data solutions.
- Design, build, and maintain scalable data pipelines and ETL workflows.
- Write advanced SQL queries and implement optimization techniques for performance.
- Leverage Microsoft Azure & Fabric, Spark, and Python to automate complex workflows.
- Lead engineering efforts on machine learning and artificial intelligence projects.
- Develop and maintain robust web APIs (SOAP, REST) to support seamless data integration across internal and external systems.
- Collaborate with external vendors to ensure the integrity and functionality of integrations.
- Monitor and troubleshoot data processes, ensuring high availability and minimal downtime.
- Proactively identify and resolve bottlenecks or inefficiencies in data pipelines and integrations.
- Manage and prioritize work using the ticketing system while maintaining regular communication in stand-ups and stakeholder meetings.
- Conduct code reviews to ensure adherence to best practices and high-quality deliverables.
- Contribute to technical documentation for processes, tools, and workflows.
- Provide tier 3 support to end users of integrated systems such as reporting and accounting.
- Partner with business owners to identify areas for improvement and gather requirements.
- Mentor other team members by sharing knowledge, conducting training sessions, and providing guidance on best practices.
- Take ownership of complex projects, ensuring timely delivery and alignment with business objectives.
Qualifications
- Bachelor’s degree in Computer Science, Information Systems /other relevant degree or equivalent professional experience.
- Expert knowledge of relevant languages, such as SQL, dbt, Python, and/or C#.
- Expert knowledge of at least one data pipeline orchestration tool, such as Azure Data Factory.
- Expert understanding of data modeling and ETL concepts.
- Experience with version control systems (e.g., Git) and best practices.
- Strong problem-solving skills and the ability to work independently on complex tasks.
Typical Behaviors & Working Style
- Versatile and adaptable, flexing to meet the needs of the situation.
- Maintains people-orientation, even if reserved in nature. Must be helpful and service-oriented, with a strong focus on repeatable, high-quality results.
- Decision-making is collaborative, but meticulous, requiring consideration of facts, established procedures, and proven processes.
- Communicates based on the task or technical needs at hand, defining clear team roles.
- Leads according to specialty or expertise. Will act with conviction to ensure quality standards, rarely delegating.
Preferred Working Environment & Job Characteristics
- A complex, senior-level data engineering environment with high expectations for independence and ownership.
- A high-standards, reliability-focused setting where availability, performance, and data integrity are critical.
- A fast-paced, multi-priority workload balancing delivery, operational support, and cross-team collaboration.
What success in this role looks like
- Data pipelines and integrations operate reliably at scale, supporting business-critical systems and analytics.
- Operational issues are resolved proactively, with minimal downtime and continuous improvement.
- The data engineering function grows stronger over time, through high-quality delivery, documentation, and ownership of complex initiatives.
Working Conditions
This is a work from home position. All technology required will be provided.
Training
- Orientation via some live remote and some pre-recorded video sessions.
- IT security training.
- Internal development process and procedures.
- Company-approved AI technology.
Salary
$120,000 - $150,000 actual compensation within this range will be determined by multiple factors including candidate experience and expertise.
Job Requirements
- Bachelor’s degree in Computer Science, Information Systems /other relevant degree or equivalent professional experience.
- Expert knowledge of relevant languages, such as SQL, dbt, Python, and/or C#.
- Expert knowledge of at least one data pipeline orchestration tool, such as Azure Data Factory.
- Expert understanding of data modeling and ETL concepts.
- Experience with version control systems (e.g., Git) and best practices.
- Strong problem-solving skills and the ability to work independently on complex tasks.
- Typical Behaviors & Working Style
- Versatile and adaptable, flexing to meet the needs of the situation.
- Maintains people-orientation, even if reserved in nature. Must be helpful and service-oriented, with a strong focus on repeatable, high-quality results.
- Decision-making is collaborative, but meticulous, requiring consideration of facts, established procedures, and proven processes.
- Communicates based on the task or technical needs at hand, defining clear team roles.
- Leads according to specialty or expertise. Will act with conviction to ensure quality standards, rarely delegating.
- Preferred Working Environment & Job Characteristics
- A complex, senior-level data engineering environment with high expectations for independence and ownership.
- A high-standards, reliability-focused setting where availability, performance, and data integrity are critical.
- A fast-paced, multi-priority workload balancing delivery, operational support, and cross-team collaboration.
- What success in this role looks like
- Data pipelines and integrations operate reliably at scale, supporting business-critical systems and analytics.
- Operational issues are resolved proactively, with minimal downtime and continuous improvement.
- The data engineering function grows stronger over time, through high-quality delivery, documentation, and ownership of complex initiatives.
- Working Conditions
- This is a work from home position. All technology required will be provided.
- Training
- Orientation via some live remote and some pre-recorded video sessions.
- IT security training.
- Internal development process and procedures.
- Company-approved AI technology.
- Salary
- $120,000 - $150,000 actual compensation within this range will be determined by multiple factors including candidate experience and expertise.