Diaconia

Creative IT Solutions | An Officially Great Place To Work!

Senior Data/Systems Analyst

Systems EngineerSystems EngineerFull TimeRemoteTeam 51-200Since 2020H1B No SponsorCompany SiteLinkedIn

Location

United States

Posted

6 days ago

Salary

$130K - $150K / year

Bachelor Degree8 yrs expEnglishAmazon RedshiftAWSCloudETLJavaKafkaScalaSparkSQL

Job Description

• Lead the design, implementation, and management of data integration and data platform solutions with a focus on performance, scalability, and reliability • Establish and enforce best practices for data integration, ETL/ELT, automation, monitoring, and data quality • Design and maintain cloud and/or on-premise data warehouses, data lakes, and data access APIs • Oversee data modeling efforts including dimensional modeling, denormalized structures, and OLAP concepts • Drive delivery of data engineering initiatives, including structured and unstructured data pipelines • Apply data extraction, transformation, and loading techniques to integrate large datasets from multiple sources • Support analytical use cases by enabling data availability, quality controls, and reporting frameworks • Provide technical leadership, mentorship, and direction to data engineering teams • Provide required technical documentation for all relevant deliverables

Job Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or related field
  • 8+ years of experience in data engineering, including leading teams and delivering scalable data platforms
  • Strong experience designing and developing ETL/ELT pipelines
  • Hands-on experience with data modeling, relational databases, and application development
  • Expert-level SQL skills
  • Strong problem-solving, analytical, and troubleshooting abilities
  • Excellent communication and organizational skills with the ability to manage multiple priorities
  • U.S. Citizenship is required by the Federal Client
  • Must have or able to obtain DoD Public Trust Clearance
  • 3+ years of AWS data engineering experience (e.g., S3, Redshift, Kafka, and related services) (preferred)
  • Experience building data pipelines using Spark with Java or Scala (preferred)
  • Strong Java development experience in enterprise environments (preferred)
  • Familiarity with DevOps tools such as Git, Artifactory, and CI/CD pipelines (preferred)
  • Expert understanding of data movement, data exchange formats, and database design concepts (preferred)

Related Categories

Related Job Pages