Senior Data Engineer

Data EngineerData EngineerFull TimeRemoteTeam 11-50Since 2021H1B No SponsorCompany SiteLinkedIn

Location

United States

Posted

3 days ago

Salary

$154K - $196K / year

DBTSQLSnowflakeAirflowPythonTerraformAWSLambdaGitCi/cdData ModelingETLELTData WarehousingData PipelineApache AirflowDDLRBACCost OptimizationData QualityGit Hub ActionsS3Cloud FormationServerless

Job Description

Local or 100% Remote

About Point

✨ Real Impact, Real People: Our mission at Point is to make homeownership more valuable and accessible. Your work directly helps homeowners access their wealth, achieve financial flexibility, and realize life changing goals.

✨ Funding: With over $175M raised from top investors like Andreessen Horowitz, WestCap, Greylock, and Prudential, we’re scaling fast! You have the opportunity to join us at a pivotal stage.

✨ Game-changing Product: We're building a category defining company in home equity. We’ve earned a 4.7 Trustpilot rating and an A+ from the BBB, a testament to the value we provide to our 20,000+ customers.

✨ Great Place to Work: Our employees love working here! We are a Certified Great Place to Work and a Fortune Best Workplaces in the Bay Area.

✨ Remote First Culture, Genuine Connection: Work from anywhere in the U.S., while staying closely connected through virtual collaboration, team gatherings, and a people-first culture.

 

About the role

Point is looking for a Senior Data Engineer to join a lean, high-impact data team that powers analytics for 15+ business functions across a home equity investment company. You'll own the development and maintenance of a 1,600+ model dbt platform, build and optimize ETL pipelines in Airflow, manage Snowflake infrastructure (DDL deployments, permissioning, cost optimization), and ensure the reliability of 40+ data source integrations. This is a hands-on engineering role where your work directly shapes how the business makes decisions, and you'll also have the opportunity to contribute to emerging AI initiatives as the team expands into Cortex AI and LLM-powered workflows.

 

Your responsibilities

  • Develop, maintain, and optimize dbt models across 20+ data marts serving business functions including marketing, production, servicing, finance, compliance, investor operations, and data science
  • Build and maintain ETL/ELT pipelines using Airflow (Astronomer), including custom operators, hooks, and sensors for data ingestion from 40+ sources
  • Manage Snowflake DDL deployments via schemachange — databases, schemas, warehouses, users, roles, network policies, integrations, and UDFs
  • Maintain and extend Terraform-managed RBAC grants (200+ .tf files), ensuring secure, auditable access provisioning across the Snowflake environment
  • Drive Snowflake cost optimization — warehouse sizing, multi-clustering configuration, query performance tuning, and monitoring to target 15–25% cost reduction
  • Support CI/CD pipelines for dbt Cloud, Astronomer deployments, and automated data quality testing
  • Develop and maintain AWS Lambda functions supporting data integrations, APIs, and serverless data processing workflows
  • Perform code reviews, enforce engineering standards, and contribute to data model documentation (YAML configs, markdown docs) across the platform
  • Collaborate with analysts and data scientists to build and maintain data marts, feature engineering tables, and semantic layer metrics that enable self-service analytics
  • Contribute to the team's AI/ML initiatives — including Cortex AI semantic models, data preparation for LLM workflows, and infrastructure supporting data science deployment pipelines 

 

About you

  • 5+ years of experience in data engineering or analytics engineering, with significant hands-on dbt and SQL development experience
  • Deep Snowflake expertise — DDL management, role-based access control, warehouse configuration, performance tuning, and cost optimization
  • Strong experience with Apache Airflow (or similar orchestration tools) — building DAGs, custom operators, scheduling, and failure handling
  • Proficiency in Python for data pipeline development, scripting, Lambda functions, and CI/CD automation
  • Experience with Infrastructure as Code — Terraform for access management, schemachange or Flyway for database migrations, version-controlled deployments
  • Solid understanding of data modeling — dimensional modeling, fact/dimension tables, data mart design, and enterprise data architecture
  • AWS experience — Lambda, S3, SAM/CloudFormation, and serverless architecture patterns
  • Strong Git workflows — branching strategies, pull request reviews, CI/CD integration (GitHub Actions or similar)
  • Excellent troubleshooting skills — ability to diagnose pipeline failures, data quality issues, and performance bottlenecks across a complex platform
  • Clear communicator — can work with non-technical stakeholders to translate business requirements into data models and explain technical tradeoffs

 

Our benefits 

  • Generous health benefits: We provide comprehensive medical, dental, and vision plans with options for flexible spending accounts (FSA) and health savings accounts (HSA).
  • Unlimited paid time off: Recharge with unlimited paid time off and 10 company holidays. 
  • Flexible remote and onsite work: Our teams work from many different locations and time zones. We support fully remote work and also have an amazing in-person environment in our downtown Palo Alto, CA HQ. 
  • Fully paid parental leave: Point will supplement state Paid Family Leave (PFL) so employees receive 100% of their regular base pay, plus two additional weeks of fully paid leave after state PFL ends. In states without PFL, Point offers up to 8 weeks of paid parental leave. In addition, employees also receive 4 weeks of fully paid transition time, during which you may work 2–3 days per week while receiving full base pay.
  • Equity: We offer meaningful equity because we believe in sharing the value you help create. Your contributions directly impact our growth, and your equity gives you a stake in our future success. 
  • Financial wellness:  We provide 401K retirement plans for employees as well as guaranteed life insurance and short- and long-term disability coverage. 
  • Extra work/life benefits: We provide monthly stipends for internet, mobile plans, wellness perks, and a one-time home office reimbursement. 

 

Compensation at Point will be determined by skills, experience, and geographic location. Point has identified the expected annual base salary for this role at this level based on the market by tiers (Region | Location | Market Salary):

  • Tier 1 | San Francisco Bay Area, New York, and Seattle | $177,650 - $196,350 
  • Tier 2 | Chicago, Austin, Denver, Boston, Washington DC, San Diego, Portland, Sacramento, Philadelphia, Los Angeles & Santa Barbara | $161,500 - $178,500
  • Tier 3 | All other US metro areas | $154,850 - $171,150

This does not include any other potential components of the compensation package, including equity, benefits, and perks outlined above. At the launch of each position, we benchmark compensation to the appropriate role and level utilizing competitive compensation data from various data sources as references. At the offer stage, we use the signal we received from our interviews, coupled with your experience, location, and other job-related factors, to determine final compensation.

 

Location Requirement: This is a remote position. However, candidates must reside in one of Point’s states of operation: AL, AZ, AR, CA, CO, CT, DC, FL, GA, IL, KS, KY, MA, MD, MI, MN, MO, NH, NV, NJ, NY, NC, OH, OR, PA, SC, TN, TX, UT, VA, WA, WI.

 

Point is proud to be an equal-opportunity employer. We provide employment opportunities regardless of age, race, color, ancestry, national origin, religion, disability, sex, gender identity or expression, sexual orientation, veteran status, or any other protected class. Each individual at Point brings their own perspectives, work experiences, lifestyles, and cultures with them, and we believe that a more diverse team creates more innovative products, provides better services to customers, and helps us all grow and learn. 

Related Categories

Related Job Pages

More Data Engineer Jobs

Full TimeRemoteTeam 1,001-5,000Since 1998H1B Sponsor

As a Senior Data Engineer, you will be instrumental in designing and building the next generation of our data infrastructure. Your work will handle massive volumes of behavioral, operational, and customer data, directly impacting product features like segmentation, personalizatio...

SQLPythonSnowflakeBigQueryDatabricksApache AirflowdbtApache KafkaApache SparkTerraformDataOpsCI/CD
United States + 1 moreAll locations: United States, Canada
Data Engineer3 days ago
Full TimeRemoteTeam 1,001-5,000Since 2010H1B No Sponsor

GHX is seeking a Software Engineer III to work on our Content Tooling solution with a focus on data engineering and analytics. This individual will be responsible for the creation, implementation, and support of data-intensive software solutions including complex SQL development,...

SQLETLData ModelingPythonData WarehousingData QualityGitAgileScrumRelational DatabasesSnowflakeSigma Computing
United States
Data Engineer3 days ago
Full TimeRemoteTeam 201-500

The specialist will own and execute data migrations from competitor platforms, ensuring accuracy and completeness while partnering with onboarding specialists and clients to understand needs and timelines. This role involves leading complex migrations for high-value clients and collaborating cross-functionally to ensure a seamless client journey.

SQLDatabase ManagementData MigrationPythonData AnalysisETL
United States + 1 moreAll locations: United States, Canada

Freelance Web Scraping Engineer (Vibe Coding)

Mindrift

Apply → Pass qualification(s) → Join a project → Complete tasks → Get paid. Project time expectations: Tasks are estimated to require around 10–20 hours per week during active phases, based on project requirements; This is an estimate, not a guaranteed workload, and applies only while the project is active. Note: Rates vary based on expertise, skills assessment, location, project needs, and other factors. Higher rates may be offered to highly specialized experts. Lower rates may apply during onboarding or non-core project phases. Payment details are shared per project.

Data Engineer3 days ago
Part TimeRemote

This opportunity is only for candidates currently residing in the specified country. Your location may affect eligibility and rates. Please submit your resume in English and indicate your level of English.Mindrift is looking for highly skilled Vibecode...

Web ScrapingData ExtractionJavaScriptPythonApifyData ValidationData QualityParallel Computing
Iowa