Senior Data Engineer
Mercator AI
About Mercator AI
Mercator AI is a fast-moving, data-first startup transforming how construction companies identify and pursue new business. We aggregate, normalize, and enrich massive volumes of construction data to unlock early signals that fuel high-quality business development decisions. We're proudly early-stage and proudly ambitious; our team thrives on ownership, problem-solving, and speed.
About the Role
We’re looking for a Senior Data Engineer who can own complex problems from beginning to end. This isn’t a role for someone who just pushes data through a pipeline. You’ll be expected to investigate problems, make sound recommendations, and deliver high-quality solutions that directly impact the business. This role sits at the core of our operations, helping us expand and scale our data infrastructure, pipelines, and enrichment strategies.
What You’ll Do
Must Haves:
- Design, build, and optimize scalable data pipelines that power our application, delivering impactful insights to our customers
- Proven ability to own and solve complex problems end-to-end (not just execute tickets) with an emphasis on customer-facing deliverables with and user impact
- Strong proficiency in Python for data engineering tasks.
- Experience with Apache Airflow for workflow orchestration.
- Build and manage data infrastructure on Google Cloud Platform (GCP) using services such as BigQuery (or a similar data warehouse) and Cloud Storage.
- Hands-on experience with dbt for data modelling and transformations.
- Familiarity with data ingestion from APIs and/or files at scale.
- Familiarity with working within a data-mesh architecture.
- Ability to work independently and communicate progress clearly in a startup environment.
- Startup mindset: comfort with ambiguity, urgency, and rapid iteration.
- Cross-team collaboration that supports product and data acquisition teams and contractors, answering data-related inquiries, enhancements and building supporting tools
Nice to Haves
- Experience writing web scrapers using Playwright or BeautifulSoup.
- Familiarity with LLM-based enrichment using tools like Gemini or Google Search APIs.
- Background in geospatial data (real estate, construction tech, etc).
- Experience at other early-to-mid-stage startups in data-heavy environments.
- Familiarity with data normalization, schema management, and taxonomy design.
- Experience with data quality and monitoring systems such as Great Expectations or similar.
- Experience working with devops infrastructure and development tools such as Terraform, ArgoCD, Github Actions or similar.
Why Mercator AI?
- Impact: You’ll directly influence how our product scales and delivers value.
- Ownership: You’ll get real autonomy to solve real problems and be trusted to deliver.
- Collaboration: Small, senior team that values working together, creative thinking, and fast execution.
- Purpose-built tech: We’re doing more than storing data, we’re turning it into insight and advantage.