Porfolio

Camille Lebrun

I'm Camille Lebrun, a results-driven Analytics Engineer with over two years of expertise in transforming complex data landscapes into strategic assets. Leveraging advanced tools like dbt and BigQuery, I architect scalable data solutions that drive business growth. My expertise lies in optimizing data workflows, implementing robust data quality frameworks, and delivering actionable insights through modern analytics practices. By combining technical excellence with strategic thinking, I enable organizations to make confident, data-driven decisions.

About me

As a seasoned Analytics Engineer, I specialize in building robust data solutions that drive business value. My technical foundation in dbt, SQL, and Python enables me to create scalable data models and automate complex workflows, significantly improving operational efficiency.


My expertise spans the modern data stack ecosystem—from cloud data warehouses to transformation tools and visualization platforms (Metabase, LightDash, Looker). I excel in designing end-to-end data solutions that empower stakeholders to make data-driven decisions confidently.


As a non-binary professional and advocate for diversity in tech, I bring a unique perspective to problem-solving and team collaboration. I am passionate about fostering inclusive environments where diverse voices contribute to innovation, making the tech industry more accessible and welcoming to underrepresented groups.

Work experience

Analytics Engineer
Hanalytics, Paris, France
Jan.2024 - Present

Specialized in building scalable data solutions across retail, e-commerce, and media industries, delivering strategic insights through modern data architecture.

Core Responsibilities:

  • Data Architecture: Design and implement end-to-end data pipelines using BigQuery and dbt
  • Quality & Monitoring: Develop data quality checks and KPI alerting systems with Slack integration
  • Business Solutions: Create reusable dbt packages and transformation models for complex business logic
  • Stakeholder Success: Transform business requirements into efficient technical solutions

Key Achievements:

  • Multi-industry Impact: Delivered solutions for retail, e-commerce, and media companies
  • Automation: Implemented automated e-commerce tracking and data pipelines
  • Data Democratization: Built self-service analytics enabling independent business insights

Technical Stack:

BigQuery, dbt Core/Cloud, Kestra, Python, SQL, Looker Studio, Metabase, Slack API

Analytics Engineer Apprentice
Lucky Cart, Paris, France
Nov.2023 - Present

At Lucky Cart, I leverage a robust data stack—dbt, BigQuery, and Python—to deliver innovative solutions that streamline workflows, enable proactive decision-making, and align data strategies with business objectives.

Core Responsibilities:

  • Data Quality: Maintain rigorous data standards ensuring clean, accurate, and reliable insights
  • Pipeline Optimization: Audit and enhance data pipelines, reducing errors and improving maintainability
  • Automation: Develop end-to-end data application using Streamlit and Python and reduce manual effort
  • Stakeholder Support: Enable fast, actionable insights with minimal manual effort

Key Achievements:

  • Model Optimization: Refactored dbt models for improved performance and maintainability
  • Efficiency Gains: Delivered measurable cost savings through pipeline optimization
  • Reporting Automation: Streamlined reporting workflows with automated data models

Technical Stack:

dbt, BigQuery, Python, Google Sheets, Looker Studio, Git

Analytics Engineer Intern
My Job Glasses, Paris, France
June.2023 - Nov-2023

At My Job Glasses, I leverage a robust data stack—dbt, BigQuery, and Metabase—to create impactful solutions that drive data-informed decision-making, streamline workflows, and align data strategies with business objectives.

Core Responsibilities:

  • Data Modeling: Design and implement scalable dbt models to streamline workflows and deliver actionable insights
  • Dashboard Creation: Develop dynamic Metabase visualizations enabling confident, data-driven decisions
  • Code Quality: Design dbt macros and enforce best practices for maintainable, efficient data pipelines
  • Stakeholder Collaboration: Partner with teams to align data solutions with business priorities

Key Achievements:

  • Process Optimization: Created modular, reusable dbt macros enhancing efficiency across data models
  • Strategic Impact: Delivered comprehensive analyses driving operational improvements
  • Customer Success: Designed tailored data solutions improving satisfaction and retention

Technical Stack:

dbt, BigQuery, Metabase, SQL, Git

Data Analyst Intern
Click & Boat, Paris, France
May.2022 - Oct-2022

At Click&Boat, the world's leading platform for boat rentals connecting private individuals and professional rental companies worldwide, I leveraged a cutting-edge stack—dbt, Snowflake, and Tableau—to enhance data-driven decision-making and optimize the customer experience.

Core Responsibilities:

  • User Segmentation: Design and implement segmented user groups for enhanced CRM integration
  • Dashboard Development: Create dynamic Tableau visualizations for customer acquisition and marketing metrics
  • Behavior Analysis: Conduct in-depth studies of user interactions and develop booking prediction models
  • Data Integration: Streamline CRM data transfer to warehouse for unified data ecosystem

Key Achievements:

  • CRM Enhancement: Implemented user segmentation enabling targeted customer engagement
  • Analytics Impact: Developed scoring models for repeat bookings and cross-device interactions
  • Data Quality: Designed advanced dbt models improving overall data processing efficiency

Technical Stack:

dbt, Snowflake, Tableau, SQL, Python

Education

Mastere Data & AI
HETIC, Montreuil
2022 - 2024

MSc in Structural Functional Biochemistry
University Toulouse III Paul Sabatier, Toulouse
2018 - 2020

Skills

Specialized in data transformation and pipeline automation with modern data stack technologies.

Data Transformation

Expertise in dbt Core/Cloud for data modeling and testing, with focus on incremental models and custom macros development.

Data Integration

Hands-on experience with Airbyte configurations, custom Python extractors for APIs, and implementing data quality checks at ingestion.

Pipeline Orchestration

Practical experience with Kestra for workflow automation, including error handling and retry mechanisms for reliable data pipelines.

Data Quality

Implementation of dbt tests, data validation frameworks, and automated monitoring for data freshness and accuracy.

Business Intelligence

Development of self-service analytics solutions using Looker Studio and Metabase, enabling stakeholders to access insights independently.

Data Modeling

Design of dimensional models and fact tables, with focus on performance optimization and maintainable code structure.

Collaboration

Experience in cross-functional teams, translating business requirements into technical specifications and maintaining clear documentation.

Modern Data Stack

Practical experience with BigQuery and Snowflake, including cost optimization and performance tuning.

Process Improvement

Track record of identifying and implementing automation opportunities, reducing manual work and improving data reliability.

Explore my projects

Discover how I've leveraged my skills at Hanalytics to create impactful data solutions.

Modern Data Foundation & Orchestration

Description:

Built a scalable data foundation with Kestra orchestration, consolidating multiple data sources into a single source of truth. Implemented end-to-end data pipelines using medallion architecture, enabling comprehensive funnel analysis and business monitoring.

Technical Achievements:

Developed modular data models with marts layer for analytics exposure. Implemented Kestra workflows for reliable pipeline orchestration, with automated data quality testing and proactive monitoring.

Business Impact:

Enabled comprehensive B2B and B2C activity monitoring through systematic tracking of key metrics: net sales, bundle performance, upsell rates, and customer acquisition costs. Implemented cohort analysis for churn prediction and customer behavior insights. Automated reporting workflows significantly reduced manual effort while providing actionable insights for strategic decision-making.

Data Monitoring
Slack-Based Analytics Alert System

Description:

Developed a comprehensive Slack monitoring system after analyzing business and technical alerting needs. Created a solution that bridges the gap between data insights and team communication.

Key Features:

Implemented customizable alerts for business KPIs, data quality issues, and pipeline status. Built user-friendly commands for data access and automated daily metrics reporting, significantly improving data accessibility.

Impact:

Reduced time to detect data issues by 70% and enabled immediate access to key metrics. Improved cross-team collaboration through centralized data communication channel.

dbt Development
Reusable Analytics Package

Description:

Created a reusable dbt package from our internal analytics product, enabling rapid deployment and standardization across client implementations.

Technical Features:

Developed modular transformations with configurable parameters, comprehensive testing suite, and detailed documentation. Implemented best practices for maintainability and scalability.

Business Value:

Reduced client implementation time by 60% through standardized analytics models. Ensured consistent data modeling practices across all client deployments.

Latest Articles

View all posts »

Check out my latest articles about Data Engineering and Analytics on Medium.