Databricks Engineer at Clear Tech
Job Description
At Clear Tech, we specialize in Data, Analytics, and Artificial Intelligence, helping companies around the world transform their data into real business value. Our team combines highly skilled talent in Latin America with global best practices across cloud technologies, data engineering, data science, and business intelligence solutions. We work with agile methodologies to deliver end-to-end projects, staff augmentation models, and training programs that prepare professionals for today’s market challenges.
Databricks Engineer
We are looking for a highly skilled Databricks Engineer with strong analytical thinking, hands-on experience in Databricks Unity Catalog, and a deep understanding of business metrics and dimensional modeling. This role focuses on building scalable, well-governed data architectures using Databricks for business intelligence and advanced analytics. The ideal candidate brings both engineering rigor and business understanding, able to build robust data pipelines while also translating complex data into meaningful metrics and dashboards for non-technical stakeholders. You will play a key role in enabling data-driven decision-making across the organization.
Key Responsibilities:
• Design and implement data pipelines using Databricks, PySpark, and Delta Lake aligned with medallion
architecture principles.
• Work closely with business stakeholders and analysts to understand KPIs and enable analytical reporting via
curated data models.
• Model and structure data using dimensional modeling techniques (e.g., star schemas, fact and dimension tables)
to support BI tools like Tableau.
• Ensure data quality, consistency, and performance across cloud data platforms.
• Orchestrate data workflows with tools like Airflow, while applying governance best practices via Unity Catalog.
• Leverage SQL, Python (Pandas), and Databricks notebooks for ETL/ELT transformation and feature engineering.
Required Qualifications:
• Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field.
• 3-5 years of professional experience in data engineering, data analytics, or data modeling roles.
• Strong analytical mindset with the ability to understand business metrics and KPIs.
• Proficiency in Databricks, PySpark, SQL, and Delta Lake.
• Experience building data structures for business intelligence and analytics tools like Tableau.
• Practical knowledge of dimensional modeling, data warehouse concepts, and Lakehouse architecture.
• Working knowledge of Airflow and exposure to Scala is a plus.
• Solid understanding of distributed computing, cloud data platforms (AWS, Azure, or GCP), and big data
frameworks.
• Strong communication skills to collaborate effectively with both technical and non-technical stakeholders
Preferred Qualifications:
• Databricks certifications (e.g., Certified Developer for Apache Spark)
• Experience in machine learning workflows or MLOps on Databricks.
• Familiarity with CI/CD pipelines, DevOps practices, and Agile development methodologies.
• Experience integrating data platforms with BI tools and supporting data self-service environments.
We offer a contractor model with USD compensation and a strong benefits package, including paid time off, US holidays, a welcome equipment bonus, company-provided laptop, performance and compensation reviews, certifications, continuous learning opportunities, referral program, parental leave, and People Care support.
¡We look forward to receiving your application!
