Back
company logoVirtustant

Databricks Data engineer

5 days ago

·

53 applicants

Work Type: Full-time

company logoVirtustant

Databricks Data engineer

5 days ago

·

53 applicants

Work Type: Full-time

Date Posted

Aug 29, 2025

Work Type

Full-time

Job Role

Data Engineering

Salary

Description

Job Title: Databricks Data Engineer

 

About the Company:

Our Client offers a suite of innovative artificial intelligence solutions designed to revolutionize how businesses tackle challenges. They specialize in developing and managing workforce-focused online education programs in partnership with over 100 public and private universities.

 

Job Description:

Our Client is seeking a Databricks Data Engineer to support their data engineering and analytics initiatives. The ideal candidate will have experience in building scalable data pipelines and integrating Databricks with cloud services, focusing on AI-driven solutions and projects.

 

Responsibilities:

• Design and deliver scalable, cloud-based data solutions in collaboration with customers.

• Execute complex ad-hoc queries using Databricks SQL.

• Develop robust data transformation workflows using PySpark and SQL.

• Build ETL/ELT workflows with AWS and Azure Services.

• Optimize Spark jobs for performance and cost efficiency.

• Collaborate with data scientists and ML engineers to support AI and machine learning use cases.

• Implement CI/CD pipelines for Databricks jobs.

• Ensure data quality, lineage, and compliance using relevant tools.

• Troubleshoot and maintain production data pipelines.

• Mentor and share best practices with teams.

 

Required Experience and Qualifications:

• Bachelor’s degree in Computer Science, Engineering, or related field.

• 5+ years of experience in software/data engineering.

• 2+ years of experience with Databricks and Apache Spark.

• Strong proficiency in Python, SQL, and PySpark.

• Deep understanding of AWS and Azure Cloud services.

 

Preferred Skills:

• Familiarity with Databricks Data Lakehouse, Workflows, and SQL.

• Solid grasp of data Lakehouse and warehousing architecture.

• Experience supporting AI/ML workflows.

• Familiarity with infrastructure-as-code tools like Terraform or CloudFormation.

• Excellent analytical and troubleshooting skills.

 

Personality:

• Strong collaboration skills for interfacing with both technical and non-technical stakeholders.

• Clear communication skills with strong documentation habits.

• Comfortable leading discussions and mentoring others.

 

Software & Tools:

• Databricks.

• AWS and Azure services.

• GitHub Actions, Azure DevOps, or Jenkins.

 

English Level:

C1, C2, Native.

 

Schedule:

9-5 EST.

 

Salary and Benefits:

Payment in USD or Local Currency according to candidate's preference.