Artemis Consultants Favicon

Databricks Data Architect – Remote

Data Science & Analytics Job ID: 1771621


Our client is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using a proprietary, award-winning platform, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, they help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance.

They serve the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. The Analytics practice provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, they take an industry-specific approach to transform clients’ decision making and embed analytics more deeply into their business processes. They have a global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization.


The Databricks Data Architect will leverage their computer science and design skills to review and analyze the organization’s data infrastructure, plan future databases, and implement solutions to store and manage data for organizations and the users.


  • Develop and optimize ETL pipelines from various data sources using Databricks on cloud (AWS, Azure, etc.)
  • Experienced in implementing standardized pipelines with automated testing, Airflow scheduling, Azure DevOps for CI/CD, Terraform for infrastructure as code, and Splunk for monitoring
  • Continuously improve systems through performance enhancements and cost reductions in compute and storage
  • Data Processing and API Integration: Utilize Spark Structured Streaming for real-time data processing and integrate data outputs with REST APIs
  • Lead Data Engineering Projects to manage and implement data-driven communication systems
  • Experienced with Scrum and Agile Methodologies to coordinate global delivery teams, run scrum ceremonies, manage backlog items, and handle escalations
  • Integrate data across different systems and platforms
  • Strong verbal and written communication skills to manage client discussions


  • 8+ years experience in developing and implementing ETL pipelines from various data sources using Databricks on cloud
  • Some experience in insurance domain/ data is must
  • Programming Languages – SQL(including materialized views), Python, PowerBI, Salesforce Experience
  • Technologies – IaaS (AWS or Azure or GCP), Databricks platform, Delta Tables, Delta Lake storage, Spark (PySpark, Spark SQL).

Good to Have:

  • Airflow, Splunk, Kubernetes, Git, Azure Devops
  • Project Management using Agile, Scrum
  • B.S. Degree in a data-centric field (Mathematics, Economics, Computer Science, Engineering or other science field), Information Systems, Information Processing or engineering.
  • Excellent communication & leadership skills, with the ability to lead and motivate team members
  • Ability to work independently with some level of ambiguity and juggle multiple demands



Job ID# 1771621

Artemis invites you to subscribe to our free Job Alerts and The Hunt” Blog for free insights on hiring and career development.

Artemis Referral Bonus – $1000! If you know someone for this job, please join our Referral Bonus Program.

  • Max. file size: 500 MB.