Databricks Solution Architect
Are you the kind of architect who doesn’t just design systems — but shapes data strategy at the highest level?
We’re looking for a seasoned Databricks Solutions Architect to embed within our most strategic client accounts and become the trusted technical authority behind their data, analytics, and AI transformation. This is a highly visible, high-impact role where you’ll sit at the intersection of business vision and technical execution — ensuring clients unlock the full power of the Databricks platform.
You won’t be operating from the sidelines. You’ll be embedded, influential, and instrumental in driving architecture decisions, implementation excellence, and long-term platform adoption across enterprise-scale environments.
What You’ll Own
Your mission: ensure the end-to-end technical success of your client’s data initiatives — from ingestion to advanced analytics and machine learning deployment.
Architect for Scale & Performance
-
Design and lead scalable, secure, high-performance architectures on the Databricks Lakehouse Platform
-
Leverage core capabilities such as Delta Lake, Unity Catalog, and serverless compute
-
Build future-ready data foundations that support both analytics and AI use cases
Deliver Hands-On Technical Excellence
-
Guide development of advanced data engineering pipelines using Apache Spark (PySpark, Spark SQL, Scala)
-
Set the bar for code quality, performance optimization, and architectural best practices
-
Diagnose and resolve complex performance bottlenecks in large-scale distributed systems
Be the Strategic Technical Advisor
-
Serve as the trusted advisor to engineering, data science, and infrastructure leaders
-
Drive best practices in data governance, security, reliability, and MLOps
-
Translate executive-level business goals into robust, production-grade solutions
Enable & Elevate Client Teams
-
Deliver workshops and tailored enablement sessions
-
Mentor client engineers to accelerate platform mastery
-
Help organizations build independence and long-term architectural maturity
Accelerate Innovation
-
Introduce and drive adoption of advanced platform capabilities such as Photon and Databricks SQL
-
Optimize integrations across major cloud providers including Amazon Web Services, Microsoft Azure, and Google Cloud
-
Ensure seamless integration across storage, networking, identity, and security layers
What Makes You Exceptional
Deep Technical Mastery
-
5+ years in data engineering, data warehousing, or software architecture
-
3+ years of hands-on architecture and delivery on Databricks
-
Expert-level proficiency with Apache Spark and distributed data systems
-
Strong experience across AWS, Azure, or GCP infrastructure (S3, ADLS Gen2, IAM/security models)
-
Advanced Python or Scala and sophisticated SQL expertise
-
Familiarity with containerization (Docker, Kubernetes) and CI/CD pipelines
Executive Presence & Consulting Mindset
-
Exceptional client-facing communication skills — equally comfortable with engineers and executives
-
Ability to work independently in an embedded/resident model
-
Skilled at building consensus across diverse stakeholder groups
-
Proactive, consultative problem-solver who thrives in ambiguity
-
Able to balance multiple priorities while maintaining technical excellence
Nice to Have
-
Databricks certification (Associate, Professional, or higher)
-
Cloud certifications (AWS, Azure, or GCP)
-
Experience in MLOps and production ML deployment
-
Prior consulting, residency, or customer-facing architecture experience
Why This Role Stands Out
This isn’t just an architecture position — it’s a strategic partnership role. You’ll shape enterprise data platforms from the inside, influence executive decisions, and directly impact how organizations unlock value from data and AI.
If you have exceptional Databricks skills and thrive in high-trust environments, enjoy blending strategy with hands-on engineering, and want to operate at the forefront of modern data architecture, this role was built for you.
