Specialist Solutions Architect - Data Warehousing & Data Governance
This role can be remote, specifically targeting candidates in the Central Time Zone.
As a Specialist Solutions Architect (SSA) - Data Warehousing & Data Governance, you will guide customers in 1) accelerating their cloud data warehousing transformation with Databricks which span a large variety of use cases, 2) adopting Unity Catalog (a unified data governance solution) and Delta Sharing (a secure data sharing platform). You will be in a customer-facing role, working with and supporting Solution Architects, that requires hands-on production experience with large scale data warehousing and data governance technologies, and expertise in other modern data technologies such as Apache Spark™. SSAs help customers through evaluations and successful planning of data warehousing workloads while aligning their technical roadmap for expanding the usage of the Databricks Data Intelligence Platform. As a deep go-to-expert reporting to the Specialist Field Engineering Manager, you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in the data warehousing and data governance specialty - including performance tuning, data modeling, winning competitive evaluations, and production migration planning.
The impact you will have:
- Provide technical leadership to guide strategic customers to successful cloud transformations on:
- Large scale data warehousing workloads - ranging from evaluation to architecture design to production deployment
- Unity Catalog and Delta Sharing - ranging from advisory and design to deployment to troubleshooting
- Prove the value of the Databricks Lakehouse Architecture on customer workloads by architect production level workloads, including end-to-end pipeline load performance testing and optimization
- Become a technical expert in an area such as data management, cloud platforms, and architecture
- Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
- Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
- Contribute to the Databricks Community
What we look for:
- 5+ years experience in a technical role with expertise in data governance and data warehousing - such as query tuning, performance tuning, troubleshooting, and debugging MPP data warehouses or other big data solutions
- Production deployment experience with data governance solutions and hands-on experience with cloud data lakes
- Experience with design and implementation of data warehousing technologies including NoSQL, MPP, OLTP, and OLAP
- Deep Specialty Expertise with scaling big data workloads that are performant and cost-effective - including technologies such as Delta Lake
- Experience with the AWS, Azure, or GCP clouds
- Production programming experience in SQL and Python or Scala
- 2 years professional experience with Big Data technologies (e.g. Spark, Hadoop, Kafka) and architectures
- 2 years customer-facing experience in a pre-sales or post-sales role
- Can meet expectations for technical training and role-specific outcomes within 6 months of hire
- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
- Ability to travel up to 30% when needed
- Medical, Dental, and Vision
- 401(k) Plan
- FSA, HSA and Commuter Benefit Plans
- Equity Awards
- Flexible Time Off
- Paid Parental Leave
- Family Planning
- Fitness Reimbursement
- Annual Career Development Fund
- Home Office/Work Headphones Reimbursement
- Employee Assistance Program (EAP)
- Business Travel Accident Insurance
- Mental Wellness Resources
Pay Range Transparency
Databricks is committed to fair and equitable compensation practices. The pay range(s) for this role is listed below and represents base salary range for non-commissionable roles or on-target earnings for commissionable roles. Actual compensation packages are based on several factors that are unique to each candidate, including but not limited to job-related skills, depth of experience, relevant certifications and training, and specific work location. Based on the factors above, Databricks utilizes the full width of the range. The total compensation package for this position may also include eligibility for annual performance bonus, equity, and the benefits listed above. For more information regarding which range your location is in visit our page here.
Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.
Our Commitment to Diversity and Inclusion
At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics.
If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.