Work Life Limbo
Back to List

Maturing and expanding our current data ecosystem will require significant technical experience in IaaS and PaaS implementations in enterprise environments, specifically within Microsoft Azure with delta lake platform utilization (preferably Databricks). The right candidate will understand the importance of leveraging the full scale of automation within serverless delta lakes to maximize the shared heavy lifting in the IaaS and PaaS layers to minimize data movement and manual ETL efforts while maximizing the business’s ability to extract value from organizational data assets and adhere to best-in-class data security principles.

Leveraging emerging technologies to ensure a performant, secure, and cost-effective solution is critical in this role – as an enterprise organization there is no shortage of transactional and deep, rich, analysis data needs. These business needs stretch across our organization and the appetite for accurate and actionable data will only grow internally.

Our platform (currently Databricks) must support all business outcomes, and this role will require more than just technical skills. The right candidate will bring a curiosity and passion to move business forward by engaging with stakeholders that vary from business analysts to executive team members. In this role, you’ll work directly with business analysts, data scientists, business end users, engineering and application teams, and our owns data / platform engineering teams. A consultative communication style is critical as shared outcomes across technology and business are the expectation.

Responsibilities         

  • Lead the development of current and target state data eco-system architectures covering all aspects of data including data platform, data management, data security, data ingestion, data transformation, data analytics, and AI/ML
  • Develop data architecture frameworks and blueprints and provide guidance to engineering and product teams
  • Provide leadership and expertise in the development of standards, architectural governance, design patterns, and practices
  • Lead designing of all aspects of our data strategy including translating data needs to data architecture designs; create supporting artifacts such as diagrams, playbooks, and other technical documents
  • Provide technical leadership, thought leadership, and guidance to engineering teams for data architecture and design
  • Ensure recommended architectural designs, patterns, principles, and guidelines are adhered to as part of solution delivery
  • Act as a liaison with other architects (product, infrastructure, security, etc.). and with product teams working primarily in an Agile (Scrum/Kanban) manner; provide design deliverables which can be estimated by the appropriate delivery teams
  • Research, design, test, and evaluate modern technologies, platforms, and third-party products
  • Work with technology teams to prototype and prove the viability of innovative solutions, capabilities, and patterns
  • Socialize information on technology trends and products, such as authoring white papers to provide a viewpoint on new and emerging solutions and industry trends
  • Partner with operations teams to create controls for compliance

About You

  • Bachelor's degree in MIS, computer science, engineering or related field, OR 7+ years of equivalent verifiable experience, skillset, and record of accomplishment
  • At least 5 years of architecture or engineering experience, including progressive experience from data warehouse / ETL to enterprise data lake to direct hands-on and leadership experience with delta lake platforms.
  • Working experience with the following platforms and tools: Azure (Data Factory, Functions, Logic Apps, Synapse), Databricks, Delta Lake, SQL, Oracle, Spark, Hive, Google Cloud, and Kafka
  • Experienced in all aspects of pre-processing:
    • Data Governance as Code
      • Categorizing data assets with metadata
      • Hands-on and leadership experience with Colibra, Purview and other enterprise governance tools
      • Leveraging Azure Active Directory as a function of platform architecture
      • Expert level of understanding of “garden walls” with enterprise experience in departmental utilization and cost reporting utilization of each departments space
    • Secure Cloud Planning and Infrastructure Execution
      • Must have relevant DevOps experience to ensure scalability of automation within the Databricks platform (Ansible, Helm, Kubernetes, etc.)
      • Multi-AZ, and regional data strategy to adhere to cloud security principles while maximizing performance
    • Delta Automation
      • Familiarity with automated connector suites (both source and target) to minimize manual ETL efforts
      • Direct hands-on experience deploying data models within the delta platforms and minimizing the need to model data within the application / reporting layer
    • Layered Data Structure
      • Must have hands-on and leadership experience in designing and deploying a layered data structure within a delta platform (L1 – L3 / Gold – Bronze/Bare)
    • Experienced in all aspects of post-processing:
      • Reporting Layer / Reporting Tools
        • Hands-on and leadership experience utilizing single cloud data endpoints at the appropriate layer to enable reporting utilization at scale with minimal efforts to business end users (PowerBI, Tableau, etc.)
        • Must have experience leveraging models as a function of the data within the platform to minimize modeling efforts for disparate data sources
      • SQL / Java / Python / Scala Jupyter Notebooks
        • Must be an SQL expert hands-on experience deploying SQL through Jupyter notebooks or other platform level tools directly in the analytics layer of a data platform
        • Must have senior level Python skills and hands-on experience deploying Python through Jupyter notebooks directly into the analytics layer of a data platform
        • Must have senior level Scala skills and relevant experience deploying packaged Scala pipelines
      • ML Ops
        • Must have hands-on experience turning machine learning algorithms into plain old java objects (POJO) and deploying objects as a function of the data within a delta platform
        • Must have leadership experience in utilizing Databricks for the above deployment
      • Ability to design and document how data platforms and tools should be scaled, sized, and deployed
      • Experience and skills to troubleshoot and solve for data pain points across operational and analytical platforms
      • Strong problem-solving skills with the ability to see past the immediate answer to the simple, elegant solution; ability to identify and implement patterns that enable solution simplification
      • Skilled at designing and implementing data backup and recovery in cross datacenter scenarios
      • Hands on experience, team orientation, flexibility, innovative thinking, problem solving, conflict management, and self-motivation

 

Apply to this Job
First Name *
Last Name *
Email

Phone