Jobs for People with MS: National MS Society

Mobile National MS Society Logo

Job Information

BeiGene Associate Director, Data Platform and Solution Engineering in Emeryville, California

General Description:

Join BeiGene's Global Data Strategy and Solutions team to implement and operationalize a cutting-edge and fully integrated Enterprise Data Platform that accelerates our journey from Data to Insights. This role reports to the Director of Platform and Solution Engineering and is responsible for architecting, implementing and operationalizing our cloud-based enterprise data platform and foundation. The role will also oversee the delivery of data solutions and capabilities built on top of the platform in partnership with cross-functional teams. The AD Platform and Solution Engineering must be an expert in modern data platform and cloud technologies to design a scalable, high-performance data platform and capabilities that empower our organization to leverage data effectively. The ideal candidate will possess strong technical knowledge and experience in cloud data platform architectures, big data processing, data engineering and analytics, coupled with the ability to collaborate cross-functionally to drive data-driven decision-making across the organization.

Essential Functions of the job:

The individual in this position should expect significant day-to-day variability in tasks and challenges.

Primary duties include but is not limited to the following:

  • Platform Architecture and Engineering:

  • Architect and implement a fully integrated data platform that supports modern data management and data engineering approaches to acquire, curate and transform data into insights in a scalable manner enabling multiple federated data teams globally.

  • Integrate technologies like Data Bricks, Azure cloud services and other COTS products (Informatica, Collibra, etc.) to provide high degree of self-service for data engineers, data scientists, AI Engineers and data stewards to create high quality data and analytical (including Gen AI) assets and solutions.

  • Develop repeatable processes and capabilities including but not limited to provisioning workspaces, catalogs, security and governance controls, etc. to deliver robust platform for data teams with automated governance.

  • Develop reference architecture that provide solution blueprints for common patterns and use-cases.

  • Develop strategy and roadmap to enhance and scale MLOps and LLM enablement capabilities

  • Socialize the future architecture with data teams and enable their adoption and effective use of the platform. Provide advice on best practices, solution design and options and resolve issues where necessary.

  • Establish data management frameworks, optimizing ETL/ELT processes and data models for performance and accuracy.

  • Integration:

  • Partner with data engineers, scientists, and business stakeholders to develop seamless data pipelines prioritizing data integrity and usability.

  • Operationalize and integrate tools that enable rapid ingestion and transformation of data while minimizing cost and overhead using serverless and modern cloud approaches.

  • Seamlessly integrate Cataloging and technical metadata harvesting as a core component of the data platform.

  • Implement and uphold data governance practices that enhance data accessibility while ensuring compliance with regulations.

  • Platform Optimization:

  • Monitor and enhance system performance, employing tools and methodologies to optimize data processing and storage solutions.

  • Develop automated reporting and capability to show back consumption and costs by different functional areas.

  • Troubleshoot and resolve data and platform related issues to maintain optimal system functionality.

Qualifications:

  • Proven experience (8+ years) in platform/data architecture or in a similar role, with extensive experience in Databricks and cloud-based data solutions (preferably Azure).

  • 8 years of experience in solution engineering, architecture, or related roles, preferably in platform development and enablement.

  • Strong proficiency in Databricks, Apache Spark, Unity Catalog, Python, SQL, and data processing frameworks.

  • Experience with APIs and experience in integrating diverse technology systems.

  • Familiarity with modern development frameworks, DevOps methodologies, and CI/CD processes.

  • Experience with data warehousing solutions, delta lakes, and ETL/ELT processes.

  • Familiarity with cloud environments (AWS, Azure) and their respective data services.

  • Solid understanding of data governance, security, and compliance best practices.

  • Excellent communication and interpersonal skills, with an ability to articulate complex technical concepts to diverse audiences.

  • Experience leading cross-functional teams and projects is a plus.

  • Education:

  • Bachelor’s degree in Data Science, Computer Science, or a related field is required.

  • Advanced degree (Master’s) in a relevant is a plus

Preferred Skills:

  • Databricks certifications or hands-on experience with Delta Lake and its cloud architecture is preferred

  • Familiarity with machine learning, AI frameworks, and data visualization tools (e.g., Tableau, Power BI, Spotfire).

  • A proactive approach to learning and implementing new technologies and frameworks.

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.

DirectEmployers