Description
At Danaher, our work saves lives. And each of us plays a part. Fueled by our culture of continuous improvement, we turn ideas into impact – innovating at the speed of life.
Our 63,000+ associates work across the globe at more than 15 unique businesses within life sciences, diagnostics, and biotechnology.
Are you ready to accelerate your potential and make a real difference? At Danaher, you can build an incredible career at a leading science and technology company, where we’re committed to hiring and developing from within. You’ll thrive in a culture of belonging where you and your unique viewpoint matter.
Learn about the Danaher Business System which makes everything possible.
Danaher’s India Development Center – IDC is a research and development center with the vision of accelerating product roadmaps across various Danaher business segments. Started in 2014, the center now hosts 700+ associates, for multiple Danaher operating companies focusing on Diagnostics, Life Science, and Environmental and Applied Science segments. The operating companies include Beckman Coulter, Radiometer, Leica Biosystems, Digital Teams, Leica Microsystems, HemoCue, Phenomenex SCIEX and Cepheid.
The IDC workforce comprises of various product engineering teams, working on development of software and hardware components of cutting-edge products for, Immunoassay, Chemistry, Hematology, Molecular diagnostics, Oncology, Neurosurgery etc,. IDC has evolved as center of excellence for Cloud and data analytics, with significant contributions to the key informatics solutions. The teams consist of highly hardworking software & hardware engineers and development managers. The teams are supported by local Product managers, Quality & Regulatory and Intellectual property specialists.
The inhouse teams works in close coordination with other global R&D centers at US, France, Germany, Japan, Australia, Denmark and Sweden. Located at the center of Bangalore IT HUB, IDC is housed at state of art facility
The Senior Data and AI Engineer at Danaher Enterprise AI team, supporting various initiatives and advancing business objectives through rigorous data analysis, data engineering and implementing data solutions for scientific, technical, or operational functions especially on Service and support area. This role is based in India, Bangalore working onsite and will report through the aligned business and organizational structure.
In this role, you will have the opportunity to:
+ Design, develop, and maintain robust, scalable, and efficient data pipelines to ingest, transform, and serve data for AI/ML and analytics workloads.
+ Architect and maintain scalable, secure, and low-latency data pipelines and systems to support agentic AI applications. Partner with Data Science teams to support model training, feature engineering, and deployment processes.
+ Collaborate with data scientist, project managers, Software Engineering teams, Service and support team stake holders to understand data needs and translates business requirements into technical solutions .Develop and manage data architecture, data lakes, and data warehouses supporting Service AI use cases (e.g., ticket analytics, customer interaction insights, predictive maintenance, resource planning etc,).
+ Optimize data storage, compute, and retrieval strategies for structured and unstructured data (including logs, text, images, telemetry data, etc.).Support MLOps workflows by enabling model deployment, monitoring, and versioning pipelines.
The essential requirements of the job include:
+ Bachelor’s or Master with 8+ years of experience in Computer Science, Information Technology, Engineering or related field
+ 5+ years of experience in data engineering or data platform development, preferably supporting AI/ML workloads
+ Strong proficiency in Python, SQL, and data processing frameworks (e.g., PySpark, Spark, Databricks, Snowflake).Experience with cloud data platforms (AWS, Azure, or GCP, Snowflake) and related services (e.g., S3, Redshift, BigQuery, Synapse)
+ Hands-on experience with data pipeline orchestration tools (e.g., Airflow, Prefect, Azure Data Factory). Familiarity with data Lakehouse architectures and distributed systems
+ Working knowledge of containerization and CI/CD (Docker, Kubernetes, GitHub Actions, etc.). Experience with APIs, data integration, and real-time streaming pipelines (Kafka, Kinesis, Pub/Sub). Also familiar with creating reports and visualization using Power BI, Power App, Power Automate or a similar BI tool.
It would be a plus if you also possess previous experience in:
+ Experience building and deploying equipment failure prediction models at scale
+ Familiarity with enterprise-scale Service and Support data (CRM, ServiceNow, Salesforce, Oracle, etc.)
+ Strong understanding of data security, compliance, and governance frameworks
Join our winning team today. Together, we’ll accelerate the real-life impact of tomorrow’s science and technology. We partner with customers across the globe to help them solve their most complex challenges, architecting solutions that bring the power of science to life.
For more information, visit www.danaher.com .
Operating Company: Cepheid





