Stefan Moser
Associate Director, R&D Data Analytics, Amyris
Below is the video transcript.
In today's world, there's more and more focus on sustainability. The changing climate. If you think about all the things we wear or eat or the laundry detergent you're using, those chemicals have to come from somewhere. And a lot of those chemicals come either from plants, from animal sources, or often from petroleum. It often takes expensive or dirty chemistry.
And so, the idea is: can you use biological systems, in our case, microbes or yeast, to manufacture those chemicals? Let the biology do the heavy lifting and produce the things that the world needs. And ideally you can make it more sustainably at better cost, but you could potentially also make things that you wouldn't be able to make otherwise.
Amyris is a [synthetic biology] company. The company started out of a lab in UC Berkeley on a grant from the Bill and Melinda Gates Foundation. And the agreement of that grant was that we would use our technology to make an antimalarial drug and license off the manufacturing for free. But in turn, we would keep IP and lab infrastructure that was built.
We have a variety of specialty chemicals and ingredients we manufacture now at scale, producing in the ton quantity, from flavors, fragrances to cosmetic ingredients.
There's a very popular and powerful emollient that's a common cosmetic ingredient called squalane. And traditionally it's sourced from shark livers. That was driven by the industry in Japan. It was outlawed in Europe. They started making it from olive trees, but the product was inferior. Amyris found a way to make it with our technology and produce squalane that was as good and as pure as shark squalane but didn't require the killing of sharks.
I work in a larger department that includes software engineering and data science and our teams are supporting this high-throughput, scaled R&D pipeline. My team specifically is really focused on data and process quality. How do we ensure that scientists can make decisions on signals rather than noise?
I think we're seeing a movement away from the traditional idea of a scientist. Most people imagine what a scientist does is they take an experiment from cradle to grave. They're ideating, they're designing, they're executing, they're generating the data, they're analyzing that data, making decisions.
What you see more and more is: how can we enable scientists to do more? How can we have them test 50 hypotheses instead of 2 every month? And so, to do that, you have to bring in modern technology; data systems, software applications, lab automation. In some cases, scientists might not even have to step into a lab to execute an experiment. There's automation to do that. There's maybe dedicated teams that have optimized that process to be really efficient.
We are definitely seeing a movement towards this model of high-throughput, scalable labs that support scientists in being able to test more ideas. It is really ultimately enabling, I think, for scientists, but it presents new challenges. Now a scientist has 50 times more data to deal with.
The support we have in terms of tooling, the time people have internally to dedicate to building data skills, building statistical skills, competencies. You need both, right? You can have a community of people that are really passionate about statistics and data. But at the end of the day, you need an element of top-down leadership to help drive that. You also need an element of accountability to get pervasive data literacy within an organization. That's something you have to continuously invest time in with the expectation that there is, from a business perspective, some ROI.
The beauty of JMP is it opens up doors of things we usually wouldn't pursue otherwise. The amount of time it's been developed, it has so many, so many features, so many ways to analyze data.
The Workflow Builder not only allows you to make people more effective, have them work quicker, but it allows you to standardize a workflow. Which leads you to a point of okay, we wouldn't have been able to make this decision. We wouldn't have been able to make this change.
If you can demonstrate that from a data perspective, which JMP allows us to do, you can very clearly point to the impact of cost, right? So we have to buy 10,000 less analytical vials a year. That's $25,000. We have to have one less analytical instrument that costs $180,000 to run all the samples because we don't need to run as many. So we also, especially within my team, try to tie it back to that direct impact. And at the end of the day, I think the data speaks.
For the scientific world, just the world in general, analytics is only going to be more and more important. Resources might be limited, but data, on the contrary, certainly is not. Within that data is hiding answers, insights. We can have sustainable solutions, but if only you or I use that, what is the impact on the planet?
So it has to be scalable, not only from a technology perspective, but also from an economical one. In a perfect world, the consumers can drive this change, maybe without even knowing they're driving it. That's why scalability is so important.