Monthly Archives: June 2020

Solving the challenges of standardizing cell counting to ensure reproducibility in experiments, assays and manufacturing processes



In this podcast we talked with Christian Berg, Global Product Manager at Chemometec about the importance of standardizing cell counting because while often overlooked, it is essential for reproducibility in experiments, assays and manufacturing processes. Cell counting is a challenging technique, with many pitfalls, that can delay entire projects. We discussed how new technologies are solving these challenges and enabling standardization of cell counting across organizations.

Show Notes

I began our interview by asking Mr. Berg if he could tell listeners why cell counting is so important. He explained that there are some applications where it is clear why cell counting is important, for example when cells are used for manufacturing pharmaceuticals. In biomanufacturing, you need cell counting to perform bioassays for toxicology testing, for use in quality testing of products and to measure the activity of the final product by using cells to report the activity of the drug. Most importantly these cell counting methods must be standardized.

He went on to say that with the emergence of cell therapies, you have completely new processes and new requirements for cell counting and companies must rethink how they count cells and what quality parameters they should focus on.

When you look at cell counting in the context of research, it is used as a tool to maintain the cells in culture while setting up experiments. However, the need for standardization is critical as cell density is very important for how cells function. Cell density influences powerful signaling pathways that can impact any biological function under study, so standardizing the cell counting in a lab is key to achieving reproducibility of a specific assay. Another challenge in cell counting is data management. For instance with cell therapies one must scale the process tremendously, so you might have hundreds of cell counting units, thus it will be important that the incoming data is easy to organize.

Next, I asked Christian if he could talk about how cell counting needs have evolved with new research and manufacturing demands. He said that in research the trend is towards larger experiments, such as cell based screening assays. With large experimental setups, it is very important that the system is optimized with a consistent counting method to reduce day-to-day variation of the experimental setup. He also discussed the reproducibility crisis in research where several of the published articles in peer review journals cannot be reproduced by other researchers. There have been many suggestions about how to correct this problem and standardization is one of them.

He said that at Chemometec, they are seeing a lot of interest from research leaders looking to standardize the laboratory methods. When Chemometec goes to labs that are using manual counting. The team typically asks a handful of researchers to perform manual counting on the same sample. It is not uncommon to see a forty percent variation between the researchers. This makes a big impact as that kind of variation is a big problem for a research group, but it also makes collaboration with other partners more difficult.

Then we discussed the challenges of cell counting. Christian described how the most significant challenges in cell counting today can be split into two groups – technical challenges with performance of the cell counting and practical challenges when cell counting is used in a specific operation or research.

With bioprocessing, customers typically use older generations of cell counters that are mechanically complex and contain tubing, which is inherently unstable and causes instruments to break down quite often. If a cell counter breaks down, it is time consuming and expensive to fix. This is a technical challenge.

Conversely, he said that data management is a practical challenge that can be clearly seen in virus and cell therapy manufacturing. For these types of manufacturing, you cannot build a bigger steel tank if you want to increase production. Instead, you are forced to scale out the production setup, which means that you need more analytical equipment to support the production. He shared that Chemometec has customers that have hundreds of cell counters to support manufacturing needs and having good reliable data integration is essential to be able to control these processes. Therefore it is important to consider the performance of the cell counter as well as the implementation of the cell counting method in your process.

I then asked Christian to talk about Chemometec’s recently launched next generation version of their popular NucleoCounter NC-200, the NuceloCounter NC-202. He explained that the NC-200 is actually a third generation counter and it represents a general overhaul of the hardware, the cassette and the software. They have updated the optics electronics as well as the cassette, and the camera. They have also improved the light sources, which improve the quality of the images. This quality improvement permits a greater extraction of detail about the sample and led to improvements of the performance of the instrument. For example, they increased the dynamic range to ten million cells and at the same time reduced the time it takes to conduct a cell count to thirty seconds. The improved data quality also permits viewing of much smaller particles and the ability to quantify cellular debris for increased robustness.

He went on to state that cell therapy has very high requirements for the scalability of analytical instrumentation. Chemometec developed NC-202 to meet these demands by using modern tools to centralize data. We expect a large increase in the use of robotics and automation efforts will probably revolve around MES systems that will allow operators to integrate the different analytical and manufacturing instrumentation on a single platform to allow more effective control of the manufacturing processes.

I followed up by asking about specific industry segments, and the ways in which cell counting can improve these processes. We started with biologics manufacturing. Christian explained that in biologics manufacturing the cell lines are used to produce therapeutic proteins, so cell counting is employed to control these processes. There are different modes in which the cells can be grown, but in all cases the cell counting is a very important analytical parameter for decision making. The cell lines that are used in biologics manufacturing are not the hardest cell lines to count, but the stability of the automated cell counting unit can be a challenge. The challenge lies in the instability of many of the older generations of cell counters that are mechanically complex and contain internal fluidics that can clog the system. When a cell counting instrument breaks down during a process that can be very unfortunate, because quite often instruments measure differently. This can cause a systematic difference in the cell count, which will affect the data that you are monitoring. It can also affect the evaluation of the quality of the final product. One important feature of the NC-202 to is that all the units will measure the same regardless of production year. Chemometec achieves this by using a very rigorous calibration during manufacturing.

Then I asked if he could talk about cell therapy and virus manufacturing processes. He explained that in comparison to biologics manufacturing, the production of virus and cell therapies are much more difficult to scale. In virus manufacturing, the cell lines that are used to expand the virus are quite diverse because specific viruses have preferences for specific cell types. As a result, there is no single cell line available that can be used for the expansion of all viruses. In addition, most of the cell lines used are adherent, which are much more difficult to scale compared to suspension cells like CHO cells. In order to overcome the scalability problem with adherent systems, companies use microcarriers, which allow the cells to be grown in bioreactors. However, counting cells on microcarriers is not trivial because you cannot count the cells while they sit on the microcarriers. You need to strip the cells off the microcarriers prior to counting and that process together with the actual counting can take up to thirty minutes.

He described a recent case where a large virus manufacturer came to Chemometec and asked for help with setting up a cell counting method for counting primary cells. They used manual cell counting where operators would put the cells into five categories manually. This counting process was very extensive and took more than 30 minutes for a single cell count. Using the NC202, Chemometec provided the manufacturer with an assay that allowed them to automatically perform cell counting, thus reducing the analysis time from thirty minutes to thirty seconds.

Next I asked Christian to describe what the implementation of the NucleoCounter looks like for scientists interested in incorporating this into their workflow. He explained that the NucleoCounter is very easy to use and that is particularly important if you want to standardize a process. The cassette in the NucleoCounter replaces three workflow steps still present in other counting methods – the addition of dye, the loading of the cells into a counting chamber and the focusing required before performing the cell count.

He summarized by saying that the simplicity of the NucleoCounter operation will make it much easier to implement in any process, because the standard operation procedures are shorter and it is easier to train new people to use the instrument. Another important advantage of the NC-202 is that it can easily be deployed in clean rooms. It is easy to clean and it doesn’t require any maintenance, regular validation is enough to ensure consistent performance of the instrument.

I closed the interview by asking Christian if he had anything that he would like to add for listeners. He said Chemometec is still operational, so if companies would like to try the NucleoCounter, please contact them. Normally they would send out field application scientists to get companies started, but because of these unusual times they made a video to demonstrate setting up the instrument and getting started. Typically a week is sufficient to test the performance of the NucleoCounter with other cell counting systems.

For more information about the NucleoCounter, please see


A Look Towards the Future of 3D Cell Culture – A panel discussion



Introduction and Overview

Debbie King

Researchers have used 2D cell culture since the early 1900s, but we know that growing cells on planar surfaces have some drawbacks. Cells grown in vitro in 2D space don’t behave like cells found in vivo. They lack critical cell-cell and cell-matrix interactions that drive their form, function and response to external stimuli. This limits their prognostic capabilities. More recently, 3D cell culture techniques have become popular because the cell morphology, interactions and tissue-specific architecture more closely resembles that of in vivo tissues. Spheroids, organoids and more complex 3D tissue systems, such as ‘organ-on-a-chip’ are examples of 3D cultures used by researchers to model native tissues.

Spheroids are simple, widely used 3D models that form based on the tendency of adherent cells to aggregate and can be generated from many different cell types. The multicellular tumor spheroid model is widely used in cancer research.

Organoids are more complex 3D aggregates, more like miniaturized and simplified versions of an organ. They can be tissue or stem cell-derived with the ability to self-organize spatially and demonstrate organ-specific functionality.

More complex yet, are technologies like organ-on-a-chip, which is a multi-channel 3D microfluidic cell culture system that mimics whole organ function with artificial vasculature. Cells are cultured in continuously perfused micrometer-sized chambers that recreate physiologically relevant levels of fluidic sheer force to allow for gas, nutrient and waste transport to the cell just as is observed in vivo vascularized tissues.

How are spheroids impacting cancer research and what do you see as future applications for the technology?

Audrey Bergeron

Spheroids can be an improved model for cancer in the lab compared to standard 2D cell culture. When cancer cells are cultured as spheroids, they are able to maintain the shape, polarity, genotype, and heterogeneity observed in vivo (1). This allows researchers to create models that are much more reflective of what’s going on in the body. For a simple example, if you think about drug penetration into a 2D monolayer of cells it’s completely different from drug penetration into a solid tumor. In a 2D monolayer each cell is exposed to the same concentration of drug whereas in a spheroid, like a solid tumor, there are gradients of drug exposure.

More and more we’re seeing researchers move away from cancer cell lines and move more toward specialized cancer models such as patient derived models. The hope here is to find the appropriate therapies for each individual patient.

(1) Antoni, D., Burckel, H., Josset, E., & Noel, G. (2015). Three-dimensional cell culture: a breakthrough in vivo. International journal of molecular sciences, 16(3), 5517–5527. doi:10.3390/ijms16035517.

What tools and technologies are needed to fully realize the potential of spheroid culture models?

Debbie King

One of the key parameters for success with spheroid culture is controlling the size of the spheroids. It can be very difficult to get consistent, reproducible results if the starting spheroid culture is not uniform in size and shape. Cell culture tools available on the market now, such as ultra-low attachment plates, facilitate the formation of uniformly sized spheroids for many research applications from low to high throughput modalities.

Hilary Sherman

There always needs to be a little bit of a balance between throughput and complexity in terms of creating models for research. That’s why there are so many options available for 3D research. Low attachment products such as Corning® spheroid microplate and Eplasia® plates are great for creating high throughput 3D models, but can lack some complexity. Organ-on-a-chip and hydrogel models add biological complexity to the model, but are typically not as high throughput.

What is the most interesting achievement so far in using organoids?

Elizabeth Abraham

Personalized medicine. Due to their unique ability of unlimited self-renewal, organoids are different from spheroids. Organoids can be made from patient-derived stem cells in a selective medium containing Corning® Matrigel matrix. These organoids can then be exposed to varying drugs to identify the best treatment to fight that particular cancer; thus personalizing medicine to treat disease. Taking this idea even further is the ability to repair genes in cells that can form organoids, then using those organoids to understand treatment regimens. Organoids thus serve as a converging platform for gene editing, 3D imaging and bio-engineering. Thus the therapeutic potential of organoids in modeling human disease and testing drug candidates is in my opinion the most interesting achievement thus far.

Audrey Bergeron

There has been a recent report of researchers at the Cincinnati Children’s Hospital Medical Center (2) developing the world’s first connected tri-organoid system, the human hepato-biliary-pancreatic (HBP) organoid. This is a remarkable achievement since it moves the field from individual organoid research to connected organoid systems, which more physiologically mimic the interplay between human tissues. There have also been challenges to date with current liver organoid approaches failing to adequately recapitulate bile duct connectivity, which is important for liver development and function. The authors describe optimized methods to create the multi-organ model from differentiated human pluripotent stem cells via the formation of early-stage anterior and posterior gut spheroids which fuse together and develop into hepatic, biliary and pancreatic structures. This is an exciting new basis for more dynamic and integrated in vitro systems-based organoid models to study organogenesis, for use in research and diagnostic applications and for potent applications in precision medicine and transplantation studies.

(2) Hiroyuki Koike, Kentaro Iwasawa, Rie Ouchi, Mari Maezawa, Kirsten Giesbrecht, Norikazu Saiki, Autumn Ferguson, Masaki Kimura, Wendy L. Thompson, James M. Wells, Aaron M. Zorn, Takanori Takebe. Modelling human hepato-biliary-pancreatic organogenesis from the foregut–midgut boundary. Nature, 2019; DOI: 10.1038/s41586-019-1598-0.

Debbie King

The generation of cerebral organoids is one of the most interesting achievements. These “mini-brains” derived from pluripotent stem cells can self-organize into functioning neural structures. Neural cells are notoriously difficult to culture in
vitro
and obtaining sufficient cells for experiments can be challenging. Cerebral organoids offer a way to study neural tissues, replicating aspects of human brain development and disease that was once impossible to observe in the laboratory. Scientists have used them to make discoveries about neurological disorders like schizophrenia and autism. These organoids have been useful models to examine fetal brain microcephaly caused by Zika virus infection.

What technologies played have helped scientists overcome the biggest challenges to using organoids?

Elizabeth Abraham

3D extracellular matrix, like Matrigel® matrix, provide the appropriate scaffold to be able to generate and grow organoids. This matrix can also be modulated by matrix metalloproteases secreted by cells within the organoid to grow and differentiate.

The discovery of Wnt signaling is also important as the Wnt pathway is the heart of the organoid technology.

Lastly, 3D imaging, the ability to see inside 3D structures such as organoids has also enabled scientists learn how to use organoids in disease modeling.

Audrey Bergeron

One of the biggest challenges is the lack of vasculature in organoid systems, which hinders in vivo-like expansion and limits organoid size. Technologies have been developed (and are continuing to be developed) which improve long-term culture conditions and the delivery of nutrients and gaseous exchange to the developing organoid. These include spinner flasks and bioreactors to increase “flow” in the culture system and microfluidics-based platforms for efficient nutrient diffusion, oxygenation and waste metabolite disposal (a key example is cerebral organoids).

It is also interesting to see the evolution of permeable membranes such as the Transwell® permeable supports  and other semi-permeable membrane materials being integrated into perfusion systems and 3D bioprinting techniques to improve nourishment to the organoid during maturation. These technologies have helped to increase the life- span and utility of organoids to months.

Another challenge in high throughput pharmacological and toxicity screening applications has been the formation of reproducible, single organoids per well. The Ultra-Low Attachment (ULA) surface cultureware or microplates coupled with established biological hydrogels such as Corning Matrigel matrix have provided a platform to generate uniformly sized organoids compatible with HTS applications. Concurrent advancements in high content screening platforms has also helped to elucidate the 3D complexity of organoids in terms of multi-parameter imaging and quantitative analysis.

What advancements do you see with organoids in the next five years?

Elizabeth Abraham

Organoids fulfilling the need of “organ-donors” that can be used in patients awaiting transplantation and using organoids as a diagnostic tool to detect and treat cancers.

Audrey Bergeron

More complex, vascularized multi-organoid systems will continue to be developed to advance precision and regenerative medicine closer towards transplantable organs. I also think that protocols and models will continue to be optimized to generate data and improve clinical predictively of organoid models in pharmacological and toxicity testing – this could potentially mitigate the need for animal models during drug development.

Debbie King

Researchers are also looking to combine genome-editing technologies like CRISPR-Cas9 in particular with patient cell-derived organoids. For monogenic diseases, it opens up the possibility of performing gene correction through gene editing prior to autologous transplantation as a curative solution. It’s already been shown that the defective CTFR gene in cystic fibrosis patient-derived organoids can be corrected using CRISPR/Cas9 homologous recombination (3).

(3) Schwank G, Koo BK, Sasselli V, Dekkers JF, Heo I, Demircan T, Sasaki N, Boymans S, Cuppen E, van der Ent CK, Nieuwenhuis EE, Beekman JM, Clevers H. Functional repair of CFTR by CRISPR/Cas9 in intestinal stem cell organoids of cystic fibrosis patients. Cell Stem Cell. 2013 Dec 5;13(6):653-8. doi: 10.1016/j.stem.2013.11.002.

What technologies will enable those achievements?

Hilary Sherman

I think more defined reagents such as ECM’s and media will help to make organoid culture easier and more consistent. Also, better bioprinters with higher resolution will aid in generating more complex 3D structures.

Debbie King

Automation platforms will allow for precise control of culture conditions and enable high-throughput screening in drug discovery workflows. Also, high-content imaging technology will be key to capturing morphological and gene expression data to study organoids. Live cell imaging within organoids will allow us to visualize, for example, early events in human development in real time. Overall, the field would also benefit from standardization in protocols, reagents such as the type of culture media/ECM to use and the best cell sources so that comparisons between labs can be made to help advance research forward.

What areas of research do you think will be most impacted by 3D culture systems?

Audrey Bergeron

Cancer research, to better model cancer in vitro to better understand cancer biology and for personalized medicine.

Hilary Sherman

Regenerative medicine, a branch of therapy that involves engineering biological tissue or organ to establish normal function, this includes the hope to someday be able to 3D print organs.

Debbie King

3D culture systems will continue to have a large impact on developmental biology. To study human development, this has largely been limited to observational studies on pre-implantation embryos or from progenitor cells isolated from fetal tissues, which are then cultured in vitro. The advent of organoid models derived from iPSCs opens up the ability to study human embryonic development in a way we couldn’t do before. As well, organ-specific progenitors generated from iPSCs provide a wealth of insight into the morphogenesis of different organ systems.


Computer Aided Biology Platform Helps Companies Meet the Challenges of 21st Century Biomanufacturing



In this podcast, we interviewed Markus Gershater, Chief Scientific Officer with Synthace about computer aided biology and how it addresses several common biomanufacturing challenges. We also discussed ways to build a common culture between science and software.

I began the inteview by asking Mr. Gershater to describe the concept of Bioprocessing 4.0 and what it means to the industry. Markus explained that the term 4.0 refers to the industrial revolutions that have happened throughout history. The first began when steam was introduced as a power source, next came electrification and the production line. 3.0 refers to the incorporation of automation and 4.0 is the connection of different devices and automation through digital technology. This involves cloud computing that enables data storage, computing and analysis. This is particularly important in a complex industry like bioprocessing that requires sophisticated knowledge and control. Bioprocessing 4.0 will enable the industry to progress to the next level.

Next I asked Markus to explain the solutions that Synthace provides in this area. He described how Synthace started as a bioprocessing company that was looking for a way to conduct more sophisticated, automated experiments. The result was the creation of their software Antha, which can auto generate instructions for biological protocols. This means that scientists can specify the protocol they want to run and Antha will works out all the details down to the every step of the run. It then converts those detailed actions into scripts for each automated device to run the protocol. The user hits go and the robot will run the specified protocol with the instructions that Antha generated. This automatic generation of scripts makes automation more user friendly and powerful. In particular, there is a problem with lab automation due to the complexity of programing it. Antha is able to make complex lab automation implementable.

Markus goes on to say that the beneficial knock on effect is digital integration. The devices used in protocol automation are only a small part; there are also analytical devices that produce data. What is needed is a way of structuring data from all of these diverse pieces of equipment. Since Antha generates all the detailed instructions that go on into a particular protocol, it also has the detailed structure of the experiments. So at the end of any chain of actions, Antha can provide the provenance of all data points. Thus, it can also auto structure data into the context of experimental design.

As the industry runs more complex and high throughput experiments, the bottleneck shifts to data structuring. Antha has become a tool that allows the automation of lab processes as well as the data processing from those lab processes. This permits dynamic updating as the structure of the experiment updates.

We then discussed the technology behind the product. Markus explained that first step in getting started is to identify a specific protocol. Then, for example, Antha specifies samples that need to be diluted and provides a framework with specific parameters. Next, you need to look conceptually at how you can move liquids around to fulfill this design. What equipment do you need to run and what consumables? Once you have those, Antha can generate the lower level tedious details. This allows users to change one detail of the experiment and Antha will calculate a new set of instructions. Antha can then pass these specific instructions to devices through the Antha hub, which communicates with the equipment. Once users are satisfied that the equipment has been set up properly then they can hit go and the experiment will run.

I asked if there were any case studies that could be shared to show how this would work in a real life setting. He described how their case studies range from programming relatively simple workflows like automating ELISA assays to extremely complex experiments. They recently co-published a study with Biomedica where they ran an experiment to improve their process for generating lenti viral vectors and were able to improve viral vector titer ten fold over the course of just two very sophisticated Antha experiments.

Markus shared that Biomedica is good at automation and programming automation. When they looked at the scripts generated by Antha, they determined that it would have taken them a week to program each experiment that Antha generated on the fly. He says that this illustrates why often automation isn’t used. Scientists don’t have time to spend a week programming automation for an experiment that they might only run once. There is not sufficient return on investment for the time it took to program.

Synthace has also generated case studies around automated data structuring. In this example, he explained that with bioprocessing you have bioreactors and sensor data that must be aligned with sample data to provide a full picture. Antha enables this data structuring.

Next, I asked Markus if he could talk a bit about the vision for computer-aided biology and how he sees the evolution of the space in the next five years. He explained that computer-aided biology is a vision of how we can use twenty-first-century tools to help us pick up on these complexities of biology. This application can give us insights that maybe wouldn’t have been possible without applying machine learning. This doesn’t mean replacing scientists and engineers with AI, but instead flagging things that they may have missed due to the highly complex data sets.

He said that at conferences, there has been a growing swell of excitement around using these methods for drug discovery. However these techniques are just as important in the lab to interpret bioprocessing results. To reach this sort of future, that includes AI augmented insight, requires routine production of highly structured data sets every time and with every experiment, so that you can compare results experiment to experiment.

Frequently there is an expectation that scientists and engineers should be conducting the data structuring, but it is highly onerous. There is also a wide range of techniques being used to do this from company to company. There needs to be a system in place, where as much automation is incorporated as possible. This will open up the opportunity for an ecosystem of hardware and software working together.

This led me to ask the next question on building a common culture between science and software. Markus explained that this it is interesting because scientists and software engineers tend to think of things in fundamentally different ways. Biologists are used to a large amount of ambiguity because they deal with such complex systems on a day-to-day basis. For software engineers on the other hand, things are a lot more defined and a lot more predictable. They are used to making things happen in a powerful way very quickly.

He said that it is fun to see them work together to discover what is possible and what’s not possible and learn from each other. He goes on to say that what is nice about the Antha system is that both sides can understand it – scientists want to use it to automate the protocol and software engineers can see the logic within the protocol because it is highly defined.

He then told a story about hearing a speaker from Amgen discuss this same point and she said the common culture is “just happening naturally” as more digital tools are available and scientists are shifting their mindset about how to conduct their science.

To learn more about Synthace and computer-aided biology, please see…


Balancing Risk, Cost and Speed During Clinical Development While Still Maintaining Quality



In this podcast, we talked with Thierry Cournez Vice President, BioReliance® End-to-End Solutions, MilliporeSigma.. We discussed effective ways for emerging biotechs to collect material quickly and cost-effectively for pre-clinical and clinical studies. We also discussed managing the need to move quickly with cost and quality.

Show Notes

I began the interview by asking Mr. Cournez if he could talk about how MilliporeSigma works with emerging biotechnology companies. He said that developing and implementing a clinical material process can be time consuming and complex. He explained that MilliporeSigma is a CDMO partner for startups and small biotechs that are developing commercial biological drugs. They help these companies balance speed and cost by providing a comprehensive suite of products and services to accelerate clinical biopharmaceutical development, scale up production processes, and design and implement single-use commercial production facilities. In addition, they do not just focus on the technical and quality phase of drug development,  they also prepare supporting information for regulatory agencies. They make sure companies get all the support they need to explain to agencies what is in the dossier and answer their questions. Most importantly, they want to make sure that their customers are successful.

Next I asked Mr. Cournez how working with a CDMO is different than working with multiple suppliers. He said that after finalizing their strategy, small biotechs need to decide on how they want to structure their team. They have to ask whether they have all the expertise they need in-house or if they need to work with a partner to cover the considerable needs of process development, chemistry, manufacturing, control, filling and regulatory affairs. Many small biotech companies will choose to work with a partner and choosing a partner should not be taken lightly. Mr. Cournez said he feels it is key to the success of the project. Ideally the CDMO partner can interface with most of the individual suppliers and cover most of the requirements, but the biotech company does need an in-house person to work with the CDMO. At MilliporeSigma, they have a one dedicated project manager for each of their clients.

I then asked how small biotechs can collect material quickly and cost effectively to reach the pre-clinical stage. He said that this is where a CDMO can offer a real benefit. At MilliporeSigma, he explained, they learn from previous projects where time and money can be saved. He said they recently identified a two month time savings on the development of a molecule for a client.

Next we talked about how a CDMO can be instrumental during analytical development. He explained that analytical development is a critical part of the development process. Molecules can be really complex with no way of knowing what issues may arise. This is why analytical methods development and process development must work side by side to create a seamless process.

I then asked Thierry about what he thinks are the key considerations that small biotechs need to consider when working toward clinical development. He said that clinical development for a biotherapeutic is long and very challenging. Companies need to move quickly through the early clinical phases while demonstrating direct safety and efficacy. At the same time companies need a reliable process for producing clinical materials that ensures they will meet all quality and regulatory requirements. Long term, companies need to consider the final cost of their drug at commercial scale. Because of the complexity of this process, relying on an experienced partner can really help navigate these decisions early and be informed of everything that could impact decisions down the road.

I then asked how companies can balance cost, risk and speed during clinical development while still maintaining quality. He said that biotech companies all want the process done well, cheap and fast, but under no circumstances can we sacrifice on quality. MilliporeSigma employs a flexible approach to the biotech company’s needs and constraints. Again experience is very important as skilled partners can suggest new approaches and solutions. An example of an innovation through experience that has been launched is MilliporeSigma’s integrated plug and play upstream development services. This service eliminates the need to work with vendors for upstream development, thereby reducing bottlenecks and lowering time to clinic by 3 months.

I closed the interview by asking if there was anything else that Thierry would like to add for listeners and he said that these are very interesting times with many novel therapeutics in development. Many groups are working together to get these products into the hands of patients. At MilliporeSigma, we are looking forward to these collaborations.