Optimizing Protein Expression and Purification using Mass Spectrometry Analysis

In this podcast, we conducted a panel discussion with experts from Peak Proteins on common protein challenges and the use of mass spec to overcome expression and purification issues. We also discussed how mass spec can also be utilized in working with complex proteins and in-process modifications.

Show Notes

I began the podcast by asking about what types of organizations use Peak Proteins’ services. Dr. Abbot explained that these are companies and organizations involved in the discovery of new medicines, including biologics, vaccines and small molecules.

Then I asked him if he could describe the kind of work that they are doing and how they are helping their customers with discovery projects. He said that they use purified engineered recombinant proteins as research reagents to support drug screening and structural biology using X-ray crystallography. Many screening efforts rely on specifically designed proteins to support screening whether that is high throughput screening of a few million compounds or employing biophysical methods such as surface plasmon resonance of fewer compounds. For the medicinal chemist, being able to design drugs based upon a detailed knowledge of how existing molecules interact is very useful. Peak Proteins generates this information using X-ray crystallography of highly purified proteins

Next I asked Dr. Abbot what are the most common protein challenges that they encounter. He explained that because proteins are individual molecules that the way they are designed, made and purified has to be individualized to the protein. They see a vast diversity of proteins every year membrane proteins, protein complexes, flexible enzymes, stable cytokines and many more. They begin by thinking about how best to design it and once purified how to characterize it. This requires a huge toolbox of different methodologies. Engineering the protein to make it more stable, soluble and suitable is critical. Challenges include proteins that have poor expression, are physically unstable or sensitive to proteases, and the rate at which protein folds. Mark pointed out that the cell host and how it is cultured only part of the process. It is very important to consider the design of the protein and how it is to be purified. Once purified it is important to also consider how it will be characterized.

Next I asked if they could describe the cell culture systems that they use to express proteins and how they are used. Mr. Elvin described the several types of expression systems used for protein production including mammalian, insect, bacterial, plant, yeast and cell free expression systems. At Peak Proteins they use three different systems to express the various proteins that clients request. These are a transient HEK293 mammalian suspension expression system, a baculovirus infected insect cell expression system and various E.coli expressing host strains. Each system offers unique advantages and disadvantages over the others.

He went on to say that it’s very important to choose the right system for your protein to be expressed in. If the target is a “simple” protein which doesn’t require much post translational modification then it would be better to go for expression in an E.coli based system. This would have the major advantage of being cheaper as the growth medium is generally inexpensive and can often be made in house. At Peak Proteins they routinely use T7 based expression vector systems to maximize soluble expression and can also refold proteins from E.coli derived inclusion bodies if required.

If the proteins are more complex then you would look to produce them in mammalian systems such as HEK293 or CHO cells. In these systems protein folding is much more efficient and protein secretion into the surrounding growth media is good, making the purification process much simpler, especially if used in conjunction with affinity tags and solubility tags. The main reason to use mammalian cells to express the protein of interest is due to the complex post translational modifications that are able to be formed on the proteins. However, the cost of cell culture media is generally more expensive and there is a reliance on CO2gassed incubators to grow these types of cell lines. At Peak Proteins they assess expression of proteins in their transient HEK293 suspension system after the transfection of cells with plasmids harboring the gene of interest, which can be scaled up to a reasonable level. They routinely work with secreted and intracellular proteins, as well as many difficult to express proteins.

He continued with insect cell expression systems and how they are now becoming more popular as well. At Peak Proteins they use a flashBAC baculovirus insect cell system for expressing secreted, intracellular and membrane proteins. This baculovirus based expression system offers advantages again such as correct post translational modifications, while not as complex as mammalian systems, and scalability. However, like mammalian systems, the growth media costs are higher relative to bacterial expression systems.

Next, I asked how mass spectrometry helps them overcome the challenges of expressing so many different and diverse proteins. Ms. Rowlinson described how the Sciex Exion LC coupled to the X500B mass spec is the ideal system to deliver fast results assisting with the challenges of protein expression. The system seamlessly switches between intact mass and peptide mapping analysis. Intact mass analysis (LC-MS) is conducted via a C4 column and the run times are routinely only five minutes. Peptide mapping (LC-MSMS) uses a C18 column and run times are around ten minutes. Both techniques use reverse phase LC and the column often switches between the two depending on the LC method selected.

She said that it enables them to work out exactly what they have or have not made in a way that SDS-PAGE or western blots alone cannot do. It is much more definitive. It is primarily an “in process” check that affects the decisions they make on expression systems, how they use them and how they purify the proteins.

I followed up by asking about a few specific areas of work with LC-MS and if they panel could share how they have used it in each of the areas.

Protein characterization

Mark described that one of the most common post translational modifications for secreted proteins is glycosylation. There are two primary forms of glycosylation on secreted proteins, N-linked glycosylation to asparagine residues and O-linked glycosylation to tyrosine residues. He then explained why glycosylation is a real problem for X-ray crystallography and how mass spectrometry provides more details, which provide an opportunity to understand glycosylation better. He then provided an example of how they were able to use this to successfully crystallize a protein and said that there were many examples of how mass spec has allowed them to better understand protein glycosylation.

Troubleshooting difficult purification

Mark provided a protein refolding example about a disulfide bonded heterodimer protein expressed in E.coli as inclusion bodies. He said that two separate chains were both denatured separately then refolded under conditions that they thought would allow for the correct disulfide bonds to refold. Once refolded, the protein was purified using chromatography, and then analyzed on reducing and non-reducing gels. They found that on the non-reducing gel there were two bands on the Coomassie stained gel and on the reducing gel there was a single band on a reducing Coomassie gel. They determined that the reason for this discrepancy  was that there were two species present and that the disulfide bonds got jumbled up in the refolding. This was a problem that could not have been identified without using mass spectrometry.

In-process modifications

Next Mark provided a couple examples of when they use mass spectrometry for in-process evaluations. In one example it was used to to identify contaminant bands through peptide mapping of the bands. In another example it was used to determine whether the protein of interest was present when there was poor expression and 30-40 other proteins present. He explained that in some instances, just a small amount of protein is enough and can be purified using multiple chromatography steps. However,  before that work is done, it is important to know that the protein is actually there.

Working with complex proteins

Mark explained that they frequently work with membrane proteins, which are one of the most complex proteins to work with and that they are very unstable. As a result, they are very difficult to express. LC-MS is very important as it allows them to clearly identify if the membrane protein of interest is present. They use LC-MS/MS and peptide mapping to identify that Coomassie stained bands really are the membrane protein.

They also work with multi-protein complexes and LC-MS is critical for identifying all components, any modifications, and the mass of whole complex.

I then shared that I had recently covered a presentation by Janssen on the use of SCIEX’s QTRAP LC-MS system to analyze spent media metabolites in CHO cells and the information that could be gained with this approach. I asked if they had used mass spec to improve the growth or productivity of their culture or for process optimization. Mr. Elvin shared that at Peak Proteins they haven’t as yet used mass spec to improve the growth or productivity of their mammalian or insect cell lines or indeed for process optimization.

However, it’s true that the insights into the metabolome can offer a powerful indication of “cellular status” or “cellular health” which can provide information on the molecular events that determine, regulate or limit cell specific functions such as cell growth and/or recombinant protein production.

Historically MS-based metabolomics methods have been utilized to improve the bioprocessing capacity of mammalian cells and these datasets has been subsequently used to rationalize the design and improvement of chemically defined media, to optimize cell line specific feeding regimes in order to boost productivity and improve product quality, to define metabolic markers of high productivity and more recently to define targets for specific cell line engineering. All of these things have allowed for mammalian cells, especially CHO cells, to produce more and more recombinant protein into the grams per liter range instead of the microgram or milligram range.

Much less is known about the metabolome of the various recombinant baculovirus expressing insect cell lines however, but hopefully with more and more metabolomics studies being performed as a matter of routine then we will see an improvement in the bioprocessing capability of this system in the near future. This is something I am very interested in following up here at Peak Proteins in order to increase the bioprocessing capabilities of our own insect cell expression system.

Next I asked when and how do they use LC-MS to change culture conditions. Dr. Abbot shared another example of how they used LC-MS when working with a recombinant protein that was primarily insoluble. They were able to use LC-MS to identify a part of the band as an insoluble fraction and after affinity chromatography they weren’t able to see any protein of interest. This told them definitively that moving forward with that construct in that cell line wasn’t possible. They decided that they needed to change the cell host to baculovirus.

In another example with a predominately insoluble protein, they were able to purify enough protein to identify it was the protein of interest with LC-MS. Since they knew the protein was present, they were able to reduce the temperature of the E. coli process to improve the protein solubility.

Lastly, they used LC-MS to identify when a proteolytically sensitive protein produced in an insect cell line should be harvested. Typically harvest would occur at 48 hours, but they found that there was much less proteolysis at 36 hours.

Next, I asked how the LC-MS system has fit into their process workflow and how was implementation of the system. Dr. Abbott explained that they use it throughout the process; from the first column to check on what they have purified, to a final check on the product that is delivered. As mentioned earlier, he reiterated that it is particularly useful to help identify problems and decide the best way forward. Implementation was of the system was easy and everyone in the lab runs their own intact mass samples and peptide mapping LC-MS/MS.

I closed the interview by asking what advice they had for companies that are considering adding an LC-MS system to their lab. Dr. Abbot said that LC-MS has helped them to troubleshoot, problem solve and conduct final QC on the hundreds of proteins that they work with each year. He said that many think of LC-MS as just a tool for QC, but they think of it as much more than that. He believes that it is an invaluable tool to have in the tool kit of any protein biochemistry or protein biostructural biology lab.

For more information, please see the webinar “LC-MS platform a wonderful cure for protein science headaches.”

New Lab Set Up – Best Practices for a Successful Start

In this podcast, we spoke with Dr. Ann Rossi Bilodeau, Senior Bioprocess Applications Scientist and Dr. Catherine Siler, Field Applications Scientist both with Corning Life Sciences, who shared insightful tips for setting up a new lab. We discussed how to create a lab plan, maximize lab space, stay within budget and timelines. They also shared their experience in implementing lab safety and training as part of the new lab launch.

Show Notes

We began the podcast by talking about how setting up a new lab can feel overwhelming and what is the best place to start and first steps to take. The advice given was to start with four simple steps:

  1. What do I need? Take your aims and break them into experiments and then break those experiments down to steps to get a framework for the types of equipment and consumables that you will need to conduct your work.
  2. What do I have?
  3. What can I borrow or share? Are there other labs in the same building that could share equipment with you if it is something that you don’t need often?
  4. Who will help purchase? Are there other labs that would be willing to split the cost of equipment?

Once these questions are answered, you will have a good list of what you need to buy and whether there are any opportunities to share those costs. Other advice included taking pictures and measurements of the space. It is important to understand the space layout and amount of space to be able to plan for lab set up. Additionally, it is helpful to know your administrators, as they may have contacts with vendors or other labs that could help with the purchasing.

Next, I asked who they thought should be involved in the new lab planning and set up. They said that there would always be unexpected voices, but there are certain people who can really contribute to your success. They recommended connecting with other primary investigators at your university as they will be familiar with any quirks of the building or space. Secondly, use your vendors. They will have knowledge of the equipment or consumables you need and since they see many other labs, they may be able to provide additional advice when it comes to set up of the lab space and specifically their equipment in that space.

It is important that all personnel have some input as they understand how specific protocols are run and any associated ergonomic challenges or space requirements. It is important to designate a lab manager or primary investigator to take all opinions, weight them and make final decisions to keep things organized and on track.

We then talked about maximizing lab space. Both Cat and Ann felt that creating zones for areas of equipment was an important consideration in making the lab more efficient. Zones can be created through the lens of safety requirements, which makes it easier for technicians to comply and the process overall more efficient. It is also helpful to think about the size of the equipment, and if possible, group large pieces of equipment together, say in a common space to ensure there is enough space to navigate around the equipment. This will leave more functional lab space elsewhere and will simplify traffic patterns. Function is key, so  it is important to really consider how the lab will function.

I then asked them if they had advice on how to stay within budget when purchasing equipment. They had some great tips. Think about how much you will actually use a piece of equipment. If it isn’t very often think about borrowing or sharing with another lab. You can also look into whether your University has a core facility that you could use on a pay as you go basis.

This is another good place to use vendors as they might offer new lab bundles that can save money and time. If a piece of equipment is a workhorse in your lab , it is important to consider maybe spending a bit more and getting the best equipment for your application. Also service contracts with preventative maintenance plans can really save time and money in the long run. It is the worst to lose time and even experiments due to equipment issues.

Next I asked about staying within budget for consumables. Ann and Cat had great thoughts here too. If the consumable is unique to your work and no other lab around you would stock it, then it is critical to have that on hand at all times and stock up. If it is something that you could borrow, if you run low, then maybe don’t worry about stocking up in advance. It is also good to take advantage of promotional deals that your vendor might be running, but be sure to consider where to store back stock. The same pricing principle applies here. It might be worth it to purchase the better quality consumable if it means that your experiments will be more consistent and reproducible.

We then discussed balancing being well stocked vs. overbuying. They both suggested putting the vendor to work here. They will have insight into how long it takes to get products that are critical to your work. Is there any lead time on these products, if yes, then maybe you want to stock up. If it is something that they always have in inventory, then you don’t need to buy as much. It is also important to decide who in the lab will have responsibility for ordering. It is good to have one person in charge, so that they can be sure that the lab always has what it needs and there aren’t any miscommunications.

Next we talked about lab safety and ensuring the best safety practices. Cat explained that what was helpful to her in the past was to set up “Guardians” of each piece of equipment. They were then responsible for training new employees for that piece of equipment.

When discussing lab training, there were other key tips. Since training is ongoing, especially in labs with high turnover, a lab manual is very helpful. It can be electronic and could include trainings, quizzes, etc. Another helpful resource is on-demand webinars. For example, Corning has webinars on basic cell culture techniques and other basic lab introductory topics. General topics like these can be used as training tools that do not have to be created internally.

Training Resources:

Another important aspect of training that shouldn’t be ignored are training for administrative, maintenance or safety tasks. These are critical to lab functioning and it is important that more than one person knows how to order or maintain equipment for example.

I closed the interview by asking if they had any additional advice. The takeaways were to use your vendors and distributors to help you navigate the process. There are lots of choices when it comes to products, so tell them what is important to you, performance, price, brand, etc. and then let them present a list of options based on those needs. They can suggest promos and other ways to save money and order efficiently. For example, Corning is currently running a New Lab Promo: Stock your lab with brands you know and trust and we’ll give you free lab supplies equal to 25% of your total purchase. Some restrictions apply.

Lastly planning is so important, be sure to take the time to plan upfront.

For more information, please see the following resources:

How to Set Up a New Lab From Scratch

Demystifying the FBS Selection Process – A guide for evaluating product quality, origination and cost consideration

In this podcast, we talked with Chris Scanlon, Global Marketing Development Manager at Thermo Fisher Scientific about how to effectively evaluate which FBS product is right for each application. This includes weighing product quality levels and country of origin. Chris also shares strategies for maximizing purchasing options and new FBS products on the horizon.

We began the interview by talking about how FBS has evolved to include several different product quality levels to meet the needs of end users. Chris shared that in the 14 years that he has been working with FBS that there has been an increasing number of tests performed to meet specific application requirements. Twenty to thirty years ago there were far fewer tests conducted, now Gibco sera runs up to 96 tests to give serum a more defined scope.

Next, I asked Chris about serum origination as one way to differentiate serum, how important is it and what does it tell consumers. He explained that there are two areas where origin can be important. The first is with respect to viruses. There are some countries that still have viruses like foot and mouth and blue tongue, for example. So, it can be important to understand the origin.

The other area where origin is important is in regulatory requirements. FBS from certain countries can’t enter other countries due to regulations, thus it is important to be mindful of serum origin and the FBS import requirements for your country.

He went on to describe a recent study that Thermo Fisher Scientific conducted. The study took 18 months to complete and involved over 500 researchers. In the study, they surveyed researchers about the specifications that were most helpful to them in determining what they would purchase. Fifteen specifications came back as most important with the origin being number 12. This was a surprise as they expected origin to be higher on the list. It turned out that endotoxin, hemoglobin, total protein and filtration quality made the top of the list, but origin wasn’t in the top 10.

I thought this study was very interesting and asked Chris what were some of the characteristics of FBS that end users were looking for most. He said that endotoxin was number one as it can really define the quality of serum. Endotoxin is measured at collection and it demonstrates how well the raw serum was collected. For instance, was it collected using a closed system and aseptic techniques. Once you have that endotoxin number, it can’t be changed, so it gives researchers a real look at how carefully collection was conducted. This in turn is an indication of whether the serum was exposed to any other contaminants that would hinder research. As such, endotoxin serves as a real quality marker. In addition to endotoxin, hemoglobin, total protein, osmolarity, pH, and filtration were listed as specifications that help researchers decide what to buy.

Next I asked Chris about Thermo Fisher Scientific’s FBS categories and how they are designed to help researchers find the best product for their needs. He said that they recently changed their categorization from five categories to three. They made the change because with five categories there was too much overlap between product specifications and it was difficult to understand the product differences. The team looked at the entire portfolio and compared this to the responses that they had received in the survey to develop a system that was easier to navigate. The new categories are Value (up to 50 tests), Premium (up to 96 tests), and Specialty where products are delineated for specific applications. It became easy for customers to make decisions between the three based on the culture requirements of cells they were using and their specific application.

I followed up by asking about Specialty FBS and their custom options. Chris explained that they do have custom options available and a team that works with researchers to find the serum that best meets their needs, timeline and budget. However he also shared that before custom, customers should look at the Gibco  portfolio of products to see if there is a non-custom match. He also described their iMatch program, a sera lot matching tool. Researchers can enter their FBS requirements and the program will search current inventory to find a match. If the match is 80% or greater, Thermo Fisher Scientific considers it an iMatch guarantee, which means they are confident that customers will see the same consistency as with past lots of FBS with the same requirements.

We then discussed cost and factors that can affect pricing. Chris explained that the cost and supply dynamic for FBS can be difficult to predict. He recommends that researchers work closely with suppliers to see if they have any insight on pricing changes to help plan purchasing for the future. Of course there are factors like drought, storms or geopolitical issues that can’t be predicted, but buying in larger quantities for a year or 18 month supply can help to combat pricing fluctuations. In addition, with larger purchases there are more discounts available and cost is locked in for the supply purchased.

We then talked about how selecting an FBS can be quite confusing and I asked Chris what he recommends to researchers who are unsure about finding the right FBS for their application. He agreed and said that one thing that can make selection even more challenging is that there isn’t a standardized Certificate of Analysis (CofA) or naming convention to help customers compare serums across brands. This is something that the International Serum Industry Association (ISIA) is working on, but in the meantime he recommends that researchers really look at the CofA for each serum to understand the details behind brand names or categories. It is important to be sure that researchers are comparing two like products. It is also good practice to compare the new product’s CofA to your previous FBS lot’s CofA to best understand how the new product will perform in your cell culture system.

I closed by asking Chris if he had anything else to add for listeners, specifically any new products or technologies on the horizon. He shared that Gibco has launched two new FBS products which were designed to give researchers even more information about the serum they are purchasing.

The first product is their Tet System Approved FBS that is functionally tested to ensure optimal induction for protein and gene expression systems. Tet-System Approved FBS is specially tested for cell culture applications using tetracycline and tetracycline-derivative regulated gene expression systems.  Cell-based assays that detect the presence of tetracycline and tetracycline analogs are used to select FBS that delivers highest range of induction. The Tet System Approved FBS will be available in all the innovative packaging options including the 500 mL Boxy bottle and the One Shot 50 mL bottle.

The second product is their Premium Plus FBS with 96 quality tests, including functionality testing in six commonly used cell lines to test the consistency of cell proliferation. To be released, Premium Plus FBS must be greater than 80% of the control for cell proliferation. They also have growth, cloning and plating release specifications that must be greater than 80% of the control. The endotoxin specification is best in class at ≤1 EU/mL. Thermo Fisher Scientific developed this product to answer the demand for a more stringent quality serum, which is ideal for vaccine, therapeutic, diagnostic, and other demanding research applications.

Learn more about Gibco sera’s commitment to quality and innovation since 1962 at thermofisher.com/fbs

Reducing Fill Risk in Drug Product Manufacture Utilizing New State-of-the-Art Systems and Platforms

In this podcast we talked with DQ Wang, PhD, and Vice President, Formulation, Fill and Finish of WuXi Biologics about their DP4 multi-product fill & finish facility featuring the Vanrx SA25 robotic, gloveless, isolator-based filling system. The system significantly reduces drug product fill risk and provides greater aseptic assurance. The facility is the first in China to use this technology platform and the system fits perfectly with “scale-out” manufacturing paradigms.  This highly-flexible platform can easily transition between various Container Closure Systems (CCS) such as vials, pre-filled syringes and cartridges including the new Ready-To-Use (RTU) formats

We began the interview by talking about the completion of pre-filled syringe drug product runs in one of WuXi Biologics’ drug product fill facilities in Wuxi, China. What made these runs so unique is that they were produced in a new state-of-the-art facility that utilizes the Vanrx SA25 system. The system allows WuXi Biologics to greatly reduce risk for each fill and the amount of human intervention compared to traditional automated fill lines. Dr. Wang went on to say that it also allows them to perform fills for the first time using pre-filled syringes. He noted that adding this capability is another huge step towards their goal of building open-access technology platforms with the most comprehensive capabilities and capacities in the global biologics industry.

I followed up by asking DQ how human intervention can be reduced even further than what is already prevalent in the industry with automated fill lines. He explained that this gloveless, isolator-based system performs fully programmable, robotic functions for all aspects of the fill. The fully robotic functions include VHP sterilization of the container closures, liquid dispensing in the CCS of choice using a single flow path, capping, and delivery of the batch. After those steps, the Clean in Place (CIP) function is also programmed into the run. Lastly, there is integrated programmable and robotic air and particle sampling among other in-process checks and an integrated electronic batch record (EBR). All of this is done without human intervention, thereby removing one of the key areas where mistakes can be made during drug product manufacture.

Next I asked about the CCS chosen, pre-filled syringes, and whether the unit was dedicated solely to this type of configuration. He clarified that the unit is highly flexible and that it can handle a wide variety and sizes of CCS, such as vials and cartridges, in addition to pre-filled syringes. Additionally, the unit is able to handle the new simplified, two-component Ready-to-Use (RTU) CCS. These RTU formats reduce in-run risks. By design, the RTU formats reduce rejection rates caused by particles and other part defects that are more common in traditional rubber stopper and aluminum crimp seal configurations. He added that the system is also capable of performing inert gas overlays and for PFS and cartridges offers a servo-driven vacuum plunger design for bubble-free fills. Thus, the system is very adaptable to a wide-range of CCS types.

We transitioned the discussion to the primary drivers behind their decision to implement the Vanrx SA25 system in this new DP facility. DQ explained that their focus is always on patient safety and there are many reasons why the system is ideal from that perspective. There were other time and economic drivers as well. WuXi Biologics’ manufacturing goal is to quickly and safely advance their customers’ remarkable therapies to market. They knew that through this systems’ modular design and installation ease that it would accelerate the process of adding DP filling capacity, especially when compared to conventional filling systems. The qualification and validation of the process also takes less time due to the design properties of the modular system. This system has more standardization throughout the different filling operations and thus this standardization can reduce costs and time to clinic or market. It also fits perfectly with WuXi Biologics’  “scale-out” manufacturing paradigm.

Then I asked DQ if he could elaborate about the cost savings that can be achieved with this system. He said that this is actually one of the key benefits of implementing a filling unit like this. Overall a system like this reduces the cost of drug product operations because it requires less space and thus was less capital intensive. It requires fewer operators (2 versus 10 in traditional filling lines) and less training is required. With automated EM and RCS sampling and an integrated electronic batch record, there is less workload. This means reduced overall operating expenses and cost on a per run basis. Because of its single-use component only design and automated CIP after every run, you can also achieve faster changeover. With changeovers in as little as 45 minutes, you can complete more runs per unit of time and the yearly cleaning and qualification shut-down costs are cut in half.

I then asked DQ if he could explain the “scale-out” paradigm and how the Vanrx SA25 system fits for that purpose. To help explain this paradigm, he used WuXi Biologics’ drug substance (DS) manufacture as an example. For DS manufacture, WuXi Biologics uses single-use bioreactors. This allows them to obtain the scale needed without scaling up to larger stainless steel tanks. There are multiple benefits to this approach including, reducing any scale-up risks to product quality. It also greatly reduces the process validation effort required during late phase manufacturing, because the scale of manufacturing, from a bioreactor perspective, has not changed.

DQ then explained that similarly, for the Vanrx system, due to the small footprint, modular design of the unit, lower capital requirements and speed to install and validate, additional Vanrx systems can be added to scale-out the DP production. This is much less costly and time consuming than installing and validating a new dedicated commercial scale fill line. Like the bioreactor example, the late phase process validation is greatly reduced by simply using another identical Vanrx system. He added that WuXi Biologics is in the process of installing more Vanrx systems in several of their new manufacturing facilities in China and the USA.

I wanted to understand a bit more about the risk reduction element of the system. DQ told me about how being gloveless and robotic means less human intervention and thus higher aseptic processing control. The robotic functions are simply more reliable and precise. Thus, filling is done more accurately, with fewer mistakes and rejected product during QA review. The system is isolator-based and uses only single-use components, which is ideal from an aseptic processing perspective. WuXi Biologics has achieved 10X less variability during dispensing using the systems’ advanced peristaltic pump compared to the traditional peristaltic pumps. There is a single-flow path for the drug product into the container thus reducing risk further. With the integrated EBR they are even able to reduce the risk of human mistakes in the generation and review of batch records.

I then asked how long WuXi Biologics has been utilizing the system and what has been the performance thus far. DQ described how the system came on-line and was GMP-ready in early Q3 2019. They used the system to successfully fill four batches of pre-filled syringes (PFS) with an acceptance rate of up to 99.1%. For vial filling, their acceptance rate is even higher, at 99.6%. Since their first GMP run using this system in July 2019, they have had 0 EM excursions, have successfully passed all 4 media fills with a 100% success rate, have also successfully completed over 20 batches of drug product in RTU vials, and passed 7 client quality audits with no critical findings. He added that the system has allowed them to expand their capacity and helped to keep all projects on their intended timeline. With many runs now scheduled through end of 2020 and into next year, they are continuously adding and validating more CCS types and expect to have validated cartridges for use in the system by 2021.

We then discussed whether regulatory agencies had approved drug product from these systems. DQ said that they know that multiple clinical trials are now being conducted using DP manufactured in this system and several BLAs are pending. Large pharma is also steadily adopting and installing these systems as well. RTUs container closures are also now in widespread use in the industry.

I wanted to follow up on the utilization of RTUs and why the industry would move towards these new CCS. DQ explained that the material used in RTUs results in a reduction in particulates, which is an advantage. With the RTUs, there is no glass-to-glass contact during the fills and the potential for breakage or glass particulates is eliminated. There are advantages at the clinical site as well, these configurations tend to open easier and have fewer risks associated with damaging doctors’ gloves as they open or work with these CCS.

In terms of cost, fewer components results in less cleaning and sterilization costs. Two-part CCS also help reduce per run costs, storage costs and inventory space. RTUs provide greater flexibility and efficiency because most manufacturers use the same tubs for various sizes of RTUs, which makes loading and unloading very efficient. There is also an added benefit of high cosmetic quality.

With so many advantages to this system, I asked DQ if they were moving entirely to this type of DP filling system. He said that from a CMO perspective, there are significant advantages in using the Vanrx system especially for multi-use, multi-product and multi-CCS facilities. Product types like orphan products or other product types that need to be made “on-demand” are also ideal for a system like this. However, these systems cannot completely replace the traditional fill line yet. There are some economies of scale for which the traditional fill lines still have an advantage, for example with blockbuster drugs. There are also many drug companies that still need to use a particular CCS, which requires the traditional fill line. At WuXi Biologics, they still operate traditional isolater-based automated fill lines, when it is a better fit for clients and they are needed. However, the Vanrx system along with the use of RTU CCS has fulfilled a regulatory agency quality-by-design mission of continuous improvement and implementation of state-of-the-art technologies to reduce risk, lower costs and increase efficiency in biologics drug production. In the end, what benefits the patient is what really matters.

For more information, please see …

The Challenge of Staying Current with Regulatory Changes – How one company is providing a solution

In this podcast, we talked with Ken Chen, MBA, Senior Director, Regulatory Affairs, WuXi Biologics about staying current with regulatory changes. We discussed how WuXi Biologics recently began publishing a quarterly summary of regulatory updates on new or revised guidance documents from the various global regulatory agencies and how this is a valuable resource for anyone in the biological drug development arena.

I began the article by asking Ken what prompted WuXi Biologics to begin publishing these regulatory updates. Ken explained that WuXi Biologics works with companies from around the globe in all facets of product discovery, development and GMP manufacturing and across the full drug development continuum from preclinical to commercialization. Due to the nature of the services they provide, they need to provide an optimal regulatory CMC strategy and thus remain up-to-date with the relevant global regulatory expectations. They work proactively to minimize regulatory risks by identifying regulatory changes or new hurdles in advance and thus if needed they can rapidly perform gap analysis and formulate a strong risk mitigation strategy.

We then discussed how having these updates provides a win-win for both WuXi Biologics and their clients. Ken described how the companies that are working with WuXi Biologics are filing INDs and BLAs in various jurisdictions and WuXi Biologics must provide results and documentation that will be a part of those filings. Hence, WuXi Biologics must adhere to the quality and GMP standards required in those various geographic venues.  He went on to say that since they are such a large organization and because they provide a one-stop, single-source development platform it was imperative that they develop an internal mechanism to keep their entire staff up-to-date with the relevant, wide ranging and rapidly evolving regulatory updates and expectations.

I asked Ken about other resources available with regulatory updates. He said that there are other resources available and that they use some of these to help put their update together, but many are not organized by agency and topic. In addition, most and are not as wide-ranging, from a global perspective, as what they felt like they needed for their team. They decided that a quarterly update was just the right size for teams to digest and, if necessary, act on to stay current without disrupting normal daily operations. They thought that offering the translations of relevant NMPA updates and documents would be unique to current industry resources.

Next I asked Ken about whether the updates were just for the regions in which WuXi Biologics has operations – China, Europe and the United States. He clarified that they work with clients to file INDs in many countries around the world not just in the U.S., China and countries currently and formerly governed by the European Medicines Agency or EMA. Some of those other countries include Australia, Korea, Singapore and Japan, thus their updates need to include the regulatory requirements from those countries as well. They also review updates coming from Health Canada, ICH, WHO and PICs amongst others if relevant to biological therapeutics and vaccine development.

I was curious about why they decided to make these updates available to the greater industry. Ken explained that many of the leaders at WuXi Biologics came from drug development companies both large and small from around the globe and they wished that they would have had a similar consolidated and comprehensive update. Ken said that although many larger companies have similar teams assembling this information, many smaller companies do not have the same resources. WuXi Biologics thought that their update would be valuable for them, especially for companies wishing to file in multiple jurisdictions.  They also provide a translated update of the new Chinese regulations, and they believe this provides value to even large organizations looking to bring drugs into the Chinese market place.

I then asked where they have seen these updates provide the most benefit. He said that really it is a means to start a dialog internally or with a client on how a new update or guidance document will impact an organization.  Also being alerted of a new Draft Guidance, especially when a given agency wishes to receive comments from industry, is extremely useful. The more that the industry can provide feedback to regulatory agencies, the more effective the resulting guidance will be. He added that they also use the information from the updates in their discussions and collaborations with clients and partners to ensure everyone is on the same page with global requirements. They are often consulted on how best to file the same product globally in multiple jurisdictions because of their extensive experience and their unique understanding of the China regulatory landscape.

Next we talked about the newsletter’s coverage and whether it was more specific or broad in scope. Ken explained that it was designed to be broad, covering all aspects of biotherapeutics and vaccine drug development, but given the recent impact of COVID-19 and Brexit, they are also providing updates focused specifically to those timely topics. The idea is to keep staff and industry informed of the many relevant regulation updates as they occur. He said, “there is so much information to keep track of, we felt a summary would be useful for specific topics as well.”

I followed up by asking why they chose COVID-19 and Brexit as topics to follow more closely. Ken stated that COVID-19 has a worldwide impact and finding vaccines and treatments is possibly issue #1 for our industry at the moment. This means that regulatory agencies around the globe are continuously meeting and issuing updates and guidance for this initiative. WuXi Biologics is trying to do their part in helping keep people up-to-date. Brexit is more layered and its impact reaches beyond Europe while the EMA is being relocated and the UK breaks away from EMA oversight. This could impact regulatory issues, timing for drug approvals and international relations as well, so they thought it would be good to provide a summary on this critical change.

I asked Ken if there were any caveats to these updates. He said yes, they really only capture updates, guidance documents and regulations that impact biotherapeutics and vaccines. They do not address changes related to medical devices, diagnostics or those updates dealing solely with small molecule drugs. Also, the updates are meant as a tool and another resource for anyone hoping to stay current from a regulatory perspective and should not be construed as regulatory advice nor representing any regulatory agency. Thus, they cannot state that the updates are truly comprehensive of all updates from the regulatory agencies and the updates are for informational purposes only.  In more of a “legal speak” the content is provided “as is” without any warranty.

I closed by asking Ken where we can find the most recent regulatory update and how often it is published. Ken said that they are publishing it quarterly and that it is located on their website where they also keep an archive of past updates. Readers can find updates and sign-up for quarterly updates via email, if they would like at https://www.wuxibiologics.com/regulatory-updates-archive/

Solving the challenges of standardizing cell counting to ensure reproducibility in experiments, assays and manufacturing processes

In this podcast we talked with Christian Berg, Global Product Manager at Chemometec about the importance of standardizing cell counting because while often overlooked, it is essential for reproducibility in experiments, assays and manufacturing processes. Cell counting is a challenging technique, with many pitfalls, that can delay entire projects. We discussed how new technologies are solving these challenges and enabling standardization of cell counting across organizations.

Show Notes

I began our interview by asking Mr. Berg if he could tell listeners why cell counting is so important. He explained that there are some applications where it is clear why cell counting is important, for example when cells are used for manufacturing pharmaceuticals. In biomanufacturing, you need cell counting to perform bioassays for toxicology testing, for use in quality testing of products and to measure the activity of the final product by using cells to report the activity of the drug. Most importantly these cell counting methods must be standardized.

He went on to say that with the emergence of cell therapies, you have completely new processes and new requirements for cell counting and companies must rethink how they count cells and what quality parameters they should focus on.

When you look at cell counting in the context of research, it is used as a tool to maintain the cells in culture while setting up experiments. However, the need for standardization is critical as cell density is very important for how cells function. Cell density influences powerful signaling pathways that can impact any biological function under study, so standardizing the cell counting in a lab is key to achieving reproducibility of a specific assay. Another challenge in cell counting is data management. For instance with cell therapies one must scale the process tremendously, so you might have hundreds of cell counting units, thus it will be important that the incoming data is easy to organize.

Next, I asked Christian if he could talk about how cell counting needs have evolved with new research and manufacturing demands. He said that in research the trend is towards larger experiments, such as cell based screening assays. With large experimental setups, it is very important that the system is optimized with a consistent counting method to reduce day-to-day variation of the experimental setup. He also discussed the reproducibility crisis in research where several of the published articles in peer review journals cannot be reproduced by other researchers. There have been many suggestions about how to correct this problem and standardization is one of them.

He said that at Chemometec, they are seeing a lot of interest from research leaders looking to standardize the laboratory methods. When Chemometec goes to labs that are using manual counting. The team typically asks a handful of researchers to perform manual counting on the same sample. It is not uncommon to see a forty percent variation between the researchers. This makes a big impact as that kind of variation is a big problem for a research group, but it also makes collaboration with other partners more difficult.

Then we discussed the challenges of cell counting. Christian described how the most significant challenges in cell counting today can be split into two groups – technical challenges with performance of the cell counting and practical challenges when cell counting is used in a specific operation or research.

With bioprocessing, customers typically use older generations of cell counters that are mechanically complex and contain tubing, which is inherently unstable and causes instruments to break down quite often. If a cell counter breaks down, it is time consuming and expensive to fix. This is a technical challenge.

Conversely, he said that data management is a practical challenge that can be clearly seen in virus and cell therapy manufacturing. For these types of manufacturing, you cannot build a bigger steel tank if you want to increase production. Instead, you are forced to scale out the production setup, which means that you need more analytical equipment to support the production. He shared that Chemometec has customers that have hundreds of cell counters to support manufacturing needs and having good reliable data integration is essential to be able to control these processes. Therefore it is important to consider the performance of the cell counter as well as the implementation of the cell counting method in your process.

I then asked Christian to talk about Chemometec’s recently launched next generation version of their popular NucleoCounter NC-200, the NuceloCounter NC-202. He explained that the NC-200 is actually a third generation counter and it represents a general overhaul of the hardware, the cassette and the software. They have updated the optics electronics as well as the cassette, and the camera. They have also improved the light sources, which improve the quality of the images. This quality improvement permits a greater extraction of detail about the sample and led to improvements of the performance of the instrument. For example, they increased the dynamic range to ten million cells and at the same time reduced the time it takes to conduct a cell count to thirty seconds. The improved data quality also permits viewing of much smaller particles and the ability to quantify cellular debris for increased robustness.

He went on to state that cell therapy has very high requirements for the scalability of analytical instrumentation. Chemometec developed NC-202 to meet these demands by using modern tools to centralize data. We expect a large increase in the use of robotics and automation efforts will probably revolve around MES systems that will allow operators to integrate the different analytical and manufacturing instrumentation on a single platform to allow more effective control of the manufacturing processes.

I followed up by asking about specific industry segments, and the ways in which cell counting can improve these processes. We started with biologics manufacturing. Christian explained that in biologics manufacturing the cell lines are used to produce therapeutic proteins, so cell counting is employed to control these processes. There are different modes in which the cells can be grown, but in all cases the cell counting is a very important analytical parameter for decision making. The cell lines that are used in biologics manufacturing are not the hardest cell lines to count, but the stability of the automated cell counting unit can be a challenge. The challenge lies in the instability of many of the older generations of cell counters that are mechanically complex and contain internal fluidics that can clog the system. When a cell counting instrument breaks down during a process that can be very unfortunate, because quite often instruments measure differently. This can cause a systematic difference in the cell count, which will affect the data that you are monitoring. It can also affect the evaluation of the quality of the final product. One important feature of the NC-202 to is that all the units will measure the same regardless of production year. Chemometec achieves this by using a very rigorous calibration during manufacturing.

Then I asked if he could talk about cell therapy and virus manufacturing processes. He explained that in comparison to biologics manufacturing, the production of virus and cell therapies are much more difficult to scale. In virus manufacturing, the cell lines that are used to expand the virus are quite diverse because specific viruses have preferences for specific cell types. As a result, there is no single cell line available that can be used for the expansion of all viruses. In addition, most of the cell lines used are adherent, which are much more difficult to scale compared to suspension cells like CHO cells. In order to overcome the scalability problem with adherent systems, companies use microcarriers, which allow the cells to be grown in bioreactors. However, counting cells on microcarriers is not trivial because you cannot count the cells while they sit on the microcarriers. You need to strip the cells off the microcarriers prior to counting and that process together with the actual counting can take up to thirty minutes.

He described a recent case where a large virus manufacturer came to Chemometec and asked for help with setting up a cell counting method for counting primary cells. They used manual cell counting where operators would put the cells into five categories manually. This counting process was very extensive and took more than 30 minutes for a single cell count. Using the NC202, Chemometec provided the manufacturer with an assay that allowed them to automatically perform cell counting, thus reducing the analysis time from thirty minutes to thirty seconds.

Next I asked Christian to describe what the implementation of the NucleoCounter looks like for scientists interested in incorporating this into their workflow. He explained that the NucleoCounter is very easy to use and that is particularly important if you want to standardize a process. The cassette in the NucleoCounter replaces three workflow steps still present in other counting methods – the addition of dye, the loading of the cells into a counting chamber and the focusing required before performing the cell count.

He summarized by saying that the simplicity of the NucleoCounter operation will make it much easier to implement in any process, because the standard operation procedures are shorter and it is easier to train new people to use the instrument. Another important advantage of the NC-202 is that it can easily be deployed in clean rooms. It is easy to clean and it doesn’t require any maintenance, regular validation is enough to ensure consistent performance of the instrument.

I closed the interview by asking Christian if he had anything that he would like to add for listeners. He said Chemometec is still operational, so if companies would like to try the NucleoCounter, please contact them. Normally they would send out field application scientists to get companies started, but because of these unusual times they made a video to demonstrate setting up the instrument and getting started. Typically a week is sufficient to test the performance of the NucleoCounter with other cell counting systems.

For more information about the NucleoCounter, please see

A Look Towards the Future of 3D Cell Culture – A panel discussion

Introduction and Overview

Debbie King

Researchers have used 2D cell culture since the early 1900s, but we know that growing cells on planar surfaces have some drawbacks. Cells grown in vitro in 2D space don’t behave like cells found in vivo. They lack critical cell-cell and cell-matrix interactions that drive their form, function and response to external stimuli. This limits their prognostic capabilities. More recently, 3D cell culture techniques have become popular because the cell morphology, interactions and tissue-specific architecture more closely resembles that of in vivo tissues. Spheroids, organoids and more complex 3D tissue systems, such as ‘organ-on-a-chip’ are examples of 3D cultures used by researchers to model native tissues.

Spheroids are simple, widely used 3D models that form based on the tendency of adherent cells to aggregate and can be generated from many different cell types. The multicellular tumor spheroid model is widely used in cancer research.

Organoids are more complex 3D aggregates, more like miniaturized and simplified versions of an organ. They can be tissue or stem cell-derived with the ability to self-organize spatially and demonstrate organ-specific functionality.

More complex yet, are technologies like organ-on-a-chip, which is a multi-channel 3D microfluidic cell culture system that mimics whole organ function with artificial vasculature. Cells are cultured in continuously perfused micrometer-sized chambers that recreate physiologically relevant levels of fluidic sheer force to allow for gas, nutrient and waste transport to the cell just as is observed in vivo vascularized tissues.

How are spheroids impacting cancer research and what do you see as future applications for the technology?

Audrey Bergeron

Spheroids can be an improved model for cancer in the lab compared to standard 2D cell culture. When cancer cells are cultured as spheroids, they are able to maintain the shape, polarity, genotype, and heterogeneity observed in vivo (1). This allows researchers to create models that are much more reflective of what’s going on in the body. For a simple example, if you think about drug penetration into a 2D monolayer of cells it’s completely different from drug penetration into a solid tumor. In a 2D monolayer each cell is exposed to the same concentration of drug whereas in a spheroid, like a solid tumor, there are gradients of drug exposure.

More and more we’re seeing researchers move away from cancer cell lines and move more toward specialized cancer models such as patient derived models. The hope here is to find the appropriate therapies for each individual patient.

(1) Antoni, D., Burckel, H., Josset, E., & Noel, G. (2015). Three-dimensional cell culture: a breakthrough in vivo. International journal of molecular sciences, 16(3), 5517–5527. doi:10.3390/ijms16035517.

What tools and technologies are needed to fully realize the potential of spheroid culture models?

Debbie King

One of the key parameters for success with spheroid culture is controlling the size of the spheroids. It can be very difficult to get consistent, reproducible results if the starting spheroid culture is not uniform in size and shape. Cell culture tools available on the market now, such as ultra-low attachment plates, facilitate the formation of uniformly sized spheroids for many research applications from low to high throughput modalities.

Hilary Sherman

There always needs to be a little bit of a balance between throughput and complexity in terms of creating models for research. That’s why there are so many options available for 3D research. Low attachment products such as Corning® spheroid microplate and Eplasia® plates are great for creating high throughput 3D models, but can lack some complexity. Organ-on-a-chip and hydrogel models add biological complexity to the model, but are typically not as high throughput.

What is the most interesting achievement so far in using organoids?

Elizabeth Abraham

Personalized medicine. Due to their unique ability of unlimited self-renewal, organoids are different from spheroids. Organoids can be made from patient-derived stem cells in a selective medium containing Corning® Matrigel matrix. These organoids can then be exposed to varying drugs to identify the best treatment to fight that particular cancer; thus personalizing medicine to treat disease. Taking this idea even further is the ability to repair genes in cells that can form organoids, then using those organoids to understand treatment regimens. Organoids thus serve as a converging platform for gene editing, 3D imaging and bio-engineering. Thus the therapeutic potential of organoids in modeling human disease and testing drug candidates is in my opinion the most interesting achievement thus far.

Audrey Bergeron

There has been a recent report of researchers at the Cincinnati Children’s Hospital Medical Center (2) developing the world’s first connected tri-organoid system, the human hepato-biliary-pancreatic (HBP) organoid. This is a remarkable achievement since it moves the field from individual organoid research to connected organoid systems, which more physiologically mimic the interplay between human tissues. There have also been challenges to date with current liver organoid approaches failing to adequately recapitulate bile duct connectivity, which is important for liver development and function. The authors describe optimized methods to create the multi-organ model from differentiated human pluripotent stem cells via the formation of early-stage anterior and posterior gut spheroids which fuse together and develop into hepatic, biliary and pancreatic structures. This is an exciting new basis for more dynamic and integrated in vitro systems-based organoid models to study organogenesis, for use in research and diagnostic applications and for potent applications in precision medicine and transplantation studies.

(2) Hiroyuki Koike, Kentaro Iwasawa, Rie Ouchi, Mari Maezawa, Kirsten Giesbrecht, Norikazu Saiki, Autumn Ferguson, Masaki Kimura, Wendy L. Thompson, James M. Wells, Aaron M. Zorn, Takanori Takebe. Modelling human hepato-biliary-pancreatic organogenesis from the foregut–midgut boundary. Nature, 2019; DOI: 10.1038/s41586-019-1598-0.

Debbie King

The generation of cerebral organoids is one of the most interesting achievements. These “mini-brains” derived from pluripotent stem cells can self-organize into functioning neural structures. Neural cells are notoriously difficult to culture in
and obtaining sufficient cells for experiments can be challenging. Cerebral organoids offer a way to study neural tissues, replicating aspects of human brain development and disease that was once impossible to observe in the laboratory. Scientists have used them to make discoveries about neurological disorders like schizophrenia and autism. These organoids have been useful models to examine fetal brain microcephaly caused by Zika virus infection.

What technologies played have helped scientists overcome the biggest challenges to using organoids?

Elizabeth Abraham

3D extracellular matrix, like Matrigel® matrix, provide the appropriate scaffold to be able to generate and grow organoids. This matrix can also be modulated by matrix metalloproteases secreted by cells within the organoid to grow and differentiate.

The discovery of Wnt signaling is also important as the Wnt pathway is the heart of the organoid technology.

Lastly, 3D imaging, the ability to see inside 3D structures such as organoids has also enabled scientists learn how to use organoids in disease modeling.

Audrey Bergeron

One of the biggest challenges is the lack of vasculature in organoid systems, which hinders in vivo-like expansion and limits organoid size. Technologies have been developed (and are continuing to be developed) which improve long-term culture conditions and the delivery of nutrients and gaseous exchange to the developing organoid. These include spinner flasks and bioreactors to increase “flow” in the culture system and microfluidics-based platforms for efficient nutrient diffusion, oxygenation and waste metabolite disposal (a key example is cerebral organoids).

It is also interesting to see the evolution of permeable membranes such as the Transwell® permeable supports  and other semi-permeable membrane materials being integrated into perfusion systems and 3D bioprinting techniques to improve nourishment to the organoid during maturation. These technologies have helped to increase the life- span and utility of organoids to months.

Another challenge in high throughput pharmacological and toxicity screening applications has been the formation of reproducible, single organoids per well. The Ultra-Low Attachment (ULA) surface cultureware or microplates coupled with established biological hydrogels such as Corning Matrigel matrix have provided a platform to generate uniformly sized organoids compatible with HTS applications. Concurrent advancements in high content screening platforms has also helped to elucidate the 3D complexity of organoids in terms of multi-parameter imaging and quantitative analysis.

What advancements do you see with organoids in the next five years?

Elizabeth Abraham

Organoids fulfilling the need of “organ-donors” that can be used in patients awaiting transplantation and using organoids as a diagnostic tool to detect and treat cancers.

Audrey Bergeron

More complex, vascularized multi-organoid systems will continue to be developed to advance precision and regenerative medicine closer towards transplantable organs. I also think that protocols and models will continue to be optimized to generate data and improve clinical predictively of organoid models in pharmacological and toxicity testing – this could potentially mitigate the need for animal models during drug development.

Debbie King

Researchers are also looking to combine genome-editing technologies like CRISPR-Cas9 in particular with patient cell-derived organoids. For monogenic diseases, it opens up the possibility of performing gene correction through gene editing prior to autologous transplantation as a curative solution. It’s already been shown that the defective CTFR gene in cystic fibrosis patient-derived organoids can be corrected using CRISPR/Cas9 homologous recombination (3).

(3) Schwank G, Koo BK, Sasselli V, Dekkers JF, Heo I, Demircan T, Sasaki N, Boymans S, Cuppen E, van der Ent CK, Nieuwenhuis EE, Beekman JM, Clevers H. Functional repair of CFTR by CRISPR/Cas9 in intestinal stem cell organoids of cystic fibrosis patients. Cell Stem Cell. 2013 Dec 5;13(6):653-8. doi: 10.1016/j.stem.2013.11.002.

What technologies will enable those achievements?

Hilary Sherman

I think more defined reagents such as ECM’s and media will help to make organoid culture easier and more consistent. Also, better bioprinters with higher resolution will aid in generating more complex 3D structures.

Debbie King

Automation platforms will allow for precise control of culture conditions and enable high-throughput screening in drug discovery workflows. Also, high-content imaging technology will be key to capturing morphological and gene expression data to study organoids. Live cell imaging within organoids will allow us to visualize, for example, early events in human development in real time. Overall, the field would also benefit from standardization in protocols, reagents such as the type of culture media/ECM to use and the best cell sources so that comparisons between labs can be made to help advance research forward.

What areas of research do you think will be most impacted by 3D culture systems?

Audrey Bergeron

Cancer research, to better model cancer in vitro to better understand cancer biology and for personalized medicine.

Hilary Sherman

Regenerative medicine, a branch of therapy that involves engineering biological tissue or organ to establish normal function, this includes the hope to someday be able to 3D print organs.

Debbie King

3D culture systems will continue to have a large impact on developmental biology. To study human development, this has largely been limited to observational studies on pre-implantation embryos or from progenitor cells isolated from fetal tissues, which are then cultured in vitro. The advent of organoid models derived from iPSCs opens up the ability to study human embryonic development in a way we couldn’t do before. As well, organ-specific progenitors generated from iPSCs provide a wealth of insight into the morphogenesis of different organ systems.

Computer Aided Biology Platform Helps Companies Meet the Challenges of 21st Century Biomanufacturing

In this podcast, we interviewed Markus Gershater, Chief Scientific Officer with Synthace about computer aided biology and how it addresses several common biomanufacturing challenges. We also discussed ways to build a common culture between science and software.

I began the inteview by asking Mr. Gershater to describe the concept of Bioprocessing 4.0 and what it means to the industry. Markus explained that the term 4.0 refers to the industrial revolutions that have happened throughout history. The first began when steam was introduced as a power source, next came electrification and the production line. 3.0 refers to the incorporation of automation and 4.0 is the connection of different devices and automation through digital technology. This involves cloud computing that enables data storage, computing and analysis. This is particularly important in a complex industry like bioprocessing that requires sophisticated knowledge and control. Bioprocessing 4.0 will enable the industry to progress to the next level.

Next I asked Markus to explain the solutions that Synthace provides in this area. He described how Synthace started as a bioprocessing company that was looking for a way to conduct more sophisticated, automated experiments. The result was the creation of their software Antha, which can auto generate instructions for biological protocols. This means that scientists can specify the protocol they want to run and Antha will works out all the details down to the every step of the run. It then converts those detailed actions into scripts for each automated device to run the protocol. The user hits go and the robot will run the specified protocol with the instructions that Antha generated. This automatic generation of scripts makes automation more user friendly and powerful. In particular, there is a problem with lab automation due to the complexity of programing it. Antha is able to make complex lab automation implementable.

Markus goes on to say that the beneficial knock on effect is digital integration. The devices used in protocol automation are only a small part; there are also analytical devices that produce data. What is needed is a way of structuring data from all of these diverse pieces of equipment. Since Antha generates all the detailed instructions that go on into a particular protocol, it also has the detailed structure of the experiments. So at the end of any chain of actions, Antha can provide the provenance of all data points. Thus, it can also auto structure data into the context of experimental design.

As the industry runs more complex and high throughput experiments, the bottleneck shifts to data structuring. Antha has become a tool that allows the automation of lab processes as well as the data processing from those lab processes. This permits dynamic updating as the structure of the experiment updates.

We then discussed the technology behind the product. Markus explained that first step in getting started is to identify a specific protocol. Then, for example, Antha specifies samples that need to be diluted and provides a framework with specific parameters. Next, you need to look conceptually at how you can move liquids around to fulfill this design. What equipment do you need to run and what consumables? Once you have those, Antha can generate the lower level tedious details. This allows users to change one detail of the experiment and Antha will calculate a new set of instructions. Antha can then pass these specific instructions to devices through the Antha hub, which communicates with the equipment. Once users are satisfied that the equipment has been set up properly then they can hit go and the experiment will run.

I asked if there were any case studies that could be shared to show how this would work in a real life setting. He described how their case studies range from programming relatively simple workflows like automating ELISA assays to extremely complex experiments. They recently co-published a study with Biomedica where they ran an experiment to improve their process for generating lenti viral vectors and were able to improve viral vector titer ten fold over the course of just two very sophisticated Antha experiments.

Markus shared that Biomedica is good at automation and programming automation. When they looked at the scripts generated by Antha, they determined that it would have taken them a week to program each experiment that Antha generated on the fly. He says that this illustrates why often automation isn’t used. Scientists don’t have time to spend a week programming automation for an experiment that they might only run once. There is not sufficient return on investment for the time it took to program.

Synthace has also generated case studies around automated data structuring. In this example, he explained that with bioprocessing you have bioreactors and sensor data that must be aligned with sample data to provide a full picture. Antha enables this data structuring.

Next, I asked Markus if he could talk a bit about the vision for computer-aided biology and how he sees the evolution of the space in the next five years. He explained that computer-aided biology is a vision of how we can use twenty-first-century tools to help us pick up on these complexities of biology. This application can give us insights that maybe wouldn’t have been possible without applying machine learning. This doesn’t mean replacing scientists and engineers with AI, but instead flagging things that they may have missed due to the highly complex data sets.

He said that at conferences, there has been a growing swell of excitement around using these methods for drug discovery. However these techniques are just as important in the lab to interpret bioprocessing results. To reach this sort of future, that includes AI augmented insight, requires routine production of highly structured data sets every time and with every experiment, so that you can compare results experiment to experiment.

Frequently there is an expectation that scientists and engineers should be conducting the data structuring, but it is highly onerous. There is also a wide range of techniques being used to do this from company to company. There needs to be a system in place, where as much automation is incorporated as possible. This will open up the opportunity for an ecosystem of hardware and software working together.

This led me to ask the next question on building a common culture between science and software. Markus explained that this it is interesting because scientists and software engineers tend to think of things in fundamentally different ways. Biologists are used to a large amount of ambiguity because they deal with such complex systems on a day-to-day basis. For software engineers on the other hand, things are a lot more defined and a lot more predictable. They are used to making things happen in a powerful way very quickly.

He said that it is fun to see them work together to discover what is possible and what’s not possible and learn from each other. He goes on to say that what is nice about the Antha system is that both sides can understand it – scientists want to use it to automate the protocol and software engineers can see the logic within the protocol because it is highly defined.

He then told a story about hearing a speaker from Amgen discuss this same point and she said the common culture is “just happening naturally” as more digital tools are available and scientists are shifting their mindset about how to conduct their science.

To learn more about Synthace and computer-aided biology, please see…

Balancing Risk, Cost and Speed During Clinical Development While Still Maintaining Quality

In this podcast, we talked with Thierry Cournez Vice President, BioReliance® End-to-End Solutions, MilliporeSigma.. We discussed effective ways for emerging biotechs to collect material quickly and cost-effectively for pre-clinical and clinical studies. We also discussed managing the need to move quickly with cost and quality.

Show Notes

I began the interview by asking Mr. Cournez if he could talk about how MilliporeSigma works with emerging biotechnology companies. He said that developing and implementing a clinical material process can be time consuming and complex. He explained that MilliporeSigma is a CDMO partner for startups and small biotechs that are developing commercial biological drugs. They help these companies balance speed and cost by providing a comprehensive suite of products and services to accelerate clinical biopharmaceutical development, scale up production processes, and design and implement single-use commercial production facilities. In addition, they do not just focus on the technical and quality phase of drug development,  they also prepare supporting information for regulatory agencies. They make sure companies get all the support they need to explain to agencies what is in the dossier and answer their questions. Most importantly, they want to make sure that their customers are successful.

Next I asked Mr. Cournez how working with a CDMO is different than working with multiple suppliers. He said that after finalizing their strategy, small biotechs need to decide on how they want to structure their team. They have to ask whether they have all the expertise they need in-house or if they need to work with a partner to cover the considerable needs of process development, chemistry, manufacturing, control, filling and regulatory affairs. Many small biotech companies will choose to work with a partner and choosing a partner should not be taken lightly. Mr. Cournez said he feels it is key to the success of the project. Ideally the CDMO partner can interface with most of the individual suppliers and cover most of the requirements, but the biotech company does need an in-house person to work with the CDMO. At MilliporeSigma, they have a one dedicated project manager for each of their clients.

I then asked how small biotechs can collect material quickly and cost effectively to reach the pre-clinical stage. He said that this is where a CDMO can offer a real benefit. At MilliporeSigma, he explained, they learn from previous projects where time and money can be saved. He said they recently identified a two month time savings on the development of a molecule for a client.

Next we talked about how a CDMO can be instrumental during analytical development. He explained that analytical development is a critical part of the development process. Molecules can be really complex with no way of knowing what issues may arise. This is why analytical methods development and process development must work side by side to create a seamless process.

I then asked Thierry about what he thinks are the key considerations that small biotechs need to consider when working toward clinical development. He said that clinical development for a biotherapeutic is long and very challenging. Companies need to move quickly through the early clinical phases while demonstrating direct safety and efficacy. At the same time companies need a reliable process for producing clinical materials that ensures they will meet all quality and regulatory requirements. Long term, companies need to consider the final cost of their drug at commercial scale. Because of the complexity of this process, relying on an experienced partner can really help navigate these decisions early and be informed of everything that could impact decisions down the road.

I then asked how companies can balance cost, risk and speed during clinical development while still maintaining quality. He said that biotech companies all want the process done well, cheap and fast, but under no circumstances can we sacrifice on quality. MilliporeSigma employs a flexible approach to the biotech company’s needs and constraints. Again experience is very important as skilled partners can suggest new approaches and solutions. An example of an innovation through experience that has been launched is MilliporeSigma’s integrated plug and play upstream development services. This service eliminates the need to work with vendors for upstream development, thereby reducing bottlenecks and lowering time to clinic by 3 months.

I closed the interview by asking if there was anything else that Thierry would like to add for listeners and he said that these are very interesting times with many novel therapeutics in development. Many groups are working together to get these products into the hands of patients. At MilliporeSigma, we are looking forward to these collaborations.

Bi-Specific Antibodies – The development, manufacture and promise of these cutting-edge therapeutics

In this podcast, we conducted a panel discussion with experts from Selexis and KBI Biopharma on bi-specific antibodies. We examined bi-specific antibody development and manufacturing, including current challenges and key solutions. We also discussed the promise of these cutting-edge therapeutics and their future in medicine.