By: Frank Corden
The explosion of readily available data is everywhere around
us. Silicon Valley is bombarding us and
those who would like to sell to us with data to purportedly make our lives
easier. Whether it be the articles you see on your news feed or the research
you do before buying the next gadget, it seems like the current strategy is to
conceive of a potential use for the data and throw it at us to see what
sticks. However, often the data isn’t
timely or even useful.
Take my commute in this morning to Boston. The warning of
the traffic congestion came about 5 minutes before I hit the slow down. Since several of the exits are more than 5
miles apart, there wasn’t time to get off Mass Pike. But even if the data was timely, it wasn’t
actionable. Once you get to I-90 to head
into town, there really isn’t an alternative route to get there.
A major theme of this year’s BioProcessing International
Conference (#BPIConf) revolves around making informed decisions with data. Whether it be on-line monitoring data,
laboratory data, or process analytical technology (PAT) based data, data
rules. But what are the rules around
data and how do we make it useful? What are the “rules” around using process
data in bioprocessing?
I was sitting in the 8:15am presentation Evaluation of
Continuous Manufacturing in a Downstream Process. I guess I wasn’t the only one who headed into
town early, the room is pretty full.
It’s great to feel the energy and enthusiasm of the group first thing in
the morning.
The introduction to the Recovery and Purification track delivered
by Marc Bisschops of Pall Life Sciences was provocative. He challenged us to move from batch
manufacturing to continuous manufacturing.
The benefits are clearly dependent on our ability to balance throughput
of the various unit operations as you move through the process.
Kudo’s to the first presentation, Data Based Comparison of
Capture and Polishing Steps in a Continuous Mab Process. The authors, from the chromatography company
ChromaCon compared continuous versus batch approaches with hard data
analysis. By evaluating throughput, cost,
and resource requirements, the analysis demonstrated that a change in the
manufacturing paradigm from batch to continuous chromatography can have some
impressive benefits.
With a better quality outcome (increase in purity from mid
70% to mid 80%), you can cut chromatography resin usage by one-third. For the resin selected, the reduction in
resin usage translated into $190,000/year.
In the pilot facility studied, the breakeven for the investment in
continuous chromatography occurred after the transfer of only two
molecules. Clearly, the dollar savings
in a full production facility would be significantly greater.
The decision to shift from the tried and true manufacturing
approaches we use today is a difficult one.
We all realize the risk of getting it wrong is what keeps us up at
night. A delay in the release of a
product not only affects our companies but also the patients who depend on
these products to keep them healthy, or in some cases alive.
Hard data to help make a difficult decision; now that’s data
that rules.
Share this article with your social network, just click below to share now!
|
|
No comments :
Post a Comment