Data Integrity in Life Sciences Podcast

by , , | Mar 18, 2024 | Control & Safety Systems, Life Sciences & Medical

Jim Cahill

Jim Cahill

Chief Blogger, Social Marketing Leader

Data Integrity in Life Sciences PodcastFor pharmaceutical and biopharmaceutical manufacturers, the data that accompanies the production is as critical as what is being produced. The integrity of this data is paramount for the successful release of the products for sale that comply with all regulatory requirements.

In this Emerson Automation Experts podcast, Michalle Adkins and Hilary Mills-Baker join me to discuss the challenges of achieving the required data integrity and solutions to drive performance improvements.

Give the podcast a listen and visit the Life Sciences and Medical section on Emerson.com for more on the technologies and solutions to help you drive data integrity improvements in your manufacturing operations.

Transcript

Jim: Hi everyone. This is Jim Cahill with another Emerson Automation Experts podcast. Data integrity is still a hot topic in the life science industry. We’ll be discussing that here today, and I’m thankful to be joined. By Michalle Adkins and Hilary Mills-Baker, who in fact just did a webinar on this topic in late 2023.

Welcome, Michalle and Hilary.

Michalle: Thanks, Jim.

Hilary: Hi, Jim.

Jim: Hey, it’s great having you both. I guess to get things kicked off, Michalle, can you give us a little bit of your background and path to your current role here at Emerson?

Michalle: Sure. Thanks Jim. I started out my career at Merck after graduating from the Penn State University with chemical engineering degree, and I worked in the instrumentation and automation department.

Spent some time. Doing automation support as well. And then I worked in manufacturing in for purification and formulation in a vaccine manufacturing. I also did scheduling and planning. Then I wound up starting out at Emerson in a consulting group and supported. MES and DeltaV-related consulting manage the team of consultants, and I’m currently the director of our life sciences strategy.

I’m looking after where the industry is going, the direction and how we develop technologies to support the future direction of the industry.

Jim: Well, that’s a great wide background with a lot of experience in there. Hilary, your turn. Give us a little bit about your background and path to where you are today at Emerson.

Hilary: Oh, thanks Jim. Well, I started doing a board-based engineering degree and a master’s in quality engineering, back in the 80s. And I was sponsored by a company called Courtaulds, and I worked for them in their control and instrumentation department. And I moved into Emerson because I wanted to do more systems-based work and get into automation.

So I started with Emerson in 1990. We were using Provox at the time, but we now, you know, I moved into projects with DeltaV and I was lead engineer on those, some of those projects. And we were heavily involved in the nuclear industry, sort of another regulated industry. By two thousands, we were doing a lot of pharmaceutical projects out of the U.K. And I moved into supporting the validation activities because as a supplier we want to be able to align with our customer in terms of validation and make sure we understand their quality requirements. And so over the years, I’ve set a department up in Europe of quality and validation engineers computer system validation engineers.

So that we can support our customers on life science projects and make sure that we’re really building quality in right from the start of the automation.

Jim: Well, that’s really great to have all this experience here all together. So let’s dive into the topic of data integrity. Michalle, let’s start with a concise definition. What is data integrity?

Michalle: Well, Jim, I would say in short, data integrity is about completeness, consistency, and accuracy of data. And then more specifically, there’s a abbreviation or acronym that is often used to describe data integrity, and that is ALCOA+, which stands for data that is attributable, legible, contemporaneous, original, an accurate or true copy. And then the plus part is really going into complete, consistent, enduring, available, and retrievable as well.

Jim: Well, I’m glad they made an acronym out of it ’cause that’s a mouthful of things to try to track down there. So let’s dig into that a little more. Hilary, why is data integrity of such paramount importance in the life sciences industry?

Hilary: Wow. I think you just caught it, Jim. All of those words, all of those things that we’re trying to make sure we’re building into data actually makes data integrity quite difficult to achieve. On the other hand, data is all that our Life Science customers have to use to say whether a batch of medicine is suitable to be released to the public so that we can consume it and hopefully get well. So, you know, if we had a, a poor batch of medicine that could be harmful to somebody or it might not treat their life-threatening condition. All that we have is this data integrity to tell us if it’s right or wrong.

And so it’s a, a really important tool. And because of that, if a company is found by the regulators to have poor data integrity, that they’re not managing well, and hence could inadvertently release poor, poor quality medicine, the consequences to that drug manufacturer can be things like product recalls, loss of reputation, because they’ve got a warning letter from the regulator. And there can be financial consequences as well in falling share price and lack of sales. So I said data integrity is hard to achieve. Just give you a few examples of the sorts of things that can go wrong. Let’s say while a batch is being manufactured, we’re collecting data that will be used for batch release manually. So, you know, we all know we can write the wrong number down or we can forget to do it, or even that those records could be falsified.

So that’s why data integrity in a manual environment can be difficult. So one FDA warning letter, FDA is a regulator in the US found evidence of an employee apparently verifying that tasks have been done on a day that he wasn’t even in work. And so, you know, it’s easy to falsify records if you have a mind to.

Conversely, we also have data that’s generated by a computerized system like DeltaV during manufacturing. And that can be incorrect because maybe the system wasn’t tested properly in the first place, so the software’s doing something wrong. Or we’ve got untrained people logged on being allowed to generate data and not really knowing how to do that properly. So, data integrity is hard to achieve. Also, we need have it right. So, a big, important topic.

Jim: Wow. I think you highlighted the stakes of making sure it’s done properly. So Michalle, what are some key points that companies should remember when it comes to data integrity?

Michalle: Well, I’m thinking about the comment that Hilary just made regarding the example with technology and I often when I talk to people, I have to remind them or we work through the fact that it is more than just about technology. Yes, technology can be an important piece of how you’re managing data integrity, but we have to remember that we’re managing the data throughout the entire lifecycle, the data lifecycle, a product lifecycle, and this is going to include things like governance and management and procedural and technical controls, as well as other human factors.

So it’s about company culture also. It’s about people and how they perform their work. It’s about work processes. All of this, in addition to the technology and frankly, management really sets the tone. And there have to be procedures in place to be followed.

There need to be reviews, there’s change management. All of these kinds of things are, are covered in the governance and management processes and those procedural and technical controls, and they’re all really an important piece. Each one is an important piece of the data integrity puzzle.

Jim: Wow. That sounds like it, that you’re right, it’s so much more than just the technology. And I guess before we start talking about the technology then, is there anything else we should note about data integrity?

Michalle: I would say that one key thing to note is, you know, we generate a huge amount of data. And so maybe not all that data actually falls under GxP or GMP, GLP regulations, so we only really need to ensure that the data categories, which represent the needs for regulatory compliance. So data that’s supporting regulatory compliance, GMP processes, GxP processes, that those are properly managed to ensure data integrity from a regulatory perspective that is.

Jim: Well that’s sounds like some good news is not all that data needs to be captured in part of the data integrity process. So I guess now let’s move on to technology and Hilary, let me turn this over to you. Do you have some initial thoughts in the technology realm?

Hilary: Well, yes, I think firstly it’s important to understand that the FDA, the regulator in the US, really encouraging the use of technology. So data integrity can be supported by technology, and we can see big advances in quality.

You know, really helping to reduce errors, optimising resources, and using technology because of those factors. Also reducing risk to the person taking the drug at the end of the day. So the FDA has recognised the potential for these technologies to provide significant benefits to the industry. So in a world where, okay, maybe we could still do everything manually, we definitely being encouraged to do things using, supported by good technology.

Jim: Well, that’s good. Technology does play an important role in that because that’s some of the things we do here at Emerson. So, speaking of Emerson technologies, Hilary, how do they address the data in integrity questions?

Hilary: Michalle used that acronym ALCOA+, and so I’m going to just give you a few examples of how our system DeltaV is designed to actually build data integrity into the data that’s generated during a batch when a batch of material is actually being produced. So let’s just take a examples. So that first A in ALCOA+ stands for attributable.

And the regulations are telling us we must know who generated or changed the data. And so within DeltaV there is a mechanism that’s recording actions and events all the time along with the identity of the operator who may be involved. So for instance, if a valve gets open during production, which could cause some sort of issue to what’s actually being produced, we will know that.

We will know when that happened, and we will know who did that activity. And that can be very important when investigating whether a batch is good to be released or not. The next one of the letters of ALCOA+ list that we’re going to look at is contemporaneous. So dates must be recorded, contemporaneously at the time the event occurred.

And again, DeltaV systems are designed to be NTP compatible. They use the network time protocol, a standard communication protocol that allows computers to synchronize with a time server across a network. So this means any timestamps that are being put against the data, we know they’re going to be accurate to the master time that’s being used in the area.

Just one more example. Data must be accurate. You know, it sounds like that’s a given, doesn’t it? But data can be inaccurate for many, many reasons. So DeltaV is really reliant on all of the input sensor data coming from across the plant. And we do provide within DeltaV asset management, which can be used to make sure those sensors, those devices have been maintained well and so are giving us good input data to work with that’s accurate. So just a few of the examples, Jim, I could go on.

Jim: Well, I think you described very well the attributable, contemporaneous, and accurate in there. Well, Michalle, let me turn it over to you. Are there other technologies to highlight?

Michalle: Sure. I think one that I will highlight is MES, our manufacturing execution systems.

These solutions are also a great way to capture data with context to ensure data integrity. So our DeltaV MES solution, which was formerly called Syncade, is also a great technology for capturing data. Contemporaneously and with context, and then additionally, because of the nature of MES, there are other mechanisms in place to ensure that the data is complete, accurate, legible, and attributable as well.

Finally, of course, the, the system itself has been designed with regulatory compliance in mind so that you can find some white papers and other information regarding electronic signatures, electronic records, et cetera, which are, are, are also very relevant to this topic.

Jim: Well, those technologies sound like very helpful and all of data integrity.

Hilary, do you have any other thoughts on the technology front?

Hilary: Yeah, I think our DeltaV system has the necessary data integrity functionality to be compliant to the regulations. But really I think that DeltaV product or indeed the DeltaV MES product, that’s just a platform that we start to work on during a project when we’re supplying customization of that system to be able to manufacture a particular supplier manufacturers drugs. So DeltaV itself, or Syncade itself is the foundation and data integrity built in. But then we have to customize that and the regulations make it very clear that during that process, computer system validation underpins data integrity. So making sure that you’ve got the right designs, that you’re implementing the code well, and that you’re testing it properly is all very important to make sure that the data at the end of the day is correct.

So within Emerson’s project teams, we have specialists, validation specialists who can work with the customers to align to their specific validation strategy. Trying to make sure that it’s absolutely lean, the lean validation, but it’s good validation. So we don’t want to have repeats of testing on during the project and on site. We want to maximize leverageability of the work Emerson has done.

But, we also want the computer system validation to have been done correctly. So that’s one way in that actually, you know, using the systems, making sure they’re being used in the right way is a good starting point. That helps data integrity. And finally I’d like to say that during our project project execution, Emerson is developing electronic project execution tools that are going to help improve the quality and consistency of that final delivered DeltaV system, and all of its supporting documentation.

So again, all of that is supporting this concept of data integrity.

Jim: Well, I think you’ve really described the lifecycle of data integrity from this project phase you were talking about through continuous operations in there. I guess Michalle, one other quick question. You noted the ALCOA+, but I’ve heard something about FAIR, the F-A-I-R data principle. How does that compare?

Michalle: You know, that’s a really good question. I’ve had some interesting conversations along the way, just about that very topic. Well, first of all, I would say that FAIR, it stands for Findable Accessible, Interoperable, and Reusable and FAIR was really bred out of the R&D data infrastructure needs that are required for associated metadata and capture and storage of that data.

The ALCOA+ principles were really more specific to data that’s associated with quality testing and manufacturing processes in order to ensure data integrity and to document actions properly and document processes and the data associated with all the activities in, again, manufacturing and quality testing.

There’s certainly a bit of overlap with ALCOA+ and FAIR, since Findable and Accessible of FAIR actually relate to the available and retrieval of ALCOA+. So it’s kind of a similar concept. And then while Interoperable and Reusable are also related to enduring, so you can see there’s a lot of overlap there.

So essentially, I would say that both ALCOA+ and FAIR are important principles for data integrity. They just have a little bit of a different origin or focus.

Jim: Well, that makes a lot of sense. Well, I hope our listeners have learned as much today as I have. I feel that, at least on the data integrity front, I’m smarter than I was a few minutes ago before we started with all this.

For our listeners, if you do a search on Emerson Life Sciences or Emerson Life Sciences and Medical, we have a reference to the webinar, the white papers, and a whole lot more information about these topics and more in making sure that your data integrity is as high as it can be and satisfies the needs of the regulatory bodies and everything else.

So Michalle and Hilary, thank y’all so much for joining us today.

Hilary: Thank you. Thanks, Jim.

Michalle: Thanks Jim. It was great to be here today.

-End of transcript-

Popular Posts

Comments

Related Posts

Follow Us

We invite you to follow us on Facebook, LinkedIn, Twitter and YouTube to stay up to date on the latest news, events and innovations that will help you face and solve your toughest challenges.

Do you want to reuse or translate content?

Just post a link to the entry and send us a quick note so we can share your work. Thank you very much.

Our Global Community

Emerson Exchange 365

The opinions expressed here are the personal opinions of the authors. Content published here is not read or approved by Emerson before it is posted and does not necessarily represent the views and opinions of Emerson.

PHP Code Snippets Powered By : XYZScripts.com