Methodology for Assessing Information Maturity

GAT_Thumbnail_V1.jpg

Chris Duncalf from BRG’s Global Applied Technology (GAT) team discusses methodology for assessing information maturity (AIM), healthcare informatics, and how information is valued as an asset.

Learn more about assessing, understanding, and improving information maturity.


TRANSCRIPT

MJ [00:00]             Hi, everyone. This is Michael Jelen, and welcome to the BRG Global Applied Technology podcast. The GAT team, as we call ourselves, is a globally distributed team of software engineers, data scientists, graphic designers, and industry experts who serve clients through our BRG DRIVE(TM) analytics platform. We're helping some of the world's largest and most innovative companies and governments transform data into actionable insights.

Today, I'll be speaking with Chris Duncalf from our BRG GAT team. Chris is an information professional with twenty-five years of experience in healthcare informatics spanning both the private and public sectors. We'll be talking about Chris's methodology for assessing information maturity, which we refer to often by its abbreviation, AIM.

As always, if you've any questions or comments, please email us at gat@thinkbrg.com. And with that, please join me and Chris Duncalf. Hi, Chris. Are you there?

CD [00:52]              Hi. So how are you doing?

MJ [00:54]             Doing great. How about yourself?

CD [00:56]             Yeah. Not too bad at all. A little bit windy and rainy where I am, but I can't complain too much.

MJ [01:01]             And where is that actually, just to let people know?

CD [01:06]              So I live in Wigan. So for those that know of it, home of pies and rugby, one of which I like more than the other. I'll leave it to everyone else to guess which. Just west of Manchester, about twenty miles away from Manchester in the northwest of England.

MJ [01:22]             Wonderful. Wonderful. Well, I'm excited to talk today, so thanks for making time for me. And before we get into the details, I'd love if you could just give a quick introduction and explain a little bit about how you came to be on this team here at GAT at BRG.

CD [01:38]              Yeah. Absolutely. No problem at all. So I'm Chris Duncalf. And I guess my healthcare background started—oh, in fact my work background started all the way back in 1995 when I joined Wigan and Leigh health authority, a public-sector health authority, on a six-week temporary contract which, certainly, some people will recognize as the way in a lot of the time. And seventeen-and-a-half years later, I left the public sector. So a six-and-a-half-week temporary contract towards a seventeen-and-a-half-year permanent contract is some feat. But in that time, essentially, I've had a number of roles in information and informatics background. So within that time, I've been an information analyst, a SQL developer, a business analyst, and embedded in a clinical division in the medicine division. And ultimately through to my final role in the public sector, in the UK NHS, was as a head of information, where I had responsibility for business intelligence, analytics team, system administration, clinical coding, information governance. And so the list continues. Quite a large portfolio, really.

So that, as I say, occupied me for about seventeen-and-a-half years, public sector, and in that time, survived the National Programme for IT, which certainly some people will be familiar with. And until eventually, I was looking for change, really. And I joined Dr Foster at Dr Foster Intelligence, as it was at the time at the UK healthcare benchmarking company, which people will be aware of, and spent six-and-a-half years at Dr Foster, again in a few different roles. Certainly joined as an information consultant to begin with. And finally, there I was involved as head of strategic products for new products to the market.

That took me through to the end of 2019, at which point I moved on again, and finding myself here at BRG in the GAT team in a consultancy role, again across informatics and information and technology-based products and solutions. So that's what's brought me to where I am today.

MJ [03:43]             That's great. And the breadth of experience that you've brought to the table has really been an asset to us, so it's been a pleasure seeing what you've learned along the way to now service our clients with that knowledge. The thing that we've been working on most recently and the thing I wanted to talk about today is the AIM product that you're working on, assessing information maturity. Would you be able to give us a quick overview of what that is and how you came to create that?

CD [04:09]              Yes. Certainly. So I'll give you a bit of a brief synopsis of the intent there and then obviously we can drill into as much detail as you need to, really. But I guess in a nutshell, and you've given the title there, it is about assessing the information maturity. And fundamentally for me, this is about how information is valued as an asset, really. And certainly my background, as you just heard, is predominantly healthcare. But of course, all organizations, all sectors generate huge swaths of data. I think healthcare is particularly adept at generating huge swaths of data collected in different forms, and be that technological or be that paper-based bombs.

But actually, looking at how that's leveraged, and I can look at it—what's involved in leveraging data or information and assets. So I mean, much like any product that any company would bring to market, data itself—it's got a value chain ultimately. And really, the end of that value chain for data is, it's done, that data being used to make a decision to take action, whether that be an actual physical change that is then implemented or whether that be actually some assessment that what we're doing is the correct course of action. So we've taken off the other option, but either way it's got to be utilized to be a value. But of course, that all assumes that the product, data in this instance.

So our analytics that it's been turned into has got to be fit for purpose in the first place to make decisions on. So it's all geared really around acknowledging the data, from creation to consumption if you like, for want of a better phrase, involves not only those people who create it and those people who process it but also those people who consume it, and trying to find where that balance is and where those challenges and hurdles are for organizations in: Are we producing the right information? Are we specifying it correctly? Is our analysis as mature as it could be? Are we doing purely descriptive analytics? Could it be predictive of prescriptive analytics?

But then on the flip-side of that is: I would challenge actually about were pitching the analytics. Is it the data literacy of consumers of that information? Are they able to consume it? Have they got skills that they need to learn? Have they got—and that's not just technological skills, that is about data interpretation. So before I go into, dive into too much of the granular detail I'll kind of come back to you and you reflect on that, Michael.

But I think it is very much about data as a product and an asset and almost leveraging some of those, and product management strategies really, so thinking about things like innovation cognizances and value propositions. But that's not to overengineer it and make it a very paper-based and document-heavy process, but actually thinking about—are we leveraging the right value from this and where are those opportunities, really.

MJ [06:59]             Yeah. And I think that is a challenge that every organization faces largely because the speed at which information maturity—or I should say information management—changes is very, very rapid. So in a lot of situations, I think clients may say, "We don't know what we don't know, and we don't know what the best practices are or how to even accelerate to that point." Practically speaking, if someone was interested in understanding where they fall on the spectrum of information maturity, how would they go about assessing that? Is that something that we would do through a platform? Is that interview based? How do we approach that?

CD [07:35]              So there's a mix really of methodologies and for good reason really quantitative, quantitative and qualitative, approaches really. And you've kind of hit the nail on the head their, Michael, really, in that aspect of not knowing what you don't know. And that can come from two angles. That could be, were sat on so much data that we're only leveraging two key data flows because we're under capacity and that's all we can handle. Or there may be other data flows in the organization that those people who produce the analytics, the BI, don't know about.

So trying to get to all of that, as you can imagine, can be quite a complex process and certainly has to involve kind of a number of stakeholders, really. So our approach in essence is to start with, I guess, a broader assessment, if you like, across an entire organization, if that's what we want to do to capture those stakeholders. But with a more high-level survey to look at some of those perceptions around information provision and indeed international consumption. So if we talk about perceptions, there's really, we split this assessment into five demands. So we look at demands of people or leadership and people, policies and governance, standards, technology, and performance. So briefly within that, if we talk about people, we'll be looking at their skills involved in articulating the vision, the strategy. Certainly from the provider side of that information. From a consumer side—and this is where we get those two angles—we'll be looking at those individuals that champion data literacy and champion data as an asset and promote training and engagement. And that kind of two perceptions of what we're looking for, what goes through each of those demands. But to start with, we would essentially be leveraging and drives Port's application, which essentially is our online survey application for us where we can build bespoke surveys. So we have a set number of questions within each of these five demands, along with some contextual information as well, to start to get a first pass of the perceptions in the organization about how well this information is provided and consumed, what policies and governance might be in place, how engaged people are, how KPI some metrics are set.

CD [09:57]              So that would be our, will be our first pass if you like to start to give us some themes, some domains, and some idea of where we can start to target individuals and, indeed, be they executive-level individuals, whether they be individuals in the information team or service managers, because that next level then would actually be face-to-face, where we can do them virtually, certainly at the moment and in some areas, but certainly face-to-face interviews to dig into some of those themes and do then more of a semi-structured interview really. And to start to dig into those themes to start to see what it is those individuals are getting, what they're not getting, how involved they are in that process. For example, do they own their own data? If they're a clinician, do they get regular sight of their data?

We certainly acknowledge within the healthcare of the organization the creators of this data are quite often the consumers of the data, but on many of occasions they won't see it until they've input it into a system or written into a case; no, they don't see it again until it's used in a KPI metric and potentially aren't involved in the assessment of the quality of that information. So we very much kind of formal, if you like, that those themes down so that we can get all the way down to that level of detail of what's affecting people in each area, and that will be, as I say, service managers, business managers. This is not certainly just something for healthcare. It absolutely can be sector agnostic. Looking at those people who consume that information and also looking at the people who kind of provide that information really. So that's kind of a two-pronged approach. Within that, we will absolutely review things like existing strategies and policies and to look at communication to look at branding. It is essentially everything that's wrapped around that lifecycle of the data from and could use the phrase before from creation to consumption.

MJ [11:53]             That makes sense. It's perfect. So at a higher level, we would conduct a survey, get them information from different people, and then know where to focus effort when following up and having interviews and things like that. That seems like a great approach to be able to dig in and get as deep as you need to. I did want to spend one minute and just go into the consumer and provider a little bit more in relation to data. One thing that I know we've seen a lot of times with different clients is that the people who are maybe using data on a day-to-day basis may not be at the managerial level. And the granularity required for someone on the front lines to do their job, and the system and the interface that they may be using might be perfect and give them exactly what they need to be able to do their job very effectively. But from a management perspective, often that data doesn't roll up effectively or provide the metrics or dashboards that would be required for them to make these large strategic decisions for the entire organization.

Could you talk a little bit about the consumer and provider relationship within data and how those two can be reconciled? Because many different people—we're coming into a situation where I've seen many, many different people touching and consuming data from very different levels of the organization. So how would you approach that, or how would you tailor some of your questioning to understand if that's an effective way that a firm is dealing with that?

CD [13:17]              Yeah absolutely. So a strong theme though, really, between that relationship is the fact that what's a lot of organizations will have, a proliferation of metrics of indicators, and again I can speak from experience here that healthcare has many of those and for good reason, quite often for good reason to be fair. And it is essential that we're actually going, but when we're identifying these metrics and KPIs and indicators, however we want to turn them, that we're actually going back to understanding what our overall goal and strategy and vision is for whichever sector we're in. And there will always be a necessity for a swathe of the mandated metrics, whichever way you cut it in, whichever sector you're in, but certainly those that we're focusing on to actually drive the business forward achieve specific goals and ultimately deliver our vision and strategy, and that's where the work needs to start. We need to ensure that those metrics are indeed going to deliver their outcomes we're trying to achieve and setting those from the [] and bringing the consumers, which in this case at a strategic level may well obviously be the directors and the executives, but bringing them together with those data literate people and those analytics experts, what we need to actually say what would those measures look like. What would be a realistic definition, a realistic algorithm that at the end will give us an outcome.

If it is an outcome we want as opposed to an output that actually tangibly will tell us that we are achieving or we're not achieving our goals. So ultimately were possible. It is absolutely about ensuring that those metrics align with your business goals and your strategy. Not as I have seen in the past. And again I will speak from Alcove perspective here, and that you can just get a proliferation of indicators that are just measuring things as to whether they have gone up or down because they've been requested, not necessarily because they are a measure of anything other than a number has gone up or a number has gone down. You could take something like an outpatient follow-up, right. If outpatients have an average follow-up rate of two-and-a-half follow ups to every first. Well, okay, so if two-and-a-half goes up to seven, is that a good thing? Is that a bad thing? Very much depends on what service you're looking at. What kind of experience the patients had. It is all about ensuring that we can tie metrics back to tangible outcomes and the outputs where we need to get outcomes that we've described? What is it we're trying to achieve, and does this metric actually tell us that that is getting better or worse and give us the ability to make a decision? And if it doesn't, then there is a different question about how valuable it is, really.

MJ [15:58]             So this approach sounds very logical to me, but are there areas where you've received push back? Or is there anything that you think has to be overcome within an organization, whether that's the status quo or any sort of political challenges that are involved in getting people at different levels to participate in an assessment like this?

CD [16:18]              I think, the difficulty a lot of the time—not all of the time—the difficulty a lot of the time is about an education and awareness and an understanding, I think. And that's not—that's not to be critical of anyone involved. I think it's quite easy to get dragged into a mandated set of metrics that we must measure and so we must measure them as quickly as possible. And certainly from my perspective, out of experience of that in the past, and I'll withhold names to protect the, maybe, not so innocent. But certainly I've had requests in the past when I was in a senior information role in the health service where we were producing certain metrics. I think we were producing them weekly. And I was requested by a then senior individual to produce them daily. My pushback was, "Well, until I see any evidence that you're actually taking any action based on the weekly figures that's leading to an action, what is the point of me producing or my team producing them daily? Ultimately, it will just tell you sooner that you are still a poor performer because you're not taking any action." So I think it's difficult when they're mandated metrics that someone demands, are those that might be central government. It might be an arbitrary arm's length body. But I think a lot of the time—there's almost a fair, sometimes, I think, maybe some of those experimental metrics. And I say experimental in the sense that until we try them, we don't know that they actually tell us what they showed.

And I think the other challenge really is the quantity of data and trying to pull some of that together, and I'm uncertain everyone is on board with the fact that the more variables that we can consider that may affect, and I put them all variables we can build in and then actually potentially a more accurate metric, but sometimes, that can be overengineered. Sometimes it can be very simplistic. I think it's it's down to—some of the time, it's down to get in the voices of those analytical experts, those data experts, those technology experts heard really, so that they can kind of push that message that actually we could do something unique. We could do something a bit more leading edge. Certainly something that would be better than a gut feeling or an assumption, which we do still see quite a lot and that is part of the challenge is getting past the education the data—whilst data alone won't have all of the answers, it will certainly signpost us a lot better to where the answers might be.

So I certainly think and there's a promotion that needs to be had. There's a push that needs to be had. I think it lies with everyone and everyone in that respect. But certainly to get those data and analytics and technology people more to the fall involved at day one in those conversations when we're thinking about changing a service; when we're thinking about changing the process, involve them at day one after the event when we've made a change, really.

MJ [19:19]             Bringing data and technology people to the forefront and pairing that up with industry expertise also sounds like something that we at BRG have always made a core tenet of who we are. It's always been an expert-driven firm. And I think what we've seen inside of our teams, since we are a little bit more, I guess, tech-focused and data-focused being the purveyors of that drive platform, is that we've now been able to unlock knowledge and expertise that perhaps wouldn't be able to be applied to very, very large datasets. I'm thinking about people that maybe have thirty or forty years of experience in a given industry: healthcare or oil and gas, so be it. And they have typically been working with Excel sheets and manageable data volumes throughout the course of their careers. And now we're entering an era where that is just simply not possible anymore.

How do you see this fitting with overall trends, with data volumes? And do you think that this tool could be very useful in situations where we need to essentially liberate that data to all decision makers at all levels and do so in a nontechnical manner so that people without any programming expertise are able to use information effectively? How do you see that—how do you see that playing into organizations both now and tomorrow?

CD [20:37]              Yes, so I think really what kind of you described it. I guess a couple of areas that we've come across before follow through my experience and through applying this as a kind of review assessment type solution really is there. On the one hand, we do need specialists, and we certainly don't want to head too far down the route of creating teams of generalists, because specialists both with operational knowledge and with technology and analytical knowledge absolutely have a role to play. Bringing those together is key so that we can actually solve some of those seats kind of cross-functionally across those individuals. I certainly don't think teams of generalists is going to be a way forward, and certainly in my experience from in healthcare that that doesn't work. You have to have individuals with expertise. That gives you other challenges. I accept that in terms of cross-cover, in terms of backup, but we are getting to a point where the complexity that—whether it be the complexity of the data, the volume of the data and the number of disparate data sets—that we do have to retain that expertise. But there's also the aspect of, I guess, what you would call organizational business intelligence and versus or complemented by personal business intelligence.

So organizational BI being about more—I don't want to say on rails—but that more contained and, I guess, set of data sets, sets of curated data sets, for want of a better phrase, and coherent sets of technologies that are ultimately providing a complete BI solution that might be pre-canned, preset reports with filters and with some level of interactivity. But then you've also got those individuals who are more technically capable, where we can kind of rule out more of that granular-level detail and you're getting into the personal BI space, which is you are looking for a level of expertise then to exist with those individuals. So that can take some of that self-serve effort, if you like, away from some of those experts individuals on the BI side.

But of course it does introduce also aspects of risk. There are pros and cons with both of these things that you could, if not properly controlled, end up with a proliferation of local data sets and no longer a single version of the truth. So there's definitely a balance to be had about that. That transition has to occur. But I think we need to be careful of the difference between self-service and self-sufficiency and being able to self-service one thing being self-sufficient. I think is something slightly different.

You are looking for that level of scale, that level of understanding, that level of expertise and to be able to take that forward and indeed to give everyone confidence that you would indeed release more of that data for the ability to mine and the ability for people to draw more insightful conclusions. So it's definitely a fine balance, but I think the technology is certainly there to do it. And the days of locking people behind permanent permissions, and they can only access ten static reports, it's just not possible to do that anymore I think. Data is being generated at such a huge rate, and indeed technology moves at such a huge rate, that actually we've got to start to look at these data literacy type skills along with the technology skills having some form of baseline and almost a competent competency-based assessment. In the same way, we would expect those technologists and data people to understand the operational side of the business. And I've certainly been a great advocate of that in the past. Whilst it's not impossible to run your data and to be a BI professional with no understanding of what that operational data actually means, in a way it brings a ten-fold benefit if there is some knowledge there. So I think that cross-pollination and that cross-knowledge, I think, is really required without trying to turn everyone into generalists if that indeed makes sense.

MJ [24:40]             Yeah. I think data literacy is becoming increasingly important for anyone at all levels of the organization. It's such a critical piece of decision making, so everyone really has to have a good understanding of how to use that to drive forward their individual function. Since you're at the confluence of both healthcare and technology, so what if I could just take a step back really quickly and understand from you more broadly? What would be the things that you're most excited about and most looking forward to in the industry? And my guess is maybe data will play a role in that. But really very broadly and open ended, based on what you're seeing, what excites you? What are the things that you're looking forward to in the next, call it, five years or so?

CD [25:22]              I think, for me, and again I will have a little lean towards healthcare to start with certainly. As we start to see—I mentioned it back at the beginning of our conversation. I mentioned the National Program for IT and for those that are aware of it will certainly know that the ultimate goal there was to try and achieve the—I guess, what I would term the holy grail of work on integrated electronic patient record really. And so that's a step in the UK. For example if you were in Southhampton yesterday on the south coast and in a hospital, and you ended up in Edinburgh the following day, they'd be able to access the same record to know exactly what was done.

And I think whilst the national program achieved a lot of things, we didn't quite get there ,and I'm not about to be critical of the national program at all. We're starting to see that again just through a slightly different regime in terms of the likes of all stripes, the likes of Cerner and the other big hand of EPR players through digital exam, plus starting to achieve the same thing, and I think that opens up certainly a wealth of data that we just have not access to before in a technological fashion. So if we go back to paper-based case notes, I say go back. We don't have to go back. We still have them in a lot of areas, and put in things that are written down in the paper-based case notes are certainly going to get digitized, and not just—I don't mean an electronic PDF copy. I mean they're going to start to get properly digitized.

We're going to start to be able to turn it into some kind of structured data. We're going to start to be able to pull a lot more information from a patient's record that just opens up a world of possibilities in thinking around case-mix adjusted metrics, and those types of risk predictors, and we can already do a lot with data that's available, but to start to be able to pull through some of those nuances; and supported by certainly the speed in which technology moves on and the ease of access to processing power and to be able to kind of run essentially what can be truly be big data sets in a way that we could potentially get as far as supporting clinical decision with almost real-time data. And there's a whole host of governance concerns with that before I go off too far down that track.

That is really powerful, and the bit I guess that concerns me a little about that is I've just said all that, but then I've also just mentioned about a proliferation of KPIs and indicators that potentially aren't fit for purpose that don't follow strategy.

So we can't just suddenly hail new technology and hail new big data if we can't get some of the basics right. So we've really got to go at paces, both going at pace, we've got to bring the people with us as well, and technology is moving on a pace. I think data literacy is getting much better. That also has to move on at pace. So I think certainly in the healthcare space that that is something really exciting to look forward to. And I know some of that is already going down, but I think there's a lot further to go, especially when you think about the data that can be accessed through wearables and mobile tech. And again, with the suitable kind of governance in place.

I think we only need to look at what's still going on recently with COVID-19 and where some of the opportunities that I know people can already see some of those opportunities might have been if we were further, on with that and some of the challenges we've still got to address. So I said anything in the healthcare space that that's certainly something that excites me in respect of what could be achieved. But there is a lot of work to do. I think there's an awful lot of governance in all arenas, but more so in that arena and understandably so. But we need to find a way through some of that as well, and governance can't be a block or two to achieving better patient outcomes, really, so striking that balance is going to be hard and I'm certainly not about to get into a debate about information governance.

MJ [29:23]             Well, it sounds like with all of this new data coming online, wearables, antibody tests, all of the things that we're seeing rise very rapidly, assessing the information maturity of a given organization, it sounds like a critical reality check that needs to be provided every so often in order to ensure that we are marching in the correct direction, interpreting information, and using that to drive data-driven decisions. So I think that that framework is a great way to apply it.

Chris, just in being mindful of time, I was going to ask one final question of: What it would be that you wanted to leave people with and where you see this moving in the future?

CD [30:07]              I think for me it would be getting a handle, and I know a lot of organizations will have a handle on this, but I think getting a handle on the consumer and provide a relationship. I think quite often, and the analytics the data the technology to some extent, although technology reaches a little bit further in terms of through clinical system provision, but no longer being seen as, as the back office support services, really, and being partners. I think in delivering in healthcare a better patient care in other sectors more timely indicators, more insightful indicators, prescriptive and, well, predictive and prescriptive analytics as opposed to descriptive and diagnostic. And I think that that partnership work in understanding what the baseline is, and because I'm certain, I certainly know from experience, there's a lot of good people out there doing a lot of really interesting useful stuff with data and analytics.

But to circle back around to the beginning of our conversation, if that isn't able to be used, for whatever the reason, whether it's the way it's presented, whether it's the way it's delivered, whether it's the way it's interpreted and understood. Then actually it's pretty much all for nothing. So I think it's definitely trying to find out where that baseline is and finding out whether that relationship is in place, is on track, is the trust there? Is the engagement there? And from that point, kind of building more meaningful indicators, metrics, and solutions together, really.

MJ [31:35]             Perfect. Well, thank you so much, Chris. It has been a pleasure. I've learned so much. I really am looking forward to the same things that you are as the healthcare industry is continually revolutionized by data and we continue to use it in a positive way. If people want to hear more about you or find you somewhere, where should they look? How can they get in contact with you?

CD [31:56]              Well, I'm certainly available—I'm available I'm on LinkedIn. So if people search for me on LinkedIn, Chris Duncalf, I will certainly be on there. And we will be posting something up there within a few weeks and specifically around the assessment of information for charity, so that should be a nice thing, that one's for people that, again, we'll have my contact details on the site about, certainly shouldn't be a problem. I'm not hard to find. Let's put it that way.

MJ [32:20]           Wonderful. Well, I hope everything for the remainder of the day goes well up in northern England. You guys get some sunshine. And looking forward to connecting the next time. Thanks so much, Chris.

The views and opinions expressed in this podcast are those of the participants and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group or its other employees and affiliates.