Vial raised $67M; NextGen Healthcare bought TSI Healthcare; iCAD partnered with Google HealthRead more...
Simcox will be a part of Vator's Healthcare in Politics salon on October 7
With the election right around the corner Vator and HP will be hosting their latest salon on October 7, called Healthcare in Politics (register for the event here!) where multiple panels of experts, policy makers and lawmakers who will be on hand to discuss topics related to healthcare policy and decision making.
One of the panels will center specifically around data, including if the government is handling data correctly, what new data telehealth has created and what new technologies are being created to help us track the virus.
I spoke to one of our panelists, Ed Simcox, Former CTO at the US Health and Human Services and current Chief Strategy Officer at precision health company LifeOmic, about what the US has learned about health data collection thanks to COVID, what's different about HHS and the CDC and what we can do to prevent another pandemic from getting this far.
VatorNews: Tell me about yourself, your background and your work at HHS.
Ed Simcox: I've spent a large part of my career at the intersection of healthcare, innovation and data. Prior to joining LifeOmic, I served as the Chief Technology Officer at the US Department of Health and Human Services, where I was in charge of efforts across HHS to leverage data, technology and innovation to improve the lives of the American people, as well as the performance of HHS’ 29 offices. While CTO, I also served as the Acting Chief Information Officer at HHS, where I was responsible for the department's IT modernization efforts, IT operations, cybersecurity, and the IT congressional oversight response.
The Office of the CTO is the front door to the innovation and startup communities in the health sector across the United States. I had the privilege to visit multiple startup communities, and we held events called HHS Startup Days in 12 different communities across the US. Through that experience, I came to have a deeper understanding that, for so long, the health care system in the United States has been broken. The system has been paying for “sick care” in a reimbursement model that's predominantly fee-for-service. I came to firmly believe that to fix health care, innovators need to be creatively disrupting the status quo. We need to stop rewarding players for treating sickness on a fee-for-service basis and we need to focus on preventing sickness from happening in the first place. .
The other positive trend I saw was a new generation of companies that are using advanced technologies, like genomics, artificial intelligence and machine learning, and implementing things that we would think of as “science as code” and “medicine as code,” to better treat complex conditions and be more precise in their diagnosis and treatment. That's the concept of “precision medicine.” Combining wellness and prevention with precision medicine is what LifeOmic does. We call that “precision health and wellness” and this focus attracted me to LifeOmic. This combination is really what will lead to material improvements in our healthcare system.
VN: You left HHS, the largest civilian government agency in the world, for LifeOmic, which is a small healthcare startup. Tell me about that shift.
ES: Creative disruption is happening across the health sector. LifeOmic is at the center of this creative disruption. We started by using genomics to advance the diagnosis and treatment of very complex types of cancer, and we're doing that very successfully in the research setting. Along the way, we realized that what we were building had a much broader applicability and could be applied to wellness and prevention. We now have solutions to address not only cancer but have moved upstream to address people's health status in a real way. We can get to the root cause of illness, and do it with much more precision than what we have seen in the past from other so-called wellness companies. That was a big attraction to me.
VN: How did the US do when it came to handling data during COVID? What did we do right and what did we do wrong?
ES: COVID has taught us a lot in the federal, state, and local public health data community. It underscored how important it is to have timely, complete, accurate and actionable data available during a public health emergency. We can't manage what we can't measure, and we can't improve what we can't manage. To really combat the coronavirus and COVID, and learn as we go, we need data. The data are collected by each state, each health care provider, each independent lab, and even local and county health departments. The systems that collect these data are largely bespoke, unique and disparate, meaning they don't talk to one another and they don't exchange data programmatically.
To add to that complexity, state health departments manage and analyze their data differently from one another. They all have unique systems, and this makes it difficult for the CDC and other parts of the federal government to gain accurate insights from population-level roll ups of that data to inform policy and help combat the epidemic. Because the virus is novel, our response has been novel, and our response in the data community has had to be novel too. We're learning a lot about how to respond as we go, and it’s requiring us to really reexamine our underlying data infrastructure across the entire continuum of the US public health system.
The CDC and the states have been collecting public health data for a long time. The states have really good public health data systems but they were not engineered to meet the specific requirements of the current pandemic such as allowing for systems to talk to one another or to the CDC’s systems programmatically. And so, the pandemic underscored how important it is to get health data interoperability right. The 21st Century Cures Act requires that software companies and healthcare providers exchange data in meaningful ways and in a bulk fashion, using what we call an API. The API standard chosen for this is called FHIR. FHIR APIs allow content to move seamlessly from system to system. Rather than working on manual translations of data for every single system, we can focus on creating a standardized, population level, clinical data set, and focus on the higher order task of turning that data into insights.
Coincidentally, the federal regulations for the 21st Century Cures Act were released in the middle of our initial response to the pandemic in the spring, but I wish that the Act’s implementation would have been a couple of years in advance of the pandemic, because our response, from a data perspective, would have been a lot better. We would have had a standardized way to transact data across the ecosystem. I encourage HHS and CMS to keep the pressure on the industry to implement interoperability and eliminate information blocking.
VN: What are some of the other challenges that you've seen in terms of data and data collection during the pandemic, and how are we solving those?
ES: One of the things that the federal government, in cooperation with the states, is trying to prepare for is the proper collection and dissemination of immunization data associated with the impending COVID vaccine. Every state has an immunization information system but, unfortunately, these systems do not talk to each other or to the CDC. The systems themselves are unique and bespoke. While they do serve a great purpose for each individual state, they don’t support roll ups of this data so that we can track the distribution and administration of the vaccines that will eventually come out. Nor do they support important interstate sharing of immunization data.
The CDC and my former office, the HHS Office of the CTO, partnered to collaborate on a system called the Immunization Gateway to connect every state’s immunization information system (IIS). We started that work two years ago, way before the coronavirus. The idea is to allow for the data to reside in the native state IISs and to securely connect them together so that data can be transmitted to any authorized party in the ecosystem that has a need for that data to advance public health. This avoids the need for new systems and substantial rewrites of existing systems. This gateway is a great example of CDC/HHS collaboration, and it’s being developed as we speak.
VN: So, we talked about how to collect the data, but I want to ask about who is collecting the data. The administration abruptly changed the way that hospitals report data, giving it directly to HHS instead of the CDC. What is different about HHS versus the CDC? How do they handle data differently?
ES: There are really two categories of data that are important: there's the locally-collected public health and lab data, which helps us figure out the per capita infection rate, the transmissibility of the virus, the mortality rate of the virus, etc. And then there's what you alluded to in your question, which is hospital data, like percentages of ICU beds available in hospitals, the number of ventilators available for COVID patients, PPE on-hand, etc. While this data has historically been gathered at state and local levels, it has never been comprehensively, systematically gathered and reported to the CDC or HHS.
The CDC has a system called the National Healthcare Safety Network, which allows hospitals to report hospital-acquired infections, among other things. When the coronavirus hit, it was decided that this system could also be used to track coronavirus metrics because the hospitals already had access to this system. The CDC began adding functionality to the system and ran into some snags which caused delay. Plus, the data being reported from states and providers was incomplete and was two weeks old or more. So, a decision was made by HHS, because of the national health emergency, to work with current vendors to create a temporary reporting system allowing hospitals to report that data directly to allow a more complete and timely picture.
The idea was that we don't want to know the trailing numbers from two weeks ago; we want to know yesterday's numbers and how we're trending, because that's a really good indication of how we should deploy resources to stay in front of the curve of infection as far as beds,ventilators,masks, PPE, etc. HHS collaborated with the CDC on the transition, and the CDC continues to develop functionality in their various surveillance systems with the idea that, at some point, the collection and reporting responsibility will shift back to the CDC.
VN: Was that decision made because HHS was able to handle the amount of data and CDC didn't really have the infrastructure to do that?
ES: I don't think it's fair to say that CDC didn’t have the infrastructure. CDC has good systems, but some are older, legacy systems. It's hard to create new functionality in those systems in an agile way. The HHS system uses modern data systems and “agile” development methodologies to rapidly create a comprehensive reporting environment for this data.
VN: In the future, I would hope that CDC as well would get some of the more up to date technology that would allow it to be able to handle that data as well. Is CDC getting technology akin to what HHS has, so it can be implemented for future pandemics?
ES: Yes, we are all learning as we go and I will tell you that the IT shop at the CDC is extremely capable and sophisticated. They are earnestly learning as they go, as well as solving for the future. In fact, the CDC recently held a great listening session on the importance of using the new 21st Century Cures Act regulations to promote public health data interoperability. They generated a public report detailing great insights they gained and recommendations that can be used to accelerate modernization of their systems. That's a very promising and hopeful advance and that's just one example of the great work that's happening at the CDC.
VN: The next time an event like COVID happens, what do you believe would be the ideal way for us to handle data, so we make sure that a pandemic doesn't get to this point again?
ES: Those of us in health IT and health data are learning from this and preparing for the future. We're all realizing that we have a lot of work to do to modernize the federated public health data in the United States, which still suffers from being largely paper- and fax-based. Post-pandemic, we need to be introspective and should convene a 9/11-like commission to look at how to do just that. How do we modernize the IT that supports our public health system? By doing that, and by promoting things like the 21st Century Cures Act rules, we're going to emerge from the pandemic stronger and better.
VN: Is there anything else I should know?
ES: They say, “all politics is local.” I would add “all public health is local.” The federal government plays a very important convener role but, at the end of the day, we rely on public health workers on the ground and in contact with citizens, patients, and providers. They're doing a tireless and amazing job. Most states have solid data collection systems but we live in a federated world where states operate independently from the federal government. We need to design a modern system that allows for a federated approach and that also cares for security and privacy, and I think we're well on the way to doing that.
(Image source: redpointglobal.com)
Support VatorNews by Donating
Read more from our "The Future of Healthcare" series
Merck acquired Imago BioSciences; care.ai raised $27 million; HC9 Ventures launched $83M fundRead more...
DispatchHealth raised $330M; Sondermind bought Total Brain; Elizabeth Holmes gets 11 years in prisonRead more...