How a Massive NIH Consortium Is Uniting Scientists to Expand the Toolbox for Gene Editing Therapies
The National Institutes of Health (NIH) just announced the next steps for its ongoing Somatic Cell Genome Editing (SCGE) program, a massive, years-long initiative that is intended to update and advance gene-editing technologies and the underlying science that makes clinical gene therapies possible.
The SCGE consortium, which is made up of scientists contributing to 45 new research projects, was granted $190 million to pursue these goals over 6 years while keeping a keen eye on data transparency along the way.
A long list of prominent scientists, all collaborators or Primary Investigators (PIs) in the consortium's various projects, published a perspective article in the journal Nature on Wednesday to explain where the field is today, where it's going, and specifically how their research will get it there.
Erik Sontheimer is one of the authors of this perspective piece. Eric kindly took the time to share with CRISPR Medicine News what the consortium wants to accomplish and where he thinks the field will go from here.
Leading scientists focusing on somatic cell editing
-Before we get into the details of this perspective and this initiative, let’s talk about the list of authors. There are many prominent scientists here — what brought everyone together into this particular group?
In early 2018, the NIH put out a series of requests for applications under a number of different initiatives. A whole bunch of people applied, and they got reviewed by study sections in the usual way. Some got funded, some didn't. The list that you're looking at on the paper is the PIs of the various grants that were funded after that round of review. So yes, there are many eminent leaders on that list. At the same time, there are others out there who are not listed in this perspective, who have ancillary roles with projects but aren't PIs.
-Before we get into details, one thing that struck me is how the perspective almost starts with a warning, with emphasis on editing somatic cells versus germline cells. I understand why that is a critically important distinction, but can you talk a bit about that decision and where those concerns stem from?
Sure. I mean, this preceded the project itself. That was an actual organising principle from the NIH. I think [NIH director] Francis Collins in particular was quite adamant that they did not want any Jiankui He-types of scenarios for themselves.
Basically, there are many unmet technological needs even if we just focus on somatic editing. That avoids the ethical landmines while still addressing the technological needs that already exist. I think it just made for a very clear picture that to whatever extent a societal debate needs to happen, regarding whether and how to move forward, that's just not going to be part of this program. There have been various prominent published calls for germline editing moratoria, and so forth, and Francis Collins has signed on to some of those. I think that this is just consistent with that overall framework.
Delivery and third-party testing are major priorities
-The NIH allocated $190 million for some 45 projects over six years. I'm curious in a very concrete, tangible sense; what do you expect this group to have accomplished after those six years?
Some of the projects are about developing new genome-editing platforms — new things that are like Cas9, but different. In addition, there is a general sense that we need to deepen our knowledge and understanding about the potentially adverse consequences that we have to recognise and understand. Are they in fact adverse? Or are we worrying too much about some things and not enough about others?
But by far, what they recognised as the biggest unmet need for therapeutic somatic editing, and what is the largest part of the program, both in terms of the number of funded groups, and the number of dollars going to it is delivery. This is not exactly a controversial insight. Pretty much everybody recognises that delivery is currently the main limitation. I think there were 20 awards issued to delivery research, and those were a bit larger than most of the other ones.
-This programme goes into specific and precise unmet delivery issues, right?
Yeah, that's right. One the projects that I'm part of, in the delivery initiative, will de-emphasise certain tissue targets. So, for instance, the liver, the eye, and ex vivo editing. It's not like they're abolished or banned, and it's okay to be looking at those but they are definitely not the consortium's priority. It's not because they're not important — they are — but because that's where the field is already furthest along. They wanted to focus on what they think is the longer-term unmet need.
Another aspect of the delivery initiative that I think is particularly noteworthy is the requirement for third-party testing. There's been a lot of press over the last 15 years about rigour and reproducibility. The pharma company Amgen undertook a widely publicised study about 12 years ago, where they tried to reproduce the results reported in a substantial number of pre-clinical publications. They could reproduce about 7% of them. It was a shockingly low percentage. Ever since then, the NIH has been pushing very hard to promote rigour and reproducibility because 7% is an embarrassment and it's not acceptable. To help promote that, they set up the delivery part of the consortium to involve these testing centers.
Each group is promised three years of funding at the outset. By the end of the third year, these groups have to send their materials to a separate group that performs small animal testing, e.g., in mice. Testing centers have to take the materials provided by these other groups and use them themselves with their own hands on their own premises. They have to show that what other groups developed in their own labs actually works somewhere else. It's only if you hit your milestones with this third-party testing that you progress to the fourth and fifth years of funding, which is where it can be translated into large animals.
The third-party testing thing, I think, is a particularly innovative aspect of the consortium.
-I'm also interested in the aspect of data transparency that came up a few times in the perspective. What kind of commitments have consortium members made about sharing results, techniques, and so forth with each other and the public?
Everybody recognises that this is a space with commercial interests. We all work at particular institutions with tech transfer offices, IP filings, and so on and so forth. So, it's not like everything [will be shared] immediately. However, there are provisions for a SCGE toolkit, which will be an online portal. Groups are required to deposit their data into this toolkit. There are various data tiers in the toolkit and some won't be public-facing. But eventually, and certainly by the time findings get presented and published, all the data can be accessed on that portal.
I don't want to overstate the transparencies because the toolkit site is not going to go live externally until early 2022. It has to be built and the data has to be generated and everybody has to have a chance to prepare whatever IP filings, publications and presentations and so on that they need to do. Eventually, the idea is to make this toolkit accessible to the broader scientific community so that the technologies that emerge from the consortium can be disseminated.
A toolbox of methodologies for gene-editing therapies
-We talked earlier about how targeting the liver or the retina using ex vivo editing is now well-established. When all is said and done, do you have any predictions about what the new go-to methodologies in the field might be?
You know, I don't think there's going to be a go-to technology. I think it will be all of the above, depending on the target tissue. There are some groups in the consortium developing lipid nanoparticles for either RNP or messenger RNA delivery. Others are developing new viral vectors. There are others again that are doing straight up RNP delivery to target the lung or the central nervous system. Some are doing local, while others are doing systemic administration. I'm running through this whole roster because it's really going to be case-by-case.
There won’t be one single technology that serves all needs. The SCGE is trying to cover as much ground as they reasonably can, given that time, personnel and dollars are not infinite. They have to make some choices, but at the same time, they are really trying to cover a lot of territory on the delivery side.
Need to understand immune responses and consequences of off-target edits
-You talked about delivery already, but what other urgent unmet needs are there in the field that must be tackled in the immediate term?
I think that immune responses against the editing machinery or against a viral vector, for instance, are certainly crucial considerations.
Part of the question though is understanding how much of a problem it really is. What are the potential workarounds? There are some delivery modalities, such as mRNA and RNP, which by their very nature make immune responses likely to be a transient issue. Those delivery modalities would be expected to be lower risk with respect to immunogenicity because they'll be a one-and-done type of thing. You might not have to repeatedly dose so even if you have an immune response against an initial transient delivery system, you won't need to face that again with subsequent administrations. We'll see, but that that's part of the goal.
There's certainly also interest in understanding whether a certain degree of off target editing is tolerable. So, in other words, you might design a CRISPR treatment to generate a small indel, and in many instances it does. But to what extent do you also get much larger indels, or rearrangements or translocations, or things like that? That's actually a surprisingly difficult question to answer. One of the initiatives is about biological systems for understanding the consequences of adverse events.
Method development a major focus for this consortium
-This consortium seems primarily focused on improving gene therapy methodologies, safety, efficacy, and improved targeting. Aside from the restriction against germline editing, have other conversations come up around research ethics in general?
There's an organisational hub called the Dissemination and Coordinating Center. They're important for general organisation and for developing the toolkit. But they also have team members who are focused on research ethics, and they have components of the so-called outreach subcommittee that is considering those things. You're right: ethics is not at the core of this perspective. I think that was also part of the decision to sidestep the question of germline editing.
-Because germline editing would also raise many ethical questions?
Yes, that's a big potential stumbling block that we just made a non-issue because it's just simply not going to happen under this consortium. Yes, there are components related to research ethics, but development of the methodology is the primary focus.
It is intended to eventually translate into the clinic. But there aren't any aspects of the consortium that really deal with human subjects.
Research during a pandemic
-This initiative really began several years ago. How has it changed during the pandemic?
It's been quite a trip trying to make this consortium work during a global pandemic with lab shutdowns. We're certainly not unique in that regard, but that's an aspect of this that nobody counted on.
Institutions shut down our labs for periods of time. A lot of these grants are so-called cooperative agreements with specific milestones and dates attached to them. The reality is that some of these deadlines had to be relaxed a little bit and pushed back and so forth. There was so much uncertainty at the beginning of the pandemic, wondering, "Are we going to get back into our labs anytime soon?", but with some adjustments we’ve managed to keep kept most things on the intended track. Now we're hopeful that, like everybody else, we can come out the other end and really get back to full speed.
Link to original article in Nature:
Dan Robitzski is a science journalist and former neuroscientist based in Los Angeles.
Note: this interview was condensed for length and clarity.