Hello everyone. I'm Subin MV. I lead the Life Sciences Tools and Diagnostics team at Guggenheim Securities. Thank you for joining us today for our inaugural Healthcare Innovation Conference. It is my pleasure to be hosting Seer. Joining us is David Horn, CFO. Thank you for being there.
Thanks, Subin.
We'll start with a presentation from the team and then move on to Q&A.
Sounds great. Thanks, Subin. Thanks so much for having us here and the whole Guggenheim team. So it's a pleasure. I'll just run through a quick overview of Seer. I'm David Horn, the President and CFO. Seer was founded in 2017. We launched our flagship product, the Proteograph Product Suite, for limited release in January of 2021, and then broadly commercial release in January of 2022. So since then, we've made great progress about our mission. And so today, I'm excited to kind of share with you an update on that and also some of the customer data that's been generated with our technology. Just a word of note that we will be using forward-looking statements in this presentation. So Seer's mission is really envisioned to redefine what's possible by pioneering new ways to decode the biology of the proteome to improve human health.
Similar to how next-generation sequencing transformed the world of biology, we believe proteomics is the next frontier in understanding biology that will have an even more profound impact. Seer's technology is uniquely positioned to overcome the technology limitations that previously prevented the scientific community from discovering biological insights that are embedded in the proteome. Over the last 20 years, there's been significant advances with large-scale access to deep genomic information, resulting in the identification of over 1.3 billion genetic variants across the studies. Yet we still don't have functional context for the majority of the genomics information at the protein level. Biology is complex. As we move from approximately 20,000 genes in the genome towards the proteome, it is estimated that we will encounter millions of different protein variants with distinct functions that originate from the same 20,000 genes.
Variants of proteins originating from the same gene can have vastly different functions, and in some cases, opposite functions. These findings have been reported in a large and growing body of peer-reviewed publications, exemplified by the papers on the right. Protein variants range from single amino acid differences to alterations in entire protein domains to post-translational modifications. We believe the best way to discover the biology embedded in the proteome is through deep, unbiased proteomics at scale. Large-scale access to deep, unbiased proteomics will help decode the complexity of the proteome and annotate the function of genetic variants, meaningfully advancing our biological insight. Seer is uniquely positioned to power unbiased deep proteomics. Plasma is the most accessible biosample for population-scale studies.
Prior to Seer, deep, unbiased plasma proteomics at the population scale was simply impractical, with the largest unbiased study comprising of only 48 samples and the deepest study published by researchers at the Broad Institute, reaching a depth of 5,300 proteins prior to the launch of the Proteograph product suite. Since the launch of our Proteograph, we've seen exponential progress on the size and depth of studies our customers are running. We shipped our first Proteograph to a customer in late 2020. And by 2022, just two years later, we were seeing multiple studies of over 1,000 samples completed. And the deepest customer study was run with over 6,000 proteins. And we're not done yet. In 2023, our customer studies delivered an average of 7,400 proteins per study, with over 10,600 proteins observed across studies.
And the scale of studies was also beginning to rise on a sharp slope, with one of our customers, PrognomiQ, initiating a study of 15,000 samples. Importantly, the number of proteins for our customers seen largely represents the number of protein groups in their studies, meaning that in large part, they're not accounting for protein variants arising from single amino acid, small insertion deletion, and PTM variants. If you accounted for these items, the number would be much larger, exactly as we saw with genomic variants. I believe we'll see a growing body of evidence from our customers demonstrating the biological insight that is uniquely delivered by deep, unbiased proteomics at scale. I'll highlight two of these toward the end of the presentation. So we believe we're well positioned to become the definitive tools leader in proteomics.
Our Proteograph Product Suite, which includes the Proteograph XT Assay, the SP100 automation instrument, and Proteograph Analysis Suite, enables nearly any lab to take on large-scale proteomics to a fully automated solution for deep, unbiased proteomics studies at scale. The Proteograph XT Assay increases sample throughput by two and a half times from our previous assay without compromising performance, all with less than one hour of mass spec time for further scale studies. A single technician can now run hundreds of samples per week with minimal hands-on time. And researchers can analyze 10,000 samples per year with one Proteograph XT and one leading mass spec system, enabling large-scale deep, unbiased proteomics. So Seer solves some of the key challenges of conventional deep, unbiased proteomics by addressing the wide dynamic range and complexity of proteins and biological samples.
These include limitations of scalability, cumbersome workflows, need for a lot of equipment, manual labor, and time. Seer has uniquely solved these problems through our proprietary engineered nanoparticles by removing complexity and enabling access to proteomic content at scale, speed, and depth previously not possible. As a result of these four attributes working together, our customers can obtain a highly accurate, reproducible, quantitative measurement of the proteome across a broad dynamic range. Importantly, these measurements are made at a 1% false discovery rate. Our technology is applicable to a wide range of sample types. Importantly, it works with any species, including the model organisms typically used in medical research and drug development. The protein enables differentiated biological insights, including protein isoforms, protein variants, and more robust pQTLs, and can be used in a range of applications such as biomarker and drug target discovery, model organisms, and QC and biomanufacturing.
We're at the beginning of large-scale proteomics, and these types of studies are just the tip of the iceberg of what we can enable, so our market opportunity is large and growing. The proteomics market is estimated to be about $27 billion, and the Proteograph product suite can be used to accelerate our understanding of this biology and human health, and with the discovery of novel proteomic content, uniquely enabled by the Proteograph, and the demonstration of its biological value, we expect entire ecosystems and end markets could be created and expanded, much as we've seen in the genomics market, so we launched the Proteograph XT assay kit at the ASMS conference in June of 2023, and the feedback for this product has been fantastic.
It provides, like I said, two and a half times throughput compared to our first product, while concurrently reducing the mass spec time per sample by an equivalent factor. A portion of the data on the right was presented by Thermo Fisher Scientific at their ASMS conference, where they showed an impressive performance with their newly launched Orbitrap Astral, detecting approximately 760 proteins in neat plasma. The same plasma sample, when processed in conjunction with the Proteograph XT, results in the identification of over 6,000 proteins, representing an 8x improvement with equivalent protein abundance CVs than without the Proteograph, confirming a high degree of reproducibility for our workflow. Notably, the approximately 6,000 protein identification is on a per-sample basis. When you look at proteins observed in about 2,500 samples, well over 10,000 protein groups were detected using the Proteograph XT with the Orbitrap Astral.
It's exciting to see how the Proteograph consistently improves mass spec performance. So we work with any mass spec type. And over the years, the performance of mass spec platforms in terms of depth and protein coverage has been improving, and I expect will continue to do so over time. The dark blue bar in this chart represents the performance improvement across generations of mass specs from Thermo Fisher, Bruker, and SCIEX. The teal color bars in this chart represent the performance improvement of these mass specs with the addition of the Proteograph product suite. First, you see that the fold improvement performance that the addition of the Proteograph product suite provides is preserved across mass spec generations. And second, you see that the lion's share of the overall performance in terms of depth of protein coverage can be attributed to the Proteograph product suite.
Here you see the visual representation of the potential for biological insight when we compare the coverage of the Proteograph XT to a commercially available high-plex affinity-based panel. As I mentioned earlier, over 10,000 proteins are detectable in plasma within about 2,500 samples. These proteins comprise over 150,000 peptide measurements covering over 1,900 Reactome biological pathways. These 1,900 pathways map to 29 categories shown in the slide. When we look at the coverage of these categories by Proteograph in teal and compare it to the coverage from the commercially available high-plex affinity-based panel in dark blue, you can see that the Proteograph covers these pathways far more completely. In addition, we're focused on reducing barriers to adoption for our customers who may want to leverage a service model, either for their mass spec work or for the entire workflow.
In 2023, we announced the launch of our Seer Technology Access Center, or STAC, where we provide access to the Proteograph XT and Thermo Fisher Orbitrap Astral. This year, we opened an additional STAC in Germany to serve our European customers. We now have served 66 organizations, including 10 large pharma customers, with a continued strong pipeline of opportunity for the balance of this year and going into 2025. Additionally, we have an average of approximately 6,000 protein groups per plasma study and a 6x average fold improvement over neat plasma. STAC was an important step to enhance the accessibility of the Proteograph product suite. In addition, last week, we announced the expansion of our relationship with Thermo Fisher Scientific and the signing of a co-marketing and sales agreement.
We're delighted that Thermo Fisher, the leading mass spec provider, is partnering with Seer to co-market and sell the Proteograph product suite alongside their Orbitrap Astral mass spec. Under the non-exclusive agreement starting in early 2025, Thermo Fisher global sales force will have the ability to quote and sell the Proteograph, enhancing the accessibility of this innovative technology to life science researchers worldwide. The new collaboration makes it possible to acquire the Proteograph in conjunction with their acquisition of the Orbitrap Astral. We'll also conduct joint marketing studies, including conference promotion seminars, and collaborate on joint research studies, including population-scale studies, to showcase the combined power of the proteomics platform. Changing the status quo is an enormous undertaking. When we started, there was a huge mountain to climb in front of us.
Over the last few years, we've been enabling the discovery of content and scaling the studies and throughput, making progress, climbing the mountain. Now, we believe we're on the cusp of inflection for widespread adoption and revenue growth as our customers around the globe begin to demonstrate and exemplify the power of biological insight that's uniquely enabled by the Proteograph. To that end, we have growing exemplification of our technology with over 19 peer-reviewed publications and 10 additional manuscripts on preprint servers such as bioRxiv. And these articles are published in high-impact journals such as Nature. So as the body of evidence and the power of our Proteograph continues to grow, we feel like there'll continue to be advancement towards adoption. Just two quick examples. So this was a study performed by PrognomiQ using deep, unbiased proteomics as part of a multi-omics-based blood test for early-stage lung cancer detection.
The study was a large case-controlled study that included those at high risk for lung cancer. The compliance rate for current low-dose CT cancer screening is only 5%-10%. PrognomiQ is developing a blood test to help close this compliance gap to identify patients at risk for lung cancer and subsequent evaluation. With samples from approximately 2,500 individuals, they used Proteograph to measure over 8,300 proteins, which is then combined with over 200,000 RNA transcripts and 1,000 metabolite measurements to drive a multi-omics classifier. As you can see on the left, the proteomics data performs extremely well with an AUC of 0.91. Adding the other omics data increased the AUC to 0.96. This classifier represents a potential best-in-class performance by achieving a sensitivity of 80% for stage one and 89% for all stages of lung cancer at 89% specificity.
The middle of the slide shows the top features contributing to the classifier, ranked by their importance. The top features are heavily enriched for those derived from the Proteograph assay unbiased proteomics. When we look to see where these proteins fall on a standard plasma concentration curve on the right-hand panel, you can see they are across the entire dynamic range of the proteome. Therefore, only the capabilities of the Proteograph platform providing deep, unbiased proteomics at scale could this classifier have been developed. Finally, this study highlights a study we performed with the researchers in Alzheimer's disease from Massachusetts General Hospital. Using a $2 million SBIR grant to Seer and MGH, this study looked at 1,800 plasma samples comprising both controls and individuals diagnosed with cognitive decline, including AD.
The samples were collected over a 10-year period, allowing us to investigate proteins associated with progression or protective against cognitive decline. Comparing the AD-affected individuals with controls, we identified 138 proteins that were up or down-regulated in AD individuals. Of these 138 proteins, only 44 have previously been associated with AD. The remaining 94 represent putative biomarkers of AD, potentially highlighting new biological insight. We used the clinical information to identify the point of significant cognitive decline and determined proteins that separated the population into fast and slow decliners. We identified eight such significant proteins and are now investigating how these results may be advanced to develop a score indicating the likelihood of cognitive decline in a particular timeframe.
The reason that I say such studies are uniquely made possible using deep, unbiased proteomics at scale using the Proteograph is because 55% of these 138 proteins are not present on a commercially available high-plex affinity-based panel, so researchers would not know to be looking for these proteins, and there is no other practical way to do deep, unbiased proteomics to study over 1,800 plasma samples other than leveraging the Proteograph, so this study represents the largest deep, unbiased Alzheimer's disease proteomics study completed to date with highly differentiated and novel biological insight, so turning just to the financial overview, we continue to grow product and STAC service revenue year over year and an extremely strong capital position with approximately $312 million in cash, cash equivalents, and investments, and no debt as of September 30, 2024.
We continue to be focused on stewarding investors' capital by continuing to reduce our cash burn each year. In addition, given our strong cash position, our board authorized a $25 million open market share repurchase program in the second quarter of this year. And as of September 30th, we have repurchased a total of 5.7 million shares at an average cost of $1.80, reducing our total shares outstanding by approximately 9%. So we feel very good about where we are from a capital structure standpoint. So with that, I'll stop and let you answer any questions you have.
That was a great overview, David. Thank you for that. Congratulations on the co-marketing and sales partnership with Thermo. How does that partnering fit into your commercial strategy going forward? And could you quantify the tailwind you expect to see in 2025?
Sure. So we were really excited about that Thermo partnership. I think what it enables is we originally partnered them with our STAC. So they sent us their Astrals when they first launched the Astral last year for our STAC center in Redwood City. And they've subsequently provided some Astrals for our STAC center in Germany. And what we found is that the data produced by the combination of the Proteograph and the Astral is just phenomenal. And the customers are really excited about it. And so we wanted to kind of continue to ride that excitement about the combination of the two technologies. And Thermo is also excited about it in the sense that they can now also provide the full workflow solution to their customers.
The ability for them to be able to offer that to their customers who are interested in doing these large population-scale studies to really see deep into the proteome has really what enabled it from their perspective. From our perspective, of course, they're obviously a much larger organization and have a geographic breadth and diversity that will help certainly leverage our own reach in terms of being able to place the Proteograph.
Perfect. And after investing to significantly expand your own sales force this year, will this partnership be accretive, or will you force your sales force to focus on a different area?
Yeah. No, it's definitely going to be accurate. So I think Thermo's primary motivation, obviously, is they want to provide the complete workflow to their customers with the Astral. So the way the partnership's structured is they will do that initial sale of placing the Proteograph in conjunction with our reps as well, if we can be helpful. And then after they make that sale, they'll turn the account over to us in terms of the Proteograph piece of it so that we'll continue to follow up. And all the kit sales and consumable sales will come directly through us. So it's really just kind of the tip of the spear to get us into the accounts and place a Proteograph. And then after that, we'll help drive the usage.
Thermo has the biggest infrastructure.
Right.
Can you talk through your current commercial strategic focus, including some goals? You touched on this a little bit, but the scalability of the STAC program, especially with the Germany site opening.
Yeah. So really, our goal has been adoption of our technology. And given it's a new technology, really trying to make it as easy for customers to access the technology. So we've done that in a couple of ways. One is the STAC program, which is essentially a service program. And they can either customers, if they want to run the samples in-house on the Proteograph, they can run it in-house. But if they don't have access to a mass spec, we will provide the mass spec service for them on the Astral through our STAC. Alternatively, there's a lot of customers that either want to see data before they purchase or simply don't bring these technologies in-house. So they want to consume it as a service. And so we will also provide end-to-end workflow where they send us the samples, we send them back data.
So it's been very powerful from that perspective. So really, it's about getting as much data out there as we can. It's also been a great strategic asset for us because, as we've said on some of our earnings calls, we've invested with some key collaborators to do STAC services for a reduced price so that we can get data to these key collaborators. And we're working on some really interesting studies now where they will publish the paper, publish the data, and create more excitement around the technology. The other thing we've done is we've done what we call our strategic instrument placement program. So this is where we'll place an instrument with an upfront consumable purchase. So if people purchase the consumables upfront, we'll place the instrument, and then they'll be able to run the instrument and generate data.
This is good for a couple of reasons, primarily around budget. So certainly for academics, they need to get the data and write the grant proposal to be able to buy the instrument. So this just gives them a head start so that they can get it in-house, start working on it, and then within six, 12 months, they can have the funding to then buy the instrument. Also, for pharma, it's helpful in that they may say, "Well, I'm budgeting for this next year, but I want to get started now." So we've had some good success with that. And we've actually had quite a number of conversions of SP instruments that we've initially placed with just that upfront consumable purchase. And then now they've gone on to buy the instrument.
Has that looked different in academic versus biopharma end markets?
Pharma, at least from a budget perspective, tends to move a little faster because they can find the money sometimes if they want to. It's been a challenging environment in pharma, though, I mean, certainly from the macro perspective, but the academics tend to be they have to get the data to write the grant proposal, make the case for why they need the funds, and so it just tends to take a little more time on an academic perspective.
Got it. And you mentioned this probably two earnings call. The sales pipeline is split 60% academic, 40% pharma, but the revenue is the other way around where 60% revenue comes from biopharma and 40% comes from academic. How are you prioritizing the conversion of academic interest into meaningful revenue contributions?
Yeah. So what's really interesting, what we found, I mean, if you look back to the genomics example, academics were really the first to adopt the genomics technology and really look at it and what it could do. And pharma, I think, was a little initially skeptical about it. Fast forward 15 years with the introduction of this proteomics technologies, we saw the pharma companies really lean into this because they recognize now the benefit of having these large-scale studies, especially from a multi-omics perspective. And that's thanks to all the work that the genomic companies have done, primarily Illumina, to really drive that. And so we saw them adopt quickly, or quicker, should I say, than academics because they understood that they had all this genetic information that they weren't able to functionalize.
They have been customers that have, like I said, kind of been able to find the budget. They have the sample numbers, and they have the motivation to try and find novel biological insights. They have kind of leaned in, which is why we have our current revenue base and installed base of about 60% pharma versus academic. Academic is coming around. They are writing grants, making proposals. I think over time, we'll probably have a 50/50 type split. What we're starting to see now is academic centers getting the grant funding to be able to purchase the instrument.
Got it. Share repurchases have been a capital allocation priority throughout this year. Do you see that continuing in 2025, and how are you thinking about further capital deployment heading into 2025?
Yeah. So we fundamentally see a fundamental dislocation in our stock. I mean, we have over $5 of cash per share. We were trading around $2. We're trading north of that now. And so we saw it as a great opportunity to reduce the dilution that was caused through our IPO and our follow-on and the various option issuances at a price that is really attractive. And we feel like we're in a great capital position. So we continue to reduce our burn each year. And so really, the capital is going to be deployed to continue to try and drive sales for sure and adoption. But just given our strong balance sheet, no debt, we felt that there was a prudent use of capital to return some funds to shareholders and still leave us plenty of money to execute on our operating plan.
That also shows a sign of confidence. A finisher from Guggenheim. Three years from now, what will investors wish they realized about Seer today?
Yeah. I think incredible buying opportunity. Besides that, it's something where I think people are starting to appreciate the power of unbiased deep proteomics at scale. So you have to do in proteomics what was done in genomics, right, that you have to study thousands and thousands of samples in an unbiased, hypothesis-free way to really get the novel biological insights that's going to drive improvement in health and the cures of disease. So I think that's starting to be realized by people. And so I think Seer is really the only platform today that enables that, right, to reproducibly study, to do thousands of samples reproducibly at depth and scale in an unbiased way. So the affinity-based panels are biased. It's equated essentially to the GeneChip versus Illumina, right? It's one's a panel.
You kind of go in with a hypothesis because you're only going to see what's actually on the panel, right, versus looking at it in an unbiased way. And so I think the notion that you have to do unbiased proteomics at scale to really get novel biological insight is, I think, going to become more and more clear. And we certainly started to see it over the last couple of years and certainly over the last few quarters.
Perfect. Thank you for that, David. It was great having you. Thank you, guys, for coming in too. Thank you.
Yeah. Thanks very much.