It looks like we can go ahead and get started. Hi, everyone. I'm Elizabeth Kozlosky. I'm on the Tools and Diagnostics team at GS. I'm here with Sujal Patel from Nautilus. Thank you for joining us.
Thanks, Evie. Thanks for having us.
I think just to get started, it would be helpful to set the stage a little bit, give us a brief overview of the business, particularly the role Nautilus is playing within proteomics and the technology you're in the process of developing.
Great. Well, Nautilus Biotechnology is a six-and-a-half-year-old development stage proteomics company. You know, before I kind of talk about what Nautilus does, maybe just to level set, just for a quick second, what proteomics is and why it's important. Proteomics is the study of proteins in any sample, whether it be blood, tissue, cells, and that is a pretty large unsolved problem in the world today. Genomics is a solved problem. If I take a drop of blood, drop it on a Illumina sequencer, I can get you 99% of the genome in a day for very little money. It's a commodity. For the proteome, that is completely different. If I take that same drop of blood, and I want to understand what proteins are in the sample, the very best technologies on the planet are really insensitive.
They see a small fraction of the proteins that are there. Because of that, there's a dramatic difficulty in leveraging proteins. Leveraging those proteins is really important. 95% of the FDA-approved drugs target proteins. Most molecular diagnostics, even with this huge revolution in liquid biopsy, still target proteins. The inability to measure proteins really hampers drug development, both speed and efficacy, really hampers diagnostic development, which has really slowed the wave of precision and personalized medicine. You know, in this world today, there are a number of new approaches that are being attempted to change that state of the art.
Nautilus is bringing to market in 2024 a very different approach, which is based on an instrument and consumable model that essentially, we're building a product that allows you to measure substantially all of the proteins in a sample with very high sensitivity, like a commodity, like Illumina's done for the genome. For, you know, a few thousand dollars in a day.
Yeah, very exciting opportunity. What is Nautilus's value proposition, and what makes them different from other competitors in the proteomics space?
Yeah, so maybe to kind of describe the value proposition in double quick, you have to understand what our customers do today. If you look at the broader proteomics market and the analyst estimates, it's roughly a $30 billion market. It's growing pretty rapidly, 12%, 15% CAGR, depending on who you read. Proteomics is used in a wide range of research applications. If you took large pharma, for example, the process of building a drug has lots of different phases. I have to take populations of sick samples and healthy samples. I have to look at the cells, figure out what are the differences, where are the proteins that I could potentially up or down-regulate using a new compound? If I develop a compound, how does that compound affect the cell? What's the mechanism of action?
Is there toxicity with other types of cells in the body? All of these questions are protein questions. Today, the gold standard in the proteomics world is to use a complex workflow that sits around the mass spectrometer, which is an instrument that in the proteomics world are sold by Thermo Fisher, Danaher, Bruker, and others. That complex workflow attempts to understand what's going on at the cell level, but it does an incomplete job of that.
You know, when we're talking to large pharmaceutical organizations, the value proposition for them is that what you have today is you have an incomplete answer with a very complex process, and what Nautilus is building is an easy push-button instrument that enables you to get, you know, complete coverage of the proteome in terms of the proteins that are in the sample, but more importantly, hundreds to thousands of times more coverage of the sample and more sensitivity. With that, yields better biological insight, which would dramatically improve the efficacy and speed of drug development.
It increases the coverage for development of new diagnostics, and the ease of use really enables the proteomics research to move from what is a very specific set of people who understand how to use mass spectrometers today to what we think is broad-scale adoption across the scientific community, right? Our mantra inside the company is that our mission is that any scientist in any lab in the world who wants a proteome can get one, and that's certainly not true today, because only one in 100 of those labs can figure out how to use the mass spectrometer to effectively measure proteins.
Could you talk a little bit more about the pricing of the instrument and what potential pull-through on consumables could look like as you get towards a more consistent run rate post-developing your install base?
Yeah. The first thing I'll say with respect to pricing is that we're using broad brush strokes today because we're still at development stage, so we haven't launched our formal pricing yet, but it's in the ballpark. An instrument deal is roughly going to be about $1 million. An instrument deal is the instrument, it's surface and support plan, it's the software that runs it. It's a complete package to get the customer up and running. That $1 million is comparable to what mass spectrometers that are used in the discovery proteomics world cost. You know, anywhere from $800,000 to close to $2 million is what those machines cost. In terms of pull-through, our instrument is designed with a number of different sample throughput capacities.
You can run one sample a day, four samples, or up to 12 samples a day, and each of those samples is going to cost a few thousand dollars per sample. If you look at what we think pull-throughs will look like early on as customers start to use it, we think that it's pretty exciting that even with modest utilization of the platform, you can get to consumable pull-through that comes close to $1 million a year, and then ultimately can surpass $1 million a year. In the proteomics world, if you look at the work of stomach mass spectrometers, you look at other products throughout the marketplace and other names, that sort of pull-through is not unusual in the market that we're in.
Great. And you just mentioned the proteomics market being around $30 billion, growing 12%-15%. In the short, mid, and long term, where do you see Nautilus fitting into this market? Are there key areas within proteomics that you feel you can compete best?
Yeah. 30 billion is that broad proteomics market. If you think about how that market breaks down, about 50% of that market is pharmaceutical research and diagnostic development. Those are the commercial enterprises that are looking to build drugs, technology, tests that are going to make their way into the clinic. About 30% of that market is academic and research, the rest is applied, agriculture, environmental, those types of things. For us, the square initial focus is to go after the academic and research customers and to go after that 50%, the pharmaceutical and diagnostic customers. Inside that $30 billion, if you kind of looked at that's a big swath in and of itself. The primary area that we want to focus is on customers that have discovery proteomics environments. What does that mean?
It means these are customers that are looking to analyze a sample and figure out what is in there and profile it at the greatest depth that they can, and most of those customers today are using mass spectrometers. In fact, large pharmaceutical organizations have, you know, tens to hundreds of these mass spectrometers, and they're using them every single day and using them to analyze samples and look at what proteins are in them.
What's your view on the trend in research dollars moving more towards proteomics, and do you believe this will offer a tailwind for the sector, for some time?
I think this is a really exciting piece of the proteomics story space, proteomics spaces story. The overall space I mentioned earlier is growing, like 12%-15% year-over-year. Pharma budgets globally are growing at about 4% or 5% year-over-year. There's quite a bit of share shift moving towards proteomics. I think that is occurring because there is broad recognition across the scientific community that if you want to understand cellular behavior, using the genome or using the transcriptome is absolutely insufficient to understand what's really going on inside of a cell. Cellular behavior is everything with respect to new therapeutic development, to developing diagnostics.
There's a broad recognition in the scientific community that we need to have better proteomics, and there just aren't great solutions out there, and that's really, frankly, why we exist as a company.
Great. Do you have any views on what impact a change in the NIH funding specifically would have on your launch?
Yeah, it's a good question. I mean, if you look at NIH outlays right now, I think we're up like 9%, 10% so far this year. I know the May numbers look pretty good as well. I think that, you know, that's all great because more NIH dollars, the better. As a new entrant who's coming in with a highly disruptive technology, you know, the NIH dollars never go to zero. They go up or down 5% to 10%. From our standpoint, it really doesn't affect us very much. The value proposition for us when we get to a customer is: Hey, look, you're already using many different tools for proteomics. You already have mass spectrometers in your environment.
Instead of investing in another one for growth capacity, instead of refreshing that one that's five years old and buying the new one, take some of those dollars, buy our instrument. It's much faster, it's much cheaper when you look at the total cost of running samples through it, and it gives you a much more complete data set. You know, from talking to hundreds of customers in the commercial parts of our market, so pharmaceutical organizations and diagnostic companies, now, if the product can do what we've designed it to do, we see a great deal of interest from those customers.
That's great. In terms of the academic market, is it easier to apply for a grant, if there's this new technology coming out? Like, post your launch, would it be easier for an academic customer to, like, get a grant approved because of that?
For sure. I mean, a lot of times when you're applying for a grant, you're making the pitch that you're going to make a discovery that's different than what's out there today. A big part of that might be leveraging new equipment like ours to look at biomarkers in more detailed way, to get a different view at a problem that's been looked at before. I think that certainly we intend to support our academic and nonprofit research partners in making those grant proposals. One of the things that I think is really exciting about Nautilus is that initially, when we get out to market, we will not just be dependent on those academic and research customers. That's different than a company like Illumina, if you went back to 2007 when they launched their first NGS products.
No one in the commercial world knew what to do with the genome, so all the customers were academic and research, and they were beholden to that grant cycle to start with. For us, yes, we're going to go after those customers. Yes, we're going to support their grant proposals, but half of the market is commercial enterprises, and in our conversations, they are absolutely chomping at the bit for better technology to deeply profile protein and sample.
Great. If you had a crystal ball to kind of look into the future, could you talk about Nautilus' expectations for scalability, technology, turnaround time, throughput, and direction of research and development through the Nautilus platform? In, say, 3 years or so, where does Nautilus stand versus larger competitors like Thermo, Bruker, Danaher?
Yeah. I think. The thing that you have to understand with proteomics, first and foremost, is that there is a, you know, there's a significant continuum of biological information that you can give the customer, and a lot of it's very useful. You asked for, like, a three-year time frame. If you looked at just like, what do we want to launch first? We want to launch a platform that provides nearly complete coverage of the proteome with single molecule sensitivity, as high sensitivity could be, and a very wide dynamic range, meaning that we analyze the sample very deeply. For all of the customers in the types of applications I've been talking about, that is very, very transformative. If you look at where they want to go from there, they have a set of more questions beyond just what is the protein.
The next question they want to answer is, how are those proteins modified? You know, like the genome, once a protein is expressed, it's modified in many different ways, and those modifications affect the protein's behavior, they affect its conformation, they affect where it's distributed inside of the cell, and affect its messaging. If you want to understand biological functional proteins in cells, you have to understand those modifications as well. We have a platform that is built to start with simple modification information, and we'll get much more deep over the course of, you know, three-five years that you're describing. After that point, customers want to understand what cell were those proteins in. Single cell becomes important. After that, spatial becomes important.
That same thing that you're seeing in other areas of the market, that continuum is a continuum that we intend to go down as well. In addition to that, much like you've seen in other instruments, in the three-five-year time frame, we're going to look at building instruments that have less capacity with lower cost, higher capacity for those higher throughput applications. Our customers have been talking about us increasing multiplexing so that we can not just do 12 samples a day, but maybe go up to 24, 48, and 96. That enables them to do much larger cohort studies without having to spend as much money and without having to spend as much time. Those are all things that we see are really natural extension areas for us.
The last area, and one that I think is probably the most exciting, is that the information in the proteome is hundreds of times greater than what you have at the genome. You know, you roughly have 37 trillion cells in you, and every one of them, for the most part, has the exact same genome. Every one of them has a different proteome, and that is important to understand, to understand what's going on inside the biology that defines who you are and what's going on in your body. If we can help customers through software, analyze that information, make sense of it, use machine learning and AI types of technologies to help deliver insights instead of just raw data, we can help move up the value chain in the solution that we provide.
We think that over the course of three to five years, that in and of itself is a huge opportunity for us and one that we intend to capitalize on. You know, myself and Parag Mallick, my co-founder, are both computer scientists, academic degrees in computer science. We think about this from a computing and data science perspective first, and a biochemistry perspective first point one.
We've seen recent improvements in the workflow for mass spec within proteomics, some of which was highlighted ASMS last week.
Yes.
Could you talk through how Nautilus wins versus mass spec, post your launch of your instrument?
Yeah. ASMS, just to define it for the folks, who are listening, is the American Society for Mass Spectrometry. Their major congress was last week, and that's where all the new mass specs come out from our vendors. I think that what we saw at ASMS is just a continuation of what we've seen for many years out of the mass spec vendors. You know, it's just been a couple of decades that mass specs have really been pushing into the protein discovery environment. The mass spec is an instrument that is great at a ton of things. It's great at food safety, it's great at metallurgical analysis, it's great at metabolomics. It's not great at looking at proteins and complex samples. What we saw out of the current crop is incremental improvements with that.
Faster, maybe a little bit more sensitivity, but no real change to enable them to deliver the types of biological insights that we know customers are looking for. For my standpoint, I think that it was great to see that they continued to make progress, but really no material change with the wide value proposition that we have with our product.
Great. Just to switch gears a little bit, would love to hear any insight on your manufacturing capacity and supply chain dynamics as you work towards the full commercial launch, specifically on, like, reagent and instrument assembly capabilities.
Yeah, that's an interesting question, right? I mean, from our standpoint, supply chain is something that we have to be exceptional at, and I know it's on investors' minds. I know that there have been a number of newer names that have come public that have had difficulties on the supply chain side. For us, there's actually there's a bunch of pieces to supply chain, not just the reagent side. There's the supply chain that goes into building the instrument. There's the supply chain for our chip and flow cell, which is a particular sub-assembly that we build that uses semiconductor manufacturing processes and is bonded to glass. Then we have our reagents. In each of them, we have a strategy that is focused on making sure that we mitigate risk.
My head of operations and supply chain, a woman named Mary Godwin, she's got four decades of experience, and she's worked with me for 16 years, for a very long time, across three companies at this point. Our strategy has been to mitigate risk, make sure that we have multiple vendors for all the major pieces, make sure that we're stockpiling things that have longer lead times associated with them or that, that have some additional risk. Making sure that we outsource to well-known contract manufacturers and CROs as much as we can. Our entire instrument build is handled by, a third-party contract manufacturer already, large publicly traded company. Our chip and flow cell assembly is done outside to our exact specifications, but done outside.
Reagents is a split between our own internal manufacturing and external, we have multiple vendors that we've brought on in antibodies and buffers and so forth. All of that is really focused on mitigation of risk and scale. I feel really good about our ability as we start to get into our commercial launch next year and into 2025, that we'll be able to scale effectively.
Great. In last quarter, you added five new patent applications, four new U.S. patents, bringing total to 12 U.S. granted patents.
Yep.
Can you talk about any potential patents you have in the pipeline or anything specifically, patent-wise, you're excited about?
What I'm excited about patent-wise, really goes back to the fact that we didn't talk about it very much, but the approach that we used measuring proteins is very, very different than anything that's been ever conceived before. You know, just to give you a couple of sentences for color, the standard way to measure proteins outside of the mass spectrometer, and mass spectrometer is one way, right? Break up proteins in pieces, measure their masses. The standard way is using antibodies. I have an antibody that knows how to detect an EGFR protein.
I observe that there's some kind of binding event, an interaction, and then I say, "Hey, there's some EGFR in the sample." Our approach is very different, and it's based on a data science type of approach, where we take individual molecules, we spatially separate them, and we probe them over and over again, gathering more and more information about the molecule until we can make a high confidence call what the molecule is. That type of idea, when we got started in 2016, was never conceived of. The first thing we started doing was building big patent families in each of the major innovations that would be necessary in order to bring this idea to fruition. There's the algorithm that we use to make protein calls. There's the method that we use to create a spatially separated single molecule array.
There's the technology and the machine learning behind building the probes and building the antibodies. In each of those areas, for the last 6.5 years, we've been building, you know, very meaty patent families, and we're starting to see, as you pointed out from our core numbers last quarter, we're starting to see a lot of them make their way through the system and starting to get granted. For me, that's really exciting. You know, the IP is what's going to give us a significant protection for this idea throughout the world. That's one piece of the protection.
The other piece of the protection is that it is really hard to do what we're trying to do. We didn't try to talk about it, but the types of people that we've had to hire and the types of talent that we need to execute on this idea is few and far between. We've built an exceptional team to get this done.
Great. could you give us an update on the findings with your First Access Challenge and any feedback you've heard from researchers in relation to your platform?
Maybe I'll take a quick step back and describe what the First Access Challenge is. One of the things that we have done, I think, really well as a company is from very, very early on in the company's life, we have spent a ton of time talking to customers, tuning the product market fit, understanding the requirements incredibly well, and that's helped us build really close relationships. Over the course of the last couple of years, we've formed a number of early collaborations with folks like Genentech and Amgen and MD Anderson. Those are the some of the folks who are getting the first look at data coming off of our platform. Last year, we decided to host something that we call the First Access Challenge.
What we did was, we invited researchers around the world to submit proposals that essentially look like a grant proposal, to have free access to our platform about the same time as those early collaborators do. We were incredibly excited to have received a significant number of proposals, and a lot of work went into it. We've chosen some interesting winners of that challenge. For example, the Buck Institute, which studies aging, is going to be focused on a particularly acute kidney injury and understanding how the proteome changes during the early stages of that injury. We've got other customers who are focused on brain cancer, for example, and other folks in that First Access Challenge that represent a pretty interesting, diverse set of applications.
Once we get a little further in our platform development, our early collaborators and those for First Access Challenge winners will be the first early customers that all have access to the platform. We'll use the data there, one, to get them excited. More importantly, to publish and get the scientific community as a whole excited about the types of things you can see on our platform and why that's important for research.
All very exciting.
Yeah.
You touched on it a little bit, but how do you expect these projects to highlight the Nautilus platform?
It's a great question. I, the thing that I expect from all of these, one of the lenses that we used when analyzing, when reviewing the submissions that we got were trying to find applications that would highlight the incredible sensitivity and dynamic range. Dynamic range is defined as how deeply we can see the sample of our platform. In each of the applications that we've chosen, the wide dynamic range is necessary to see the minute quantities of markers that will be present in these samples. That's an application that really cannot be tackled with mass spectrometry or any other approach that's out there. For me, that would be some of the most exciting things that'll come out of that First Access Challenge.
You know, we're already starting to see the early fruits of our labor here. Genentech, which is an even earlier collaborator, at a few conferences last year, we did a joint talk with them, and then we've done poster presentations where we show how we've been able to look at one particular protein, in their case, the tau protein, which is a key biomarker in neurological disorders like Alzheimer's. We've been able to look at the modification landscape in a way that can't be done with any other platform that's out there, because none of them can take one molecule and continually probe it to gather information. For them, that's incredibly exciting. They're looking at these modifications and trying to figure out, are they indicative of therapeutic response?
Therapeutic response is very poorly understood, and if we understood it much better, we'd be able to figure out how to titrate quantity of a drug or figure out if a drug's gonna work or not. For them, it's a really exciting area of research.
Great. Moving more towards the financial profile. Ahead of the full commercial launch expected in mid-2024, how are you managing your cash runway effectively with the right balance of organic and inorganic investments, while also remaining prudent in spend to ensure cash runway can be maintained well through 2025, which is what you guided to?
Yeah, that's right. you know, cash management for us is incredibly important, and I think that, you know, if I look around at my peers, I think we probably do a better job of it than most, right? If you looked at our cash balance at the end of last quarter, it was $302 million. Last year's burn was about $48 million of cash. We have been working very, very efficiently. I know that everyone else tells you that, but, I mean, I can tell you from my past as well, right? I mean, I was founder and CEO of a company in the tech space called Isilon, back 23 years ago.
That company went public in 2006, before we sold it in 2010 for $2.6 billion, we had our first profitable year. We had the business at a 20% positive operating margin when we sold. The year after acquisition, it was at a 40% cash margin. Like, myself, and Anna Mowry, who's our CFO here and was with me on that last journey, spend a huge amount of time and energy making sure that culturally, the company is spending every dollar in the absolute most effective way that it can. You know, we know, just like the Lehman Brothers bankruptcy last time, that times turn and they turn quickly, and you've got to make sure you've got a cash balance that supports the development, commercialization, the launch, and all of that sort of things.
We feel really great about the cash balance and expect to continue managing it very tightly.
Absolutely. Can you give us a reminder on how you expect OpEx to trend post the commercial launch?
Certainly, not just, not just after commercial launch, but heading into commercial launch as we continue to build new capabilities around manufacturing and development team, we expect that the burn is going to go up. Through this year, you should continue to expect that the OpEx is going to modestly go up. Certainly, as we head into commercialization, we have a commercial team build in front of us, and we intend to use the current cash that we have on hand to complete development, build that team, and get into the ramp before we really have to go out and tap the market for capital. That's, for us, that's the plan.
The one thing I'll mention on the commercial build, one of the biggest levers that determines whether your commercial build is gonna be efficient or not, is really the transaction size of the pull-through, right? With a transaction size of roughly $1 million for one instrument and a pull-through of, you know, what we'll get to, hopefully, closer to $1 million as the customer starts to operationalize the equipment, you've got the recipe for a very efficient sales motion. You know, in the long run, we expect our gross margins to trend to 70%. That together creates the pieces that you need for a what will ultimately be a profitable business model.
Great. You mentioned it earlier, but as you look towards launch and like, 50% academic, 50% biopharma, how do you view the spend environment in biopharma right now, going into your launch?
I mean, I think the spend environment is quite good right now. Yes, the CFOs in all of these larger organizations have clamped down. Yes, there are incrementally 1 more approver or two more approvers, but the spend is there. Is it modulated down 3%, 4%, 5%, 6%, 7%? Maybe. For a disruptive technology that can have an incredible outsized impact on research, those dollars will be there. From my standpoint, I think that if this environment persists into our early commercialization period, which we think it probably will, yeah, it'll be incrementally a little bit more difficult to get a deal done, but I think the dollars will be there, and I think that for technologies that are disruptive, like we are bringing to market, there's always an opportunity.
Great. Just to wrap up, can you talk about any key milestones or metrics that we should be paying attention to throughout the next year?
Yeah, that's a good question. you know, we haven't been giving a lot of intermediate milestones, and the equity markets like they are, there's not a lot of reason to want to do that, and so we haven't done a lot. What investors have been doing is following along with the journey that we're taking the scientific community on. Let me try to double-click and explain that. My co-founder, Parag Mallick, Stanford faculty, he has been in the proteomics world for his whole career, 25 years. He's a key opinion leader in the proteomics world.
He spent a lot of time with all of the KOLs, trying to understand how would they want to see us explain our platform and show our progress in a way that they can be supportive of what we're doing. What we learned from them was that they really wanted to follow along on our journey at each step. Years ago, we started with publications, focused on foundational pieces of our platform. How is it going to work? Last year, we started to add data. We started to show more and more data through our posters and our conference presentations. We're taking the scientific community along on this journey. In Q4, for example, we were at the Human Proteome Organization conference. We showed 24 of the different reagents that we need in our platform.
We showed decoding of those model proteins in a very simple sample. As we move through the course of the next year, we're going to get to a milestone at some point here where we will be able to decode a few thousand proteins out of a human cell sample. It's at that point, really, where there's a significant inflection point for us. For us, in order to develop the complete platform, like 75%, 80% of the work is just getting to that point, and then from there's a little bit of additional development and reagents that gets us the rest of the way there. We know that a lot of investors are watching for that publication as a key inflection point and a catalyst in our business.
I think that, you know, for us, internally, it's a catalyst as well. By the time we can demonstrate that data and publish it, we'll really be looking to sign up early access customers for our launch as well.
Great. Thank you so much for the time.
Thanks, Evie. Thanks for having us.