Butterfly Network, Inc. (BFLY)
NYSE: BFLY · Real-Time Price · USD
5.16
-0.32 (-5.84%)
At close: Apr 28, 2026, 4:00 PM EDT
5.40
+0.24 (4.65%)
After-hours: Apr 28, 2026, 7:49 PM EDT
← View all transcripts

TD Cowen 44th Annual Health Care Conference 2024

Mar 5, 2024

Joshua Thomas Jennings
Managing Director and Senior Analyst, TD Cowen

Good morning. Josh Jennings from the TD Cowen Medical Devices team, and we are going to kick off day two of the 44th annual TD Cowen Healthcare Conference Medical Devices Track with executives from the Butterfly Network team. We're really excited to about this year for Butterfly, and we are really appreciative of Heather Getz, CFO, and Joe DeVivo, CEO, for joining the conference this year. I'm gonna hand it over to Joe to walk us through a presentation, and if there's time at the end, we'll take some questions. Joe, thanks so much for joining this year, and hand it over to you.

Joe DeVivo
Chairman, President, and CEO, Butterfly Network

Josh, thank you so much, and for everyone at TD Cowen for inviting us. It's an amazing conference. So let's give you an update on Butterfly. So we ended off the year a bit stronger than we thought. We've for those of you that don't know Butterfly, we are in an awesome market with an incredible set of products. And there's a huge opportunity. But last year, we felt there was a need to reset. There was a need, given the capital markets, to restructure, to preserve our cash, and give ourselves a longer runway. Also, what you'll learn is what we'll Butterfly is a handheld ultrasound company. I'll unpack the technology for you in a moment. But there, it's a company that can do a lot of things. It can work in all different clinical specialties. It's an all-in-one probe that is digitally driven.

You also run into the opportunity to do too much. So we decided to focus, focus on our, our core business, focus on our core markets, focus R&D, focus sales. And we've emerged at the end of 2023 with some real nice sales momentum. So we announced our fourth quarter a couple weeks ago. We are seeing that the changes that we've made in our model are working. And that we're seeing our business reaccelerate. So now, as I'll walk through some products in a moment. But as we enter 2024, we're lean. We have a very strong. Excuse me. Very strong management team. We have some wonderful catalysts for growth. And 2024 is gonna be an absolute great year for Butterfly. We're gonna kinda rip the rearview mirrors off the windshield. And we're looking forward because the future's bright.

So, in this upcoming year, we did guide that we will see double-digit growth this year. We've reduced our burn substantially to an investing burn of around $60-$50 million this year. We have about $130 million of cash. And we are really excited about what we're gonna do, 'cause as we've focused ourselves, we still invest heavily in R&D. So for those who are new to the story, let me unpack the technology for a moment. So in ultrasound, a standard ultrasound device has it uses crystals, uses piezo-based crystals as a lens, a transducer to focus energy, and then power to generate that energy. The standard devices, as you see on the right, there's different frequencies that you need for different parts of the body.

If I wanna do an abdominal scan, I'm gonna use one type of probe that delivers a depth and a shape of the energy. If I'm gonna do musculoskeletal, I'll use a different probe. If I'm gonna do cardiac and pulmonary, I'll use a different probe. That's what's standard. If you ever look at an ultrasound cart in a hospital, they have four different handles, four different transducers. And it's very typical that each of those are used for the different parts of the body. Well, about a decade ago, it was found that semiconductors can be used to generate this energy. Semiconductors can form these curvatures and can form these wavelengths. And you can do it all with one semiconductor. So this innovation is based upon a MEMS technology that's on top of a silicon wafer.

On top of the MEMS, there's 9,000 little sensors on this rectangle. And then we can direct. That's a great ringtone.

Heather Getz
Executive Vice President, Chief Financial and Operations Officer, Butterfly Network

My daughter.

Joe DeVivo
Chairman, President, and CEO, Butterfly Network

So she's fun. No, and so on this wafer, there are 9,000 sensors, and we can control all of those 9,000 sensors. Just like a pixel on an LED screen of a TV. And so what is the benefit? Well, the benefit is we can now have one probe that can do all the different use cases that all the different handheld ultrasounds would have. What's the benefit? Well, if I'm an emergency room doctor and I don't know what I'm gonna encounter, instead of having to identify which device I would go grab, I'm able to always have one device plugged in. I identify which part of the body I'm scanning, and if I'm scanning abdominal, if I'm scanning cardiac, if I'm scanning pulmonary, it all then tunes the device specifically to that application, and I go.

Within three seconds, I plug the phone in, and I'm scanning, and I'm scanning, exactly what it is, that I'd wanna have. We have over 1,000 of these, for example, in Ukraine, where medics are using a device because they simply aren't gonna carry four different devices or have a cart. They're gonna have one device 'cause you don't know what you're gonna encounter. That's why we do so well in emergency rooms, and we do so well with individual doctors in primary care practices. And so by tuning our digital capability, we have built a whole market and have grown. Point-of-care ultrasound has been around for a long time. It's been used initially where old carts were repurposed in the emergency room, and doctors were starting to look at things that they never looked at before. They found tremendous benefit.

And then since handhelds have come out to make a portable device, but then Butterfly was the first all-in-one device. And the all-in-one device is not only powerful, but it has the ability to grow in its processing. And I'll talk about that in a bit. But to have one portable device versus a cart, this is a lot of change management. It's change management because if you're gonna get an ultrasound normally, you would have an appointment, you'd have an ultrasonographer in a high-cost facility, you'd then get that scan, and then a specialist would read that scan. But imagine if you can train a doctor how to read the ultrasound.

Imagine just as comfortably as they take their stethoscope out and look for pulmonary sounds, for example. Look how easy it would be if they can just pull a device out and scan your lung. And if you have abdominal pain, they can just scan your abdomen to see if you have appendicitis, a bowel obstruction, or some other potential challenge. So that is the ultimate goal. So what is our model? Our model is we sell probes. We sell annual subscriptions to the software. We also sell software for enterprise middleware to help manage ultrasound in hospitals. And we sell professional services. But also, our business model's evolving because we've been adding some things in 2023 that are very exciting.

Many people believe that one of the ways to bridge from an ultrasonographer who's an expert in knowing ultrasound to then training doctors and even nurses on how to do ultrasound is using AI. And using AI tools to be able to help automate the capture of an image, and then use AI to potentially even interpret that image and spit out data. Well, what we have done is we have created an environment called Butterfly Garden.

What Butterfly Garden is, is very similar to an app store where we now have opened up our platform to allow third parties to develop their ultrasound AI and then capture an image with a Butterfly, plug it into their phone, open up a specific new AI app, and then be able to benefit from the technology and the read that comes in from Butterfly to be able to then run through the unique algorithm that's created by that AI. So, I'll talk to you a little bit more about that, but this over time is gonna become a nice contributor to our top-line growth and our general on in our gross margin.

We've also launched Butterfly Certified where we have advanced certification training 'cause the bridge between the ultrasonographer knowing how to do a scan to having a doctor or a nurse being trained is a very big step and a very big leap. And so we invest a lot in education. And then also in developing our semiconductors, we have over $500 million of invested capital in the development of our semiconductor technology. And we see that that particular semiconductor has value outside of point-of-care ultrasound, and we are now starting to sign licensing deals helping empower this technology to be used in new markets. So in how we look at the market, the vision of the company is one device per doctor.

If we can replace the stethoscope or augment the stethoscope and every doctor can have a probe, then that would be a very large market. It could be used in multiple places, whether it's in acute care and facilities, whether it's outpatient, or whether it's home. These are the three areas that we are targeting in the evolution of our technology. So how big is that market? Well, there's about 1 million doctors in the US, about 7 million or 9 million worldwide. And so at a rough, you know, $2,000 per device, you know, you can see the size of that market. But as AI tools, we just trained a year ago 500 midwives in Kenya to determine what the fetal position is of the baby.

'Cause if they know what the position is of a baby prior to childbirth, then they can actually help reposition that baby to have a much higher successful birth. And 20,000 women a year die in childbirth. So it's a very, very big issue. And we trained, with the help of the Gates Foundation's funding, we were able to train 500 midwives to do this. So the concept that ultrasound can be used by not just an ultrasonographer, but can be used by medics in the field, ambulance drivers, emergency room doctors, and even midwives is very possible. But how do we get them? In the U.S., there's over 4 million nurses, 27 million worldwide. So imagine if we crack that code and we're able to get that many people, in. But then also, ultrasound is needed to help diagnose chronic diseases.

There is a path for us that with the continuing development of our semiconductor technology to be able to dramatically grow our market opportunity if we are able to ultimately get into the home where there's 30 million chronic care patients in the U.S., 150 million chronic care patients worldwide. Every single health system in the world is looking at how do they manage patients outside of high-cost environments. Whether it's, A, I wanna deliver care to a rural area 'cause they don't have access to care. Or B, I'm managing a high-cost patient and I wanna keep them in a lower-cost environment. There's trends for all different types of every health system around the world is trying to do that. And there are many companies with remote patient monitoring and other ways of helping manage these patients in the home.

We believe we have an opportunity there. So, the product that we have is not just a probe. We have our probe, but we have a lot of software. A lot of software with a lot of software capabilities, because our chip gives us access to a lot of data. We are architected in a very modern way. So today we have over 145,000 devices around the world. All of those devices are connected into our cloud. And so the amount of imaging data that we get on a daily basis is ultimately the largest ultrasound repository in the world. And then from that repository is a lot of learning. First of all, our customers have access to their data, and we integrate into EMRs, and we push data into DICOMs for them.

We also then have a large repository to run very sophisticated AI algorithms to be able to identify trends, commonalities, which would then inform the development of very sophisticated AI tools. And doing that, we also have a middleware that helps, it's called Compass, that helps manage proficiency. On average, from the data that we see, any hospital is only capturing 35% of their scans into the EMR for reimbursement because they don't have an effective middleware. And we find that we converted 123 hospitals last year into our middleware, and we're seeing their reimbursement capture increase significantly. Now, if there was a rub, so we've launched our iQ, our first generation in 2018. We launched our second generation in 2020. And our iQ+ was our second generation and was absolutely and today is the workhorse.

You know, again, we have sold more devices, on a unit basis, in the point-of-care handheld space than any other company in the world. And that includes all the bigs. And we've done it by creating a direct channel to our customers as well as building a global infrastructure. But if there was a rub, if there was a barrier for us to get into more hospitals, it was image quality. Hey, you have that semiconductor. Hey, it's great that you can have all these different probes in one probe. But when I wanna get to that cardiac image or when I wanna get to that advanced OB image, the hospitals or those in the hospital other than the emergency room would push back.

So we built, again, a very large market and a very strong opportunity directly to doctors and in all different types of places around the world. But one of the things that we benefit from is the progress in semiconductor computing. Those analog devices that exist today, the crystals, the transducer, it's very analog. And we believe, other than image processing, that there's from an image capture standpoint, that technology has kinda reached its ceiling. There's only so much power you can put through a crystal before the human's hand gets burned because the heat is the artifact. It is what occurs from it. So what we've done just a few weeks ago is we've launched our third generation technology called iQ3, which is shown here on the screen. iQ3 is smaller. It has more than double the battery life.

Our past processor was 4.8 gigahertz per second. Our new processor is 9.6 gigahertz per second. We can, and that is equivalent to 20,000 4K movies running concurrently. That's how much processing power and speed we have, which now allows us to get into an entirely different generation. The first, in my view, the first phase of Butterfly was showing that we can do it, was that our chip and our MEMS technology can be equivalent to any handheld made by any company out there. With this third generation, within a degree of freedom, whether we're a little bit better or a little bit less, we are right there with every other handheld on the market. The next phase is how can we take an image smarter and how can we do things differently?

And so first of all, on image quality, you can see the slides that are identified as iQ+. That is our second generation. And you can see the step function in image quality of iQ3. Now, we did a survey, a blinded study where images of our iQ+, our iQ3, and GE's Vscan and GE being the biggest ultrasound company in the world. And on a blinded study, 475 doctors looked at images. And by far, the majority selected the iQ3. So for the first time, a digital capture device now is at minimum equivalent, if not in many people's eyes you know, imaging is all in the eyes of the beholder. But in many people's eyes, on a blinded basis, they selected iQ3. So we think we've captured a major accomplishment. And here, you know, in pulmonary scanning, on the left, you see GE.

You see our, our second generation iQ+ in the middle, and then you see iQ3. So we think it's go time. We think we've accomplished the goal of having not only an all-in-one device, but an all-in-one device that now is as good as any other device out there. And wait, by the way, it's less expensive than those other devices. In order to have those four devices, our competitors charge between $5,000-$7,000 for those devices. Times 4, that's $20,000-$28,000. That's what you buy; that's what you pay for a cart. Well, of course, that's the model. So if you replace a cart with their handhelds, it's really the same cost to the hospital. Whereas our iQ3 is $3,899. So less than $4,000 for one digital probe that does everything that the handhelds do.

So now that we've taken image quality as a competitive knock against our company out of the equation, there's no reason for any company to wanna purchase a suite of handhelds when you have the ability of having one digital device that's cloud-based that has advanced digital tools and processing. So we are really committed to developing this market. In doing so, we are very committed to education because education, you know, is literally the biggest barrier to the growth rate of this market. How many more people learn? How bigger does that market get? Interestingly, 60% of medical schools in the United States have Butterflies. And their curriculums, I think over 70%, have curriculums specifically to develop point-of-care ultrasound. So let's think about that for a moment.

If over the last several years, 60%-70% of the medical students are now being trained ultrasound just like they were being trained how to use a stethoscope, do you think it's inevitable that doctors are gonna be using diagnostic ultrasound in their practice? I think so. 42% are residency programs today. We're right now in medical school season, and we're excited to be talking to schools 'cause ultimately, while we are in that many schools, not as many have a one-to-one model. They might have a few there that they use to teach in a course. But if you truly want to learn ultrasound, the student has to have their own. They have to walk around with it. They have to be able to scan when they want to and practice and practice and practice.

The different training tools that we've developed over time are our Butterfly Academy. That's our University of Phoenix. You wanna learn how to do ultrasound? You can go online. You can learn about the anatomy physiology. You can learn how to read scans, where to position in place, take tests, and be able to self-learn. On the far right is our new Butterfly Certified program. Butterfly certification is an in-person, 6-12-week, depending upon the number of scans you learn, course where it's near the level of getting a CME without the CME. But we provide that. We have a partner, and we have our methodologies. We're very committed to the development of the education. Then we've recently launched an AI tool called ScanLab. ScanLab basically. This is ScanLab.

A ScanLab click again is a tool which tells you first if you choose which scan that you would like to do, whether it's a cardiac scan, a pulmonary scan, an abdominal scan. So then once you choose your scan, it will then tell you where to position the probe. It'll then show you what your image should look like at that probe position. And then when you plug your probe in, it'll show you, of course, the image based on that position. But then there are AI tools that tell you what you're looking at. It'll put a label on the mitral valve. It'll put a label on the left ventricle. It'll put a label on the septum.

Now that white abyss of cloudy ultrasound now starts to make sense to you. By having your device with you and having ScanLab as an AI tool, it allows students to learn and practice and be able to develop their skills much, much easier. In the development of the market, AI tools are very, very important, not only for education but also to automate the ability of capturing an image. We have the ability now where we can see and an AI tool can tell you how wet a lung is. If someone has pneumonia or if, it's a congestive heart failure patient and they're they're getting more and more fluid in their lungs, we have a new AI tool on the left called Auto B-line Counter that can do that assessment.

We also have an AI tool that can look at the bladder and tell you how full it is and what its reading is. But, you know, Butterfly, regardless of of how we're leading this market, there's over there's hundreds of companies that are now developing AI tools for ultrasound to make it easier for people to for diagnosis, whether it's for third world countries or whether it's simply to to to to allow it to make it easier. And we are now signing up partners who are paying us to have access to that install base, paying us to be able to use and plug Butterfly into their application. And then we revenue share when they sell their product into our customers. The first of these are gonna launch in April or May of this year.

And so, the first partner where we're now gonna monetize it, and the way it works is they'll put, say, Think Sono, for example, one of our partners. They'll put an app in the App Store. Then they'll sell that app to one of our customers. Our customers will download the app. They'll open it, and they'll plug their Butterfly into that app. And then the unique capabilities of their AI algorithm runs on the input from Butterfly. So by creating this ecosystem, we become more valuable to our customers because we're creating more and more solutions for them. And we're also valuable to the partners 'cause we're giving them a distribution channel where they don't have to go buy a new ultrasound in order to use their application.

And then we have also monetized our chip with two developments with Forest and Mendaera, who are actively using our hardware in other markets. And it's, it's very, very exciting. But probably the one thing, to to leave you with is what what will absolutely change the future is how images are captured. Standard ultrasound is like a flashlight. The image comes straight out of the handle, out of a fixed lens. And as I move the lens and I move the flashlight, I move the beam. That's how ultrasound is captured. Now, for less trained, we actually have the ability 'cause remember those 9,000 sensors on the chip? I can control all of them. So for the first time in history, we are actually moving the beam without moving the device. So we can now place the ultrasound device over a kidney. A nurse can press a button.

The device, like Star Trek's Tricorder, just goes and scans. And not only does it scan it, but if you look at the right and look at the – this is a slice. It'll deliver you 46 different images at the bottom. The doctor just scans to identify which image they want. And then it does the measurements or whatever on that image. Very similar to taking an MRI. If you take an MRI, you get – the doctor is delivered a package of thousands of images. This is like doing a quick snapshot. So this is the next generation of ultrasound. We've already passed them now 'cause we're equivalent on imaging. Now we're changing the game using our digital tools on image capture. In 1995, the first digital camera was launched by Kodak. It failed because the image quality wasn't good enough. They jettisoned the program.

Toshiba, Canon, Fuji all developed digital cameras. They were committed to that space. What allowed them to get better? What allowed them to get better was semiconductor technology. They went from 1 megabyte per image then to 2, 3, 5, 7. Today, I think it's 45. In 2003, when image quality was identified to be equivalent, they sold more digital cameras than any of the analog out there. That is gonna happen now. Butterfly is in the process of capturing and completely revolutionizing ultrasound and handheld ultrasound. It'll look just like this does for an MRI. We are now able to scan this. But it doesn't just stop there. It gets to a point where what camera do you guys use today? Use your iPhone. Well, why?

Because the semiconductors have not only gotten your image quality good enough, they've miniaturized the sensors that they can impregnate it into a common device. That's what Butterfly will do next. We will create a device that is no bigger than this. It'll go on the patient at home. And a doctor from the hospital can come in and scan the body and check is there, you know is the tumor growing inside the kidney? Is the fluid getting worse? What's the cardiac function? This is not just a handheld ultrasound company. We will completely, unequivocally revolutionize imaging and ultrasound. Thank you. Thank you for your time. So I don't know. Do we have any time for questions? So the clock went to 0, so I stopped. Is that it? All right. Maybe Joe just to highlight the Investor Day coming up in New York City on the Investor Day.

Just get that out there and just remind people when they give the agenda. So on March 18th, at the New York Stock Exchange, so Monday, March 18th, we're having an Investor Day. And there we will unpack the technology more. We'll have workstations. We'll talk a little bit more clearly about our roadmap over the next several years. And we'll unpack our business and our technology and be able to really let you touch and feel all the things that we're working on. So thank you so much, Josh. We really appreciate it.

Powered by