Micron Technology, Inc. (MU)
NASDAQ: MU · Real-Time Price · USD
746.81
+100.18 (15.49%)
At close: May 8, 2026, 4:00 PM EDT
757.35
+10.54 (1.41%)
After-hours: May 8, 2026, 7:59 PM EDT
← View all transcripts
Analyst Day 2017
Feb 2, 2017
Alright. Welcome, everybody. Hopefully, you enjoyed that that brief video. I think, as Ivan said, I think we've got a great program for you today. A lot of good information, to cover.
So I'm going to move right along, but before I jump into specifically what's going on at Micron,
I'd like to just back up
a little bit and take a look at what's going on in the economy and why is data so critical to the new data economy? Why is what Micron's doing really relevant? Why do we find ourselves in this really pretty miraculous market environment where demand is growing for all our products and all our technologies across an increasing slots of end market applications. So when you think about when you think about the new data economy, it's just it's really astounding to me. The types of things we do today that we didn't do just a few years ago with the data that we have.
Think about autonomous cars. Well, we're not all driving autonomous cars yet, but we're all many of us are driving, particularly in this room, I'm sure many of us are driving cars with ADAS's SIS and collision avoidance, and then the use to come pretty quickly, we're going to have autonomous cars. I know when I buy a car for my teenage daughter here sometime in the next year. It's gonna have all the collision avoidance stuff. I can figure out how to get on there.
But These types of applications are, they drive real value. They save lives. They make a big difference in people's lives. And we're at the beginning of that. You think about autonomous vehicles in 2020, those things will be, driving an enormous amount of data.
And Jeff Bader, when he comes up later and talks a little bit about our embedded business unit, we'll focus particularly on the automotive segment, tell you some of the things that are going on there. You think about the beginning of cognitive computing and artificial intelligence. IBM's Watson today now has downloaded, into the system, all the medical literature, in history, practically, and is now downloading real time patient data from patient monitoring systems. And it's now proven that they can predict diabetic attacks out in advance of those things actually happening. So real life saving applications starting to happen in the realm of artificial intelligence.
And we're at the beginning of that cycle. And then you think about scientific research and what's going on of discovering how the universe actually works and discovering the Higgs Boson and all the things that are going on in international labs and super super colliders and things of that nature, just immense amounts of
data being generated. And all of that's got
to get sorted and processed real time so that you can identify the one particle that you actually care about and what's going on with this interactions with the particles around just an explosion in our understanding of the universe, all driven by the new data economy. And those are, again, just the tip of the iceberg. So all of these things really are driving 3 major trends. That I think impact Micron's business and something for us to be cognizant and aware of. First, of course, is just the data proliferation.
Just the enormous growth and the amount of data and the locations of that data. It's not just all in your PC anymore or all in your smartphone, It's in the data centers in the cloud, and it's all the way out to the extreme edge of networks all over the world. By, by 2020, it'll be 5000 gigabytes is growing exponentially. So by 2020, it'll be they talked about some number of quintillion bits. I don't even know what that is.
But the way I think about it is in a couple of years, in 2020 in about 3 years, it'll be over 5000 gigabytes per person created every year. 5000 gigabytes per person of new data. And it's just a phenomenal number, and it's distributed all over the world now. It's not just in the United space in the first world, already today, that over half of that data in 2017 is going to be in emerging economies. So the geographic distribution of where this data is being generated, where it needs to be processed, and the people that are benefiting from it is really starting to change in exceptional ways.
It's not, you think about your smartphone and the high density videos that you can stream to it now, etcetera, etcetera. That is becoming less and less important, what's becoming more and more important is all the data getting generated out in the internet of things and all these sensor networks out of the extreme edge of the network starting to talk to each other and use that data in new ways, that is just incredible.
And it all has to be analyzed
real time, not all of it, but a lot of it. In order to be useful, A lot of this data, and now has to get structured in some way so that you can operate on it. And it's got to be used in a lot of applications relatively quickly. So take, for instance, in micron fabs, we now have thousands of pieces of equipment per for fab and thousands of data streams that come with the equipment that we buy from the equipment suppliers generating data every number of milliseconds And then we add our own sensors into that equipment and generate additional data. And we try and
look at all that real time and understand real time, not
when the wafer gets to the next plus step or when we get to a metrology step on downstream, we're trying to monitor all those things real time so we can drive higher yields, higher quality, it all means we've got to get that data into some sort of real time cognitive computing system in order to get
the value out of it.
Just a phenomenal change in the way we think about our ability to recognize signals. We can now see signals from our tools and signals from our defect measurements, etcetera, that we could never see before. Think of it again about the autonomous driving application or your ADAS application. You gotta get data from your, from your visual sensors, you gotta get data from your radar, you gotta get data from, from, other, other, sensor systems within the car and then figure out how you're going to do that and how you're going to react, what's going to happen with the brakes, etcetera, etcetera. And that's all going to happen real time.
So real time analytics becoming more and more important, and that drives certain things in our business that we'll talk about as we go through the day to day. And finally, all of this is only important because it drives economic advantage. When you think about one of the really valuable companies out there in our economy today, you think of the Googles and the Facebooks and what are they, there were repositories for your data. And your data is valuable. But beyond that, the data is valuable in all these cognitive computing applications and in all these artificial intelligence So if we can save people's lives with cognitive computing, or we can add cognitive computing or artificial intelligence to all our business analytics.
And by the way, by 2020, I believe that
the stats are, it's over half
of all business analytics will have some sort of cognitive computing functionality. Then you can drive real value in the economy. And then that just drives the insatiable demand that we're seeing in all these end market applications. So Micron's technologies are critical to all of us. We absolutely are the engine underneath this new economy and driving it in many, many different ways.
First of all, data availability. Last year, Google said that memory latency is now much more critical to them, than their processor speed. When we go out and talk to industry consortiums now about new IO standards for for Micron, whether it's with the OEMs or the data center or the enablers, there's a whole new, openness and interest and acceptance of the fact that memory is a critical player at the table in terms of defining all these interfaces, because at the end of the day, if people can't get the data, they can't they can't get to the solution that they need in order to drive the value in the end economy. And so, really just huge benefits. Coming from data access, whether it's consistent memory, scratch pad memory, storage, or wherever else that memory resides and needs to be processed.
When we talk about storage, It's really all about near storage today. Storage that's out, coal, somewhere on a tape or a hard drive, much less useful than data that sits proximate to the central processing or proximate to whatever the end application is. And so when we look at what's going on in the data center today and this rapid transition from hard drives to solid state drives and the way that curve is really going exponential. A lot of it's driven by SSDs today, but that's just the tip.
That's just the very tip of the iceberg on
what's going to happen relative to storage and how close it can get, to the application and to the memory as we start deploying new architectures, new memory architectures for storage, we start figuring out how to use NAND in a really optimized software environment, and we start adding in some of these new technologies like 3 d cross point. So all of these, end applications and the ability to use the data can be driven by new storage architectures And finally, we've always talked about bandwidth and power and the memory bottleneck and how that is, really critical to the advancement of of applications and computing. But it's never been truer that when we think about the high performance computing, tasks we try to do today as they get more and more complicated. And we talk about the size of the datasets and the vast wealth of information we're trying to bring together, to drive those end applications. Never been more critical that we have the right solutions from a high bandwidth perspective or from a data access perspective.
And you'll hear more about that from both Tom and CMBU. And when he starts talking about some of the things that are going on in the graphics market, as well as all the other business units as well, because it's the all these trends really overlay all our businesses.
So this was this was sort
of a memory environment when I started, a micron in 1984. It was all PCs. Pretty simple. You had a processor. You had memory.
You had storage. It's still completely different today. The end applications are everywhere, and they're all different and the opportunity to really tune the solution to their application and provide some of the collateral software and even processing that needs to come with it has never been greater. Many of you may have seen, as you came in the room, the virtual reality application that are good friends, collaborators, partners, Nvidia will running out there, running on a graphics gddr5x memory. Just one example of how memory becoming critical to these end applications and driving a new world.
But it goes way beyond that. It goes to things like labs out in the marketplace, to work with, automotive and handset, developers to make sure that the memory is custom tuned to their environment and providing all the performance they want. It goes to, things like a, a compute service center in Austin. So to take our SSDs and use those in concert with our customers to optimize software solutions. So really a very a complete C change in the way our products are used and the opportunities that now accrue to Micron as a result of that change.
Excuse me.
Alright, let's bring a little closer to home.
Demand for Micron's products, I think somebody was asking me earlier. Do I remember a cycle where demand was as strong for both DRAM and NAND simultaneously? As it is today. And I'm not sure I know the answer to that. I'd have to go back and take a look at the data to really fine tune it, but it is certainly very, very robust.
And the growth is still exponential. We're still running over a 20% CAGR for DRAM and 45% CAGR for NAND. So exponential growth, rapid growth across a diversifying set of end market segments, really providing a great opportunity for the company as we move forward. So that's just the short term view. When you think about a little further out, what's it going to look like in 2030?
All these new data economy applications, are starting to proliferate. It's harder to know with any precision exactly where all these bits are gonna But what we do know is that a significant amount of the total will be in applications that don't even exist today. There'll be in applications where memory has now been repartitioned and re architected in order to deliver against those imperatives of near storage, high bandwidth and potentially even approximate computing. And so it can't get more exciting than it is in memory today. And the promise, I think, has really never been greater in terms of what we can do to drive value for is very benign and in both DRAM on the left and NAND on the right of this graphics supply being outstripped by demand.
And so as we think about the future here, we've got great opportunities to continue to develop advanced products and advanced technologies in both the in both the system main memory applications as well as NAND applications as well as some of the emerging memories that Scott's going to come and talk to you about here in a minute. And in in both large sections of the market today, we see a very a virtuous supplydemand balance that is only made better by the fact that the end markets are differentiated and that the products are becoming stickier and more accustomed to the end application. Stepping back and taking a look at the long view, This is, this is memory market in aggregate since 1990.
There are a couple
of things to note. First of all, when you look at the total market, it continues to grow. It has been volatile, but you see that the cycles are becoming less volatile. And in fact, I think it's safe to say that we're out of the 2016 bottom now and that was the least volatile cycle in history. When you decompose it into the 2 big parts of the memory business today and look at DRAM, you see that DRAM really has historically driven most of the volatility, but that, again, is becoming less of an issue as these end markets are diversified in a significant way and the supply base is consolidated.
And then a lot of the growth has been driven by NAND. We certainly anticipate that as we go as we continue to go forward, we'll see this trend. Now I know there's a lot of questions out there in the investment community about, well, how long does NAND look like this? And is there a wave of oversupply coming? I think it's tough to know.
We can't, you know, none of us can know the future, with any certainty, but we do know that there are these huge drivers And we do know that the demand is elastic that once you reach a certain point, there is pent up demand yet to come. And so what I would say is, while I think, whenever you have technological disruption and change, things can't get out of balance for a while. I think the situation with NAND is probably as it has been that the market will be generally a growth market it and that the to the extent there is some volatility, it'll probably be very short frequency volatility because elasticity will kick in and take it back to an equilibrium point. So long term, it's a great market. When you think about memory, over the next, 3 to 5 years, it's probably going to grow at 2.5x, the rest of the semiconductor industry growth rate, X Memory.
So you think about, you know, semiconductors at maybe 4% per annum and maybe memory at 10% per annum, it's a pretty good place to be if you want to have exposure in semiconductors, which I think is a pretty good place to be if you want to participate in the data economy. Okay. We had a pretty good year. I would say we had a very good year in 2016. We started the year with a lot of challenges.
We, we took a step back and identified what were the key priorities as we always do. And then we went to work executing on them. And I think we can say really mission accomplished on all these operating priorities that I showed you last year. 20 nanometer ramp for DRAM went very well. We're well on our way to 1x nanometer manufacturing enablement.
Scott will talk more about about all of these, when he comes up in a minute. The 3 d NAND, a 32 layer gen 1 ramp, went very well. We're very happy with our yields. We did crossover in terms of 3 d NAND bit output recently in the fall of 2016. And we continue to make good progress on our 3 d NAND, sorry, on our 3 d Crosspoint technology development and enablement with key customers.
And then in the background, we still did a lot of work, both internally and externally on our controller capability, our firmware and software capability, made good progress across a number of different business units in that arena. As well as, with the title acquisition and the great work those guys are doing for us for the future. We brought on some great new talent and advanced packaging development manufacturing and have a good plan there that will we'll continue to work on as we move through 2018, 2017. And really this emerging memory topic, we can't talk too much about it, but I can tell you I am so excited, not only about what's happening with 3 d Crosspoint, but some of the other new emerging memory technology that Scott and his team have been working on, I just think it's really transformational stuff. So 2017, it's a new set of priorities.
But they're not all that different in terms of the focus. We're still focused here on making sure we've got all the wood behind the arrow in terms of driving new technology, into the manufacturing space and getting the manufacturing efficiency out of This year, it'll be all about ramping the 1x node in DRAM and making sure that the 1y process is mature to come come behind that. We'll be ramping, continue ramping on TLC in three d Gen 2 NAND technology. We are very excited about how strong our 3 d NAND bit performances and the characteristics of that. So we'll be stretching that on to Gen 3, but also we'll be working on on QLC, our working on QLC enablement so that we can take advantage of the density advantages that come with QLC for certain clouds.
Providers moving into the future. And then we'll continue to work on technology development, and you'll hear more about all of these things from the rest of the team here as they come up and talk over the next hour and
Scott will be 1st.
He'll talk about technology and product enablement. We'll have all 4 of the BUs, I can tell you, I have never been, prouder of the Micron team and what they've accomplished in the last year. And the way we're positioned for the future, we've got the right technologies, we've got the right customers, we've got the right market environment, And hopefully, you're gonna hear that story from them. And then at the end, I'll get to come back up and, and, take some Q and A from you. But for now, I'm gonna turn it over to Scott, and he's gonna talk about technology.
Thank you. Okay. Good morning.
I'll follow-up on a few of the different topics Mark mentioned. Lot to talk about from both last year and the upcoming. So I'll start off today and go into a little more depth on our 2016 performance and how that sets us up for success in the upcoming year. Then I'm going to give you a little more detail on how we're making progress on DRAM and where our development model has evolved into and how that's leading to success over over this over both this year and some of the previous results that you're starting to see now. On the NAND technology side, I think we have we have a great story here.
And both in terms of how effective our 1st generation of 3 d NAND technology is as well as some results on our next generation that we're really proud of. And then I'll finish up with some detail of our view of emerging memory technology, both in terms of where is that today, but also how it evolves the future and how we're positioned well to take advantage of that. Okay, so first on the 2016 topic, We have migrated more mature yields now on our 20 nanometer technology. We last year, at this time, had projected we would get about a 25% or more cost reduction in moving from 25 nanometer technology to 20. We've executed to that now.
And are pleased with where that transition has ended up. When we look at our yield execution, Our fabs are well positioned now with the ability to take match technology and implement it across consistently. And as a result of the efforts, on the DRAM model that I'm going to talk about in a second and other things, our yield execution strength is looking very solid going forward. When we look at the progression from 25 to 20 And the early 1x results now, we're seeing progressively better results, both in terms of yield ramp capability, so the time to get to mature yield and also the material level. So, better as we go through these nodes, I'm feeling really good about the progress there.
And then I'll have quite a bit more detail for you on our 1x nanometer technology So I'll save that for a later slide. On the 3 d NAND front, our 32 tier technology Last year at this time, we projected we'd get more than a 25% improvement in costs relative to our last planer node. When we worked through everything, we actually wind up with more than 30% improvement between that last planer node and our first thirty two tier technology node. So we're now, as I mentioned that, we're at maturity yields on our 32 tier technology. And a couple of other interesting points on that is one for I think really the first time we're seeing matched yield on TLC and MLC.
So our TLC products are there's no yield difference between those 2 technologies. And the second thing is really on the performance front. And without getting too deep here for you, The interesting thing about our 3 d NAND technology is if you look at a comparable density of dye, for our 3 d NAND technology versus our planar NAND technology. Our performance on the TLC product is actually better on three d than it was on planar for an MLC product. So from a performance point of view, that's pretty outstanding to look at a TLC product on this generation of technology and say that it's actually better than we used to do on a 2 bit per cell product.
So this R3D NAND technology is really a pretty amazing balance of of cost improvement and performance improvement at the same time. And then I'll have some more information on our 64 tier, but just to start with, we're looking at a good cost again, as we transition to 64 tier technology, and we're making great progress on it. So next, shifting over to really the DRAM development model. And coming out of the Alpida acquisition several years ago, our challenge really was what's our DRAM development model going forward. And I think we've landed in a place now with a good balance of making use of our global talent and making turning that into a strength.
We have development centers in the U. S. And in Japan, and those 2 development centers below 20 nanometer now or alternating development focus from node to node. The 1x nanometer technology that we've now deployed into manufacturing is the first node that is coming from that consistent development model. And it's not just It's not just the fact that we have 2 sites.
It's also that we have aligned management, we have aligned and align capability and set of methodologies across these development centers that leads to more rapid execution. When we look forward, Our 1y nanometer technology and our 1z nanometer technology now are both resourced and moving forward. The 1y nanometer technology is more near term. You can think of it as about where the 1 X was last year when we were here at the same time. So the 1 y nanometer technology is the focus.
It's the next node and then one Z is farther out in
the future, but we were able
to start working on it earlier than we were historically. So we're getting an earlier start on these nodes. All together this development model, we think provides, will provide substantial reduction in our time between notes, and we're seeing good early evidence to that now and growing confidence. So on the DRAM technology front, when we look at the results of the model I just talked about, From the time when we acquired LPDA, we have significantly improved our competitive position relative to 2013. So we have we're in a much stronger position with our 20 nanometer rolled out in our 1X in the situation it's in now.
And again, a much stronger DRAM position than when I was standing here a few years ago. We will have meaningful output on our 1x nanometer node by the end of fiscal year 2017. On the bottom of this slide, you can see a yield picture of of our 1x nanometer technology. In addition to just the pure cost performance benefit, this technology node is critical to us for some of the mobile products, higher density mobile products that Mike's going to be talking to because it really enables some form factors and package capability. In addition to that, Tom will, Tom's focus on graphics today.
High density graphics and server memory are really benefited by the transition to this note as well. And I mentioned previously, we're able to focus on more nodes at once right now. So we have our 1Y technology at the moment rolling in development in Boise in the U. S. The picture at the bottom
is kind of the way it looks at
the moment. This technology will be moving into a manufacturing startup mode in the second half of calendar 2017. Okay. On the 3 man technology side, as I mentioned, we're at mature yields on our 32 node. 32 tier node.
Competitively an interesting thing, which isn't obvious, is from a competitive position, that 32 tier node actually is a strong competitor with, other suppliers, 48 tier nodes because of a combination of The architecture that I'll talk some more about, we call CMOS under the array. It's a very innovative approach that makes our dye density or a bit per wafer density much higher than a competitor die right now. The combination of that technology making our bit density only slightly less than the 48 tier, much less than you would think given the 32 to 48 number. And then the process complexity being significantly lower on a thirty two tier note relative to the higher stack and costs associated costs. So Our position on 32 tier at maturity is strong.
Our movement now is to 64 tier technology. And That ramp has gone extremely well over the past year. We are there's a yield map here. We're in a strong yield position on 64 tier right now, coming off of 32 tier where our mature yield level on 32 tier was comparable to the material levels that we had seen on planar nodes previously. So we feel we have a strong capability yield this 3 d NAND technology 64 tier is on a good path with meaningful output again by the end of fiscal year 2017.
And a good cost reduction trajectory. So our 64 tier technology again gives us a really nice cost benefit gigabit per wafer benefit shown in the graph on the bottom left versus 32 tier, and then we're now from a technology development point of view focused on moving to the generation beyond that, which again provides us not quite the benefit that the 64 tier does, but still a good cost and density benefit versus previous. As Mark mentioned, we're focused on QLC for BIPR cell technology and aligning that to market requirement timing. We have good confidence that the combination of our CMOS under array and cell technology wind up very well with enabling 4 bit per cell technology. All these generations benefit from what we call CMOS SUND Ray technology, and I'll talk a little bit more about that on the next slide.
So this slide is all about our 64 tier technology. As I mentioned, it's yielding very well right now. We're ahead of what we had set and thought were very aggressive internal targets. For the deployment of this technology, it's out with with customer sampling now and ramping to that meaningful output towards the end of fiscal 2017. This technology enables approximately, it's actually more than 25% type of wafer relative to any competitor 64 technology.
64 tier technology. What we have is the world's smallest die, world's smallest 256 gigabit die, it's 59 square millimeter This accomplishment is enabled through what I've been referring to as CMOS under the Array Technology. And effectively, what that is is a set of process and design innovations that allow us to move the circuitry that drives the memory array predominantly all underneath the memory array. So there's effectively no use of extra silicon. Most memory technologies, DRAM, our competitors NAND, Other things have the CMOS periphery driving into the array and it uses up silicon space on the wafer.
This technology shown on the right has the memory array, which is the 64 tiers, underneath it, all the CMOS circuitry. So this is a pretty big accomplishment and something we're very proud and enables us to be well positioned competitively for the years to come on 64 tier. Now moving over to new memory technology, com, as we've talked about before, Intel Micron had developed this technology. We're running and manufacturing now on the 1st generation. There's a lot of discussion over many years about new memory technologies coming in, whether it's startups or competitors, different things.
I want to give a little perspective on this and I want to next slide about how difficult it is actually to bring a new memory technology and and establish it in the market. There's very few that actually happen. So this should be thought of from a perspective that this is actually the only commercially ready new memory technology. In the market right now. And we are focused with our key customers on enabling this high capacity low latency solution for both storage and memory applications, and there's a lot of interest in this.
So, and we'll have some more to say about that later, I believe. Both from a technology development point of view, we're focused on the next 2 generations at this point. And those 2 generations are going to give substantial density and performance increases, again, versus the 1st generation that we're running now. We have high confidence in that roadmap and this is a real exciting technology for the future. From a Micron specific point of view, our development efforts internally are focused on another space, which is high performance memory, more DRAM like performance with a cost scaling opportunity that sets us up for the future.
So I'm going to talk a little bit about that on the next slide. So if you look at existing memory technology, one of the things that sometimes missed is The substantial challenge for our most significant challenge for any new memory technology is not trying to hit a DRAM performance today. Is trying to find an intercept point for that new memory technology where it actually is meaningful when it gets there. And if you look at kind of where the commercially available technology sit today
and then think where are they going
to be in 5 years or so? You see a couple of trends here. 1, one is for memory technology to be down this bottom quadrant on the performance versus cost per bit, graph, both NAND and 3 d cross point have very significant cost reduction paths over the next few years. So this is very much a moving target to try to intercept something down in that space with a cost profile that actually makes it significant and enable the market at the same time. So the opportunity there for, say, perspective, different kinds of technologies is challenging.
On the high performance side, with the DRAM somewhat slowing in the scaling path, so the cost reduction path. There's there we believe there's an opportunity to come in with new memory technology that has near DRAM wide capability, but fits the cost structure and the performance space, that's interesting for many of our customers. So from a technology development point of view, that's where much of our effort is. Okay. And then, closing today, just We're feeling much better about our DRAM competitive position, and we feel like there's been some great execution across all of our operational pieces related to DRAM.
The manufacturing team, the technology development teams have really come together and put us in a strong competitive DRAM position. On 3 d NAND, I think our yield position on 32 tier and now our world leading die size position in combination with some great demonstrated yields on 64 tier are going to position us with with great products and great technology for years to come. On new memory, I think we're thoughtfully positioning ourselves to take advantage of major transitions in the memory market over the next decade with technologies that can really change the way memory impacts, all kinds of systems. Mark mentioned this, and I know Ernie is going to talk about it later. We is
a focus of some of
our investment. Package technology is a critical enabler across all the spectrum of memory technology and becoming more and more important as form factors become very key cog in enabling certain applications. So we have we brought in new leadership. We have We've put a lot of our focus into making sure that we're leaders in the package technology area. And then finally, last year, at this time, I had told you all this technology that we're were putting together, was driving us to go look at starting a fab expansion for our R and D space and Today, I can tell you the R and D cleanroom expansion is nearing complete and we're going to be occupying it in the last part of actually in the first half of this year.
So thanks.
Thanks, Scott. So, I'm going to spend the next few minutes talking about both the growth drivers across all of CMBS based on segments and then doing a little bit more of a deep dive into graphics, both the segment itself as well as how the technologies and capabilities that we're driving in that segment can help CMBU thrive a little bit more broadly. As a reminder, CMBU Services, the Micron portfolio into PC, what we call client, traditional server OEMs, what we call enterprise, cloud and hyperscale, we put all that under the cloud moniker networking and graphics customers. The one portfolio exception is FSDs and high density NAND flash components that go into FSDs, that is serviced by Darren's storage business unit. And to be clear, CMV revenue is substantially DRAM dominated.
Okay, so let's talk a little bit about some of those growth driver across our portfolio. In the graphic space, probably the largest opportunity we see is the growth in augmented and virtual reality. IDC forecasted between 20152020. Revenue in this space is going to grow at more than 150% annual CAGR and reach over $150,000,000,000 by the end of the decade. Now a lot of that benefit is actually going to flow into Mike's business, and he'll talk about that.
But a very significant portion of it also benefits PC and console graphics, which will provide the most immersive VR experience. We switched to the cloud. We're at a little bit of a tipping point. The software providers that are walking into Micron's CIO's offices, and his peers' offices are developing their solutions for the cloud first. And so we're seeing a very significant spike in growth coming from both public and hybrid cloud deployments out of Fortune 500 on top of the start deployments that really drove the original growth.
When you add to this, the ongoing growth in infrastructure to support the core search, e commerce and social networking, you put all that together and it makes cloud our fastest growing business segment opportunity going forward. In enterprise, a big driver is in memory databases. You know, Mark talked about the need for real time analytics on a host of business problems. And if you look at applications like SAP HANA, they have a nearly insatiable appetite for the highest possible density memory configuration that they can get. We're also seeing a very increasing interest in what is the very first form of persistent memory.
It's actually something called nonvolatile dim. It's a hybrid DRAM and NAND solution on a dim that ensures that data is never lost in the case of power loss and use one example, Microsoft is showing about 100% performance improvement in certain database applications when they deploy NVDIM. Networking is really all about video, and in particular, peak video on demand, watching by eyeballs around the world. That's going to drive, over between 2015 2020, between a 4 and a 5x growth in peak Internet bandwidth requirements. As well more towards the end of the decade as 5G rolls out, it's going to further enable the many, many, many billions of connected devices that are and will continue to be the internet
of things.
So let's talk a little bit more quantitatively. If you look at CMbu's 3 largest market segment, those would be client, enterprise and cloud. In aggregate, they are growing at roughly the same pace as the overall DRAM market in that plus or minus 20 percent per year DRAM bit CAGR. However, within those segments, the growth rates vary tremendously. In client, while there is ongoing modest growth in DRAM content per box, that all that does is offset, slightly.
What we see is a declining unit market there. And makes client our smallest growth opportunity going forward. Enterprise, much healthier, and again, overwhelmingly driven by content growth per box on top of a modest growth in server unit shipments in that space that we see. And when that density is combined with the need for the highest performance, what you're seeing, quite a bit of interest in TSV stacked DDR solutions, one of the advanced packaging activities that Scott referenced earlier, and we're going to be ramping those products later this year. The networking space actually starts on a much smaller base.
However, it has a healthy bit growth And there's a much richer mix of premium either high bandwidth and or low latency solutions. These are things like RLDRAM HMC and GDDR class memories that make it a sticky segment and stickiness makes it that much more attractive. And graphics, it's not the overall fastest growing, it's number 2 in terms of bit growth, but I am going to focus on that from a deep dive per perspective, because we see it becoming increasingly important to the broader CMVU business, we think we have a good position in the marketplace And we think the technologies and capabilities we're developing there can be more broadly applied beneficially to our entire product portfolio and across all of our segments. So I talked already about how VR and AR are a major driver behind the PC and console graphics, but there's another significant contributor, which is the growth of gaming in general and e sports in particular. Per activate, by the end of the decade, there will be almost a 500,000,000 in sports fans.
And there will be more people watching major Esports championships than any other North American professional championship with the exception of the Super Bowl. And game consoles, while perhaps not quite as high quality or immersive experience in the VR AR space, still very high quality experience. And in a number of cases, can bring that capability at a slightly lower cost point and bring it out to a broader marketplace. The other big factor in consoles, the refresh rate driven by much faster upgrades in terms of content, in terms of display technology and in terms of AR and VR capability. That refresh rate has sped up dramatically.
They're doing refreshes every couple 3 years as opposed to what used to be 5 to 7. And so you add all that up And from a GDVR class memory perspective, we see a 4x growth in bit demand between 20152020 with bandwidth improvement between 2.53x. So this looks at, performance from a bandwidth perspective comparing traditional mainstream compute DRAMs, DDR3, DDR4 going forward, DDR5 with graphics specific technology. The first graphics memory that was very broadly deployed with something called GDDR5. When that was introduced, it was at 4 gigabits per second per pin, about a 2.5x bandwidth benefit over what at the time was the comparable alternative, which was DDR3.
The clock has kept speeding up on that. It's gotten to 8 gigabits per second per pin. So that's now about a 4 to 5x benefit versus the comparable mainstream DDR4 solution. Last year, working with NVIDIA, we introduced a new generation of technology called GDDR5x. Initially, at a 10 gigabit per second per pin frequency, about a 25% performance improvement over GDDR5.
Those that are interested, I'd encourage you to go outside. We've got a demonstration of GDR5X technology applied to a dual 1080 NVIDIA system, pretty amazing what that can do in terms of gaming and VR capability.
So as we go forward, we're going
to keep turning the clock up on GDR5X and later this year, early next, we'll introduce an extension to that technology called GDDR6. And over the balance of the decade, that's going to ramp up to 16 gigabits per second per pin, about 2x where GDDR5 is now. And we expect that's going to be 8 to 10 times the bandwidth compared to the comparable mainstream device, which will be in the transition between DDR4 and DDR5 at that time. Now, all this performance, of course, is going to get applied to the benefit of PC and console graphics opportunities, But we see we can also take this dramatically enhanced performance and leverage the economies of scale coming out of the enormous volumes that graphics drive, to move this technology into other segments as well. So just taking a look at a couple examples, networking,
applications
tend to be relatively more in need of bandwidth than they are of raw memory density. And so by moving solutions from DDR4 over to a graphic specific technology, you can lower the bill of material without impacting the performance of the system. And so when you look at some of the highest volume switching and routing solutions today, they use a networking tailored form of GDDR5. We call it GDDR5N, which applies the environmental and longevity requirement of the networking space to that GDDR5 technology. Going forward, we anticipate doing the same thing with GDDR6 and see future very high volume opportunities in switching and routing there as well.
High performance computing, which by the way is increasingly being delivered not only through the traditional enterprise channel, but also as a cloud based service. One of the very fast growing opportunities there is machine learning training, which is very often a GPU based solution. And a lot of those solutions use TDDR5, we can leverage our relationships with our graphics chipset partners, as well as our presence in the high performance computing marketplace to take advantage of these opportunities. And going forward, we see growing opportunities using GDR5X and or GDR6. As well.
So we look at the overall strengths that we bring, across a range of areas. I'm going to start with development expertise, right? We've had long standing development capability in this area that we have kept laser focused on graphics technology for over a decade, They were the team that developed, enabled, and ramped the GDDR5x technology last year, They're working hard on turning the clock up on that, as well as on bringing the next generation GDDR6 technology that will be coming out later this year or early next. Ecosystem partners, by working with the FOC partners, and the OEMs in this space, we're able to both codefine technologies like GDDR5X and GDDR6 as well as enable them in the marketplace, which allows for a very rapid ramp. Now, those ecosystem partners are really important in terms of assistance environment, but we also replicate quite a bit of that internally as well.
And also leverage internal test capability. It is not easy to cost effectively test a device that's running at 10 gigabits per second per bin and higher, and so we get quite a bit of benefit there as well. But last, but certainly not least is is the leading edge technology. The GDDR5x product we have on the market is the highest bandwidth single die memory that's available today. We're going to be applying some of the 1x nanometer technology that that Scott talked about, to be continuing to advance the performance and the cost effective of that going forward.
And as I said, we'll be bringing the GDR6 technology, later this year and early next. So overall, we're really pleased about the strength of our position, in the graphics technology and capability as we apply it to the segment. As we can increasingly apply it to additional segments in our portfolio and perhaps most importantly, as an indicator of how we can take the lessons and the capabilities that we're developing with this technology, to the benefit of the broader portfolio from both a product and a segment perspective as well. And so with that, thank you, and I'd like to turn things over to Jeff Bader, who's going to spend a few minutes talking about our embedded business.
Thanks, Tom. So first, I'd like to cover a little bit, just to explain what is the embedded business? And talk about the different segments that we cover. So first on that list, it's got to be automotive. And while self driving cars or autonomous vehicles certainly get all the headlines.
There are a number of other trends right now going on in automotive that are really important in driving electronics First among those is connectivity. Today, more and more connectivity is becoming literally a purchase criteria for the car, not just connectivity, but connectivity and the services, data access and application access that that enables, right? It's part of the purchase criteria today for a car. It's no longer horsepower. It's will it work with my phone?
And do I get LTE? And when will I get to 5G, right? The next trend is, obviously, as Mark talked about, is the implementation of these advanced driver and assistance systems, so called ADAS system, and the eventual deployment in that same space of autonomous vehicles, right? And I'll talk in some detail in the later slides about that. And then finally, we talked a few years back, we've been talking about so called cluster and infotainment fusion.
Meaning the merging of those 2 systems, and you're seeing that deployed today, and you'll see that increasingly being deployed, which is essentially a configurable cockpit. Your ability to move content across multiple display surfaces within the car is going to continue to increase. And all of those are going to drive significant growth, which I'll cover on the next slide. In the industrial space, it's really the deployment of the so called internet of things. Mark talked about this in the context of, data economy, but you think about the efficiency gain he talked about in our fab, in our own fab network as we've deployed connectivity and deployed sensor networks within internet of Things, the payoff on that investment is huge, right?
We also see increasingly the movement of that analytics and the movement of that intelligence to the end network, right, to the edges of the network. And we think that trend will continue going forward in the IoT space. And with that connectivity comes the risk on cybersecurity. We've seen significant, major sort of hacks right now that leverage the IoT networks, in particular cameras, the most recent one. And so we're seeing increasing interest from our customer base on how can we address that and what role does memory play in that.
In the consumer connected home space, the 4K deployment in both video display and video delivery is really the main trend that we see here, and what's happening is that that 4K deployment, which drives on its own sort of growth in the DRAM is typically being bundled with other sort of advanced features going in there. So smart TV functionality, high definition or high dynamic range. Content is driving increased memory footprint as well with that. And we're seeing as a second, a third bullet there, the OTT over the top is so called the IP set top box, huge growth in this as the number of streaming services and carriers deploying these these platforms as a means to continue to grow, grow the interest in the eyeballs on their content. And all of those are sort of underlined.
I talked about it in each in some of the individuals, but that connectivity going across the board, across every segment today is driving with it the ability now to accumulate and process data and to drive efficiencies in each one of those applications. And of course, associated with that connectivity as well, is it a growing need for cybersecurity, again, almost across all of these applications? So looking at a little bit what type of growth rates that drives and what does this market look like overall? I'll the bottom here in the consumer connected home space. Again, I mentioned this 4K.
The other sort of major trend that we see happening in this space is the adoption of mobile architectures in these applications, in particular, in applications like smart home, in wearables, action cameras, and so on, drown platforms for that matter. The adoption of a mobile architecture is a releveraging that that architecture from an SoC perspective, but also from a memory footprint, where we're leveraging the multi chip module packages. That combine DRAM and NAND in the same package. And Mike, we'll talk quite a bit about that trend within the mobile space itself, but we're seeing the deployment of those technologies as well into this consumer space. And then the other point on there is sort of the smart home opportunity and the growth in smart home right now that's being enabled and encouraged, let's say, by the deployment of these natural language assistance AI systems like Alexa Echo products or Google Home products, right?
And we see a growth in the market itself for that for that natural language appliance, but also the integration of that functionality within other devices, which in within the home are also driving growth that you see on the TAM there. In the industrial space, this is where that data economy really comes to play. And it really where the payoff is, right? So we're going to see the implementation of this connectivity, the implementation of that data and the capability to go do very relatively small improvements in overall efficiency are going to pay off very, very strongly in the industrial space in particular. And we see those deployments happening as well.
And here, again, we see the adoption of mobile architectures having a play here in things like IoT gateways, in robotics, in surveillance markets. The movement of analytics to the edge that Mark talked about and I talked about on the previous slide, we see happening here, again, in surveillance market, in particular, think of this as things like going, instead of from just simple motion detection in a camera to true gesture analysis and threat detection in the IP surveillance cameras, right? And so that helps remove or moves that analytics much closer to the data and much closer to where the where that decision needs to happen. And of course, it brings along with it a memory and storage growth associated with that. And then finally, in automotive.
So automotive, of course, is undergoing a substantial transformation right now with the adoption of technology, the adoption of electronics in the automotive. And we see that in both the sort of in car comfort feature, let's call it infotainment cluster, so that is driving up substantially the number of display surfaces and the amount of resolution associated with those display servers that needs to be managed and driven by the car. We see that, of course, in the sensor network, As cars start to adopt these advanced driver assistance features, the sensor network is growing tremendously in the car and generating a tremendous amount of data which again needs to be processed and or stored. The transition to 3 d maps, again, moving substantially increase in the amount of storage now that's going into a car. And Micron today, is a clear leader in the automotive business.
We have we're about twice the size of our nearest competitor. We have about 40% share of this market today. And it's a very, very important market for us. And looking at the growth there, it's the reason why I'll spend the rest of the time talking a little bit about what is going on in this space, and what's the opportunity for Micronet. So when you look at a car today, there is electronics now, literally everywhere in this car.
Right, whether it's the connectivity modules, the sensor network, the CPU driving, this advanced driver and assistance systems, going into the cluster into infotainment, electronic data recorders or so called black box recorders, all of those are growing today in the amount of application or amount of memory that's needed to support those applications. The complexity and functionality of those applications is increasing. And with it, the memory footprint that needs to support that and the storage footprint that supports that continue to grow. The other phenomena that's happening that's driving this growth that you see down in the bottom right is is the adoption rate itself of these features across the car OEM lines. That these functions and these features were only being deployed, let's say, in the very high end cars.
Today, we see that penetration rate much lower in in the vehicle makeup. For example, we talked about connectivity. Connectivity is projected to reach almost 90% penetration by 2020, which means it's well down into the entry level. ADAS features, which I'll talk about in detail on the next slide, are projected to reach almost 50% by 2020. So again, we'll be well into the mainstream portion of the market, right?
And that those 2 combinations of increasing footprint and increasing penetration rate are driving this tremendous growth that we see in automotive. So talk about autonomous driving. Before I do that, it's useful to talk about the motivation. For autonomous driving, why are we going in this direction? You've seen some of the steps here, 1,300,000 fatalities a year globally.
Mostly all human costs, right? And the U. S. S. Numbers is about 35,000
deaths. In
addition to that, there's another 4,000,000 or so injury accidents. In addition to that, there's another 10,000,000 or so non injury accidents. The National Highway trade National Highway Safety Board just did a study a few years back on just the economic impact of that. Right, and found that the cost of that series of accidents of fatalities was nearly $250,000,000,000 or 1.6% of the U. S.
GDP, right? Huge, huge motivation for why we're moving toward this autonomous driving capability. It's maybe good now to pause and talk about autonomous driving in these levels that I've talked about. It's typically classified from level 1 to level 5. So think of level 1, level 2 autonomous as essentially driver assist or driver advice, but fundamentally requiring the driver to be in control.
Moving forward to level 45, that essentially either completely or partially, partially or completely removes the driver dependency from that equation. The car needs to be fully dependent or fully independent or at least situationally independent. You can be on highways and not need a driver. What we see in 2018 is really the deployment of these level 1 and level 2 systems. In significance, right.
And what kind of applications are those? Think of things like adaptive cruise control, which is pretty wide spread probably today, even, right? But on top of that, not just lane departure, but lane keeping systems. Forward collision warning, yes, but forward collision avoidance systems, automatic e braking, parking assist and so on. It's that class of feature set that's going to be fairly widely deployed in 2018.
Along with that in the car, as I mentioned, this is cluster fusions going on cluster and infotainment fusion or virtual cockpit, which is again, driving and dramatically increasing the number of display surfaces managed there. That drives that memory footprint that you see at the top there. In rough terms, that's about sort of a 2x or 3x, growth from a platform, a high end car you'd see today in 2016. The big step function, of course, is moving fully to this autonomous level 4 or level 5 capability. And you see the footprint at the top is really astounding, right?
And what drives that? So again, In this case, you need to not only have a complete sensor network, but you need to now have the machine learning or the inference engine that can decide and the control and act on that system. So you need that across the board. And when you look at the number of sensors, the resolution of those sensors, let's say, When the analysis behind that to interpret that in essentially a machine language implementation, machine learning implementation rather, It drives a tremendous growth, not only in density, which is captured on the top, but also in performance. So we're seeing a performance need driven by automotive to now be pushing to really the extremes of what we can reach with our LP 4 capability today and our LP5 capability in the future.
It's going to be a very, very high performance and also very high temperature and high quality, of course, right, application that we're going to continue to see feed and drive tremendous growth
in the memory business as well.
So why do we win? Again, Micron is number 1 in this market today. So the story for me is really how do we maintain that and how do we keep winning, right? And we really do that through essentially a very, very deep understanding of this application and the deep commitment of these unique requirements of this application. So you think about this Automotive today is expecting and requiring the so called 0 defect mindset.
You have to be delivering an extremely high quality product. Fast forward to these level 4, level 5 applications where the car is driving in and of itself, the tolerance for defects and reliability issues is nonexistent, right? So, we have to continue to drive that growth across our entire product portfolio, nor NAND DRAM product lines that we're serving into this market, We think we have sort of the industry's broadest portfolio right now, which is one of the reasons why we have the market share that we have. The investment we're putting into product innovation is very, very high. These requirements, as I said, are driving performance requirements near the very fastest speed grades that we have within the company.
They're driving temperature requirements beyond anything we we've talked about, we introduced a number of products with 125 degree capability last year. We're going to continue to expand the temperature range in order to meet these very, very high speed and high performance needs. And we're doing that in collaboration essentially with our partners. We've invested in a number of labs externally that are facing customer facing where we're bringing in the OEMs, we're bringing in the SOC vendors, and we're working together to define these platforms today, basically, for this 2020 timeframe, we're working with them on the architecture. But we intend essentially to continue to win and to continue to lead in this, what is dramatic, implementation and evolution of the automotive market.
With that, I'll turn it over to Darren Thomas. We'll talk about storage business.
Okay. So, I'm sandwiched in between the two business units that have all the cool toys. I'm gonna try to make mine a little bit more interesting here for you. First of all, there's a lot of storage trends going on. We service literally four markets, in the SSD world.
If you look on the left, you'll see the consumer market And then the next one is the client market, then the cloud market, and then the traditional enterprise. Starting with the consumer, the big trend there is they're fast followers to the client market there. This is the group of individual users who will upgrade their systems by buying SSDs directly and installing them or having a reseller or VAR do that for them. And they follow the client market. So I'm going to talk a little bit more about the client market now.
The client market's really driven by this, this insatiable desire to make a laptop thinner and longer battery life. That's That's, everybody wants that. But a couple new, a couple new things have come into play here in the last year. One of them is security. You've heard security over and over again.
These are mobile devices. They can be snatched from you or stolen from your car. And the desire to have those products literally be unusable by anybody who steals it, requires security devices. We have security I'll talk about in a little bit that's pretty exceptional. And then, the other thing is ruggedization.
It's become pretty clear to the modern user that the a laptop is a much more rugged device where it has an SSD in it and make no mistake about it. They're building laptops to be more rugged because they are lasting longer. And that's what the customers want. That market's really driven by the OEMs. The OEMs in the client space, do the customer interface and figure out exactly what sells, and it's a pre commercial market.
Moving over to cloud. Cloud's interesting because at one time traditional enterprise and cloud with the same thing, the traditional enterprise built big data centers, and then there became a group of people who built even bigger data centers. And that group of people found that it was very cost ineffective to build large scale big data centers using traditional architectures. So they started investing a lot of energy into developing these new architectures that we know now as the cloud architectures. What they're interested in is scale in this thing, TCO.
And I know TCO has been used a lot in our industry, and I don't want to overuse the statement, but make no mistake about it if you're building a large cloud environment how much you have to pay for all the hardware and how much you have to pay for all the firmware, the software, and all that makes a big, big difference. And so these cloud architectures are really based on very scalable but different, systems. And as a result, They have grown very large and they are fueling all that IoT, all that mobile information, all those photographs, A lot of the applications we see today, they're fueling that. And so we break amounts because they act and behave very differently than the traditionals. Then you have the traditionals on the right.
The big thing that drives them is always performance. They're the group of people that typically put 1000, 5000 employees on a single server or they put 10 or 20 applications on a single server and they want more and more performance. And that in a great deal drives the additional requirements to move the the NAND closer, move the storage devices closer to the CPU as well as the creation of some new technologies there. So Those are the big macro trends. If we look at what the drivers are in this space, it's pretty simple.
The client customer is trying to extend the life of their lab top. They have an older laptop. It's not at all uncommon to keep a laptop now for many more than 2 or 3 years. That which was the old standard. And if you're doing that, you can extend the life by putting in an SSD and literally make that laptop almost last another 3 years.
So that's the consumer driver. Laptops, the very lightweight ones, the ones that you can carry through the airport and they don't weigh £5 and that market has probably passed its tripping point more than half of that industry uses SSDs. Matter of fact, if you search most of the year, laptop provider's websites, you will see that they embrace and show you their thinnest smallest fastest, lightest way, longest battery life laptops on the front page. The cloud is is basically fueling the social media juggernaut that we see going on. The massive amount of mobile data might gonna talk in a minute about how mobile phones are getting bigger, while people are still uploading and downloading that to the cloud.
You also have all these IoT devices and they connect to the cloud, as well as almost all modern apps have a cloud application, connected to them or is it parallel to what they do. So the big driver for the cloud is the fact that there's a lot of demand going on. We talked about the quintillion bits of data being created, well, the cloud is the most economical place to store that. And so I'm going to dive into cloud here in just a little bit. And then you see the traditional enterprise Nolan.
The traditional enterprise, they still want performance securities become a bigger issue to them and I'm going to talk about security in just a minute. But they also care about consistency of performance and they care about reliability. They have had this reliability, availability, sustainability model for years. So that's been their driver. What I want to look at, if you look at the size of the bit gross on all these, they're all fairly large.
And if you look at the size of the TAMs now, they're all fairly large. So there was a time we had fast bid growth, but it was based off of smaller markets. But these are now big markets. So these bid growth now means something very significantly, because it's a big deal to take a $5,200,000,000 market and grow it at 60%. And that's what we're going to talk about here.
So, first thing I want
to do is kind of
set the tone for the cloud. The cloud is the early adopters. These are These are the people that go out ahead of the enterprise that really can't the enterprise, do not want to experiment with your data that's in your bank or your data that is running a financial institution or something like that, they're not a great experimenter but the cloud group is. And so what the cloud group does is they go out into the, into the longest a bleeding edge opportunity as they can and they bring the technology back. And as they refine and make that technology work, the enterprise begins to adopt it as well and usually use it on prem or on premise.
Why does that matter? Because everything we do in the cloud has a early adopter model and the enterprise customers quickly follow those technologies. So, so that's why cloud is so important to us, not the least of which, it's the fastest growing market, noticed it had 60% CAGR the same as enterprise it's off a much larger number. It was about $5,200,000,000 versus $3,600,000,000. So cloud has grown very fast.
The second thing is, you notice this number, IT growing at 18%. That's a big number. IT doesn't always grow that fast. Matter of fact, we've come off of maybe a decade or more where IT grew in the single digits This is a big change, but I want to just give you a point of reference. The cloud portion of that is closer to 30%.
So cloud is growing very, very fast in the IT. The spinning media can't handle this Well, the spinning media has handled it in the past. So I mean, that's a bold statement, but I want to explain that. Spending media on average you put a lot of drives in to get the performance that the customer needs. The performance is required is 10,000 IOPS Each hard drive does about 200.
So you have to have a
lot of hard drives to get to 10,000, and they can't add them together. That's what the array controllers do. In the case of, in the case of an SSD, we can achieve that number with a single drive. It's not that hard, but, there are architectural requirements to go in and make those things happen. So the cloud is doing those architectural changes and they have now exceeded what the steady media can do.
And in doing so with their large random MYOs and their multi tenant, and hopefully everybody understands that term, but it basically means you have a single server and you put more than one application on it, each application may bring 1000 customers with it. So if you have a single server and you put 10 application Each one has 1000 customers, that's a big number of users, and that's what we refer to as multi tenancy, and that's what clouds do. Do everything like that at scale. So when they do that, they kind of break the rules and they break the technology. We just released the new 5100 drive And I'm just going to give you some of those numbers as an example here.
Cloud guys want the product to operate the same way from the day they bought it to the day they put it away. They can't build a cloud and then go work on something else and have that cloud slowly slow down. That's just not acceptable. This thing they call QOS or quality of service requires that the drive behaved the same way over its entire life. Hours are 99% better than a hard drive and hard drives did that very well.
The next pieces, best in class capacity, And it wasn't just a few years ago that an SSD was always smaller than a hard drive. Matter of fact, most of you that have an SSD in your laptop, you probably have a smaller one than you could have had if you'd have gotten drive, but you made that sacrifice. Well, in the cloud space, you don't have to make that sacrifice anymore because we are now four times larger than the next year spending drive. Now four times larger is significant because our price point is not four times more. And so as a result, you can now start replacing drives.
With SSDs, replace spinning drives with SSDs, and you actually get the last bullet here, which is a lower cost profile. And we've demonstrated 66 percent savings. So the cloud is not only the advanced design and architecture group for the enterprise, they have cross the line sooner and demanding SSDs much quicker. Now let me just follow-up. Why do we win Well, this is important.
Scott talked about our technology. We have, the 3 d NAND, in its TLC form is a 48 gigabytes device. The next nearest one to us is 50% smaller. And that's important why, because I can make an 8 terabyte drive So our device comes out from a quarter terabyte to 8 terabytes. That's a huge scale.
And at the same time, using some architectural capability, we made it go from less than 1 fill per day, which is the measure of durability of the drive, to more than 3 fills per day. So our single customers have a large choice of large drives, small drive, a high performing drive, less performance drive, higher capacity drive, all that dimension off of a single chip. And then my bullet here about storage tune media. One of the beautiful things about that device is he also talked about the CMOS under the array, so we can shrink that device down and make a much smaller device, much smaller capacity device, like a 16 gig device. And because it's CMOS under the array, it's the smallest chip in the industry and mobile can use that part.
So one technology literally masters it in the storage world and also makes the best device mobile, a very, very compelling argument for our technology. Moving on to the SSDs plus able architecture, we now do what we call actually flex architecture. We let the customer pick the capacity. If they want to buy a drive from us, and it's 5 12 and they want it to be 510 or 505, they can move it themselves. They can do that.
We're the only company that introduced that flexible architecture. We also, if you look across the page, we have all those capabilities of security. We actually allow the customer to pick the security that can without sending the drive to us or making a unique part number, they can pick which security they want, lock it down as only that security, and then sell it or bill that security into their architecture. One single SKU to us, many SKUs to them. And then, customer alignment, we've still simplified our offerings, as I just said, Some of our flexibility allows them to do things that make one skew, one pawl.
And if you're a large partner of ours, like a large OEM, you'd like to get 5 or 6 year drive requirements out of a single call. That would be a great thing. And we work to accomplish that. And then of course, quicker qual time because we do less of the quals. Moving over to the right side comprehensive portfolio.
As we introduced our client drive, we actually made the drive, so it would appeal to the lower end of the cloud market. We did that by doing some firmware features that the client market didn't care about, but the cloud market did. So we have one drive that actually crosses into other portfolios. Why is that important? Well, because these markets ebb and flow at different speeds, we can move our NAND between mobile and storage, we can move our NAND between client and cloud we can move our NAND between enterprise and consumer.
And that's an important aspect because it allows us to play the portfolio. Industry Leading Security. We're the only company that offers it fits, which is a federal, standard for security. We also have this AES 256 gigabit. It's an unbreakable, encryption.
If you steal the drive that has that, it'll become completely worthless to you, because not only is it 256 gig, gigabit, not only is it 256 bit encryption, it is also hashed against the serial number, which makes it impossible to break the code. If a customer loses his own password, we can't fix it. The drive is that secure. And that has become an important aspect. And then finally, differentiation in the ecosystem We actually run our systems against the applications oracle.
We run them with partners like Facebook, we run them with VMware, And in doing so, we come out with architectural references that tell our customers, if you deploy our system using this many of this drive, this many of this drive, set the capacities like this, you'll get the highest performance at the lowest cost. So all of that combined together, makes us a very compelling argument in the storage space. With that, I'd like to turn it over to my good friend, Mike Rayfield, for the mobile.
Thank you.
Good morning It's nice that this event finally lines up with Groundhog Day, but this Groundhog Day feels a lot better than last year's Groundhog Day. And we'll talk a little bit about that. So, I'm gonna talk about the mobile business. I'm gonna talk about, 3 things. First, I'll talk about, what's driving it.
We've talked a long time about more DRAM, more NAND. We'll talk a little bit about what is driving that. We'll talk about what the market looks like now because it has morphed and it's morphed in a direction that I'm pretty happy with. And then we'll talk about what it means to Micron because I think we've got some unique opportunities. So we've all heard about AR and VR.
Remember, the first place you get to go do those things is somebody sells you a phone and you snap it in your goggles to go off and sort of experience the first virtual reality, whether you know it or not you use augmented reality every day, right? If you use Google Maps if you use Yelp, if you use any of those things that sort of tie everything together, it sort of become part of our lives. And that requires a staggering amount of data a staggering amount of responsiveness and ultimately storing a lot of materials. So that drives a lot. The next thing is if you wander through the market and look at all the boxes and cell phones and tablets, people talk about the amount of NAND to DRAM.
Never used to be the case. It used to be display size. It used to be the number of cores. The reality is from a performance standpoint, what people care about, It's the amount of storage and the amount of DRAM. We'll talk a little bit more about that as well.
And then finally, because we do use these devices so much, it's they're so immersive, we've got to have low power technologies, whether it be next generation, memory or storage whether it be finding ways to do creative things with LP4 so that you can use your device all day long and not worry about it it dying at the end of the day or having to go off and charge it. So all of those things are pushing this market in a direction that's very good for Micron for both memory and storage. So there's a couple of things here you'll notice. The first thing is there's 4 distinct markets. And we've talked a lot, in the past, about 3 markets.
And now we've got a 4th one and they're all very big. If you noticed, they're all growing faster than the overall market. And so that's a good sign for us. On NAND is outstripping DRAM on all of them. And if you've got any teenage kids, you'll realize why that is.
Right, 2.50 six gigabytes of storage is sort of the minimum people are asking for now as they put content there. The next thing you'll notice is nowhere on this slide, are there any mention of units? So every year, you and I have talked for the last 4 years, The first question is, you know, it's not growing very fast. The world's over. It's only going to grow at 5% versus 8%.
It's a tertiary effect. The reality, if you really do double or triple the amount of NAND DRAM, whether it grows, whether the market grows at 5% or 8% or 12%, it just doesn't matter. And that's why we've been so excited about this is because really the most important components in these mobile devices have become what we do, which is memory and storage. We have to play in all four of these markets. So if you look at the top 1, there's an awful lot of learning there, right.
There's learning about storage technologies, learning about DRAM technology, there's learning that we do and learning that our customers do that automatically sort of slides down the market, if you will. The high end market is basically the same as flagship by a bunch of folks that sell it for
about half the price.
And they actually so when the top guys come out with a phone that's 250 six gigabytes, the high end market does the same thing. And so they're fast followers with that technology, and that forces sort of an upgrade, if you will, of the whole market. The last one I'll talk about is the low end, that sort of last time we met was basically a feature phone. And what's going to happen and we've had this conversation before The next 1 or 2,000,000,000 people are going to buy one of these very, very inexpensive smartphones. It's going to be their first phone, their first computer, their first internet connection, And by the way, when you look at the cost of those devices, probably the largest part of the bill of materials is memory and storage, because that's what people care about.
It's gotta be able to operate. It's gotta be able performance and you got to be able to store, user generated content. So that's really exciting for us. So some of the things that are driving storage And the reason I point this out is because it overlaps nicely with where the technology is going and the stuff that Scott talked about. Over the last 3 or 4 years, The storage has basically been driven by the social content we do.
We take pictures, we do Instagram, we send Snapchat back and forth, We've got a whole bunch of sort of user generated data and small stuff we pull off the web. That pushes us to to sort of the 32 or 64 gigabyte need. We're generating a lot more user generated video now. Right? People when people are sending you their videos of skiing or their adventure, it's on their phone.
It's relatively high resolution and they want to keep it. So I need a lot of storage to keep it here. Darren needs a lot of storage because I'm going to back it off. And ultimately, it's going to be it's going to drive the stuff that Jeff Bader talked the car because I'm going to want to replicate that there again. And then as we get to 3 d NAND and the ability to put higher and higher density, it's going to allow us to be able to walk and do high definition both user generated, but also you end up with films that you can or movies you can download over 5 gs over wifi, they're going to be in the neighborhood of 5 gigabytes a piece.
And again, people want it on their device. Backup is great, but sort of this instantaneous ability to show your friends and show people is something that's going to continue to drive that. And you've already seen Your mobile device, there's a reasonable chance it's comparable in terms of both memory storage and compute horsepower to your notebook computer. And it makes total sense. You use it just as much and you create a lot more content with it.
So the thing that's changed a little bit that I think plays very nicely into what Micron's portfolio is is we talked about heterogeneous memory or MCPs, which is basically NAND DRAM controller and firmware on a single substrate. And in and out in time, there's going to be the next generation memories on that substrate as well. Historically, that was a relatively low end technology, low density for the low and mid range. What we've seen is that technology has moved to higher and higher densities. You can get 3 or 4 gigabytes of DRAM, 3264,128 gigabytes of band on an EMCP, you can get the next generation technology, which is UMCP or UFS on it and all of a sudden, that sort of China Inc, which has gone off and created a pull for this because of ease of manufacturing ease of use has now made that market huge.
The numbers here in the high end, mid tier and low end are just the EMCP or the or the heterogeneous memory component of that. So before we used to talk about it's a little piece of the business and mainly it's DRAM and NAND. Now it's the majority of the business. And the great thing for Micron is if you look at a lot of times you don't want to brag about having a small market share. The reality we're below 10% market share on all those things, which means the headroom is staggering.
We can grow at a pretty amazing rate. One of the things that Darren talked about was our new NAND technology. We have NAND that is tailored for mobile, It can also be used in storage. It can also be used in automotive. So with these market shares, we can flex in and out of the markets that make more sense to us.
If it looks better to be in some of Darren's markets, we move capacity there. It looks better in some of the line markets, we move back and forth. That's a level of flexibility we've never had in NAND. We've had in DRAM and it's going to do a lot to stabilize. I believe the market and our ability to go to market with great products.
So when we talked about how this is growing, before when I talked about about high end, high density, both memory and storage devices, they were sort of household U. S.-based companies. Right now, if you look at this, this is the who's who of companies in China that build amazing devices, that used to be devices that were inexpensive, and considered not the highest quality. These are amazing devices. They're starting to show up in Europe and North America that will continue to happen and ultimately what they will do is they will take the high end capabilities from the flagship devices and they'll force them into this mid range.
It is much easier from a manufacturing standpoint. We can start to build solutions that solve a problem. Let's say somebody has phone that really what they care about the high definition video, I may have an MCP that's got more NAND, maybe it's got a next generation storage on it as well, some DRAM and the customer really doesn't care what's on there. It's solving a problem, which ultimately makes us more valuable. Much easier to procure, right?
It's a sort of a one stop shop. And the great part for us is there's a limited number of people that have NAND and DRAM And as we add next generation technologies like 3 d Crosspoint, it again limits the people that can solve that customer problem to potentially just us, which is great. So why do we win? We've talked a lot about this, but having a broad portfolio that can go all the way from the bottom to the top, is a huge advantage. We've got a in mobile, we've sort of got this saying, we don't pick winners.
We spend time with all the teams in all the companies in China, North America, Europe, everywhere, and and work on design wins so that as different people win, because it's hard to call them, we've got a solution for them. There's a lot of work with an ecosystem. We've talked a lot about in the past year about getting qualifications and how important that is. We've got labs around the world. We work very closely with the SoC suppliers the processor suppliers.
So as handset people adopt that technology, we're automatically in a position to go off and start supplying solutions. And we measure basically the percentage of the sockets of the world that we're capable to supply to and we track that on a regular basis. We talked about 3 d NAND, both Darren and Scott, having we've
had a
pretty good EMCP business for the last 3 years. Having custom tailored NAND for mobile now is an unbelievable advantage for us. The ability to move in and out of markets and ultimately the ability to have I think the best solution for the marketplace is, is something we haven't had, and I think will make us much more successful. And then finally, we are, our objective is to be the best partner possible. We don't compete with our customers.
We spend time with all of them locally, whether it be in labs or in their labs to try to come up with the best solution. And the combination of that partnership, I think, the best portfolio in the industry and the ability to mix and match this unique technology is a huge opportunity for us. So We're looking forward to the year ahead and we'll come back and talk to you again soon. Thank you. Thanks, Mike.
February 2nd, so he could lead in with that groundhog day. Sorry. But the, the reality is there's a
lot of truth in what he said
that this year feels a lot better than last year because last year we were telling you what we were going to do and this year we've demonstrated what we did and we're going to tell you a little bit more in my section about how we're looking at 2018 and maybe a little bit beyond that time. So I actually have the easiest job here. These guys have done the hard work. I get to tell you what it all means, as well as share some other hard work from our operations team who, isn't represented here today. In terms of a presentation.
So some key updates here and then moving into some capital management topics. We have shared with you the information on the left hand side of the graphic now for about 18 months. We first debuted it to you in August of 20 15. And you can see by the check marks that we have met every single one of those deliverables. Now of course, 2018 or 2017 isn't over yet, but we're getting near the halfway point, so we're feeling pretty comfortable about that.
And we've now extended that outlook through 2018, and I need to add a disclaimer here in the sense that we are in the very early stages of our business planning process for 2018. So we are giving you a glimpse of data that is less mature than what we would typically provide to you This is based on a relatively normalized level of CapEx that we're planning for 2018, but you can see here how we compare in terms of bit growth CAGR as well as cost reduction across the 2 key technologies of the company during that time. And we've been more specific about 2017. So if you go and do the math after we're done here, you're going to see that for 2018, we are planning bit growth that is roughly aligned with the industry estimates that Mark provided to you a little bit earlier in his presentation, you're going to see cost per bit reductions are actually a little bit greater than what you might expect for those bit growth projections. And that's the result of all the things that Scott has talked to you about today relative to the progress on the 1X node as well as the 2nd generation of 3 d NAND.
So some good opportunity ahead for the company to continue with the success that we've had in 2017 as we go forward into technology and bit growth. So 1st and foremost, we have an ongoing effort around fab optimization. So we have, and that is essentially getting the most wafers we can out of the existing manufacturing capacity. We've had a lot of success with that Over our DRAM network with the recently completed acquisition of Innoterra, we'll have the opportunity to take that learning and apply it to Innoterra, and we'd expect some beneficial coming from that activity. In terms of assembly and backend improvements, you've heard that referred to a few times during the presentation today.
But essentially think about it as a couple of things. 1, moving those 2 activities closer to each other, and making sure that we eliminate redundancies that exist between the, between the network. We're gonna automatically switch over? Okay, perfect. And so we always want to have our products manufactured at at least two locations for redundancy sake.
We don't need the manufactured at 4 or 5 locations. So as we have the opportunity to consolidate and add some efficiency there, you're going to see some benefit that is in addition to the bit growth and cost reduction that we talk about. Secondly, you've heard today how the company's progress and leadership position in the NAND space, particularly, is allowing us to reenter markets that we have had less presence in, particularly the storage market and that will allow us to re tailor the margin profile of a big piece of our on a fairly regular basis where we try to balance the most near term profit opportunity, the longest term strategic customer opportunities, as well as whatever might be happening in the spot or the transitional markets. And together, we think that at full run rate, this would be an opportunity of about $500,000,000 a year. These opportunities would predominantly be reflected in the revenue and gross margin lines of the company.
Obviously, the segmentation pieces at the bottom of the page are more revenue related, some of the fab optimization or more cost of goods sold related, but that's where you're going to see it. And some of those are already rolling through the P and L company, and we expect that they'll roll through at greater levels here as we go through the next few quarters. Secondly, I've been talking for a while in investor conferences about the need to make improvements in working capital. We're working on 3 primary activities there. The first of them are inventory reductions.
Each of the business units and I have worked together to develop an inventory goal for, for their business that will allow them to meet their customer's needs, while at the same time trimming down and sort of optimizing the amount of working capital tied up in that inventory. Our supply chain process improvements as we have more and more success in the storage business, we'll have an opportunity to shorten that supply chain which will free up working capital. And finally, manufacturing cycle time improvements, you heard Darren in his presentation talk about having one SKU for us that satisfies a variety of customer needs. Having one SKU for us is terribly impactful in terms of having the opportunity not to have multiple skews, which in turn drive multiple inventory inefficiencies, multiple cycle time, expansions and issues. And together, We think that these will help us generate over the long term about $400,000,000 worth of working capital improvement.
All of those things together mean that if we encounter a business cycle similar to what we encountered in the second half of twenty fifteen, in first half of twenty sixteen. So the worst of the last cycle, if you will, we think that we have an opportunity for dramatically different performance. So this gray bar is representative of our performance over the course of that period of time. If you look at what we've done in terms of incremental bid outs and incremental costs down, those two facts alone would more than half close the gap from a working capital perspective relative to that last time performance. We then have scale both from the Inoterra acquisition as well as from the expansion of our 3 d NAND capacity in Singapore, and we have not double counted that.
In those first two bars. So that scale expansion is represented there under the scale line. The bit growth from technology is represented under the first bar. You then talk about some of the operational efficiency activities that I've just spoken of and the result of that is that we have an opportunity to have very meaningfully differentiated cash flow performance should we encounter similar situations to that, which we did in the second half of 'fifteen and the first half of 'sixteen. So a real opportunity here for us to view the future differently than what has occurred in the past.
Those things are also benefiting 20 seventeen's free cash flow outlook as you can see here, although we're not giving an update on the current business environment, it obviously supports a good amount of free cash flow. We now expect that to be in of $1,500,000,000 for fiscal year 2017. We've talked for a while now about using a significant portion of that toward debt reduction. So we're being more specific here that somewhere between 50% 75% of that will be targeted for debt reduction And as always, we continue to explore ways to strengthen the companies, both capabilities and the the solid formation of a balance sheet that will perform the provide a platform for the company on a going forward basis. A quick wrap up here.
This is the capital management framework of the company. We've showed this on a fairly consistent basis. And you can see here relative to our fiscal 'seventeen performance, We're on track for all of those items. If you're very detail oriented, you notice that we've made a slight change on the leverage portion of this in the bottom right hand side of the slide, we've now provided a range, we previously provided a point estimate, and the range is designed to acknowledge the fact that we're in a business that goes through strong periods and weaker periods, and you can't look at a single point GAAP debt to EBITDA number that is reflective across a set of business circumstances that are as wide as we sometimes encounter. So we have widened that range.
We are currently sitting somewhere north of the the one and a half times, we'd actually like to be below that so that just reinforces our commitment to and acknowledgement of
the fact that we have some work
to do there. Relative to the leverage of Sorry about that. We don't have an update to our capital plans for the year. They're about the same as we We articulated going into the year. So total investment nominally of $5,000,000,000 with a small range around that, 40% to 50% for DRAM.
30 to 40 for nonvolatile and the balance for technology and product enablement. And Finally, we are continuing to execute on the technology and operational enhancements, and those will allow us to deliver improved free cash flow over the course of any sort of business cycle and particularly in difficult business cycles when every dollar free cash flow becomes incredibly important to the company, and we will deliver on our commitment to reduce leverage during the course of fiscal year 2017. And with that, I will turn it back over to Mark for some closing comments.
Alright. Thank you Ernie. I have, I have one more piece of new information I'd like to share for you before I wrap it up here, and then I'll turn it over to Q And A. A number of months ago, I talked to our board about my desire to retire from the company at some point in the future. And we discussed, that for me and for the board, nothing was more important than making sure that we did it in the right way and at the right time.
And hopefully, as we've gone through our discussion today, you've come to the conclusion that this company is really firing on all cylinders at the We have very strong tailwinds in the marketplace and that we really have a very, very bright future. So I think it's fair to say that now is a very good time to facilitate a CEO's transition in the company. Let me tell you right away, the board and I take this succession planning responsibility and the the, the execution of a smooth transition as one of our most important responsibilities of the company and something that I personally haven't been here with the company through my entire career take very, very seriously on a go going forward basis. So you have my assurance that we will have a strong succession plan that we will do this in the right way And that throughout, we will maintain the very high caliber executive team that you've seen some examples of today. The board's formed a special committee, to oversee and execute a CEO search.
And important to realize there's no deadline. There's no, timeframe for transition. I will be here throughout the process. Participate in the process. And most importantly, we'll be very focused on ongoing business success and driving our strategy forward as we move through that period and through a transition once the successor is identified.
The board has full confidence in the company in our strategy. I think we've done a very good job over the 5 years that I've been CEO identifying where the company needed to go early, putting the pieces in place in order to engage the market opportunities that are coming into place today. And we will make sure and the board will make we'll make sure that we continue to execute against that strategy. There's no changes planned in the short term really just good solid business execution and ongoing deployment of resources against our existing plan.
So with that, I'd like to
wrap it up. It's been a sweet for me leaving the company, at this time because really, I believe the memory market has never been more exciting. There have never been better opportunities for us to grow the company and to add value for our customers and our shareholders. But we have a great team in place. We're executing against all our priorities.
We're working in market they're requiring more and more of our products and more and more of our technologies in more interesting ways and more complicated solutions. And so we have all the pieces in place. We've got long term collaborative relationships with our customers. We've got innovative new memory technologies going into innovative new end market solutions, and the company is executing well against all of those. So, thank you for your continued support of the company.
At this point, I would like to move to Q And A And, if the questions are hard, I'm going to direct them to my teammates over here. To help me answer some of these questions. So let's take it from there. Thank you.
Thank you very much.
Do you have all the R and D capabilities you have inside the company or are there technologies and capabilities you would look to acquire going forward?
I think we've done a pretty good job bringing all the pieces together. You never say you have everything, but as we alluded to, we continue to build on some of these fundamental underlying technologies that we think are becoming more important to our future, things like
some of
the advanced packaging technologies, controller expertise, as well as augmenting our firmware and software capability, etcetera. You're never done with all of that, but we believe we've got as many or more of the right pieces as anybody we compete against.
Can you just talk about the continued tightness in the PC market when you that to be alleviated and then maybe share any comments on the server and the mobile phone marks as well?
Yes, let me pitch that Tom, EV for PC and server and Mike Rayfield for mobile.
Guess I'm mic'd up. Okay, good.
Yes, so I think, as I talked about the, in the PC space, longer term, that is actually our our slowest growing market. We have seen, over the latter half of 'sixteen as we continue in 'seventeen, that, that decline has not been at the level that people hadn't anticipated. And so that's been, that's certainly been an uplift to the market And I don't think we've seen a material change in that trajectory. Again, the big driver is on the server side. And that's to a lesser extent in the enterprise side, to a greater extent in the cloud side.
I would say we continue to see a kind of very healthy dynamic in that marketplace, again, primarily driven by the density needs, they're just putting more and more and more DRAM in every server. And so we see that quite healthy.
Mike? Yes, from a mobile standpoint, it continues to be driven by what we talked about here on increased memory in all the devices. And then also, as Jeff talked about in automotive, that that high functionality and high memory and storage content subsystem, if you will, is also being picked up by numerous places in an automobile. So the 2 together, it feels like the low power and the mobile DRAM and NAND has got a pretty good market going forward.
Let me just add one comment to all of that, which is, as best we as best we can tell, in all these markets today, customer inventory is very, very lean, and we know our inventory is leading as we move through the quarter and on in the next quarter. The aggregate supply and demand in the overall marketplace, we think we believe is going to continue to be short. And we have the ability to move these these components back and forth between the various memory segments. And so we think that actually that disaggregation means that it can never be completely in balance and that actually you have to have even more of a surplus in order to make sure you don't have those spot pieces of various markets that are short. So, as we look at the market going forward, it looks It looks very strong to us.
Bill.
Oh, sorry. How about back here?
And then we'll come to Thanks, Mark Delaney from Goldman Sachs. 1st, Mark, congratulations on all the accomplishments of your
career and improve us that followed
the company for a long
time and observed how much you put into it. Thank you.
Two questions, one on NAND, one DRAM. So first on NAND, I think trade gross margins are running in the mid-20s. Some of your competitors are over 40. Is that all just the bit density on NAND and just getting 64 layer, does that close that gap down? And then the second question I'll just ask you now is on the 16 nanometer transition.
I think last year at the Analyst Day, Scott had talked about the number of bits per wafer going from 20 to 16 nanometer was actually a bigger increase than you got from going from 25 to 20. The slides today showed the cost down from 25 to 20 is actually maybe larger than it was from it's going to be from 20 to 16. So there's any sort of elaboration on the cost downs at 16 nanometer versus 20 in DRAM would be helpful. Thank you.
Yes. Okay. So the first part again, Mark, was
NAND gross margins in 2020 is getting
to 64 or is that close to that?
So relative to NAND, hopefully, as we went through the presentation today, you picked up, there's a number there's a number of things that we think will be beneficial to us as we move forward relative to our it's not a NAND business, but relative to how our NAND gross margins stack up. First of all, it's the new technology. There's the increased mix of TLC. There is the transition But we continue to see some cost benefit as we have more volume flushing through on the Gen 1 transition that we've been working on. But a significant cost down as we move to Gen 2 beyond that.
And then finally, there's sort of a remix in terms of the value added segments our ability to deploy this leading edge technology in the mobile and various segments of the storage sector. So we're feeling very, very good about our ability to continue to drive relative improvement, with our NAND technologies on a go forward basis. The second relative to, cost downs going from, I think it was 25 to 20 versus 20 to 16. We've got a very rich portfolio mix at the 1x node for DRAM. And so we'll have as I think was alluded to during the conversations, we'll have graphics components, we'll have mobile components, we'll have compute components, And we'll have different cost downs associated with all those different products.
Some actual follow on 1x products will be larger shrinks and drive larger cost downs. So in aggregate, we had a mix of stuff going on previously over the last year, we had some transition from 25 to 20, and we also had some transitions going on from 30 to 20 As we go forward, the 20 to 1x, we will be we believe will be substantial and in line with what Scott shows you, but there will be a mix of different products, some of which are bigger cost downs than he demonstrated today. Thanks,
Mark. A couple of questions. First of all, I'd like you to kind of discuss the tug of war between Ernie's point about wanting to bring inventory down over time but the point that you just made of wanting to make sure you have enough inventory as these market segment to not short individual applications? And then secondarily, how would you characterize Micron's ability versus your competitor's ability to flex between individual products within DRAM and within NAND. I mean, the pastor has been this question about flexing between NAND and DRAM, but I'm really thinking within the same memory segments.
Okay, great. So So first of all, relative to the inventory question, there are things that we can do that we've identified that we're working on that will bring a greater flexibility and drive the need for less inventory while maintaining comps flexibility relative to our ability to service our customers and drive the right. The service levels to meet end market demand. So there are some things we can just get better at internally and that's what earnings numbers are talking about. Some of that comes also from the second part of the question, which is what can we do internal to our operation?
To drive better flexibility between individual products and cycle time turns out to be a big piece of that. And the way that we structure of our backend turns out to be a big piece of that. So Ernie alluded to improvements in the backend. Part of that has to do with how do we configure our backend, how do we streamline that network. Part of it has to do with our ability to drive better flexibility within given manufacturing sites in order to manage that back end inventory in a more cost effective way.
Doug Thomas, I just wanted to say, I guess you've been threatening to retire for so long. I'll believe it when I when
I finally see it, but
I just wanted to add my congratulations to you. And I want to ask you something Mike mentioned this. Steve used to talk about this too. The fact that you don't compete with your customers. And over time, we've seen some of your competitors go in and out of certain businesses.
Obviously, Samsung just buying Harmon. I just wondered over the years, Have you seen benefits accrue to the company based on the fact that you have maintained this real sense of independence and going forward in terms of how you would how you and the board would like to see the company develop over the years Do you rule out any potential vertical integration, particularly in automotive or any of those end markets that you see as So substantially important to your strategic plan.
Well, we're not going to get in the car business anytime soon, but No, I wouldn't rule out some level of vertical integration. We are primarily and really wholly focused on building memory subsystems and systems for our customers. At some point, you start to blur the line when you talk about memory systems between what is a memory system where Micron can add more value versus what is a memory and where others might play in. We'll have to work our way through that. But I think there are benefits that accrue from not competing with your customers.
There are engagements you can have and there's an intimacy that you have and that we've experienced, that is important. And has that non competition as a sort of a foundational element But I think it's the new world that we're entering, where we will really see become more and more important because we really have to in order to deliver
the value, we really have
to be much closer to the end customer and into their end applications. And there's just a, an ongoing acceptance of Micron as the partner of choice, really, That's a piece of why. Part of it is technology, part of it is and becoming more of a more of an advantage, I think, on a go forward basis. Part of it is we're truly global. So there are a number of things that come into that, but I think the ability to be really close to our customers to get into their end applications and understand how do we tailor the solution for them becoming rapidly more important and all of those pieces are sort of foundational.
Yes, Liz. Thank you. I guess two questions. First one, as you think about the cost benefits that you anticipate with the move to 1x, Where do you think you'll be looking at 6, 12 months vis a vis a Samsung in terms of cost per bit? And then the second question in terms capital intensity targeted at 30%.
How do you see the mix there between spend on DRAM, NAND and other?
So first of all, relative to Samsung, obviously, a tough competitor, I think we are, that as we move through time, We have, with the technology roadmap, we've talked about and the other improvements we talked about today, I do think that we will continue to see some improvement on our cost structure, but I'd hate to try and put 2 time to time frame to that, because I don't know exactly what they're gonna do. Sorry. The second part was. Oh, capital intensity. Yeah.
You know, it's going to move through time. We've got a NAND market that's, that's growing very rapidly in a technology strength, the Gen 2, that drives significant manufacturing efficiency. But we have to be cognizant also of what do we think might happen out there in the supply environment, and we've got very strong dynamics in DRAM as well. And we're going to have to take a look at supplydemand end market demand dynamics and be somewhat flexible as we think about how big do we want to be in those 2 technologies and what's the relative merit of spending a dollar in the DRAM environment versus the NAND environment knowing that over time, hopefully we'll be improving our gross margin profile relative to competition in both those technologies.
I guess following up on Mark's question on NAND profitability. As we talk about the 64 layer, being 64 tier being 25 percent die size, smaller than everyone else. Seems like
a powerful statement. Is that analogous to cost
other things with sort of floating gate or CMOS underneath that sort of changes the cost structure relative to the peers?
Yes. I mean, Scott can talk to it here in a second. There are some obviously complexities, a piece of it, what's your depreciation load look like, etcetera, etcetera. And we never have complete precision. Relative to what our competitors look like.
But Scott, do you want to add anything to that?
Well, it absolutely translates to cost. And process complexity differences between their competitors and us are hard to quantify exactly. So, but I mean, you can think of it as like for like 64 tier, if you have so many wafers running in each facility, we have a substantial cost benefit on the 64 tier technology.
And I guess your competitors say it's harder to yield a floating gate versus charge drop. Is there any truth to that?
I don't Actually, maybe that's what they tell you. But hopefully, we left you with a feeling that we're very confident in our ability to yield floating a technology, the way we build it, we I think we're on a great trajectory with 32 tier and already hitting some really unexpected yields relative to where we when we started this technology development in 64s looking very solid also. So we have no yield concerns on this technology.
I'll answer with JP Morgan. Mark back at our investor conference in May, you had talked about, inserting your first EUV tool in your Boise R&D fab towards the end of 2016. Wanted to get an update for you from you as the EUV process module currently installed. Any insights in terms of early performance and more importantly, when does the team plan to intercept, EUV along its technology roadmaps?
Scott, you want
to take that one? Okay. So the progress is, as we had projected, we installed the tool.
It's it's actually under installation. Now still it takes a while once you put those tools in. So it's in it's in Boise, it's in the R and D fab now. From a roadmap interception point, it's really dependent on the performance of the technology. It's not absolute enabler for a technological reason to any of our roadmaps at this moment, but absolutely we watch it from a cost benefit point of view and we'll continue to watch it.
And as the performance becomes cost competitive with the lithography methods we're using now, then we'll certainly roadmap.
Mark, you early on, you were talking about different markets, driving demand, and especially on the storage side, but you also talked about elasticity. Yep. To me, over the past, so many decades elacitabine, for the most part, has been driven by customers, giving you guys overly excited you guys add capacity and then demand doesn't really deliver what the customer benefits from elasticity. And in that context, every time we ask earlier about margin profile, lack of visibility on ASP is a big factor, not able to discuss margins 1 or 2 quarters down the road, So what I want to learn what I want to hear from you is how do you manage or think about elasticity to drive incremental demand? Versus preserving margins, especially with the increased focus on capital return.
Yes. We, we're not we're not really adding any wafers anywhere. So the sort of the first thing to keep in mind is We are interested in staying in an efficient operating frontier with our with with the technologies we have deployed. So if a technology deployment can help improve our manufacturing cost efficiency, we always take a look at that. It has to be an ROIC, etcetera, etcetera.
But that's not a pure supply decision. That's really a manufacturing efficiency decision. And then when we roll in behind that, a sense of elasticity, which really has to be looked at sort of market segment by market segment. And we never have complete, precision as to what that looks like, but we do have a fairly good information from our ongoing interaction with our customers as to how they tend to think about decisions relative to more memory and we bake that into our decision making as well. So we will be unlikely I believe, to be the ones, to upset the apple cart, so to speak, relative to bringing new wafers to the marketplace, we will really be focused on making sure we have enough new technology deployed to be manufacturing efficient and to meet the performance and form factor quality bandwidth latency requirements of our customers on a go forward basis.
Let me just rephrase. Right now, we're seeing some sort of a de specking, especially in the consumer PC. You're seeing better than expected demand for hard disk drive. What happens a year from now when some of your, server customers at datacenter we're just going to say, look, your prices are not going down. SSD elasticity is not there.
So we're going to put things on hold. How are we going to how are we going to think through this? Because again, black elasticity is lower prices.
Yes. So well, it can go both ways, right? So I think one of the important bullets on Darren's slide was a bullet that talked about the fact that he can already deliver, at today's pricing, a significant cost of ownership advantage to data center customers with SSDs at today's pricing. So I can't tell you how much SSD pricing would need to go up in order to discourage them, from doing that. But at today's pricing, we can do very, very well thank you very much, and the customers can do very, very well as well.
So we're not really to that point where elasticity would even begin to kick in. In the storage market. Relative to PCs where unit growth is slow and certainly that's a segment where you know, thankfully, it's a much smaller
and smaller piece of our business
on a go forward basis.
Doesn't that record it to be more vertically integrated to cut out some of the margin sharing so that you could pass that on to the new customer's so called data centers? Well, we need to, we need to, there is great growth
in the data center. And there's, we need to be vertically integrated where we can add something of value. If we're not a value and somebody else can do it more cost effectively. We didn't really solve anything. We just moved the bogie around for to become our problem instead of somebody else's problem.
Oh, okay. Sorry. Thank
you. Thanks for taking my question. I'm with Credit Suisse. I had a question on NAN. If we look at your technology that you showed on 64 layer n1, significantly smaller die, That should give you, seems like better gross margins and operating margins than your competitors if you implemented in production sooner.
So why are you not more aggressive and implemented the technology and becoming a leader in it because sometimes what we have seen in past is that you take too long for a node transition and during that time, the competitors kind of got job or move ahead. Yes. So there are a lot of things that
go into that decision. Obviously, supply and demand balance is one. What are the other opportunities to deploy that capital inside the company, whether whether it's other, you know, NAND opportunities or DRAM opportunities or 3 d cross point opportunities. Where do we want our balance sheet to look like? Where do we want that to be?
Long term in order to have a solid foundation to drive growth throughout throughout ongoing cycles. So there are a lot of different things that go into those decisions. And we will tend we will over time, reallocate capital back and forth between different technologies and different parts of our business in order to generate the best return for our customers
and for our shareholders.
Thanks very much.
The Fox of Cross Research.
So a couple of times during the presentation was mentioned that you guys don't feel like you have to pick the winners out of your markets. But if demand and supply continue to remain tight. What's the chances that you may have to reconsider that and make some more customer specific that's down the road, especially on some high volume targets like in auto and cloud. And then just one for Ernie, so working capital was a positive to cash flow in Q1. You just talked about some improvements going forward on working capital.
Should we see some of that flow through increasingly during the fiscal year? Any color on
that would be helpful. Thanks.
So the first question, the discussion around try not to pick winners try not to pick winners too too soon. What we really wanna do is we wanna have a lot of, you know, we wanna have a lot of excess demand and then pick and choose between it. Once we have that demand firm dot, we've got the design wins. We know that the customer, once that product from us, it's a much better thing to have too much demand than not enough demand. And yes, of course, we always have to think strategically about which market segments do we want to make sure that we're in for the long term, which customers within those segments do we need to be more aligned with or less aligned with in order to make sure that we have the right dynamics and it could be across all sorts of different things, by the way, not just necessarily short term gross margin, but ability to innovate and create better technologies for the future, etcetera.
So we have to think about alignment of our customers. We have to think about alignment of segments, but we want to maintain flexibility within that construct. Ernie, you want to take the working the cash flow?
Yes, thanks. So we were actually that free cash flow neutral in our fiscal quarter 1, maybe a slight deficit we expect that will generate significant positive free cash flow here in our FQ2 and that is a manifestation or partial manifestation of the things that I talked about. So for example, just by the discussion around their NAND gross margins, they are improving and that's a reflection of both the cost position as well as the segmentation that I talked about, you've seen our inventory make progress over the course of the last couple of reported quarters. So those things are making their way into the P and L. And the cash position of the company, we expect that they will continue
to do so over the course of the fiscal year.
Okay. Over here.
Just on the DRAM side, you guys talked about second half twenty seventeen seeing 1y transition. What percent of what's the cost structure you get from going to 1y from 1x? And on the NAND side, you talked about 64 layer NAND, 3 d NAND ramping. As you exit 2017, what percent of bits should be on 64 layer 3 d? Thanks.
Scott, you want
to take a shot on that? Obviously, it's always a sliding target, 1y cost over time versus yield and volume and the ramp and costs always start up higher and then come down. But within that construct So, a couple of
comments on that. First, on 1 Y, just to clarify the 1 Y, That's our manufacturing introduction in the second half of fiscal twenty seventeen. So that won't be a meaningful output. It won't output in fiscal 2017 for Micron. The meaningful output will be on the 1 X node before the end of the fiscal year.
So one Y is really as I mentioned, at the same kind of stage that 1x was last year at this time. Second part of that question was sorry, Sorry. Oh, sorry. Yeah. So as we mentioned, we'll have meaningful output on 64 tier and today, we're not commenting on percentage.
Hi, Karl Ackerman on for Timothy Arcare at Cowen. There have been several new memory fabs announced in China and yankee river recently broke ground on a fab. Costing, I think, 7,500,000,000 for phase 1 to make 32 layer 3 d man. They say they're getting IP from expansion, but if it's really the source of IP, do you think they or other memory, fats in China could have an impact on supply and make a competitive product? And I had a follow-up, please.
So, setting aside the question of how much health expansion might be to a new startup in China, I believe, that eventually there can be some impact of significance depending on how much money you spend and how capacity actually gets put in place over what time frame. I don't believe it'll be significant anytime in the year or 2, and probably would not be competitive in the next 5 years, unless there's some major licensing activity from an incumbent producer in the industry today. Just as a follow-up,
do you think supply and demand fundamentals across both DRAM and NAND could remain exceptionally tight for a longer period of time because the memory player in Korea wants to drive industry probability higher and maybe limit anyone from striking deal with Chinese Sovereign Funds. Thanks.
I think you'd have to ask the memory player in Korea what their strategy is and what the rationale behind it might be. Place today are rational actors. I think they are going to act to try and maximize their profits over the long haul. And, I feel very good about Micron's ability to compete with them in that type of marketplace. Maybe we'll take one more in the back of the room there.
And then we probably need to wrap it up.
Okay. Thanks, Roman Shaw. I wanted to just follow-up on but just a long term CapEx target of 30%. It actually looks like you'll come in a little bit below that, maybe 25% to 30% of sales for for fiscal 2017. So after growing CapEx meaningfully the last few years, you're basically saying that CapEx will grow in line faster than sales from here.
And I if you look at NAND, it seems like you guys spent a lot of capital to buy new equipment for 1st generation 3 d. And now you're layering from here, I would assume the capital intensity goes down. And then on DRAM, Your product portfolio today looks like it hasn't been this competitive versus Samsung in quite a while. And I guess last point would be that we don't hear you saying that you're planning to add a lot of capacity. So my question is, why can't the company do better than 30% CapEx to sales?
So, those are sort of long range targets. I think I think the reality is that over the long haul, the industry is becoming less capital intensive. We showed you bit growth over the next couple of years that we think is maybe slightly ahead of of where the overall industry market will be. And that's reflective, frankly, of we have a great NAND technology position, and it takes capital to move that forward. We are, we are, excited about our ability to deploy a wide range of 110 1x nanometer DRAM products, but that requires a capacity that has a certain footprint in order to accomplish it.
And so while we think over time, our CapEx intensity will become less intensive. We have some we have some, things that we want to accomplish over the next year or 2 or 3 that we think It's it's may generate the types of returns that warrant those types of spends.
Next year's down in the memory industry. Do you have the ability to bring down CapEx dollars? You have that flexibility?
We always have that flexibility, yes.
Okay. Thanks.
All right. Thank you all very much for coming. Again, we couldn't be more bullish about both our internal execution, in the year past and on a go forward basis. And on the opportunities we're facing going forward. And we thank you for your interest in Micron and your ongoing support of the company.
Thank you.