|
|
|
|
||
2017 ASMThe following is my report of the 2017 eMagin ASM. Many thanks to both my wife and Coveman for reviewing it to ensure there were no discrepancies among our memories. Opt ************************************************************************************ The ASM was moved this year to the offices of Goodwin Procter, LLP, eMagin’s new law firm. The switch in firms was, “in light of the greater complexity and sophistication of our legal issues in our deal structuring and negotiations with prospective strategic partners.” This is a much bigger firm --- it seems to have an entire floor of the New York Times building and this is just its New York office. Our meeting had been moved back an hour to 9:00 A.M. which made our schedule very tight. Naturally the train was late, it was pouring rain, and there were no taxis. It was only Jeff Lucas' gracious and effective effort to facilitate our entry by pre-alerting security to speed our building-entry clearance that allowed us to reach the meeting essentially on time. We are grateful to him. In contrast to the previous law firm’s office which was much smaller and compact this office had wide hallways and huge areas of empty open space. We were in a large room spread out around a huge U-shaped table which seemed even bigger since there were so few of us. There was no mingling --- people stayed in their seats. In the past all the directors were usually there as well as some of the host lawyers and other people. This year some of the directors were missing. In addition to Andrew and Jeff, Chris Brody, Paul Cronson, and General Seay were present for the whole meeting. Jill Wittels presided over the formal meeting, which lasted seven minutes, and then left for another committment. Les Polgar and Ellen Richstone were also there for part of the time. There was also someone named Marc from eMagin although I don’t know his position in the firm. There was one other man I didn’t know but I don’t think he was a lawyer. Unlike the previous meetings there were no lawyers from the firm. This year Coveman and his son were also there so we finally had a year where we were not the only shareholders. In previous years the conversation proceeded with our asking questions of Andrew so the questioners guided the order. This year to facilitate that I organized the questions like a script with questions on similar subjects grouped together. But the meeting procedure also changed with Andrew showing a series of slides and questions discussed in the order of the slides. My questions were completely jumbled as I hunted for the ones that corresponded to the current slide and I only had time to scribble brief notes on the answers to those I did find and ask. As a result I missed some questions I had planned to ask The first slide was plastered with a mass of names of the many companies with which eMagin did business. I started by asking Andrew what he meant when he used the phrase “Tier 1 company”? Did he mean a major conglomerate such as Apple, Google, Microsoft, Samsung, Facebook or Amazon or could it mean just a leader in a specific area such as AMD, Intel, or Nvidia in electronics or Occulus, HTC, or ODG in headsets? His answer was, “All of them”. In the Q117 CC Andrew said, “Our goal is to get funding for the next-generation display design as a precursor to establishing the mass production capacity that the consumer market would require”. He described a three-step strategy saying, “In summary, we ….. develop the technology, are working on the next-generation display design, and working towards mass production with partners”. With respect to the first step of providing the next-generation technology he said in the Q117 CC, “We started this quest by listening to what the AR, VR, HMD companies needed. That is high brightness, OLED microdisplays.“ and “It is clear to us that our direct patterning technology is the game changer in AR and VR.” I asked if this meant that direct patterning represents eMagin’s key technical advantage and that DPD chips will ultimately dominate eMagin’s product line. He answered that ultimately they would but, with the possible exception of some monochrome versions, not until the new next-generation DPD manufacturing facilities were operational. With respect to the second step of developing next-generation chips he said in the Q117 CC, “we’ve had great success here. One company is funding a next-generation display that we are currently developing, another is actively involved in discussions and for whom we have done preliminary work and the third is keenly interested in our current 2Kx2K design.” I asked if any of these three companies are one of the Tier1 companies with which eMagin had announced an agreement and he answered, “No”. I asked if one of these three companies is the one that paid the licensing fee and if so which one? He said, “No” but didn’t elaborate further With respect to the third step of establishing manufacturing facilities he said in the Q117 CC, “we are actively talking to high volume manufacturers to join our commercial display partners and us to fund and build the production capacity to handle the volumes required for the commercial market. Our commercial display partners are also introducing potential mass production partners to us”. I asked if the word “precursor” meant that the next-generation display design had to be completed before beginning the establishing of mass production capacity. He said yes --- one had to know the design characteristics of the chips to be built before designing machines to build them. I said that could make it years before actual mass production was in full swing. He agreed. He said the prerequisite chip design cycles could take a year, and then it would take at least nine more months to get the machine built and another three to get it installed. I said I thought such a machine schedule sounded optimistic --- it was similar to the original one with the SNUP machine and we all know how that turned out. Moreover there’s another half- year of qualification runs and 1000-hour life tests, etc. before actual full production can start. So it seems to me it’s at least 2020 before we see mass-produced next-generation display chips bringing in revenue. Andrew mentioned there were other options to SNU Precision in Korea as well as a couple of companies in Japan which he said looked promising. It seems amazing that the comment, “The future is glorious, but it is always the future!” has held true for almost twenty years but it appears now that it will have to continue to do so for still another few. Andrew also pointed out that in the interim eMagin still had its current ”best of breed” product line which many customers are finding attractive. In addition, the 2K ULT chip available soon has key properties of the next generation, i.e. direct patterning and the advanced backplane. It’s true that the next generation of these displays hasn’t been built yet but it’s also true that the next generation of competing displays hasn’t either. Rafal asked a question about, “how far the discussion between a potential manufacturer and potential tier 1 customer is going”. Other than vague comments that things were progressing well Andrew made it clear he was not going to elaborate on any details of that subject. Rafal also asked if all eMagin’s potential is in direct patterning. With the exception of Blaze, discussed below, the answer appears to be yes. It’s clear that the mass-production facilities being developed are for direct patterning. If the current products are Generation N then direct patterning will be Generation N+1. Andrew did not address whether these facilities could also produce the current line but given the ability of direct-pattern displays to tradeoff between brightness and power there doesn’t seem to be any basis, other than possibly the need for a specific older physical size, for producing the older chips. He did not say whether earlier displays would be discontinued. This dependency on direct patterning could leave eMagin vulnerable to a possible breakthrough in a competitive technology such as HoloLens, Occulus Focal Surfaces, or Magic Leap, but even if that occurred it would probably be well into the future and direct patterning would have a big head start on moving to Generation N+2 which I believe, as discussed below, will be foveated rendering. In a related question Lurker wanted to know what market the XGA display was re-developed for. Andrew said it had not been re-developed --- there was no XGA II. Any new interest in it was simply because “it has available properties people like”. This would indicate that eMagin has no aversion to dusting off an earlier product if there’s a demand for it. There was also some comment about Sony’s XGA and he said he really didn’t know what Sony was doing with it. FBL and Akimed commented on the fact that though Karl Guttag had written that Sony was the “go-to” company for OLED micrdisplays Doug Lanman had switched from using Sony when working on the Nvidia prototype in 2013-2014 to using eMagin for the current Occulus Focal Surface Display prototype and wanted to know eMagin’s reaction. Andrew made some vague comment along the lines of eMagin’s products being attractive but he had a smile like the cat that just ate the canary. Acy asked about the position of Sony in the competitive landscape. Andrew didn’t seem worried about it. In addition to the above comments on XGA and the Oculus Focal Surfaces prototype, Coveman remembers Andrew repeating his comment that Sony only made displays for itself, not for 3rd parties. I remember his saying that one handicap Sony would have if it tried to sell to third parties was that it was a headset manufacturer itself and that other headset manufacturers would be nervous about having their supply chains at the mercy of a competitor. MW asked about micro LED. Andrew said that it did have some attractive properties, particularly high brightness, but that it was more amenable to big screens. He said Sony had tried it and had trouble. Current pixel densities were about 300 ppi and, for reasons I didn’t understand, it was very tough to put the LED dot in the right place. He gave the impression that if it ever were a threat it would be well into the future. Garce had questions about Blaze. Andrew said that consumer sales were slow compared to commercial ones, i.e. first responders etc. This doesn’t seem surprising --- I would have thought the Z800 would have shown eMagin that consumer merchandising was not its strongpoint. Apparently all those hunters Dan Cui was hanging out with keep a tight grip on their wallets. The commercial response was good, however, and the company will be concentrating on the commercial sector. He said that among other things it had cheaper marketing costs, steadier demand, and higher margins. Although it was not explicitly stated I got the impression the emphasis was mostly on the Torch rather than the Spark. He also said that in addition to the IR sensor it contained a special high-tech prism and that eMagin would be amenable to licensing the IP for both of these. Lurker expressed concern about unpatented trade secret protection in a manufacturing partnership environment. I had asked Andrew this question years ago with respect to SNUP and he said that just building the machine did not necessarily spell out how it would be used. He had said eMagin might ask to have a temperature sensor installed in an unusual place, for example, but that didn’t tell SNUP why it was at that location, at what point in the process it should be monitored, or what readings were acceptable. I said that wouldn’t be applicable in a manufacturing partnership since the partner would be actually producing the product and thus know every step. He agreed and said eMagin had to lock it up via the contracts and agreements and though difficult it was possible. I guess that’s one reason eMagin switched to such a high-powered law firm. He emphasized that patents expired in 20 years (I thought it was 17) and that trade secrets never do. I thought this was an optimistic view on how long one could keep such secrets from leaking. I asked about human limits. There is a quest for ever-higher levels of brightness, for example, and there have been references to monochrome levels of 24,000 nits. I asked at what level do we begin burning people’s eyes out and what was the point of going further. Andrew said that if you looked directly at a 10,000 nit source you would be very uncomfortable. He added, however, that the brightness levels quoted were the level leaving the display and that all the reflectors, splitters, etc in the optical path each take a chunk of this energy. Only a small fraction (I think he said 20% or less but I’m not sure) ever reaches the eye. So you might need 50,000 nits at the chip to reach the “very uncomfortable” level of 10,000 nits at the eye. In a similar vein I asked about limits with respect to resolution. The human eye can only detect down to a resolution of one arc-minute. This means that a desirable FOV of 120 degrees would require 120 times 60 or 7200 pixels. On a 1-inch chip this corresponds to a pixel size of about 3.5 µm. Andrew said the optical system can spread these out to obtain a wider FOV and with this pixel size the optical system would have to spread them out by a factor of two (he didn’t say if there was a “spreadability” limit.) . Sony has achieved 8 µm and eMagin is currently at 9.6 µm so there is a long way to go to reach the 3.5 µm limit. Even if such a small size were attainable, however, it might not be practical, as Andrew has said in the past that independent of the problem of producing such a small pixel it’s already a problem fitting the required backplane circuitry under such a small area. One solution is to increase the size of the chip and this may be the impetus behind the 35 mm chip project. For a 35 mm chip a 120-degree FOV would require a pixel size of 5 µm which might be possible. Alternatively, with a 8 m pixel size the resolution of such a display would be 1.6 arc minutes which is less than double the limit of the eye and probably good enough for most people. Garce had asked about work on the 35 mm chip and I was eager, myself, to find out about this but unfortunately in the confusion the question got buried and I forgot to ask it. My apologies. Even if manufacturing problems could be solved there is another problem in reaching the limit of the eye. A FOV of 120 degrees horizontal and 100 degrees vertical would require 7,200 times 6,000 or around 43 megapixels. An image refresh rate of 120 hertz would require a pixel refresh rate of over 5 gigahertz. This far exceeds the display computing power available today. A solution to both the manufacturing and computing problems is foveated rendering. The limit of 1 arc-minute applies only to the fovea in the center of the retina where vision is the sharpest. The resolving power of the eye falls off sharply with increasing distance from the center. This means that if eye-tracking is available the chip can display the point looked at in a small central area of a chip populated densely enough with pixels to reach the 1 arc-minute limit. The optic system can then spread the remaining outer chip pixels much more sparsely to fill the FOV. This drastically reduces the number of pixels required on the chip and reduces the need for both pixel size reduction and computing power. Andrew said at last year’s ASM that eMagin was already looking at foveated rendering and the fact that the advantages are so significant is why I said above that I believe the N+2 generation following the direct patterned generation will be the foveated-rendering generation. Akimed wanted to know about “the Oculus Focal Surfaces Thingy” and there was quite a discussion about it. There’s a lot of information about this topic via Google so the following is just my summary of what I found. There are two ways the brain determines the distance to an object. The first is by accommodation which is the flexing of the eye muscles to change the shape of the lens so that the object is in focus. If an object is close the lens becomes thicker and if it’s far away the lens becomes slimmer. The shape of the properly focusing lens is thus an indicator of the distance. The second is by convergence which is the degree to which the eyeballs must be angled towards each other to have them both point to the object. If an object is close the eyeballs are angled to a greater degree and if farther to a lesser degree. The degree of convergence is thus also an indicator of the distance. Since these two ways are redundant the brain expects them to agree and in normal reality they do. Since the headsets provide separate images to the right and left eye, for VR the system can control the degree of convergence by modifying the images stereoscopically and cause the convergence to shift from the value for the display image plane to the correct value for the desired virtual distance. Unfortunately there is currently no corresponding technology for shifting accommodation. The eye remains focused on the display image plane so the lens shape remains constant at the correct shape for this distance. The two supposedly redundant methods do not agree and the Oculus research indicated that this was a source of confusion and stress. The Oculus Focal Surfaces are a technology that allows the system to modify the image so that lens is shaped to focus on the desired virtual distance rather than the display image plane. The two depth-perception methods are then in agreement. In the military convergence is not a factor because soldiers only use one eye in shooting. In the discussion General Seay explained that the military could cope with accommodation by training soldiers to shift their vision focus very quickly between front and rear sights. Paul Cronson was of the definite opinion that the ability to control the accommodation method of depth perception would be a very desirable feature. Andrew said that if this technology does pan out he thought it lies well into the future. The Oculus team termed it research and said there’s a long way to go. IMHO it would probably be a N+3 generation factor at the earliest. Rafal asked about Intevac offering 2K x 2K chips for Elbit headsets. I always thought eMagin sold chips directly to Elbit. I was not familiar with Intevac and so I thought he was mentioning it as a competitor to eMagin and offering its own 2K x 2K chips. Since I had never heard of it as a direct competitor I thought it was relatively unimportant and as time was short I skipped the question. Big mistake! Thinking back, I think I remember (but I’m not sure) the name Intevac on the first slide crowded with the names of companies with which eMagin was involved. When I got home and checked the Intevac website I found the statement, “We are the sole source provider of integrated digital imaging systems for most U.S. military night vision programs.” I immediately thought of the biggest of these, ENVG III, by BAE and DRS. I have always assumed eMagin sells chips directly to these companies but this statement implies that it sells them to Intevac which builds them into subassemblies which it then sells to BAE and DRS. This would be analogous to Sagem’s FELIN program where eMagin did not sell chips to Sagem but to Thales which built them into its Minie D which it then sold to Sagem for incorporation into FELIN. I’m embarrassed by my ignorance in this area and would appreciate any clarification people can offer. Lurker also asked about comments a few years ago concerning a large 10-year aviation helmet upgrade program. I did not have time to ask about this but if you are referring to the F35 helmet program I can offer some comments. This program was initiated at a time before eMagin OLED chips were a viable candidate and so went with LCD. The helmet was deliberately designed, however, to be upgradeable with respect to the display system. I asked Andrew about this a few ASMs ago and he said eMagin had progressed to the point where they were now a viable alternative and if an opportunity presented itself they would be ready. The F35 program has had a history of trouble with the display, particularly with effects of “green glow” and while there have been continual improvements, the problem persists. I’ve read a recent report that said that the system is now as good as it can be made with the current hardware so if the Air Force really wants to eliminate the problem it will probably have to upgrade to OLED. If it’s the F35 program, however, I believe it’s fairly small. Andrew originally mentioned it in contrast with a helicopter program and used its smallness to say, indirectly, that the helicopter program, by contrast, would be large. I read a recent report that said the F35 helmet would involve 3,000 units but I also read another one that said 20,000, so who knows. Coveman asked why the stock offering was necessary. Jeff answered that the company wanted to improve its cash position so that the balance sheet would look better. As the meeting was breaking up I had a chance to talk to Andrew directly. I was interested in his feelings about the future path of the company. It seemed to me that eMagin could try to grow into the key supplier of microdisplays selling to the various VR/AR players in the manner of Intel’s becoming the key supplier of CPU chips to the various PC makers where eMagin Equipped became analogous to Intel Inside. Alternatively, it could be bought out and become part of a vertically integrated major. I asked whether at 66 he was interested in trying to climb on top of a leviathan worrying about business factors such as financing the tooling of new manufacturing generations, when to discontinue support for older products, etc. He said what he enjoyed most was building and working with teams to solve problems. I got the impression that he wasn’t looking to drive another leviathan version of Intel down the highway, but that’s only my impression. As to a buyout, that would depend on Mort Sackler. Chris Brody, Mort’s representative on the BOD has said at past ASMs that Mort was not averse to a buyout “but not at fire-sale prices!” It’s JMHO but the most logical candidate to me would be Facebook. It is the only major player without its own vertically integrated stack of suppliers. Except for Oculus as a headset maker Facebook is limited to software. If it acquired eMagin for microdisplays and AMD with its Radeon platform and CPU/GPU chips it would have a complete vertical stack and be on an equal footing with the other major players. That would be expensive, but Facebook is not a poverty pocket. We left the meeting area by different routes but I bumped into Andrew again in the coat room. I asked him, “Do you have enough cash to see you through all this”? He answered, “I’m comfortable with our cash position” with a contented smile on his face. I asked, as Rafal had also, whether the military ramp-up and other increases would be enough to break even. He repeated, “I’m comfortable with our cash position” with the same contented smile on his face. So we may see a return to breakeven or profitability soon. OTOH, it could be there are plans for another stock sale, an imminent buyout, or something in between. |
return to message board, top of board |
Msg # | Subject | Author | Recs | Date Posted |
3017 | Re: 2017 ASM | frankenberry | 2 | 6/6/2017 8:16:35 AM |
3018 | Re: 2017 ASM | iamgarce | 1 | 6/6/2017 9:43:03 AM |
3019 | Re: 2017 ASM | slick | 1 | 6/6/2017 11:15:22 AM |
3020 | Re: 2017 ASM | frankenberry | 1 | 6/6/2017 12:48:32 PM |
3022 | Re: 2017 ASM | ema_lurker | 0 | 6/6/2017 11:20:18 PM |
3024 | Re: 2017 ASM | akimed | 2 | 6/7/2017 5:47:55 AM |
3040 | Re: 2017 ASM | frankenberry | 2 | 6/11/2017 12:30:50 PM |
3055 | Re: 2017 ASM | ema_lurker | 3 | 6/22/2017 2:46:41 AM |