OLED Association
  • Home
  • Past Musings
  • Who We Are
  • FPD & OLED Market Reports
  • Board Members
    • Members
  • Join Us
  • Contact OLED-A
  • Evaluation
  • BOE, Tianma, TCL Report Strong Profits In 2021_03/27/22
  • Home
  • Past Musings
  • Who We Are
  • FPD & OLED Market Reports
  • Board Members
    • Members
  • Join Us
  • Contact OLED-A
  • Evaluation
  • BOE, Tianma, TCL Report Strong Profits In 2021_03/27/22
Search by typing & pressing enter

YOUR CART

Musing-Weekly Newsletter

Vertical Divider
The Impact of Machine Learning
November 27, 2017
 
Its becoming critical that phone companies start working on their own machine learning solutions now in order to remain competitive when those things become essential and core to the user experience, as they threaten to do as early as next year. Chinese companies may work at ludicrous speeds when iterating on hardware, however the rules change when the thing you’re trying to replicate is months and years of machine learning. One of, if not the most impressive expressions of machine learning consumer tech to date is the camera on Google’s Pixel and Pixel 2 phones. Its DSLR-like performance in low-light conditions is astonishing. Google’s imaging software transcends the traditional physical limitations of mobile cameras (namely: shortage of physical space for large sensors and lenses), using a combination of algorithms and machine learning. Google has turned a light problem into a data problem, and few companies are as adept at processing data as Google. Even if Google had done nothing whatsoever to improve the Pixel camera in the time between the Pixel and Pixel 2’s launch, the simple accumulation of machine learning time has made the camera better. Time is the added dimension that makes machine learning even more exciting. The more resources applied to a machine learning setup, the better its output becomes, and time and processing power (both on the device itself and in Google’s vast server farms) are crucial. Google Assistant is not a differentiating feature for hardware, as Google wants to have Assistant running on every device possible. But the Assistant serves as a conduit for funneling users into Google search and the rest of the company’s services, with practically all of them benefiting from some variety of machine learning. What Assistant does for the mobile market is to enhance Google’s influence over its hardware partners: pity the manufacturer that tries to ship an Android phone in 2018 without either the Google Play Store or Assistant on board.
 
On the Apple side, machine learning is permeating much of the software running on the iPhone already, and the company’s Core ML tools are making it easy for developers to add to that library. But the big highlight feature of the new iPhone X, the thing everyone notices, is the notch at the top of its display and the technology contained within it. Up in that monobrow section exists a full array of infrared and light sensors, something tantamount to a Microsoft Kinect system, which facilitates the new Face ID authentication method. It is still open as how well Face ID strikes the balance between security and convenience (especially without the fallback of Touch ID’s fingerprint recognition). The system is robust enough to work in the dark and, thanks to machine learning, it will adapt to changes in an appearance. Strip away all the usual incremental upgrades and design tweaks, the Face ID system is the iPhone X’s defining new feature. And it’s reliant on machine learning to work its technological magic.  It may still be early for machine learning enhancements to truly be the key selling point for mass-market phones. Face ID is of secondary importance to iPhone X purchasers more attracted by the new, bezel-phobic design. While Google’s camera is the best reason to own a Pixel, there are still few Pixel owners and the adversities of the pOLED doesn’t help. Outside of Apple and Google, Huawei has been the biggest proponent of implementing machine learning and AI in mobile devices. The company’s latest phone and processor are both marketed as having “the real AI” smarts. Huawei is moving in the right direction with this AI push, however, unlike Apple and Google — both of which have turned machine learning into tangible, obvious and (literally) user-facing features — Huawei’s approach is to dig into the far less marketable sphere of using machine learning to optimize Android performance over the course of long-term use. However, it’s hard to imagine it being a true differentiator when people are comparing shiny new phones in a store. Huawei is also putting some marketing toward having “camera AI” that tries to automatically enhance images based on detecting what is being photographed, however it is not close to the effectiveness of Google’s Pixel.
Huawei’s example shows that machine learning itself is not the unique selling point; the unique selling points are and will be built on top of machine learning. The OLED display on the iPhone X is impressive. As pricey and exclusive as it may be, though, that panel is available on Samsung, not just Apple. Every new hardware tweak from Apple seems to be targeted at making the manufacturing of its devices trickier and more technical — such as the Taptic Engine for haptic feedback, the 3D Touch interaction on iPhone displays, and the Touch Bar on the newest MacBooks — but all of those are ultimately systems that can be reverse-engineered and replicated by others. The days of phone makers being able to secure a major hardware advantage for longer than a few months are now gone. At this late stage of the evolution of smartphones, machine learning is the only path toward securing meaningful differentiation. Google’s camera is widely underrated, mostly owing to Google’s chronic inability to distribute Pixel devices widely enough. Face ID will be copied, likely badly, by a whole slew of aspiring competitors. But the distinguishing line between the true mobile innovators and the fast copycats, which had until recently been blurring and fading, Apple’s ARKit is positioned as the forerunner to the new world of Immersive Computing and takes machine learning to another level:
  • TrueDepth Camera -- iPhone X and ARKit enable a revolutionary capability for robust face tracking in augmented reality apps. Using the TrueDepth camera, an app can detect the position, topology, and expression of the user's face, all with high accuracy and in real time, making it easy to apply live selfie effects or use facial expressions to drive a 3D character.
  • Visual Inertial Odometry -- ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with Core Motion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.
  • Scene Understanding and Lighting Estimation -- With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.
  • High Performance Hardware and Rendering Optimizations -- ARKit runs on the Apple A9, A10, and A11 processors [iPhone 6s series and later]. These processors deliver sufficient performance to enable fast scene understanding and supports detailed and compelling virtual content on top of real-world scenes. A related web page, which provides guidelines for developers, includes reminders that AR uses via the iPhone are not optimal if they bother the consumer. For example, one bit of guidance advises the developer:
The high degree of AAPL's enthusiasm for AR --AAPL's CEO, Tim Cook, spared little effusiveness for this technology in his prepared remarks during the Q3 conference call, saying:
  • The launch of iOS 11 also made iOS the world's largest platform for augmented reality. There are already over 1,000 apps with powerful AR features in our App Store today, with developers creating amazing new experiences in virtually every category of apps aimed at consumers, students and business users alike. Put simply, we believe AR is going to change the way we use technology forever.
  • We're already seeing things that will transform the way you work, play, connect and learn. For example, there are AR apps that let you interact with virtual models of everything you can imagine, from the human body to the solar system. And of course, you experience them like they're really there. Instantly, education becomes much more powerful when every subject comes to life in 3D. And imagine shopping when you can place an object in your living room before you make a purchase, or attending live sporting events when you can see the stats on the field. AR is going to change everything.
  • That's unequivocal about how AAPL looks at AR's future.
  • It's important to note that the huge size of the iPhone market and the speed with which so many iPhone users upgrade to the latest iOS version may well make iOS ideal for AR software developers.
Where AAPL wants to go to take AR to the next level -- A TechRadar article from last month, Augmented reality is ready for the big time - but it needs one more thing, explains the practical problem and where things have to go for AR to be more useful. From the final section of its article:
  • EyePhones -- But there's still a slight issue - holding the phone up in front of your face still feels weird. The space to prod is small, even with phablets, but tablets are cumbersome to carry around. The idea that you can watch a movie played out in 3D on your dining table is amazing, but you'll soon tire of holding a device out... Throughout all the demonstrations, it became clearer and clearer where the watershed moment lies with AR: a pair of glasses that enhances everything you're looking at.  From an interview that Tim Cook gave while in London last month with the Independent, we get a good look at AAPL's thinking:
  • Cook claims that shopping will be changed "entirely" by augmented reality, and says that he doesn't think "anything will be untouched" He is really pumped about AR.
  • AAPL believes that AR can provide AAPL with a large competitive advantage. From the same newspaper report:
  • "Our competitors are trying to mimic what we've done," says Greg Joswiack, Apple's vice president for iOS, iPad and iPhone marketing. "But they just don't have that scale we bring to it."
  •  [Apple's] competitors "don't control the hardware and software", Cook says. "It goes to what Apple is about - the integration of those two things, with the App Store on the server side. I think it's going to be hard for other folks [i.e., Android]."
  • AAPL is talking tough here. Again, AAPL under Tim Cook has tried to double down on secrecy, so I take what it does say at the CEO level seriously.
 
Elsewhere in this interview, Cook overtly compares the ramp of AR to the ramp of the App Store itself: a slow starter, and underestimated, but then huge. However, there's a big "but:"
  • "There are rumors and stuff about companies working on those - we obviously don't talk about what we're working on," Cook says...
  • "But today I can tell you the technology itself doesn't exist to do that in a quality way. The display technology required, as well as putting enough stuff around your face - there's huge challenges with that.
  • "The field of view, the quality of the display itself, it's not there yet," he says. And as with all of its products, Apple will only ship something if it feels it can do it "in a quality way".
  • So, AAPL is looking and, I infer, researching matters with avidity, but it's not "there" yet.
Apple Inc., seeking a breakthrough product to succeed the iPhone, aims to have technology ready for an augmented-reality headset in 2019 and could ship a product as early as 2020. Unlike the current generation of virtual reality headsets that use a smartphone as the engine and screen, Apple's device will have its own display and run on a new chip and operating system, according to people familiar with the situation. The next step [after ARKit] - creating a headset with a built-in display capable of streaming 3D video without draining the battery -- is much more complicated – see Apple article later in this Musing. Referring to challenges creating displays, Chief Design Officer Jony I've told a tech panel last month that "there are certain ideas that we have and we are waiting for the technology to catch up with the idea." There is a good deal of detail stuffed into that brief article, which I will not quote. One that must be mentioned is that according to Bloomberg's information: Because Apple doesn't have a fully operational headset of its own, engineers have begun using HTC Vive headsets for testing purposes. If this is true, then I doubt that anyone can really put a time frame on what will be ready at any specific time. But this does lay down a marker for us as AAPL investors or potential investors.
AAPL needs a new dream to really get its P/E above 20X in this market.
  • For starters, it would be great if more AR applications arrive for the iPhone and iPad. They will help increase sales, with the iPhone X (and future latest-greatest new models) the greatest beneficiary. Better, if AR on iDevices really takes off, the App Store will sell that many more AR apps, which will be higher-priced apps. Thus, AAPL's services segment should be a big beneficiary, especially given its high profit margins.
  • However, for AR to simply drive extra iPhone and app sales, it is not necessary for AAPL to have its CEO make the very high profile endorsements of what AR is going. This makes me think that AAPL wants the entire AAPL ecosystem to become increasingly invested in AR, from app developers to customers looking forward to the next hit AR-based games, to various practical uses... and then to a more comfortable, more capable system. In other words, Google Glass (NASDAQ:GOOG) (NASDAQ:GOOGL), but much more functional and attractive.
Which gets to the AR dream that could help AAPL power higher.  AAPL wants to take AR far beyond the Google Glass concept, but do it with the level of fit and finish, and commercial potential, that is AAPL's hallmark. Based on prior and sometimes lesser AAPL opportunities, such as large screen TVs that Gene Munster and others latched onto, watch out for additional leaks, rumors, what-have-you about AAPL and its interest, progress, etc. in AI. Remember, AAPL never said much publicly about the Car. That was a media thing; because AAPL knew that the Car was a difficult, uncertain project that it did not want to have the CEO make a stand on. That's the opposite of how it is handling AR.
The lowest-cost iPhone that Apple is expected to launch next year is a device with a full-face 6.1-inch liquid crystal display (LCD). The two higher-priced phones that are expected to debut alongside that iPhone should have 5.85-inch and 6.46-inch organic light-emitting diode (OLED) displays, respectively. On social media, Kurt Marko, who is an independent technology analyst, argued that the manufacturing cost differences between the 6.1-inch LCD iPhone and the 5.85-inch OLED iPhone won't be that large, making the idea of a lower-end 6.1-inch LCD iPhone questionable. What should help Apple with OLED costs is the arrival of a second and third supplier most likely in 2019.

    Subscribe to Musuing

Submit

Contact Us

Barry Young
​barry@oled-a.org

Neo Kim
​neo@oled-a.org


Sungeun Kim
​sungeun@oled-a.org

Visit us at OLED-A.org



COPY RIGHT  2022 OLED ASSOCIATION. ALL RIGHTS RESERVED DISCLAIMER