<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Ian Uymatiao]]></title><description><![CDATA[I write what I think, then do it all over again]]></description><link>https://ianuymatiao.com/</link><generator>Ghost 3.17</generator><lastBuildDate>Sat, 15 Nov 2025 16:46:51 GMT</lastBuildDate><atom:link href="https://ianuymatiao.com/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Public Domain Day 2024: It's D-Day (D as in Disney)]]></title><description><![CDATA[Public Domain Day in 2024 stands out above most other years due to an important milestone in the history of copyright law.]]></description><link>https://ianuymatiao.com/public-domain-day-2024-d-day/</link><guid isPermaLink="false">6591aa1bb16a7e06206542a6</guid><category><![CDATA[movies]]></category><category><![CDATA[public domain]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Mon, 01 Jan 2024 06:46:00 GMT</pubDate><media:content url="http://ianuymatiao.com/content/images/2023/12/Steamboat-Willie-2.jpeg" medium="image"/><content:encoded><![CDATA[<img src="http://ianuymatiao.com/content/images/2023/12/Steamboat-Willie-2.jpeg" alt="Public Domain Day 2024: It's D-Day (D as in Disney)"><p></p><p>January 1st is one my favorite times of the year. Not just because it's a public holiday and a good time to make New Years Resolutions, but because it's also an important day for human knowledge. Every January 1st, a new set of creative works enter the public domain, where they are allowed to be copied, remixed and enhanced by anyone without any legal repercussions.</p><!--kg-card-begin: markdown--><p>After a highly controversial 20 year freeze on this phenomenon that lasted from 1998 to 2018 (due to <a href="https://en.wikipedia.org/wiki/Copyright_Term_Extension_Act">this travesty of a bill</a>), we've been enjoying a steady stream of new works entering the public domain for the last five years.</p>
<!--kg-card-end: markdown--><p>But 2024 has been a highly anticipated year for the public domain for a very specific reason. It's this cartoon:</p><!--kg-card-begin: html--><iframe src="https://commons.wikimedia.org/wiki/File:Steamboat_Willie_(1928)_by_Walt_Disney.webm?embedplayer=yes" width="640" height="480" frameborder="0"></iframe><!--kg-card-end: html--><p></p><!--kg-card-begin: markdown--><p>For those not familiar, this is Steamboat Willie. It's a very historical cartoon short film for multiple reasons. It's the first cartoon with fully synchronized sound, it's the first cartoon to star world famous icon Mickey Mouse (and Minnie Mouse), and it's arguably the creative work that kickstarted the Walt Disney empire (if you don't count the earlier <a href="https://en.wikipedia.org/wiki/Alice_Comedies">Alice Comedies series</a>).</p>
<!--kg-card-end: markdown--><p>As of January 1st, 2024, Steamboat Willie is officially in the public domain. That means that the whole world now has full creative freedom to do whatever they wish with Disney's first Mickey Mouse cartoon. In fact, I just exercised my creative freedom by embedding this previously copyrighted work into this blog post. Disney previously had the right to sue me for distributing this cartoon without their permission; now they don't. That's the power of the public domain.</p><!--kg-card-begin: markdown--><p>Steamboat Willie entering the public domain is also of symbolic importance. For the longest time, Disney has been one of the staunchest advocates of copyright extension. They're one of the biggest reasons why copyright terms (which previously lasted 56 years) now last a whopping 95 years. People have literally lived healthy lives and died within the copyright span of this black-and-white Mickey Mouse cartoon. After copyright terms had alredy been extended multiple times in the past, Steamboat Willie could have entered the public domain back in 2004. But Disney spearheaded <a href="https://en.wikipedia.org/wiki/Copyright_Term_Extension_Act">that aforementioned travesty of a bill</a>) that prevented new creative works from entering the public domain for 20 long years. Given the timing of it all, that bill being informally called the &quot;Mickey Mouse Protection Act&quot; makes sense.</p>
<!--kg-card-end: markdown--><p>Because of Disney's actions in helping extend the copyright term to an absurd duration, Steamboat Willie had become a symbol of the uphill battle that the public domain faces against the interests of big corporations. To see Steamboat Willie finally enter the public domain closes a highly frustrating chapter in the ongoing struggle for creative freedom and free-flowing human knowledge.</p><!--kg-card-begin: markdown--><p>But there is a reason why Disney fought to hold on to Steamboat Willie's copyright for so long. It's not just the cartoon that's in the public domain. Without sounding hyperbolic, <strong>Mickey Mouse is in the public domain as well</strong>.</p>
<p>To temper expectations, it's not a total free lunch just yet. Only the Steamboat Willie version of Mickey Mouse is in the public domain, whereas subsequent incarnations (such as his colored design or his modern design with updated eyes) still remain in copyright. Artists also have to deal with the fact that Disney owns the <strong>trademark</strong> on Mickey Mouse. Trademarks are supposed to protect the <strong>brand</strong> of the trademark owner, so Disney (in theory) has the power to go after anyone who publishes a Mickey Mouse work that can be mistaken as coming from Disney itself. We have no idea how legally enforceable this trademark ownership is in the context of derivative creative works, but hopefully it's not some legal loophole that can be weaponized as an infinite copyright glitch.</p>
<!--kg-card-end: markdown--><p>But no matter what happens from here on out, this is still a monumental Public Domain Day to celebrate. Steamboat Willie is in the public domain, as is Mickey Mouse, Minnie Mouse and Bad Pete. As far as creative freedom is concerned, we're in uncharted territory now, and I couldn't be more excited.</p>]]></content:encoded></item><item><title><![CDATA[In Peace for All Mankind]]></title><description><![CDATA[On the 50th anniversary of Apollo 11, it's worth thinking about how we celebrate this accomplishment, and why we should do better for the next era of space.]]></description><link>https://ianuymatiao.com/in-peace-for-all-mankind/</link><guid isPermaLink="false">5d32bbf387bd62041a99f9dc</guid><category><![CDATA[space]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Sat, 20 Jul 2019 08:17:08 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1446941611757-91d2c3bd3d45?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=1080&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1446941611757-91d2c3bd3d45?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="In Peace for All Mankind"><p>50 years ago, two members of the Apollo 11 crew, Neil Armstrong and Buzz Aldrin, set foot on an alien world for the very first time. There shouldn't be any doubt that the Moon landing is one of the greatest achievements of the human race. With that said, on the 50th anniversary of Apollo 11, it's worth thinking about why and how we remember such an occasion.</p><p>In this day and age, whether it's humans going to the ISS or space probes heading out into the far reaches of the solar system, it's been clear that space travel has been an enormous boon for science, technology and overall progress for humanity. Apollo 11 is no different in that plenty of science equipment was brought to the surface of the Moon in order to study Earth's one and only natural satellite. And while the scientific community is eternally grateful to the Apollo program for bringing back Moon rocks to study and for leaving behind the still-functioning <a href="https://www.lpi.usra.edu/lunar/missions/apollo/apollo_11/experiments/lrr/">Laser Ranging Retroreflector</a>, it's still worth mentioning that Apollo wasn't exactly a purely scientific endeavor.</p><p>The race for supremacy of the sky was a defining trait of the Cold War. Once the Soviets released Sputnik into orbit, the two superpowers of the era tried to one-up the other in an attempt to show which ideology was superior in conquering space. While a Moon landing sounds like a logical conclusion to any space race, it was also a geopolitical ploy by the United States to make sure the Soviet Union had to work much harder for the last laugh. The first few years of the space race in low-Earth orbit were defined by total Soviet dominance, so by making the endgoal of this race the Moon, the United States effectively bought themselves additional time at a time when they were clearly losing. Think of it like trying to beat Usain Bolt in a race by stretching the 100m goal into a cross-country marathon; suddenly the odds sound much more even.</p><p>It was a political gamble that ultimately worked. By the end of the 1960s, Americans were walking on the Lunar surface, while the Soviet Lunar program floundered, and was ultimately cancelled in the 1970s. It should come as no surprise that the US stopped sending astronauts to the Moon around the same time. There could have been a lot more science that could have been done on the Lunar surface, but it didn't matter. The United States got the last laugh.</p><p>Things are certainly different these days, though. With more governments and private companies now capable of sending both objects and humans into space, the odds of any one entity having full supremacy of the sky is slimmer than ever. With the ever lowering cost of access to space, there is an emerging possibility that there will be humans permanently stationed on other worlds before the end of the century. As we begin to move towards this reality, whether it's NASA's <a href="https://www.nasa.gov/artemis">Artemis program</a>, or Elon Musk's ever lofty goals of <a href="https://www.spacex.com/mars">conquering Mars</a>, it's worth internalizing how we can build and improve upon the Apollo program from 50 years ago.</p><p>Certain things are certainly going to improve over Apollo for sure, such as the inclusion of female astronauts being a cornerstone of the Artemis program. Still, I do worry that this is just going to be another space race, another program where geopolitics trumps everything else. Hopefully, given the generally lower cost of getting into space, we won't have to enter another space era defined by supremacy.</p><p>When Neil Armstrong and Buzz Aldrin first landed on the Lunar surface, they left a plaque saying that "We Came in Peace for All Mankind". While I'm thankful that no death laser was ever installed on the Moon, it's a little difficult to take that statement at face-value given what we know about the US-Soviet rivalry at the time. My hope is that this time we will return to the Moon and explore other planetary bodies on their own terms. I want the next era of space travel to be one of peace, for peace.</p>]]></content:encoded></item><item><title><![CDATA[Breaking the 95 Year Prison]]></title><description><![CDATA[New works have once again entered the public domain, the first time since 1998. It's a momentous occasion, and the return of a yearly tradition.]]></description><link>https://ianuymatiao.com/95-year-prison/</link><guid isPermaLink="false">5c2a27ee8990670fd9b92e76</guid><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Tue, 01 Jan 2019 08:01:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1517770413964-df8ca61194a6?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=1080&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="https://images.unsplash.com/photo-1517770413964-df8ca61194a6?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=1080&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Breaking the 95 Year Prison"><p>This January 1st 2019, something really exciting just happened. For the first time since 1998, creative works in the US have once again entered the public domain. What these means is that a handful of old works can now be freely distributed, remixed and referenced without repurcussions. That means no copyright strikes, no infringement lawsuits, and no accusations of piracy.</p>
<p>This is meant to be yearly tradition. By law, no copyright is meant to last forever. When a work's copyright term is officially over, that work enters the public domain on New Year's Day. Most of us consider January 1st to be a day of renewal for ourselves, but the same can also be said of these creative works. By entering the public domain, these works can be given a second life, a different perspective. If you want an example of this in action, look no further than the countless fairy tales that have been turned into animated Disney movies.</p>
<p>Unfortunately, this copyright term has been extended multiple times by vested interests. Copyright terms used to only last a few decades; now it can last over a century (especially if it's a book). During the last copyright term extension in 1998, works that should have entered the public domain were given a blanket 20 year extension. In addition, every work published before 1978 was automatically given a 95 year copyright term. Even though F. Scott Fitzgerald passed away more than 70 years ago, meaning all of his literary works <em>should</em> be in the public domain by now, some of his famous works like The Great Gatsby still remain in copyright.</p>
<p>As a result of these copyright laws and numerous term extensions, it has taken an inordinately long time for old works to finally return to the people where it belongs. Take for instance today, when works from 1923 have finally entered the public domain. For context, the year 1923 was:</p>
<ol>
<li>Just five years after World War 1</li>
<li>Before the Great Depression</li>
<li>Before sound and Technicolor film</li>
<li>Before the United Nations</li>
<li>Still in the midst of Jim Crow</li>
<li>When Jazz was relatively recent and was still the devil's music</li>
<li>Before the breakup of the British Empire</li>
</ol>
<p>If it sounds like so long ago, that's because it is. The world moves so fast that it's silly that it's only now that the public has free access to such old works. It should be obvious by now that 95 years is far too long to keep important works in copyright purgatory, especially since many of them are no longer monetizable by publishers anyway. You may think that your work is worthy of a 95+ year monopoly, but not every book or movie gets to be as enduring and as profitable as The Lord of the Rings.</p>
<p>It's probably too late to dial back this copyright term now. A long-duration copyright term is pure upside for the billion-dollar media companies of today. I hope, though, that they're sane enough to realize that the buck should stop here. Any further extensions to copyright at this point would be pure lunacy, and would be a big disservice to those who are trying to preserve our human heritage, even those no longer worth much money to their respective publishers.</p>
<p>Yes, copyright <em>does</em> have value. Creators should be able to monetize their works for a reasonable period of time. But works should eventually return to the people. They should be given new lives that the original creators couldn't have dreamed of. As humanity continues to grow, so should its collection of <em>public</em> knowledge, not perpetually kept in the hands of a few individuals.</p>
<p>While there's still plenty to be frustrated about regarding the state of copyright, I don't want to distract too much from today's momentous occasion. Today is an amazing day for the public domain, and should wiser heads continue to prevail, this day every year will have something else to celebrate in addition to our yearly resolutions.</p>
<hr>
<p>Is one of your New Year's Resolutions to read more books? Try out these books that have just entered the public domain<sup class="footnote-ref"><a href="#fn1" id="fnref1">[1]</a></sup>:</p>
<ol>
<li>The Prophet -- Kahlil Gibran</li>
<li>The Murder on the Links -- Agatha Christie</li>
<li>Men Like Gods -- H.G. Wells</li>
<li>The Lighthouse at the End of the World -- Jules Verne</li>
<li>The Able McLaughlins -- Margaret Wilson</li>
</ol>
<p>You have the <em>right</em> to read, distribute and remix these books without consequence. If you don't find this book freely available yet, you can bet that they will show up in Project Gutenberg fairly soon.</p>
<p>Want more high-quality ebooks of Public Domain works? Check out <a href="https://standardebooks.org">Standard Ebooks</a>!</p>
<hr class="footnotes-sep">
<section class="footnotes">
<ol class="footnotes-list">
<li id="fn1" class="footnote-item"><p>These books have entered the public domain under the current US copyright system, where works published before 1978 have a copyright term that lasts 95 years. Some countries have terms shorter than this, however digital distribution remains tricky, so most online resources effectively have to wait on US copyright expiration. If you live in a country that has even longer copyright terms, then you may have to wait longer for such works to enter the Public Domain in your country. <a href="#fnref1" class="footnote-backref">↩︎</a></p>
</li>
</ol>
</section>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Android's Update Problem: 2018 Edition]]></title><description><![CDATA[With Project Treble bearing fruit, and Android Jetpack ushering a new era for backwards compatibility, it's a good time to revisit Android upgrades in 2018.]]></description><link>https://ianuymatiao.com/android/</link><guid isPermaLink="false">5c28ad8d8990670fd9b92dab</guid><category><![CDATA[android]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Sun, 30 Dec 2018 13:50:08 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><blockquote>
<p>This is an annual blog post I write about the state of Android updates. <a href="https://ianuymatiao.com/androids-update-problem-2017-edition">Click here for last year's edition</a></p>
</blockquote>
<p>I'm happy to say that 2018 has mostly been upside for the state of Android updates. The Android Team promised that under-the-hood changes in the OS's foundations would lead to better update cadence, and after I/O 2018 I'm definitely inclined to believe them. The venerable Support Library, which allows for features with backward-compatible APIs, has also seen a big revamp, so that's also something worth talking about. Lastly, Google is taking a stance on older versions of Android, with the reluctant(?) assistance of its app developers.</p>
<h4 id="projecttreblebearsfruit">Project Treble Bears Fruit</h4>
<p>Last year, Google announced Project Treble, a massive engineering effort that provided Android with a standardized hardware communication layer. This essentially allowed any hardware implementations that adhered to these standards to be able to run a new release with almost no modifications. It's not a panacea by any means, since Treble can define new standards that future Android releases will require, leaving outdated Treble implementations behind. Regardless, the engineering required to make existing hardware work with newer releases has been immensely reduced.</p>
<p>In addition, any Android phone that shipped with Oreo were <em>required</em> to implement the Treble standard, meaning all of these phones had the <em>potential</em> to run Android 9.0 and above with little to no additional effort. We saw the fruits of this labor in action at I/O 2018, when the pre-release version of Android 9.0 was immediately testable not just on Pixel devices, but also from phones <a href="https://www.androidcentral.com/project-treble-could-turn-out-be-more-important-we-thought">seven different manufacturers</a>.</p>
<p>It's honestly breathtaking to see phones from disparate OEMs suddenly able to run a brand new Android release, but the seemingly impossible was made possible thanks to Project Treble. It's important to keep in mind that Project Treble only really fixes the chip-support issue when it comes to compatibility. This doesn't mean that OEMs will suddenly have OS updates on Day 1, since OEM customization is still a thing on Android. Even if you were willing to go rogue and install a Treble-compatible custom ROM for your phone, you're still at the mercy of manufacturers providing unlocked bootloaders before you can take the situation into your own hands.</p>
<p>But let's not end on a negative note. Project Treble is fantastic, and my hope is that even more phones will be able to test and run Android Q in 2019.</p>
<h4 id="introducingandroidjetpack">Introducing Android Jetpack</h4>
<p>Google has always provided the Support Library so that some features of newer Android releases were also made available to older OSes through the use of a dedicated library. In 2018, we got introduced to <a href="https://developer.android.com/jetpack/">Android Jetpack</a>. The change isn't earth-shattering by any means, but it does allow the Support Library to be something more than just a backwards-compatible API.</p>
<p>Jetpack is essentially the Support Library cleaned up, and for good measure it also adds additional components to help make it easier to build Android apps. It contains APIs that allow for cleaner, more resilient code, and even provides convenience APIs that allow for better-flowing applications.</p>
<p>We're starting to see a more distinct separation between the actual components of Android and the libraries that make it easy to interact with these components. While Android Pie may add new components like Slices, the arguably best way to interact with these components will be through Jetpack. Fortunately, Jetpack can be upgraded more frequently than the OS and even the older Support Library, so developers will always have access to APIs that improve the development experience even on older OSes.</p>
<h4 id="raisingthefloor">Raising the Floor</h4>
<p>App support plays a big part when it comes to the support window of a specific Android version, and Google plans on tackling this issue on two fronts.</p>
<p>Newer SDKs that accompany every OS release often come with new security measures built-in, something that any app can take advantage of by setting its <em>target</em> SDK level. While the <em>minimum</em> SDK level determines the oldest OS that the app can run on, the <em>target</em> SDK level determines newest set of features the app can take advantage of if it happens to be running on that version of Android.</p>
<p>Sometimes, apps will purposefully <em>not</em> target a recent SDK release in order to opt-out of these security measures, which is problematic as Android tries to add additional privacy features to better protect users from malware.</p>
<p>At the end of 2017, the Android team have <a href="https://android-developers.googleblog.com/2017/12/improving-app-security-and-performance.html">provided a new mandate</a> that states that apps <em>must</em> target newer SDK releases for their apps, in order to stop them from getting away with features that compromise security and battery life. This is a win for users on reasonably updated phones, since any security and privacy features on your phone will very likely apply to any app you install from the Play Store.</p>
<p>Google is also targeting the minimum SDK issue. While the oldest versions of Android continue to fade from existence, there are some apps that still need to support these  OSes. This means Google needs to write additional code so that Google Play Services and Jetpack are able to work on these older phones at all. Eventually, however, these releases will be just too darn old to support <em>any</em> new features, so Google eventually has to say goodbye to these older releases.</p>
<p>Over the last couple of years, Google dropped support for Gingerbread and Honeycomb from Google Play Services and the Support Library. Soon, <a href="https://android-developers.googleblog.com/2018/12/google-play-services-discontinuing.html">Ice Cream Sandwich will join the graveyard</a>. This means that an app can only support Android Jelly Bean as a baseline if it wants to remain a modern Android app with access to the latest Google APIs and features. If you're developing an app that is completely severed from Google  Services, it looks like Jetpack is still supported for now, however I expect this to change in the near-future, since ICS's market share <a href="https://developer.android.com/about/dashboards/">is getting really close to zero</a>.</p>
<h4 id="lookingforwardtoanupwardtrajectory">Looking Forward to an Upward Trajectory</h4>
<p>2018 has overall been a pretty good year for Android updates. Project Treble essentially removes a large burden from silicon manufacturers, meaning OEMs and carriers remain the low hanging fruit. My hope is that further modularization of the operating system will allow OEMs and carriers to not require deep changes to the OS in order to get their customizations out the door. I personally would prefer that no customizations be allowed at all, but there's no way we can enforce this ideal in a world where Samsung is king.</p>
<p>On the security side, I'd still prefer if more drivers be allowed to use the APK method of updating, but hopefully Project Treble's modularity can at least ease the pains in this space. Apple's model of security updates remains the gold standard, and I hope Google at least uses iOS as a compass towards day one security patches for all parts of the phone.</p>
<p>Given the enormous strives that Android updates made in 2018, I'm going to have higher expectations for next year. While Android Pie did bring cool features such as Slices and Digital Wellbeing, I suspect that more of the features that impact developers and users will be coming through the always updatable Jetpack library. Even if some of the best features of Android Q will require a full OS upgrade, my hope is that Project Treble and further efforts from Google will set a better norm than we had in the last decade of Android's existence.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[What I Want from the Next iPhone SE]]></title><description><![CDATA[The iPhone SE has remained unchanged since it debuted in March 2016. If Apple decided to update their budget iPhone at all, what would it be like?]]></description><link>https://ianuymatiao.com/what-i-want-from-the-next-iphone-se/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d65</guid><category><![CDATA[apple]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Mon, 30 Jul 2018 11:14:40 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>I am one of those people who still really values having highly capable phones in small sizes, though the options right now are very very slim. Nearly all Android OEMs have abandoned small screen phones at the high end, only paying lip service to this size category with entry-level devices if at all. That means anyone who wants to buy a tiny phone rocking powerful hardware really only has one choice: Apple.</p>
<p>Sure, the iPhone SE is hardly a flagship when compared against the mighty iPhone X, but because nobody else has a viable contender in this space, the iPhone SE punches waaaay above its weight. I don't want to sound like I begrudgingly use an iPhone SE just because it's the only option, because on the contrary I really love this device. It's got the most powerful processor of the one-handed phones, it arguably has the best camera in the sub-$400 category, and it flies on iOS 11 even though it uses hardware dating back to 2015. Apple makes plenty of phones at different price points for different kinds of people, but if you don't mind the small size and the admittedly aged components here and there, the iPhone SE is easily the best-value iPhone currently sold today.</p>
<p>And the best-value iPhone sold today needs an upgrade.</p>
<p>When the SE first debuted way back in March 2016, I had a feeling that this &quot;entry-level&quot; iPhone would not get updgrades as frequently as its premium siblings. It's a device optimized around price, and the more mileage that Tim Cook got from this iPhone the more revenue was extracted over its lifetime even as costs continuously fell. With that said, I think it's finally time for the SE to get some things updated so that it gets back its competitive edge, and Apple can finally start retiring production lines that have been making over 5 year-old technology at this point.</p>
<p>As much as I love the iPhone SE, it does have some growing pains that need updatin', so let's get started</p>
<h4 id="anewprocessorupgrade">A New Processor Upgrade</h4>
<p>Because of the iPhone SE's status as a &quot;budget&quot; phone, Apple can afford to make the excuse that the A10 will suffice. To be fair the A10 is still a very capable mobile processor, and can easily compete with Android chips in any non-highly multithreaded workloads. However, if Apple does want the SE to also be great in these multithreaded tasks, then the A11 is the right way to go.</p>
<p>The A12 is coming in September, and given that it's likely to use the latest manufacturing process, the A12 is going to be a big jump over the A11, completely leaving the A10 in the dust. Again, Apple can justify the SE getting the &quot;low-end&quot; chip, but it comes off as poor taste when you consider that the original SE used the A9 chip, the same one used by the then &quot;latest and greatest&quot; iPhone 6S.</p>
<h4 id="upgradeddisplay">Upgraded Display</h4>
<p>I don't think the resolution will increase, considering that the iPhone 8 still has a pixel density of <a href="https://www.gsmarena.com/apple_iphone_8-8573.php">326 ppi</a>. However, I'd like to believe that Apple will at least give the new iPhone SE the same display lamination technology as contemporary iPhones that allow the pixels to be so much closer to the glass surface.</p>
<h4 id="anewcamera">A New Camera</h4>
<p>This is something to be optimistic about. Apple's next best camera after the iPhone 6S camera (the same camera as the iPhone SE) is the iPhone 7 camera, which comes with optical image stabilization! If Apple were to actually upgrade the rear camera on the iPhone SE (crossing fingers!) then the iPhone 7 camera is the logical next step compared to making a weird middle-of-the-road camera just for this relatively niche iPhone.</p>
<p>The really pressing matter, however, is the front camera. The SE still uses the same front-camera as the iPhone 5S, which is only <a href="https://www.gsmarena.com/apple_iphone_se-7969.php">1.2 Megapixels</a>! It's possible that Apple can use the iPhone 7's front camera as well on the new SE, but if there's one place where Apple can cut costs while still boasting of a substantial upgrade over the previous model, it's the use of the iPhone 6S's front camera instead. Personally, I'd bet on the iPhone 6S front camera for now, and allow myself to be pleasantly surprised if the SE gets something better.</p>
<h4 id="upgradedtouchid">Upgraded Touch ID</h4>
<p>This is really a no-brainer. The iPhone SE is literally the last iPhone to use the old Touch ID sensor. If Apple were keen to finally close down the production line responsible for Touch ID 1.0, then an upgrade to the latest Touch ID sensor in the next iPhone SE is certainly in order.</p>
<h4 id="3dtouch">3D Touch</h4>
<p>Not everyone is fully convinced of the practicality of 3D Touch, but if Apple is going to insist that it's an integral part of the iOS experience, then the iPhone SE is very likely gonna get it. Considering that 3D Touch has come standard with flagship iPhones since the iPhone 6S, this feature addition is certainly overdue.</p>
<h4 id="waterresistance">Water Resistance</h4>
<p>There's a certain fear that the inclusion of water resistance would automatically mean the removal of the headphone jack. For that piece of speculation, I have two things for you. First, the idea that a headphone jack and water resistance are fundamentally incompatible is <a href="https://www.samsung.com/us/support/answer/ANS00047867/">utter BS</a>. Second, Apple is going to remove the headphone jack whether we like it or not, so there's no point in believing that we should hold back water resistance for the next iPhone SE just to keep the headphone jack around.</p>
<p>If the iPhone SE does get water resistance, then that means the entire iPhone lineup for Fall 2018 will have this feature, so I'm really hoping this happens.</p>
<h4 id="willitevenhappen">Will It Even Happen?</h4>
<p>This is probably the most important question of all: Will Apple even bother to update the iPhone SE at all? Instead of upgrading the iPhone SE and discontinuing the iPhone 6S, why not keep the 6S around to serve emerging markets and the low-price segment? Considering that iOS 12 is expected to support models all the way back to the iPhone 5S, Apple can easily sell and support the 6S for a couple more years.</p>
<p>There's also the fact that big-screen phones utterly dominate the market today. Apple claimed in March 2016 that there's still a big body of consumers clamoring for a smaller smartphone (Hi, Tim!). However, it's looking like the cultural tide is looking increasingly unfavorable to people like me. More people are converting to bigger screens than the other way around, and the demand at the low-end is for big-screen budget phones from brands such as Xiaomi and Huawei. Apple probably initially thought that low-end consumers who wanted to buy-in to the Apple brand were willing to use a 4-inch device. However, considering that the demand for big-screen phones is increasing, and the price of older iPhone such as the 6S continues to drop, what purpose does the SE serve in 2018?</p>
<p>I have a guess as to what the answers to those questions will be, but I honestly hope I'm wrong, since I love the iPhone SE's form-factor too much. It would be a shame if the flagship-grade small phone went away for good, but considering that Apple needs to sell products at scale, not all niche things can last forever. I sure hope that's not the case, and maybe Apple disagress with the mainstream sentiment to a certain extent. We'll just have to see, and hopefully Christmas comes early for me when the September keynote rolls in.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Columbia]]></title><description><![CDATA[On the 15th anniversary of the tragedy that claimed the lives of seven NASA astronauts, I share a personal story of horror, heartbreak and growing up.]]></description><link>https://ianuymatiao.com/columbia/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d66</guid><category><![CDATA[space]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Thu, 01 Feb 2018 05:00:00 GMT</pubDate><media:content url="http://ianuymatiao.com/content/images/2018/11/Space-Shuttle-Columbia.jpg" medium="image"/><content:encoded><![CDATA[<!--kg-card-begin: markdown--><img src="http://ianuymatiao.com/content/images/2018/11/Space-Shuttle-Columbia.jpg" alt="Columbia"><p>I think I speak for a lot of people in my generation when I say that many of us wanted to be astronauts when we were children. As a kid I was surrounded by science books about space, the planets, and the people in funny suits whom we looked up to as heroes. The allure of the final frontier was so immense that it’s hard to imagine the childhood version of myself not wanting to become a member of an elite crew pushing the boundaries of exploration. For a period of time in my life, dreaming of becoming an astronaut felt great.</p>
<p>It's easy in hindsight to look back at our childhood aspirations as naive and rose-tainted. Of course being an astronaut is <em>extremely</em> difficult, and so few people go to space for a good reason: <em>it’s really dangerous</em>. These are things every person with a passing knowledge of the space program will eventually learn one way or another. Most kids were probably outright told how dangerous being an astronaut was by the grown-ups, or were told that being one wasn’t all it was cracked up to be. I wasn’t like most kids.</p>
<p>I’m sure my parents were very well aware of the hazards that came with becoming an astronaut. However every time I told them I wanted to be an astronaut when I grow up, bless their hearts, they just smiled and encouraged me to follow my dreams. They probably didn’t want me to lose my childhood innocence so soon; Santa Claus was already an early casualty just a year earlier.</p>
<p>Luckily for them they never had to deliver the bad news. I was going to figure it out on my own.</p>
<p>It was February 2, 2003, Sunday morning. I had just finished eating breakfast, and was going to my parents’ room for reasons I can no longer remember.</p>
<p>This was playing on the television:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/_6OTlK8LVu8?start=860" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>
<p>The Columbia Disaster was devastating news to hear on multiple levels. It woke me up to the idea that this dream profession of mine was potentially deadly, something that wasn't very emphasized in the books of my childhood. Even if I was aware of the tragedies involved in previous space program mishaps, I didn't think tragedy in space would follow humankind into the 21st century; I was wrong. Loss of life was still very much a possibility, and the grotesque image of the once majestic Space Shuttle Columbia disintegrating across the sky remains a painful memory.</p>
<p>In the wake of the investigation, the disaster also made the heartbreaking revelation that NASA was a flawed organization, one that could be mismanaged up until the unforeseen death of seven promising astronauts. I was given the impression up until that point that NASA was an invincible institution, where the brightest minds always triumphed over internal squabbling in order to accomplish incredible human feats. I still, to this day, believe that NASA does incredible work in the service of science and exploration, but the Columbia tragedy permamently changed my personal relationship with the space agency, one that was less pantheonic.</p>
<p>All of this was a lot to take in as a 10-year old kid who loved space, but for all the trauma it caused, Columbia may well have changed my life.</p>
<p>It may be odd that I am placing so much significance on this particular moment in my life, but I do genuinely believe that the Columbia tragedy was a major personal inflection point. It forced me to question everything I thought I knew about the people and the institutions I looked up to or was told to look up to. It humanized both the astronauts and the thousands of employees at NASA, providing me the much needed reality check that they are just as imperfect as the other people you hear about on the news. Most importantly though, it made me take a cold, hard look at my life choices. Like many kids before me, I ultimately decided to not be an astronaut; though it took an unforeseen tragedy to get to that realization.</p>
<p>All of these painful lessons are reasons why I can’t help but look back at this pivotal moment in the history of human spaceflight. In my case, to witness the Columbia disaster was quite literally to grow up, even if just a little bit. Everyone has critical moments in their upbringing that shaped their journey to adulthood, to the real world. Columbia was most certainly one of mine.</p>
<hr>
<h2 id="rememberingcolumbia">Remembering Columbia</h2>
<p><img src="http://ianuymatiao.com/content/images/2018/11/STS-107-Crew.jpg" alt="Columbia"></p>
<p><em>Official photo of the STS-107 crew. From left-to-right: David Brown, Rick Husband, Laurel Clark, Kalpana Chawla, Michael Anderson, William McCool &amp; Ilan Ramon</em></p>
<p>Looking through archival footage covering the events of Columbia, I stumbled upon this interview of some of the astronauts expressing their enthusiasm for the mission. They were also talking about how they were looking forward to future missions after STS-107. I struggled to hold back tears the whole time.</p>
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/uuPVkb-LL88?rel=0" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Android's Update Problem: 2017 Edition]]></title><description><![CDATA[This entry in the annual series looks back at a year of WiFi snooping and Bluetooth bombs, but 2017 gave us Project Treble, a massive change for Android.]]></description><link>https://ianuymatiao.com/androids-update-problem-2017-edition/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d64</guid><category><![CDATA[android]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Tue, 26 Dec 2017 04:00:00 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><blockquote>
<p>This is an annual series on the ongoing issue of Android OS updates. <a href="https://ianuymatiao.com/androids-update-problem-2016-edition/">Click here to see last year's edition.</a></p>
</blockquote>
<p>In many ways 2017 may be the worst year when it comes to Android OS updates. While Google continues to push more important additions through the Support Library, providing backwards compatibility to previous versions of Android, there still remains so many things that rely on good ol' AOSP. Standing for <em>Android Open Source Project</em>, AOSP is the open source operating system that remains the bedrock of all things Android. When Google advertises a new dessert update every year, it's actually a branding around what is essentially a new AOSP release. Unfortunately for us Android users, AOSP is the piece of Android that is in the hands of chip vendors, manufacturers and carriers to push to your phone, if they do at all.</p>
<p>2017 in particular has been a harsh year for AOSP, with serious security vulnerabilities found, and with it an equally harsh reality check that potentially millions of Android phones may remain vulnerable forever. But 2017 is also a year of hope for Android. A herculean engineering effort from Google has literally changed the way that Android is integrated into the phones of the near-future, and with it comes a hope that the worst aspects of Android updates may finally be behind us. There's a lot to look back on in 2017, so let's get started.</p>
<h4 id="alookbackonandroidsnewbetaprocess">A Look Back on Android's New Beta Process</h4>
<p>Starting with Android Nougat, Google decided to release their dessert releases extra early. This meant that developers now had more time to plan their apps for the next SDK release (every OS update comes with a new SDK revision), and manufacturers had more time to tinker with the new release it can be integrated with their OEM-specific features sooner.</p>
<p>It was a pretty exciting time, to be honest. While it meant that Android announcements had less excitement to them during Google's annual keynote since they were announced much earlier, the idea that developers and enthusiasts could get their hands on a new version of Android much earlier (assuming you had a Nexus or Pixel phone) was a far better prospect to get hyped over. I also think that some OEMs have successfully managed to make the best of these earlier alpha releases. In 2016 LG managed to <a href="https://www.digitaltrends.com/mobile/lg-v20-news/">release the V20 with Android Nougat out of the box</a>, and this year we got instant Oreo releases for the <a href="https://www.androidauthority.com/sony-xperia-xz1-hits-amazon-android-8-0-oreo-box-801694/">Sony Xperia XZ1</a> and <a href="https://www.androidauthority.com/huawei-mate-10-pro-807480/">Huawei Mate 10</a>.</p>
<p>It looks like having to wait the following year for a phone shipping with the latest Android version is now firmly a thing of the past. If you happen to own a Pixel phone or one of the latest flagships from Sony or Huawei, 2017 has been a pretty good year overall. For the rest of us, this year has been the complete opposite.</p>
<h4 id="androidsdriversfromhell">Android's Drivers from Hell</h4>
<p>It's been clear for the past few years now why these downstream companies shouldn't have such decisive control over the roll-out of Android updates; serious security issues have been found in parts of Android that currently do not fall under Google's jurisdiction. Last year's elephant was MediaServer, an open source dependency in AOSP responsible for powering all things media such as audio and video. Turns out MediaServer was a cesspool of critical vulnerabilities that have been around since the earliest days of Android. These vulnerabilities, collectively known as <a href="https://www.xda-developers.com/stagefright-explained-the-exploit-that-changed-android/">StageFright</a>, took an enormous <a href="https://source.android.com/devices/media/framework-hardening">overhaul of MediaServer</a> in Android 7.0 Nougat to stomp out most of them for good.</p>
<p>This year's Android vulnerability story is much worse.</p>
<p>There really isn't one thing responsible for all the bad vulnerabilities of 2017, rather a collection of similar mistakes that allow for device exploitation. This year we had to content with <em>bad drivers</em>.</p>
<p>It turns out it is really difficult to build truly secure drivers for a myriad of reasons. One such reason is that they reside in kernel space, which in layman's terms means that they have almost unfettered access to the device in exchange for remaining fast and efficient. Should these drivers have any programming errors in them that allows malformed input (say a rouge WiFi packet) to break its walled garden, this incoming input could then execute its malicious payload code because residing in kernel space allows them to do so.</p>
<p>Android has suffered severe vulnerabilities this year based on execution of dangerous code coming from bad data packets, namely Bluetooth and WiFi packets. <a href="https://www.armis.com/blueborne/">Blueborne</a> was a family of vulnerabilities that affected nearly all Bluetooth devices, taking advantage of flaws in either Bluetooth's design or its implementation to be able to run malicious code. <a href="https://www.krackattacks.com/">KRACK</a> exploited a WiFi design flaw that allowed attackers to break the WiFi encryption and snoop in on wireless communications. Android was actually the worst victim of KRACK, since a flaw in a Linux library used by Android allowed KRACK to make the device use a key of zero, which is nearly the equivalent of unencrypted communication. There was also <a href="https://blog.exodusintel.com/2017/07/26/broadpwn/">Broadpwn</a>, a severe bug found in a specific Broadcom WiFi chip that allowed for remote code execution on affected devices. What makes these vulnerabilities so dangerous is that they require almost zero interaction from the user; no phishing necessary. That means you can be infected just by being at the wrong place, at the wrong time.</p>
<p>Something to keep in mind about all of these vulnerabilities is that they are <strong>not</strong> exclusive to Android. In fact nearly all of the vulnerabilities I just mentioned affected iOS as well. But there is a distinct advantage in iOS's favor: Apple actively patches these vulnerabilities while the Android ecosystem at large does not. As you read this, hundreds of millions of Android devices remain vulnerable to these deadly attacks because their version of Android is too old, or their manufacturer or chip vendor has deemed these devices obsolete.</p>
<p>It's a terrible situation to be in, as we have to come to grips with manufacturers who stop supporting their phones' software the instant it flies of the shelf, as well as manufacturers who can't update their phones because they can't get the relevant drivers for the latest version of the OS.</p>
<p>Google has known for a while now that the big dessert updates (that by the way also come with important security architecture changes) are too time-consuming and too expensive to carry out sustainably. Even the patches coming from <a href="https://source.android.com/security/bulletin/">the security bulletin</a> are too much of a hassle for manufacturers that can't or won't spare the resources. Clearly something had to change. Luckily that is precisely what 2017 gave us.</p>
<h4 id="anewhopeprojecttreble">A New Hope: Project Treble</h4>
<p>All Android devices that ship with Android 8.0 Oreo must now implement an Android architecture that has been given the codename <a href="https://source.android.com/devices/architecture/treble">Project Treble</a>. It is a major modularization effort by the Android Engineering Team that separates AOSP from vendor specific implementations. This is a major shift from how it worked before, where chip vendors like Qualcomm would have to make a Qualcomm-flavored version of the newest Android OS for each of their processors every year. It is both a taxing and inefficient process, and it should come as no surprise that companies like Qualcomm drops support for older chips surprisingly early (<a href="https://www.xda-developers.com/in-depth-capitulation-of-why-msm8974-devices-are-excluded-from-nougat/">R.I.P. Snapdragon 801</a>).</p>
<p>There is plenty of mumbo-jumbo going around that explains how Project Treble works, but the simplest way I could explain it is that Project Treble defines what is essentially a set of hardware standards. Chip vendors and hardware makers can design their hardware however they like, but at the end of the day their final implementation must pass these standards in order to run Android at all. As long as they are met, and future versions of Android are built to use these standards, the Android OS will <strong>just work</strong> on these chips without any additional effort from the hardware makers.</p>
<p>There are still some caveats to be mindful of, though. OEMs are still free to tweak AOSP to their liking before pushing an update to its users, so we're still not at iOS levels of speed here, mind you. There is also the fact that <a href="https://arstechnica.com/gadgets/2017/09/android-8-0-oreo-thoroughly-reviewed/2/#h2">Treble's hardware standards will have future revisions as well</a> (think v1.1, v1.2, v2.0) to support future hardware features such as Bluetooth 6 or gigabit WiFi. Hardware makers will have to update their hardware code to support these newer standards, though its not clear how pressured they will be to do so for their current chips or even for future hardware. The good news is that future versions of Android can be backwards compatible with older hardware standards, so it's not instant death for a specific piece of hardware just because it doesn't support the newest standard. Still, old hardware standards will eventually have their support dropped, but how long that will take remains up in the air.</p>
<p>Another cool thing brought about by Project Treble's modularization is that drivers can now be updated separately from AOSP. Starting with Android Oreo, <a href="https://www.xda-developers.com/android-o-users-will-update-graphics-drivers-through-play-store/">graphics drivers can be updated through the Google Play Store</a>, meaning performance and security patches for the GPU can bypass the OEMs and carriers and be installed on the user's phone as though it was just an app, because it is! Considering that we may not have heard the last of Bluetooth and WiFi vulnerabilities, I hope that future work on Treble will allow wireless drivers to be updated over the Play Store as well.</p>
<h4 id="lookingtoahopefullybrighter2018">Looking to a (Hopefully) Brighter 2018</h4>
<p>So yes, 2017 was an absolute mess for Android, but I also want to be optimistic as we start rolling into 2018. Make no mistake that project Treble is a game-changer for Android updates. Even if you end up with a phone from a lazy manufacturer that never wants to update your phone at all, Treble already opens some big doors for those willing to go the <a href="https://www.xda-developers.com/how-project-treble-revolutionizes-custom-roms-android-oreo/">custom ROM route</a>, assuming the bootloader is unlocked, obviously. There is also some promise from the fact that GPU drivers can be updated on the phone as an APK. While we can't truly prevent vulnerabilities like Blueborne or Broadpwn, the least we can do is fix them as quickly as possible, and the Play Store is as frictionless as they come. Hopefully the GPU drivers are only the beginning of a better security story for Android.</p>
<p>The last real bottlenecks left going into 2018 are the OEMs and the carriers. While I suspect that Touchwiz is still going to stall a lot of updates for Samsung devices in the near future, hints of a <a href="http://www.androidpolice.com/2017/08/24/developer-discovers-turn-android-8-0s-theme-support-without-root-bit-janky/">theming engine</a> in Android Oreo gives me hope that Google is tackling the manufacturer bottleneck head-on as well.</p>
<p>As for me, I'm kinda excited about what my next Android phone could be. I've been rocking iPhone since my last Android phone, the Sony Xperia Z3 Compact, was inflicted with touch disease. While the iPhone is wonderful (and also more secure), I do still want to keep an Android device around for the unique things it can do. I may hold off on getting a new Android device until Project Treble is more widespread on low-end and mid-level devices. That way I can get a phone that lasts way longer than even the manufacturer's own intentions, thanks to the custom ROM community.</p>
<p>By all accounts 2018 should be a better year than the one we're about to leave behind. Let's hope nobody botches it.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[I Tried the New iPad Pro Display for the First Time]]></title><description><![CDATA[I decided to check out the new iPad Pro and it's ProMotion 120Hz screen in an Apple Store. I entered the store a skeptic; I exited the store a believer.]]></description><link>https://ianuymatiao.com/i-tried-the-new-ipad-pro-display-for-the-first-time/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d63</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Sat, 12 Aug 2017 10:34:23 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Last Friday I decided to drop by an Apple reseller to try out the new iPad Pro, and especially its 120 Hz display. I was initially skeptical about the need for refresh rates above 60 Hz, as 60 fps motion looks pretty freaking smooth as is. With that said, after I started interacting with the iPad Pro, I couldn't stop scrolling pages in amazement for at least 30 minutes.</p>
<p>It's kinda hard to describe how 120 Hz feels without trying the new display for yourself, but I'll do my best. The new smoothness on the new iPad Pro is, for lack of a better word, intoxicating. It's hard to imagine that motion can get smoother than 60 fps, but it can -- it <strong>totally</strong> can. 60 fps now feels both adequate, yet frustratingly wanting. It's certainly enough for everyday use, but against 120 fps motion it suddenly struggles to be good enough in the eyes of, well, your eyes!</p>
<p>It's also massively beneficial in other ways. Back when I tested the Apple Pencil on an older iPad Pro, I felt that the drawing responsiveness was decent, but nothing groundbreaking, personally finding it more or less equal to the Galaxy Note S-Pen. But with the new iPad Pro, my jaw just dropped. The Apple Pencil felt <strong>extremely</strong> responsive. Pencil lag was far harder to detect, only really showing itself when making fast strokes that go from one end of the screen to the other. I'm convinced that the iPad Pro + Apple Pencil combo is the new gold standard for digital drawing tools. While the new Surface Pro + Pen has similar latency, the 120 Hz display on the iPad Pro is going to be Apple's ace up its sleeve for this shootout.</p>
<p>So yeah, I am massively blown away by the new iPad Pro display. I have not envied a display like this since the first time I saw a retina screen. It's at that caliber.</p>
<p>I'm not necessarily going to buy an iPad Pro after this, but boy am I going to start saving up for one of those 144 Hz computer monitors. I am now a convert; high refresh displays are freaking phenomenal. You should honestly try out the new iPad Pro in a store somewhere and scrutinize the smoother motion. You may just get blown away too!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[The 2016 Macbook Pros Part 4: The Macbook Escape, Kaby Lake and Price]]></title><description><![CDATA[With the Macbook Pros recently updated, it's time to wrap up the series by talking about the peculiar Macbook Escape and the crazy pricing situation.]]></description><link>https://ianuymatiao.com/the-2016-macbook-pros-part-4-the-macbook-escape-kaby-lake-and-price/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d62</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Tue, 27 Jun 2017 07:30:41 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>In case you missed it, you can check out <a href="https://ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/">Part 1</a>, <a href="https://ianuymatiao.com/the-2016-macbook-pro-part-2-the-specs/">Part 2</a>, and <a href="https://ianuymatiao.com/the-2016-macbook-pros-part-3-usb-c-thunderbolt-3-and-the-touch-bar/">Part 3</a>.</p>
<p><em>Disclaimer: These are thoughts about a line of products I have not yet used. While they are still opinions, please keep in mind that this is NOT a review.</em></p>
<p>It's perhaps been to long since the last entry in the series, so it's time to wrap this up.</p>
<h4 id="kabylake">Kaby Lake</h4>
<p>So just recently at WWDC Apple just updated the Macbook Pros to come with the latest Kaby Lake Processors from Intel. If you happen to have bought the new Macbook Pro recently, no need to fret; the Kaby Lake update isn't really dramatic. It does add some niceties such as a higher clockspeed and better support for next-generation video technology, but apart from those two items it's pretty much a better Skylake.</p>
<p>I also think it's at least worth commending Apple for being more aggressive with the update cadence of their Mac lineup. While it may be a bummer for early adopters that their shiny new Macbook Pro is now &quot;old&quot; nine months after launch, this is much, much better than last time when Apple laptops could go <em>years</em> without an update. I very much prefer that Apple closely follow Intel's release schedule than go dark for a couple of years at a time.</p>
<h4 id="themacbookescape">The Macbook Escape</h4>
<p>With that out of the way, I want to take some time to talk about what I think is the most peculiar model of the Macbook Pro lineup, the 13-inch Macbook Pro without Touch Bar. Now that's a mouthful for a name, but luckily the geekosphere has already come up with a better name for this little machine: <a href="http://atp.fm/episodes/193">the Macbook Escape</a>.</p>
<p>What makes the Macbook Escape so interesting is that it's pretty much a new laptop category for Apple. While the Touch Bar models can be argued to have replaced their older 13-inch and 15-inch versions, the Macbook Escape doesn't really replace anything. Or does it?</p>
<p>You see, the most likely explanation for the Macbook Escape's existence is that it is supposed to be a replacement for the aging Macbook Air. Fans have been clamoring for a &quot;Retina Macbook Air&quot; for a long time now, and I think the Macbook Escape is the closest thing we'll ever get to that. With its lower power CPU, reduction of USB-C ports (from four to two) and the lack of the Touch Bar, the Macbook Escape is a noticeably segmented from its more expensive siblings.</p>
<p>Despite being less feature-packed, there are a few reasons to consider the Macbook Escape over the Touch Bar models. The most obvious one is the continued existence of the function keys on the Macbook Escape. It seems really clear that Apple is going to continue to invest in the Touch Bar no matter what users think, and it's possible that function keys on laptops may not be long for this world. But if you're determined to fight the Touch Bar tide till your last breath, the Macbook Escape will be a good holdover.</p>
<p>Another thing to consider is battery life. As mentioned earlier, the Macbook Escape comes with a lower-power CPU than its Touch Bar sibling, and that comes with a few ramifications depending on your workload. Both CPUs have a similar boost clock, which means if you are performing work that uses the CPU in big bursts such as compiling code, file exports or compression, there will very likely be little differences between the two models. If, however, your workload is constant and sustained over long periods of time such as graphics, gaming, video encoding or certain scientific simulations, CPUs can't stay on the boost clock for very long and will have to operate at a lower clockspeed in order to prevent overheating. Low-power CPUs such as the ones found on the Macbook Escape tend to throttle their clockspeeds harder, which means noticeably reduced performance as the workload keeps going and going. If constant high performance is what's required, the Macbook Escape may not make the grade. With that said, being able to operate at a lower clockspeed during light workloads can be a boon to low-power CPUs, improving battery life.</p>
<p>Okay, so the Macbook Escape is an interesting laptop and all, but there remains an elephant in the room: If the Macbook Escape is the de facto replacement for the Macbook Air, then why the hell is Apple still selling the Macbook Air (with recent CPU speed bumps!) to this day?!</p>
<p>The price.</p>
<h4 id="price">Price</h4>
<p>What I find most frustrating about the Skylake Macbook Pro refresh is the sudden and dramatic increase in the average price of these new models. The move to a new design, a better display and the Touch Bar have all made an already expensive notebook line even more expensive.</p>
<p>Now to be fair this isn't completely unexpected nor unprecedented. When the Macbook Pro first went retina in 2012, it also came with a similar price increase before eventually lowering in price to more reasonable levels. But this time the price adjustment hurts more for a couple of reasons.</p>
<p>The first reason is that, Macbook Escape aside, this is the highest price that a Macbook Pro has ever sold for. At $1799, the new price floor makes this Macbook Pro a hard sell for Mac users, especially in countries like mine where the local currency has weakened against the dollar. The second reason is that the &quot;old&quot; models have not been updated even though Apple continues to sell them. Back in 2012, the old non-retina Macbook Pros got CPU refreshes along with the shiny new retina ones. Even though these models were on their way out, Apple still had the courtesy to give them one last refresh before they were sunset. Even if you didn't choose to buy the retina Macbook Pro back in 2012, $1199 still got you a better Mac notebook than the year prior. That is <em>still</em> not the case in 2017. $1299 for the &quot;old&quot; retina Macbook Pro buys you the exact same notebook as someone who bought it in 2015. I think updating the old models to Skylake or Kaby Lake could have made this latest refresh less of a bitter pill to swallow for those looking at the low-end of the Macbook Pro price range.</p>
<p>I think Apple is at least aware of the problematic pricing right now, and probably made the Macbook Escape as a response to the issue. At $1499 and 256 GB storage, the Macbook Escape was not too shabby as a brand new no-frills Macbook Pro compared to the Touch Bar models with bleeding-edge tech. Still, $1499 ain't no $1299, which was why during WWDC Apple added a 128 GB Macbook Escape that retails for $1299, at long last reaching price parity with the old retina model.</p>
<p>I actually want to talk a bit about this 128 GB model, because there is a positive and negative way to look at this $1299 computer. The positive outlook is that the Macbook Escape is finally a proper replacement for the old retina model, unless you really prefer those legacy ports. Aside from both notebooks having 128 GB of storage, the Escape is better in nearly every way: better display, better speakers, newer ports and faster processors (especially graphics), with the only controversies being the keyboard and the loss of MagSafe.</p>
<p>As for the negative outlook? Well, it <em>still</em> comes with 128 GB of storage! I can only shake my head in disbelief that, as we enter the <em>sixth</em> year of the Retina Macbook Pro, that we are still stuck with 128 GB of storage as the baseline. And you thought Apple took too long to kill the 16 GB iPhone! What's crazy is that the last 5 years saw a drastic drop in the price of SSD storage, so it's disappointing to see that the notebook redesign has effectively swallowed all of the SSD cost savings that Apple could have used to bump the storage at the low end.</p>
<p>I do hope that this awkward pricing situation will improve in time, just like with the old model in the years after 2012. For now, I'm at least glad that the Macbook Pro is finally back in sub-$1300 territory.</p>
<h4 id="conclusion">Conclusion</h4>
<p>So there you have it! A new generation of Macbook Pros with big changes and big ramifications for the future. Skylake and Kaby Lake have pulled these notebooks to the present, while USB-C is a glimpse of the future today. The new keyboard is going to remain a hot topic for years to come, while the pricing structure hopefully won't be for long.</p>
<p>It's good to see Apple re-commit to the Mac in the form of more frequent updates, and I am at least a little more optimistic about looking Apple's way again when it's time to replace my current Macbook Pro.</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[The 2016 Macbook Pros Part 3: USB-C, Thunderbolt 3 and the Touch Bar]]></title><description><![CDATA[Apple updated their latest Macbook Pros with a bunch of new technology, and it's got everyone in a frenzy! Here's a guide to the future of tech today.]]></description><link>https://ianuymatiao.com/the-2016-macbook-pros-part-3-usb-c-thunderbolt-3-and-the-touch-bar/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d61</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Thu, 23 Feb 2017 14:10:39 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>In case you missed it, you can check out <a href="https://www.ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/">Part 1</a> and <a href="https://www.ianuymatiao.com/the-2016-macbook-pro-part-2-the-specs/">Part 2</a>.</p>
<p><em>Disclaimer: These are thoughts about a line of products I have not yet used. While they are still opinions, please keep in mind that this is NOT a review.</em></p>
<p>In the world of Apple design, major changes to iconic products are renowned for what is added, as well as what gets removed. Old technology is always being replaced by the new every now and then, but I think it's fair to say that Apple is more aggressive than most in this regard, to the delight of futurists or the chagrin of power users. The 2016 Macbook Pros are the latest in Apple's relentless march of progress, and they also happen to carry some of the biggest changes in Apple's Mac lineup in a long time. We've already talked about the updated trackpad and keyboard, but those can be considered tweaks to a currently existing convention. This time I'm going to talk about the things that are actually new to the Macbook Pro, as well as the tech that Apple is simultaneously killing in the process.</p>
<h4 id="usbc">USB-C</h4>
<p>Perhaps sparked by the debut of Apple's proprietary Lightning connector back during the launch of the iPhone 5 in 2012, USB-IF, the consortium that runs the USB standard, decided to update the beloved USB connector after nearly two decades in service. Enter <strong>USB-C</strong>, a new USB connector standard that is designed to literally replace every other USB port currently in existence. It is one small, reversible connector that is meant for every device out there from phones to desktops; no more Regular, Mini, Micro nonsense. It can also can also carry a lot of data around, up to 10 Gbps when equipped with the USB 3.1 specification. The most interesting feature to USB-C, though, is that it can deliver up to 100W of power to connected devices. That's right, USB-C can charge laptops!</p>
<p>Apple has been shown to be highly enthusiastic about the potential of USB-C on the Mac (especially since the Mac is unlikely to ever replace standard USB with Lightning), so much so that the 12-inch Macbook only comes with one USB-C port that handles data and charging, a controversial design decision that remains to this day. Thankfully Apple has included four of these forward-looking ports in their Pro lineup (though you only get two with the Macbook Escape), but like the Macbook these are the <em>only</em> ports you will ever see on these notebooks apart from the interestingly included headphone jack. That means no more old-school USB, no more SD card slot, no more DisplayPort or HDMI and no more MagSafe!</p>
<p>You've probably heard a lot of uproar from the Mac community over this change, since there is nearly zero backwards compatibility with existing devices (except for headphone jack accessories). That means if you want to connect to the new Macbook Pro your flash drives, wired mice, external hard drives, external monitors or even your freaking iPhone, you're going to need new wires or adapters that connect to USB-C, and those currently don't ship free with phones, hard drives, monitors and such. It also doesn't help that these adapters are rather pricey, especially if you buy them directly from Apple.</p>
<p>I personally understand the anger and frustration that has emerged from the Mac community over this hard (rather than soft) transition that has resulted in instant (rather than gradual) obsolescence of other stuff. It really sucks to have to buy adapters just to do something perfectly normal today, especially when it's in addition to the higher price tag of these new Macbook Pros (more on price in the last part of this series), but I am so bullish on the future of USB-C that it's hard for me to really get mad at Apple over this. It really is going to be a painful transition, but I believe so much in the upside of a truly universal, reversible USB connector that this one time transition pain for the next one to two years will be worth it. USB-C is going to happen no matter what; the real question is how much of the pain you can bear. If you can't afford to bear it, you may want to stick with your existing Macbook (or get a refurbished version of a 2015-era Macbook) until USB-C accessories and cables start arriving in droves.</p>
<p>What I will miss, however, is the loss of MagSafe. I think the vast majority of Macbook users will attest that MagSafe has saved their notebook's lives on at least one occasion. It was also pretty elegant in its solution, showing a given color depending on its charge status without the user having to open their computer to check. What I was hoping Apple would do is include MagSafe, yet also allow the USB-C ports to charge the Macbook in case you don't have the AC adapter with you. Unfortunately that didn't happen, and I don't look forward to hearing horror stories of premature Macbook Pro deaths due to tripping.</p>
<h5 id="thunderbolt3">Thunderbolt 3</h5>
<p>Even though Thunderbolt really hasn't taken off in a dramatic way since its debut in 2011, Apple remains a supporter of the technology, and this time around they've doubled down on the number of Thunderbolt 3 ports they're bringing to their notebook line. In the 2016 Macbook Pros, <em>each USB-C port is also a Thunderbolt 3 port</em>, something that was made possible because Thunderbolt 3 dropped Mini-DisplayPort as its connector of choice in favor of USB-C. It looks to me that the push this time for Thunderbolt 3 is to be the best USB port there is, supporting all USB speeds and standards up until USB 3.1 and allowing Thunderbolt 3 speeds of up to 40 Gbps! This does have the potential to be a bit confusing, since the external device/accessory also needs to support the Thunderbolt 3 standard in order to take advantage of Thunderbolt 3 speeds, and they would need to connect to these Macbook Pros via a Thunderbolt 3 cable. That's not really easy to figure out when that accessory on the outside looks like any other USB-C device, not to mention a Thunderbolt 3 cable looks exactly like a USB-C cable.</p>
<p>Something else that must be noted is that for some reason these Macbook Pros will not support <a href="https://www.macrumors.com/2016/11/03/new-macbook-pros-thunderbolt-3-compatibility/">earlier versions of currently existing Thunderbolt 3 devices</a>, or at least will not under macOS. This issue appears to stem from the use of incompatible controller chips in older accessories, despite the fact that they are still certified as Thunderbolt 3. The reason for Apple to close off these &quot;older&quot; Thunderbolt 3 accessories is not entirely clear, but if I were to take a casual guess it could be related to software stability issues. I did hear somewhere that you can get these &quot;incompatible&quot; devices to work with the Macbook Pro if it's booted under Windows via Boot Camp, but I suggest you not rely on this method to work forever.</p>
<p>If you're on the fence about purchasing Thunderbolt 3 accessories for your Macbook Pro, I suggest you stay on the fence for another year or so until we have a clear picture of what Thunderbolt 3 compatibility is really like. Right now we are still in the weeds on this one.</p>
<h4 id="thetouchbarmoreandlessfeel">The Touch Bar: More and Less Feel</h4>
<p>The most dramatic addition to the Macbook Pros would surely be the <strong>Touch Bar</strong>. Gone are the function row of keyboard keys (including the Escape and Power Key), now replaced with a monolithic OLED touchscreen that spans the entire top part of the Macbook Pro keyboard. There is actually a lot of serious technology packed into this strip of glass, and is very likely the reason behind the new Macbook Pro's increased markup.</p>
<p>For starters, the Touch Bar is basically an entire computer on its own. It has its own dedicated chip called the Apple T1, and it handles the input and output to the OLED screen while simultaneously communicating with the Intel chip and Mac operating system. Even though it only powers a small part of the Macbook Pro, the Apple T1 is no dumb chip. It has enough CPU capability to run <a href="http://appleinsider.com/articles/16/11/21/developer-brings-classic-fps-doom-to-touch-bar-on-apples-macbook-pro">an entire game of Doom on the Touch Bar on its own</a> (it's not very playable, though), and happens to include the Secure Enclave, a dedicated security chip first found on iOS devices that made Touch ID on the Mac possible.</p>
<p>While I'm sure there have been numerous OEMs that have previously tried to come up with &quot;intelligent&quot; function rows on their keyboards, there's good reason to believe that the Touch Bar as a technology has a better chance to take off. After all, Apple's vertical integration works strongly in the Touch Bar's favor. It's difficult enough to make such a responsive and smooth secondary computer without first creating a chip as capable as the Apple T1. Even if laptop OEMs could create their own ARM chip to power the same capability, they would have to ask Microsoft for permission as to what parts of the OS their touchscreen keyboard can touch, while Apple can make these well-optimized hooks themselves and come up with a standard API that can remain consistent across multiple Macbook Pros and multiple revisions of the Touch Bar. If you're a Windows developer, it's going to be a herculean task to implement your app's touchscreen keyboard feature across multiple implementations of such a technology from different manufacturers, each of which will likely have their own separate API implementation.</p>
<p>The Touch Bar's story isn't 100% rosy, however, since removing the hardware function keys does have its own downsides. The presence of hardware keys that never move and provide tactile feedback lends itself to blind muscle memory, a power user skill that is now being disrupted now that its replacement is a touchscreen. There is definitely going to be a painful transition period for developers and professionals who may have relied on such consistent keyboard shortcuts and macros to get the most out of their productivity. Even though the Touch Bar still includes traditional function keys for the apps and users that still need them, the tactility of the old physical keys will still be missed.</p>
<p>It is still the very early days for the new Touch Bar, but there could already be good reason for you to have it in your new Macbook Pro if you're willing to pony up what is admittedly a painful premium. The inclusion of Touch ID could be a game-changer for the Mac ecosystem, since its sheer convenience easily outdoes the cumbersomeness of the traditional password security model, meaning more users would be willing to lock down their personal computers for the sake of privacy now that it's easier to achieve. As for both 1st and 3rd party applications, things are still in flux, and nobody, not even Apple, has the right answers for how the Touch Bar should be implemented in aid of the user. If you're not one for leading edge technology that also costs more, maybe wait a couple more Macbook Pro iterations by which time the Touch Bar ecosystem will be healthier (and also hopefully cheaper).</p>
<hr>
<p>Next time I wrap up the series by talking about the peculiar Macbook Escape as well as pricing. Stay tuned!</p>
<p><a href="https://ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/">You can check out Part 1, where I talk about the Macbook Pro's overall new design.</a></p>
<p><a href="https://ianuymatiao.com/the-2016-macbook-pro-part-2-the-specs/">You can also check out Part 2, where I talk about its hardware and performance.</a></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[The 2016 Macbook Pro Part 2: The Specs]]></title><description><![CDATA[In the second part of the Macbook Pro 2016 series, I take a look at the notebook line's new CPUs and GPUs to see if Skylake was worth the wait. ]]></description><link>https://ianuymatiao.com/the-2016-macbook-pro-part-2-the-specs/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d60</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Wed, 04 Jan 2017 17:15:56 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>In case you missed it, you can check out Part 1 <a href="https://ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/">here</a>.</p>
<p><em>Disclaimer: These are thoughts about a line of products I have not yet used. While they are still opinions, please keep in mind that this is NOT a review.</em></p>
<p>Apple is not known to fixate on the underlying specs that power their Mac lineup of computers. On the one hand,  the fact that Apple doesn't need to market their Macs based on specs is a testament to their marketability over commodity PCs. If specs were all that mattered, Macs should have been extinct a long time ago. Instead, Macs continue to sell in profitable numbers, riding on a marketing story that focuses on the experience of using a computer rather than CPU core counts and RAM size. Apple was successfully able to leverage their design chops and hardware-software synergies to come up with a line of desktops and laptops that were greater than the sum of their parts. Other PCs, by comparison, <em>were</em> the sum of their parts.</p>
<p>On the other hand, it is still super handy to care about the specs of the Mac. After all, the Mac's languishing specs until now is why we nerds knew that a refresh was coming in the first place, and that it was better to wait. Even the Mac's software strengths must stand on a strong hardware foundation, and as macOS continues to evolve and become more sophisticated, so must the underlying hardware that powers it all.</p>
<p>In this second part, I will tackle more PC-like concerns such as the CPU and GPU to see if they are a worthy upgrade over previous iterations. Let's start by tackling a misconception.</p>
<h4 id="whynotkabylake">Why Not Kaby Lake?</h4>
<p>This has been one of the primary complaints lobbed at the new Macbook Pro, and I will do my best to explain why these complaints are completely unfounded.</p>
<p>For those who aren't yet familiar, <strong>Kaby Lake</strong> is the codename for Intel's lineup of processors for 2016. The complaint is that Apple used Intel's 2015 lineup of processors, codenamed <strong>Skylake</strong>, instead. The accusation, therefore, is that Apple is behind the PC curve because they are using last-gen processors to power their newest Macs.</p>
<p>What's actually going on is that <strong>the Kaby Lake lineup is incomplete</strong>, while Skylake provides the most up-to-date processors that Apple actually uses in its Macbook Pros.</p>
<p>Keep in mind that Intel's mobile processors need to target specific TDPs (<a href="https://en.wikipedia.org/wiki/Thermal_design_power">Thermal Design Power</a>) in order to be suitable for specific form factors or hit specific performance targets. High TDP processors can achieve high performance, but require plenty of cooling to do so, while low TDP processors are more performance-constrained, but run cool enough to be used in thinner and lighter devices. Intel's mobile lineup can be divided into four tiers, each representing a different form factor and performance target:</p>
<ul>
<li><strong>Tier 1:</strong> TDP of 5 W; Dual core; For fanless systems</li>
<li><strong>Tier 2:</strong> TDP of 15 W; Dual core; For ultraportables</li>
<li><strong>Tier 3:</strong> TDP of 28 W; Dual core; For mainstream portables</li>
<li><strong>Tier 4:</strong> TDP of 45 W; Quad core; For high performance systems</li>
</ul>
<p>As you can probably guess, you can find a Macbook for every tier on the list. <strong>Tier 1</strong> powers the 12-inch Macbook line, <strong>Tier 2</strong> powers the Macbook Air and the newly released <em>Macbook Escape</em>, <strong>Tier 3</strong> powers the 13-inch Macbook Pro, and <strong>Tier 4</strong> powers the 15-inch Macbook Pro.</p>
<p>So what does this have to do with Skylake and Kaby Lake? At the time that the new Macbook Pros were designed, announced and shipped, <strong>the Kaby Lake lineup only included Tier 1 and 2 CPUs</strong>. That means the Touch Bar equipped 13-inch and 15-inch Macbook Pros are forced to use Skylake chips, but the Macbook Escape can't use Kaby Lake's Tier 2 chips either because of a notable omission: <strong>Intel Iris</strong>.</p>
<p>Even though Apple doesn't like including a discrete GPU in their sub-15-inch notebook lineup, they still had the decency to pack as powerful an integrated GPU as they can. When it comes to Intel's GPUs, the <em>Iris</em> lineup is the cream of the crop (more on Intel Iris in the next section). As you can probably guess, Iris GPUs have not yet arrived in Kaby Lake's Tier 2 lineup. By sticking with Skylake, Apple sacrificed CPU clock speed (the main improvement in Kaby Lake) in order to provide a huge GPU boost to the Macbook Escape. Now that the Macbook Air's successor has a Retina display, every bit of GPU improvement counts.</p>
<p>If there's any Mac that can be upgraded to the Kaby Lake right now, it would be the 12-inch Macbook. I suspect Apple will update that notebook to Kaby Lake once they manage to fit the new Butterfly Switch 2.0 keyboard into that tiny chassis. As for the Macbook Pros, <a href="http://www.anandtech.com/show/10959/intel-launches-7th-generation-kaby-lake-i7-7700k-i5-7600k-i3-7350k">Kaby Lake's mobile lineup is practically complete as of CES 2017</a>. My hope is that Apple will update the Pros pretty soon, but the Mac's slowing update frequency in recent years isn't a good sign this will be the case.</p>
<h4 id="whatskylakebringstothemacbookpros">What Skylake Brings to the Macbook Pros</h4>
<p>Usually the reason to wait for a notebook lineup to be updated with a new generation of processors is for performance reasons. The pace of CPU innovation in portables was so dramatic that it was deemed foolish to purchase a notebook (PC or Mac) with year-old specs when a new spec sheet was around the corner.</p>
<p>For CPUs those days seem to be over. In regular use mobile CPUs have been <em>good enough</em> for several years now, and when it comes to benchmarks and high performance applications Skylake has a noticeable but unremarkable performance advantage over previous generation CPUs used in older Macbook Pros. But there are non-CPU improvements in Skylake that make the wait very much worth it.</p>
<p>There is of course native support for Thunderbolt 3, a blazingly fast data transfer standard that we'll talk about in a future post.</p>
<p>Although not explicitly tied to Skylake's feature set, many notebooks in the Skylake generation (including the new Macbook Pros) also support the NVMe standard, which allow for the fastest SSD transfer speeds possible in the market.</p>
<p>Perhaps the most important update to Skylake, in my opinion, are the changes to the Iris GPU lineup. The updates to Intel Iris don't affect the 15-inch lineup since they use AMD GPUs anyway, but Skylake is a crucial update to the 13-inch line for the graphics alone.</p>
<p>Beginning with the Skylake update, all Intel processors that come with Iris GPUs also come with 64 MB of very high-bandwidth RAM called <em>embedded DRAM</em> (eDRAM). Technically both the CPU and GPU have access to this piece of memory, but everyone including Intel knows that the main beneficiary of eDRAM is the Iris GPU. In practice this little cache effectively becomes the Iris GPU's video memory, and the inclusion of <em>any</em> video memory makes a <em>huge</em> difference in performance.</p>
<p>eDRAM used to only be found in quad-core Tier 4 CPUs that came with the higher-end <em>Iris Pro</em> GPU, but Skylake fixes this by bringing the benefits of eDRAM to Tier 2 and Tier 3 chips used in the 13-inch Macbook Pros, reducing memory bottlenecks and dramatically boosting their graphics performance. Pro applications that use the GPU should be much faster in the new 13-inch Pros, and certain categories of PC games should be more viable to play as well, but this doesn't make the 13-inch Pro a gaming machine by any modern definition.</p>
<h4 id="amdpolarisonthe15inchpro">AMD Polaris on the 15-inch Pro</h4>
<p>The previous two generations of the 15-inch Macbook Pro saw the lowest end model feature the aforementioned Iris Pro GPU as its one and only graphics processor, while more expensive configurations would also throw in a dedicated GPU from either Nvidia or AMD. The newest 15-inch refresh breaks this trend by removing this previously existing price tier (around $1999), and including an AMD GPU in every 15-inch model you can buy. Curiously there are three possible AMD GPUs you could end up with depending on how much you pay for your notebook. These are the AMD Radeon Pro 450, 455 or 460, in ascending order of performance and all based on AMD's latest Polaris architecture.</p>
<p>The Polaris architecture itself is a welcome change for the Macbook Pro, bringing support for more contemporary display standards, power efficient encoding and decoding of the latest H.265 video codec, and the ability to drive up to two 5K displays at 60 Hz, on top of increased performance over previous generations for the same amount of power consumption.</p>
<p>Some professional users and gamers are upset that these Radeon GPUs are nowhere as powerful as, say, the mobile Nvidia GTX 1060 found in the 2016 Razer Blade. The truth is that with the current state of mobile dedicated GPUs, this really is the best Apple could do, and just like with the CPUs it's all about our old friend TDP. On AMD's side, nothing is more powerful than the Radeon Pro 460, and if there was such a GPU it would easily exceed the 460's TDP of 35W<sup class="footnote-ref"><a href="#fn1" id="fnref1">[1]</a></sup>, and would likely draw more power than Apple's charger can provide and generate more heat than the notebook can vent. Nvidia is the opposite, where the lowest end mobile GPU they have based on their latest Pascal architecture, the GTX 1060, has a minimum TDP of 75W. To contain that GPU would mean making a notebook that consumed <em>a lot more</em> power than its predecessor, a precedent Apple staunchly refuses to set. Nvidia could come up with a GPU with a TDP of 35W to 45W, but that's still in the future, and it remains to be seen if that GPU would be faster than the Radeon Pro 460 that ships today.</p>
<p>These aren't bad GPUs by any means, but certain design decisions made by Apple a long time ago mean the 15-inch line don't have the best graphics performance per gram compared to the competition. For doing Adobe CC work inside a Mac environment, I think these GPUs will work perfectly fine. As a gaming machine, it's a rather weak value proposition.</p>
<hr>
<p>Next time I talk about other aspects of the Macbook Pro that have somehow managed to stir <em>even more</em> controversy, such as the futuristic Thunderbolt 3 ports, and of course, the widely publicized Touch Bar. Stay tuned!</p>
<p><a href="https://ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/">You can check out Part 1, where I talk about the Macbook Pro's new design</a></p>
<hr class="footnotes-sep">
<section class="footnotes">
<ol class="footnotes-list">
<li id="fn1" class="footnote-item"><p>As far as I can tell there is no officially labelled TDP for any of the Radeons found on the 15-inch Macbook Pros. The closest I can find is a <a href="http://www.amd.com/en-us/press-releases/Pages/radeon-pro-400-2016oct27.aspx">PR post from AMD</a> saying the <em>power draw</em> is 35W. This isn't the same as TDP, but given how processors work we can assume that the TDP would be a value close to 35W. <a href="#fnref1" class="footnote-backref">↩︎</a></p>
</li>
</ol>
</section>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[The 2016 Macbook Pros Part 1: The Design]]></title><description><![CDATA[I've got a lot on my mind about the new Macbook Pros, and in this post I give my initial (not hands-on) thoughts on the notebook's thinner design.]]></description><link>https://ianuymatiao.com/thoughts-on-the-2016-macbook-pros-part-1-the-design/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d5d</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Tue, 06 Dec 2016 08:00:57 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p><em>Disclaimer: These are thoughts about a line of products I have not yet used. While they are still opinions, please keep in mind that this is NOT a review.</em></p>
<p>After what felt like a year-and-a-half of deafening silence on the state of the Mac, Apple has finally given us an update to one of their most important notebook computers: the Macbook Pros. Apart from <em>much</em> needed upgrades to the internal hardware, these new Pros also sport a new design. Apple has shown us what their vision for what a notebook computer should be like with the release of the 12-inch Macbook, and these new Macbook Pros are a continuation of that vision, for better and worse.</p>
<p>Sadly only the Macbook Pros got an update this time around. Many other Macs are in dire need of updates, and Apple hasn't been forthcoming with the state of the rest of their lineup. The iMac is still using last year's processors, the Macbook Air has an uncertain future, and the Mac Mini and Mac Pro are practically in purgatory at this point. Apple's silence on these products is a continuing source of worry and frustration for its customers, and it's in Apple's best interest to clear out the uncertainty soon lest their relationship with their customers further worsen.</p>
<p>But enough about the other Macs. Let's talk about the new Macbook Pros, shall we? I have a lot to say about these new machines, so I'm going to have to separate my overall thoughts over multiple parts. This first part is going to cover the overall physical design, including the form factor, trackpad and keyboard. Enjoy!</p>
<h4 id="thenewdesign">The New Design</h4>
<p>Apple has been shown to have an obsession for thinness and lightness, so much so that they extend this design ethos even to their professional lineup. They have slowly shaved off the thickness from about 1 inch in 2008, to about 0.78 inches in 2012, and now around 0.6 inches in 2016. Innovations in the trackpad and keyboard (more on those in a bit) as well as thermal management have allowed Cupertino to shrink their pro notebooks into smaller and smaller packages, allowing them to be fit better inside bags and cause less strain on the human back during trips.</p>
<p>While I'm sure many professionals would question why thinness and lightness should take precendence over the sheer performance that professional work requires, as a mostly non-professional user I can't help but like Apple's priorities when it comes to industrial design. Sure, chasing ever thinner enclosures may mean these notebooks will never pack the highest end processors available, but I like that Apple has a vision of notebooks becoming more portable over time while also gaining performance boost, even if they're minor.</p>
<p>Apple has been very consistent for at least a decade about certain products targetting a certain performance class, even as they've redesigned the Macbooks again and again. Even as these notebook gets smaller, the CPU and GPU always incrementally improve with every iteration. I don't think we're at risk of Apple <em>downgrading</em> future Macbook Pros in the name of thinness.</p>
<p>As for other aspects of the design, I am very happy that they have finally gotten a metal hinge unto the Macbook Pros. It always bothered me that such a sensitive part of the notebook was made of flimsy plastic, so it's good to know that the entire body is sturdy now.</p>
<p>As for the death of the glowing Apple logo, I am surely going to miss it, but it is by no means a non-negotiable. I can easily move on from this.</p>
<p>Finally, I cannot comment on the sound quality of these new Macbook Pros because I haven't actually used on myself. For what it's worth, other reviewers have said that the sound quality has vastly improved compared to previous iterations.</p>
<h4 id="thelargertrackpad">The Larger Trackpad</h4>
<p>Apple has been using the new Force Touch trackpad across almost their entire notebook lineup for a while now, but the new Macbook Pros are further evidence for why Apple was right to move to a simulated trackpad click instead of a mechanical one.</p>
<p>Apple has taken further advantage of this electromagnet technology and has allowed the Macbook Pro's trackpad to supersize itself. Not only does the Force Touch Trackpad allow for uniform clicking across the entire glass surface (unlike traditional trackpads where the top remains unclickable due to its hinge) and avoid physical wear and tear, the simplicity of its technology also allows it to extend to bigger sizes without having to deal with certain mechanical compromises.</p>
<p>Of course, it goes without saying that if you haven't tried Force Touch Trackpad for yourself you should give it a try before you commit to a new Mac notebook. From personally trying the 12-inch Macbook I think the Force Touch Trackpad feels pretty good, even if it could use a stronger click sensation from the electromagnet. I would gladly give up macOS's Force Touch feature if it means I get to reuse Force Touch as just a regular click. Somewhere around there is a click setting that's perfect for me.</p>
<h4 id="thekeyboardandbutterflyswitch20">The Keyboard and Butterfly Switch 2.0</h4>
<p>The keyboard is easily the most controversial aspect of the 12-inch Macbook, and I expect these new Macbook Pros to be a very similar story. Apple's Butterfly Switch keyboards have drastically reduced key travel compared to traditional notebook keyboards, and tries to make up for it with higher resistance and an almost clicky sensation upon activation. This new design has polarized many people since its inception, and I have previously expressed my dislike for its drastic loss in tactility.</p>
<p>My beef with the Butterfly Switch is that it doesn't make itself obvious when it's been properly pressed down, while many other keyboard switches do. Mechanical keyboards click, laptop keyboards bottom out, and even touchscreen keyboards have a feedback where your finger touching the screen sends a signal to your brain that you just registered a key press. Butterfly Switches in my opinion lack a distinctive sensation to indicate actuation.  The key travel is too small for distance to be a mental indication, and the click sensation isn't pronounced enough to tell my punchy finger that it has given way and has registered the key press. My brain knows I <em>hit</em> a key, but very little about the old butterfly switch told me that I actually <em>pressed</em> the key.</p>
<p>While Apple is steadfast in not increasing the key travel in Butterfly Switch 2.0 inside the new Macbook Pros, they did promise to increase the sense of <em>click</em> when the finger successfully overwhelms the keyboard's resistance. Apple VP Phil Schiller said these new switches offer a &quot;greater sense of key travel&quot;, and <em>sense</em> is the word that needs emphasis here. These new switches may not let your finger press farther, but it will at least trick you into thinking you did, which may very well work if the Force Touch Trackpad is any indication.</p>
<p>If there's any room for optimism here, many online reviewers who previously hated Butterfly Switch 1.0 have <a href="http://www.theverge.com/2016/11/14/13618142/walt-mossberg-macbook-pro-fast-slim-tweener">come around to liking version 2.0</a>. The same could happen to you or me, but this is one of those cases where you <strong>must</strong> try the new keyboard out in a store somewhere before you make any purchase decision.</p>
<hr>
<p>Next time I talk about the Macbook Pro's updated specs, to see if the new CPU and GPU are worth the wait. Stay tuned!</p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Macbook Pro Predictions Scorecard]]></title><description><![CDATA[Earlier this year I made some predictions as to what we would find in the 2016 Macbook Pro. See how well I did.]]></description><link>https://ianuymatiao.com/macbook-pro-predictions-scorecard/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d5f</guid><category><![CDATA[apple]]></category><category><![CDATA[hardware]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Thu, 24 Nov 2016 09:41:26 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>As I promised, I am going to grade myself on the predictions I made earlier this year for the <a href="https://apple.com/macbookpro">2016 Macbook Pro</a>. There are going to be some impressions and opinions expressed as I evaluate my predictions, but if you're worried that there isn't going to be much meat in this post, don't worry. I will follow-up this post with a more comprehensive, more opinionated impressions article. You can look forward to that if you want me to tell you whether or not these new notebooks from Apple are worth considering. For now, though, it's mostly going to be myself who's in the hotseat, so let's get to scoring!</p>
<h4 id="thespecs">The Specs</h4>
<p>Of all of the predictions I made, this was the one where I felt most confident, even coming up with specific parts that we could anticipate seeing. Overall I think I nailed the high-level predictions, such as the use of Intel Skylake chipsets and the TDP (thermal design power) for both the 13-inch and 15-inch models, but was hit-or-miss when it came to specific clock speeds, not to mention a huge curve ball I didn't see coming.</p>
<p>I am glad to say I was absolutely spot-on for the regular 13-inch model's chipsets. The lowest configuration did use the <strong>Core i5 6267U</strong> chip, while the maxed out model used <strong>Core i5 6567U</strong>. I also managed to guess that 8 GB of RAM would be standard, as I thought it was too soon to start at 16 GB. One prediction that I am pleased to say I got wrong was storage. I predicted it would start at <strong>128 GB</strong>, but Apple managed to go the extra mile and make <strong>256 GB</strong> the lowest configuration, something I am really happy about.</p>
<p>The 15-inch model predictions didn't do so well, however. I thought that Apple would do a similar thing as last time by making the entry-level 15-inch model use exclusively the highest-end integrated GPU from Intel, while reserving the discrete GPUs for the more expensive configurations. Boy, was I wrong. There is <strong>not a single Intel Iris Pro GPU</strong> to be found on any of these 15-inch models. Instead <strong>a different discrete AMD GPU is included with every configuration</strong>, including the lowest one! This meant Apple could save on costs by just opting for cheaper quad core chips that don't include the more powerful integrated GPUs, thus the use of <strong>Core i7 6700HQ</strong> at the bottom and <strong>Core i7 6920HQ</strong> at the top. The good news is that I correctly predicted the RAM, the storage and the use of AMD Polaris GPUs correctly, so that's my consolation prize.</p>
<p>Then there is the absolute curve ball that is the 13-inch Macbook Pro sans the Touch Bar, which I'm going to call from here on out the <a href="http://atp.fm/episodes/194"><em>Macbook Escape</em></a>. Spec-wise the Macbook Escape is basically a Macbook Pro with Macbook Air specs. It includes an ultrabook-class Core <strong>i5 6360U</strong> for the base model, with an optional <strong>Core i7 6660U</strong> at the top, also ultrabook-class. What's nice about these chipsets is that they come with the eDRAM-equipped <strong>Intel Iris 540 GPUs</strong>, which is highly qualified to power these retina displays and even some light gaming on the side. Please note that these chipsets only have a TDP of 15W (versus 28W on the regular 13-inch model), meaning the CPU and GPU can't clock as high or perform as well, so expect some choking in high performance applications such as games. On the plus side the Macbook Escape shares the same 8 GB of RAM and 256 GB of storage as the regular 13-inch Pro, which is great.</p>
<p><strong>Score: B</strong></p>
<p><em>Some high-level predictions were correct, but wrong 15-inch specs and storage predictions take away some of the thunder. Macbook Escape was an anomaly that I couldn't have been predicted, but still.</em></p>
<h4 id="theports">The Ports</h4>
<p>My predictions for ports took a beating this year. Despite my best efforts to be forward-thinking yet also restrained, Apple basically went full-Apple this year and outright massacred numerous ports from their Pro lineup. Here's what I predicted earlier this year (for both 13-inch and 15-inch):</p>
<ul>
<li>MagSafe Power Connector</li>
<li>3 x Thunderbolt 3 Ports</li>
<li>1 x 3.5mm Headphone Jack</li>
<li>1 x SDXC Card Slot</li>
</ul>
<p>Here's what we actually got:</p>
<ul>
<li>4 x Thunderbolt 3 Ports (2 Ports on Macbook Escape)</li>
<li>1 x 3.5mm Headphone Jack</li>
</ul>
<p>Ouch.</p>
<p>That means we lost both MagSafe and the SD card slot, but kept the headphone jack for one more generation and gained a total of 4 USB-C ports. While I got the number of ports wrong, I at least correctly predicted that every single USB port would support Thunderbolt 3, since that would be a very Apple-y thing to do.</p>
<p><strong>Score: C+</strong></p>
<p><em>I underestimated Apple's ruthlessness when it comes to ports. Kudos for predicting the full merger of USB and Thunderbolt, and for the sparing of the headphone jack.</em></p>
<p>And now for a bit of rapid-fire scoring.</p>
<h4 id="thinnerlighterdesignandnewcolors">Thinner, Lighter Design and New Colors</h4>
<p>I said both were likely. What we got was a design that was thinner than the Macbook Air, and certainly much lighter than previous Pros. We also got a new Space Grey color.</p>
<p><strong>Score: A</strong></p>
<p><em>Spot-on prediction, but I can't give an A+ because I hedged my bet by saying likely instead of certainly.</em></p>
<h4 id="touchid">Touch ID</h4>
<p>I didn't fully commit to the prediction because I wasn't sure the Secure Enclave would be allowed on an Intel chipset. Turned out it was, so now we have Touch ID on Macs!</p>
<p><strong>Score: A</strong></p>
<p><em>Also a spot-on prediction, but no A+ due to lack of commitment.</em></p>
<h4 id="oledtouchbar">OLED Touch Bar</h4>
<p>I said it was highly improbable.</p>
<p><strong>Score: F</strong></p>
<p><em>Way off the mark.</em></p>
<h4 id="butterflyswitchkeyboard">Butterfly Switch Keyboard</h4>
<p>No actual predictions were made. Instead I plead Apple to not change the keyboard, knowing it was very likely to happen. Alas, they did. They said they made improvements to the butterfly switch, so hopefully it's much better than the one in the 12-inch Macbook.</p>
<p><strong>Score: D</strong></p>
<p><em>Did not commit to a prediction.</em></p>
<h4 id="finalscorec">Final Score: C+</h4>
<p><em>See you next time for the Macbook Pro impressions!</em></p>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Why Mac Gaming Sucks]]></title><description><![CDATA[It's time to revisit one of the most controversial aspects of the Mac platform: the games. Why is Mac gaming terrible, and is there still hope for it?]]></description><link>https://ianuymatiao.com/why-mac-gaming-sucks/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d5c</guid><category><![CDATA[apple]]></category><category><![CDATA[video games]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Mon, 24 Oct 2016 15:28:16 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Now that the Mac update dry spell is about to end <a href="http://www.apple.com/apple-events/october-2016/">this coming Thursday</a>, I thought it was time to revisit a topic that is near and dear to my heart: <strong>PC Gaming on the Mac</strong>. Ever since I started coveting a Mac back in the mid-2000s, the topic of Mac gaming has always stuck with me. I can't help it; I love PC games. My childhood memories are filled with PC platformers, strategy games and educational puzzlers. When I started high school my love for PC gaming was reignited again with games like Warcraft III and its most famous mod DotA. It shouldn't come as a surprise that I wanted to have my cake and eat it when I wanted to continue my PC gaming hobby even as I was switching from Windows. Even almost a decade into the Apple ecosystem, one thing still hasn't changed: <strong>Mac Gaming Still Sucks</strong>.</p>
<p>Despite genuinely enjoying using the Mac I was still a PC gamer at heart, and seeing the Mac's gaming prowess go almost nowhere for almost a decade is heartbreaking. Maybe the Mac was never meant to game, maybe PC gaming is too much of a niche on top of an existing niche. Those are valid points, but it will always bother me that the potential to have the perfect PC platform for me (Gaming + Mac) was there, but never achieved.</p>
<h4 id="lackofhardware">Lack of Hardware</h4>
<p>There are two components of a computer that make PC gaming remotely possible: the CPU and the GPU. If you wanted an especially rich and immersive gaming experience, having a good GPU is an absolute must. As most people who criticize the Mac will tell you, Macs don't deliver on the GPU front.</p>
<p>Ever since Macs started to use Intel processors in the mid-2000s, Apple has always stuck to using integrated graphics in their mainstream Macs. Intel's integrated GPU was enough to handle movie playback and other basic media experiences, but left a lot to be desired when it came to gaming. Adding a dedicated GPU that was actually capable of gaming would have added power consumption, weight and cost. Apple is unwilling to add any of these to their mainstream lineup, and instead reserved the GPU for their more professional lineup. It is a complete lie that Macs are $2000 Facebook machines, but calling them $2000 gaming machines is right on the money.</p>
<p>To be fair, things are a lot better now than they were before for those with integrated graphics. The early-2010s saw Intel renew their commitment to integrated graphics. Integrated GPUs suddenly found themselves with more silicon, and some power consumption that previously went exclusively to the CPU were now redirected to graphics. Nowadays an integrated GPU can play some modern games at low settings, whereas before it was nigh impossible.</p>
<p>Still, PC gaming continues to be unkind to integrated GPUs and the Mac. Some big budget titles flat out require the massive amounts of graphics horsepower that only a separate graphics chip can provide. Unfortunately this is a physics problem that is extremely difficult to overcome. As long as the average gaming PC continues to run hardware that consumes many times more power than a Mac noteboook to run their games, PC games will continue to target the kind computers that are antithetical to what makes the Mac so great. For there to be a level playing field between Macs and Gaming PCs, either the Mac needs to add more power consumption to access more firepower, or the PC needs to lower it's definition of &quot;high performance&quot; to include low-power computers like the 13-inch Macbook Pro. Neither are budging.</p>
<h4 id="lackofsoftware">Lack of Software</h4>
<p>It's bad enough that the Mac is locked out of running a certain class of games based on hardware alone, but the software aspect of game development makes things even worse. Microsoft's DirectX API contributed to Windows' dominance in the PC gaming market, and as the industry standard it very much ensured that PC games targeted Windows first, and often exclusively.</p>
<p>Developers wishing to support other platforms had to rewrite their games to support an alternative graphics API like OpenGL. While OpenGL is an open standard and is supported across multiple platforms (including Windows and Mac), its ancient heritage meant it wasn't optimized for modern computer architectures, and lacked game-specific features that made DirectX an industry powerhouse to begin with. It also didn't help that OpenGL was designed by committee, meaning its development was bogged down by bureaucracy. Microsoft, on the other hand, can iterate DirectX as quickly as it wants.</p>
<p>The legacy cruft and slow development of OpenGL can be blamed for the Mac's lackluster support for PC games that could have actually ran on its hardware. Combined with the fact that DirectX was (arguably) more developer-friendly, was reliably supported by every graphics vendor, and immediately supported over 90% of the PC market, should it come as no surprise why Mac gaming is seen as an afterthought?</p>
<h4 id="otherlittlethings">Other Little Things</h4>
<p>There are other little things that make Mac gaming an even worse proposition. Support for controllers is abysmal, meaning games like platformers don't have plug and play support on the Mac. Apple also doesn't support the latest version of OpenGL on the Mac, making the herculean task of porting games on the Mac possibly more difficult since it's cross-platform API is behind other platforms when it comes to features.</p>
<p>Lastly there's the fact that Apple no longer updates its Macs as aggressively as it used to. That means the GPUs in our Macs are falling behind in the gaming race simply because they are getting older, but Apple isn't providing updated hardware to keep up with the times. The fact that Apple's high-end Mac Pro doesn't have the hardware to support VR should say a lot. This might get partially resolved with this week's keynote, but it doesn't excuse the agonizing waiting game that Apple made us play these past couple of years.</p>
<h4 id="glimmersofhope">Glimmers of Hope</h4>
<p>Things are in dire straits right now for Mac gamers, but no matter how much it sucks, I still see hope for Mac gaming going forward.</p>
<p>The first piece of optimism I have is the possibility for external GPUs. The Macs that are expected to be introduced this Thursday will most likely come with Thunderbolt 3, a data transfer protocol with enough bandwidth to support external graphics cards. This means Apple can continue to make their Macs thinner and lighter, while a gamer can use an external GPU to turn her Mac into a viable gaming machine. Right now the technology is <a href="http://www.razerzone.com/store/razer-core">still really expensive</a>, but hopefully that will change a few years from now.</p>
<p>Another reason to get excited is the meteoric rise of third-party game engines. The increased horsepower of mobile devices has allowed companies like Unity and Epic Games to scale their game engines from smartphones all the way up to consoles and gaming rigs. This desire to embrace cross-platform development has benefited the Mac as well. For the first time in a long time the Mac is now a first-class citizen of the Unreal Engine.</p>
<p>Lastly, I think Apple's Metal API is going to help the Mac in the long run.<sup class="footnote-ref"><a href="#fn1" id="fnref1">[1]</a></sup> When the Unity Engine and Unreal Engine finally support Metal, there will finally be the potential for high quality games that run very well on the Mac. Many games now run on either of these engines, and adding Metal support could actually legitimize the Mac as a viable gaming platform.</p>
<p>Not all of these things are set in stone, so it's not as if Mac gaming is going to be a guaranteed thing. Still, it would be nice to finally live in a future where Mac gaming is a thing. This week's keynote is unlikely to immediately solve the Mac's gaming pains, but maybe it doesn't have to. Maybe all that Apple needs to do is continue to make the Mac a platform worth investing in, and should these upcoming developments come true, everything else will hopefully fall into place.</p>
<hr class="footnotes-sep">
<section class="footnotes">
<ol class="footnotes-list">
<li id="fn1" class="footnote-item"><p>To be clear I would still prefer that Apple add Vulkan support to iOS and macOS in addition to Metal. For developers making their own game engine, the idea of supporting a third graphics API doesn't do the Mac any favors. At least iOS has market leverage; macOS doesn't. <a href="#fnref1" class="footnote-backref">↩︎</a></p>
</li>
</ol>
</section>
<!--kg-card-end: markdown-->]]></content:encoded></item><item><title><![CDATA[Android's Update Problem: 2016 Edition]]></title><description><![CDATA[Another Android update. Another Android update problem. We wish it would go away, and yet here it eight years after launch. How has it changed in 2016?]]></description><link>https://ianuymatiao.com/androids-update-problem-2016-edition/</link><guid isPermaLink="false">5bfea3b18990670fd9b92d5b</guid><category><![CDATA[android]]></category><dc:creator><![CDATA[Ian Uymatiao]]></dc:creator><pubDate>Tue, 13 Sep 2016 10:44:15 GMT</pubDate><content:encoded><![CDATA[<!--kg-card-begin: markdown--><p>Android 7.0 Nougat is <a href="https://www.android.com/versions/nougat-7-0/">finally upon us</a>! It comes with awesome new features such as split-screen apps, native VR support and better power management. On the backend side of things Nougat also introduces some much needed bug fixes to prevent future security exploits like Stagefright. It's looking to be a pretty nifty update, but what's also upon us is yet another year where we have to contend with the recurring elephant in the room that is Android's OS update problem.</p>
<p>Every yearly Android update comes as bittersweet for several million mobile users. Unlike the reliably updated iOS, Android updates start out as complex situations and only get worse from there. This conundrum has been a notable part of Android's story since its rise to prominence, and despite herculean efforts by Google there appears to be no end in sight either.</p>
<p>Expect future editions of this series in the coming years.</p>
<h4 id="googlebypassesthemonolithicupdates">Google Bypasses the Monolithic Updates</h4>
<p>It was in Google's best interest to keep Android up-to-speed in order to ensure the best Android experience possible. A less fragmented install base meant easier (and cheaper) maintenance, and happy users meant a reliable number of users to sell ads to. The problem is the business model of an OEM or a carrier is intrinsically at odds with software updates (especially free ones). That means the old way bringing in new features through monolithic, OTA updates wasn't going to cut it.</p>
<p>Google essentially came up with two big solutions that still positively affect the Android ecosystem today. The first solution was <a href="https://en.wikipedia.org/wiki/Google_Play_Services"><strong>Google Play Services</strong></a>, an Android superapp that was installed on every Android phone made since 2010 and contained all the latest Google APIs. Since Play Services updates itself through the Play Store (this is how it adds new Google APIs) without having to wait on OEMs or carriers, even 3rd-party apps running on old versions of Android have access to the latest Google features. A prime example of Google Play Services at work is when Google introduced <a href="https://developers.google.com/nearby/">Nearby</a>, a new set of proximity-aware APIs.</p>
<p>The second solution Google came up with was the <a href="https://developer.android.com/topic/libraries/support-library/index.html"><strong>Android Support Library</strong></a>. Google's idea for how an Android app should look and behave is constantly changing, and the Support Library was its way of allowing developers to implement new Android app design on older phones. Instead of making apps rely on features included with a specific Android version, these apps can now ask the Support Library for those same features, and the Library itself will take care of implementing those features over multiple versions of Android.</p>
<p>The result of both of these Google solutions is that even phones running a version of Android made in 2011 can run an app that was built, designed and released in 2016.</p>
<h4 id="areandroidupdatesstillimportant">Are Android Updates Still Important?</h4>
<p>Despite Google's efforts to decouple vital parts of the Android experience from the monolithic updates of the old world, some things can really only be updated through the operating system. Things like support for new Bluetooth standards, updated core Android behavior (like multi-window apps), and security updates to the Linux kernel will remain things that OEMs and carriers can neglect to push to customers. Even the Android Support Library has its limits, as parts of the new Android app experience such as smooth transitions and new camera APIs remain locked behind newer versions of Android. As cool as it is backport Material Design to phones running Android Gingerbread, it's nowhere near as easy for Google, developers and users if that phone could just upgrade to Android Nougat.</p>
<p>While you can argue that having support for the latest version of Bluetooth isn't very important, security continues to be a thorn in Android's side. There are many vulnerabilities that Google Play Services isn't allowed to patch on its own, Stagefright being the most notable one. While Google Play Services and the Google Play Store are excellent in preventing malware from being installed on users' phones, there are still many ways to hack into an Android phone.</p>
<p>Attacks like <a href="https://en.wikipedia.org/wiki/Stagefright_(bug)">Stagefright</a> use specially crafted audio and video files to perform a hostile takeover of one's phone, and there's no shortage of media on the open web. The worst examples of these kinds of attacks don't even need the user to do anything; <a href="http://arstechnica.com/security/2016/09/two-critical-bugs-and-more-malicious-apps-make-for-a-bad-week-for-android/">merely being sent a text with a malformed video</a> is all it takes to activate some exploits.</p>
<p>Google has done its best to make security patches more accessible, even if they have to go through the monolithic update process. There are now <a href="https://source.android.com/security/bulletin/">much smaller Android updates available to OEMs</a> that focus on security patches and hopefully only need minor testing before being sent to users. Nevertheless trust issues with OEMs still remain.</p>
<p>Google is also taking some longer term solutions. Problems like Stagefright largely stem from a specific Android module called <em>mediaserver</em>, and it's been responsible for handling video and audio for Android since the beginning. Google decided that it needed to rewrite <em>mediaserver</em> if it wanted close-off malformed media as an attack vector for good. That's why a <a href="http://android-developers.blogspot.com/2016/09/security-enhancements-in-nougat.html">new and improved <em>mediaserver</em></a> shipped with the Android Nougat update. There's just one issue...</p>
<h4 id="thegreatandroiddivideof2016">The Great Android Divide of 2016</h4>
<p>The problem is that a large amount of Android phones, many of them flagships, will never see Nougat arrive, not officially at least. While the details remain the realm of speculation, word on the street is that <a href="http://www.androidheadlines.com/2016/08/rumor-snapdragon-800-and-801-wont-get-android-nougat.html">Qualcomm will not provide Nougat drivers for the Snapdragon 800 and 801 chips</a>. It sounds harmless at first blush until you realize that these chips power a non-trivial number of flagships that really aren't that old. These flagship phones include the Google Nexus 5, the Samsung Galaxy Note 3, the Samsung Galaxy S5, the HTC One M8, the LG G3, the Sony Xperia Z1, Z2, Z3 and the Oneplus One. These are all still perfectly usable, performant phones, but a business decision upstream means that these phones may remain vulnerable to exploits that may still reside in legacy code.</p>
<p>The problem with Android updates is that all involved parties (and there are a lot of them) need to put their stamp of approval before it ever reaches your phone. What's so damning about this year is that for all of these flagship phones I just mentioned, the Nougat update didn't even get past the first hurdle (the chipmaker a.k.a. Qualcomm).</p>
<p>Oh, and there's still the fact that an even larger number of users using equally old non-flagship Android phones have been left out by extension as well.</p>
<h4 id="wheredowegofromhere">Where Do We Go From Here?</h4>
<p>A part of me wants to say that we shouldn't feel too bad about not getting Nougat (I personally own a Sony Xperia Z3 Compact), but the spectre of Stagefright (and all its sinister sibling exploits) looms too large for me to ignore. Android is a platform with many layers, and while Google has firm control over the top layers, too many parties with little skin in the game have too much control over the lower ones.</p>
<p>It's hard to tell if Google will be able to expand Google Play Service's powers to cover more parts of Android like <em>mediaserver</em> or the Linux kernel, but I wouldn't hold my breath on that.</p>
<p>As for the Snapdragon 800/801 conundrum, while it's painful to see these chips dropped so soon, it's probably because they didn't support newer features and APIs<sup class="footnote-ref"><a href="#fn1" id="fnref1">[1]</a></sup>, the kind that modern chips thankfully already support. My hope is that we won't see another sharp divide like this in a long time.</p>
<p>If you are an owner of an unsupported phone like the ones mentioned earlier and are concerned with currently existing vulnerabilities, it may be time to exercise your right as an Android user and look into custom ROM options. Hopefully your phone manufacturer is amenable to unlocking your phone's bootloader so that this option is even possible in the first place. If not, hopefully the security problems plaguing Android right now will make them realize that unlocking abandoned Android phones is a moral obligation.</p>
<hr class="footnotes-sep">
<section class="footnotes">
<ol class="footnotes-list">
<li id="fn1" class="footnote-item"><p>I'm specifically referring to the Vulkan graphics API and native support for encryption. The Snapdragon 800 and 801 are <a href="http://arstechnica.com/gadgets/2016/08/why-isnt-your-old-phone-getting-nougat-theres-blame-enough-to-go-around/">possibly being left behind</a> because they support neither of these. <a href="#fnref1" class="footnote-backref">↩︎</a></p>
</li>
</ol>
</section>
<!--kg-card-end: markdown-->]]></content:encoded></item></channel></rss>