Apple’s App Store Issues

If you’ve been under a technology rock, you might have missed the kerfuffle Apple’s been in for the past few months. We’ve seen a few high-profile dust ups over Apple’s control of what goes on the App Store (HEY, Microsoft’s xCloud, Fortnite). The arguments vary for each of these but the common issue is that Apple seeks to control how developers build their apps, wants to take a cut of all revenue coming into their apps regardless of how much value the store provides, and restrict many types of apps based on what tend to be arbitrary standards.

There’s a good read in Stratechery about this same issue but from an economic / antitrust angle that I recommend you check out for more detail.

If Apple isn’t careful, they’re going to wade into antitrust regulation that could potentially strip the company of a lot of control over their store. If they get ahead of it, they can set the terms. Here’s what I wish they’d do:

Reduce App Store Commission

Reduce the cut for iOS purchases. You would see fewer complaints about other problems with the App Store if the cut Apple took was closer to 10-15%. I doubt that will happen without government intervention, but one can dream.

Sideload apps

Apple should allow users to sideload apps like on a Mac. Here’s what a user sees in the security panel on MacOS:

Apple could require all apps to be signed to maintain a level of “break in case of emergency” control. Even if iOS required users to plug into a computer and load an .apk rather than a more seamless TestFlight-like experience, that’d solve for apps that are categorically not allowed (xCloud), apps that want to do their own thing payment wise (Fortnite), and fringe jailbreak-like apps.

Clearer Rules

Next up, Apple should revamp their rules to reflect the world of 2020, not 2007. Clearer rules for developers with an escape hatch to side load if push comes to shove would make most folks happy. As it currently stands, many developers are fearful of investing time and money into app development that may be rejected on a technicality.

In-app Links

Apple should allow apps like Netflix, Kindle and Fortnite to send users to an in-app webview that would allow you to purchase in-app content or sign up for the service. Apple would not get a cut of these purchases. Let the better experience and safety of Apple’s IAP compete to win out over a popover web view.

Will Any of This Happen?

I don’t anticipate they’ll do any of these unfortunately, especially the commission cut. I do worry that Apple is stifling innovation on their platform and if they do it enough times you could see a situation where entire categories of users start to choose Android over iOS because there are important things they just can’t do on iOS. Most of the things Apple has gotten into hot water for lately are not policies that put customers first. Instead, they are things that solidify Apple’s ability to make money, protect their interests or keep things “simple”. Given the push to present the iPad Pro as a computer, their limitations on the types of things that are allowed on iOS make me reconsider how much I’d like to invest in iPads or iPhones.

Even Apple’s privacy push could have an unintended outcome. A lot of apps rely on advertising to make their money and if Apple makes it prohibitively difficult for developers to monetize their apps, they may choose to slow or stop development on the platform. It is a difficult tightrope to walk but I trust that Apple can do it. Whether they will is another story.

Album-focused Music Apps

Call me old fashioned, but I love queueing up albums and listening to them all the way through. Nowadays, playlists are all the rage, but because listening to Albums in a CD-changer was the way I grew up listening to music I still enjoy hearing the entire album from start to finish. For me, it tends to invoke more memories than the random song showing up on a playlist. While one of the primary reasons I switched to Apple Music earlier this year was around better album support, the app (especially on iOS) still could use some work to make albums feel like first class citizens.

But there’s good news! As the Apple Music API has gotten more robust, more apps have been released to deliver niche music experiences on iOS. In the past month, 2 such apps have come out – Albums and Longplay. To my delight, both focus on allowing users to play their library in a way that’s “album first” – sorting albums based on certain criteria and playing them in their entirety. Both of these apps do a lot of similar things, but I thought it’d be worthwhile to highlight the pros and cons of both.

Albums 3.0

Albums is over a year old but the 3.0 release is a big one. Adam Linder, the developer of the app, has added a ton to the latest version. You have a few views at your disposal:

The main view in Albums 3

  • Albums – the traditional grid based layout that lets you perform basic filtering based on album play count, date added, etc. Tapping on any album starts playing it.
  • Library – a more granular breakdown that allows you to drill down by genre, decade, artist and more.
  • Insights – These are ‘smart playlists’ of albums that meet criteria like unplayed albums, old favorites, only listened once, and more.
  • Stats – Here are some dashboards that allow you to see which albums have been played the most.

The good:

  • Super active development gives me hope that the stability issues (see below) will be worked out eventually.
  • I love all of the ways you can sort and visualize albums.
  • Tons of settings you can adjust to your style.
  • You can view different sorts of stats for an album (play ranking for the year, compared to other albums by the same artist, etc).
  • The now playing view gives you a track listing, album metadata as well as stats about the album. The progress bar is also very interesting, as it shows you each songs progress as part of the album.
The now playing screen

The not so good:

  • It’s pretty glitchy. The app crashes a decent amount, things jump around at times (especially on an iPad where I use it in split view from time to time), and there’s a lot of room to improve the UX and the visual consistency is lacking.
  • It’s yet-another-subscription if you want all of the features. It’s only a buck a month but the mental overhead of subscribing for yet another app isn’t ideal for me. Still, I signed up for a 1-year subscription ($10) to see where things are going and to support development.
  • Due to some limitations around the way the Apple Music API works, a lot of the play recency stats seem to be tied to your device. Uou may have out-of-sync sorting between the iPhone and the iPad.

Longplay

Longplay is an app I just found out about in the past few days. This app is a lot simpler but approaches the job in a similar fashion. There are no stats or advanced sorting options so this is a bit more like Albums 2.0 was. Still, There’s a lot to like here.

This is the extent of the UI, but it gives you about everything that you need

The good:

  • It’s only $2.99. Sold.
  • Playlists are included along with albums!
  • You can long press and hide an album or playlist from the wall of art.
  • Visually, it’s very clean.

The not so good:

  • This app is really basic right now. The now playing screen is essentially a blown up version of the album art.
  • Appears to be iPad only right now. Apparently it’s on the phone, so scratch that from the list.

Anyway, I’d recommend either of these apps if you’re looking for a way to sort through, rediscover and shuffle your albums.

WWDC 2020 Initial Thoughts

The WWDC 2020 “pandemic edition” is now behind us, and it was one of the better ones I’ve seen in quite some time. Apple announced a lot in the 2 hour presentation, with iOS and MacOS getting the bulk of the attention this year. What follows is a quick rundown of my thoughts after watching the keynote last night. If you want to dive deep, you should follow MacStories this week. They have a ton of content already.

Overall

  • The presentation style was great – it was tight, dense and well paced. Some of the zooming around campus stuff was kinda cheesy, but I approve of most of the dad humor they use these days. Hopefully this is the future of the keynote, although I doubt it.
  • The Music app seems to be getting way better search, filtering within lists and a redesigned start view that will replace “For You”.

iOS

iOS got a TON of attention this year. I was very impressed with this part of the presentation.

  • The App Library looks fantastic. I’ll be hiding everything but my first screen when iOS 14 is out.
  • The “Smart Stack” suggested widgets on your home screen could be neat … but so could the Siri watch face on the Apple Watch.
  • I hope App Clips catch on. Can’t wait to delete a lot of the parking & other one-off apps from my phone. The restaurant specific pages within an app like Yelp is interesting.
  • Based on the screenshots I saw during the presentation, it appears that the Apple notes texture background is gone!
  • The Siri redesign looks fantastic. I’m interested to see if the Siri enhancements are only skin deep, however. The on-device changes to dictation will hopefully speed things up so my voice command to turn off the lights don’t need to go to space and back.
  • Maps got cycling directions! I hope a basic version works everywhere at launch as I don’t live in a big city. I’m more interested in time/elevation data for when planning a bike ride.
  • Tons of Messages group chat enhancements, pinning convos, threading and mentions. And all on the Mac.
  • Emoji search!
  • 3rd party email and browser support should spur more innovation in those areas.
  • The minimal incoming call UI is much-welcomed.
  • In iOS 14, when apps ask for access to your Photos app, you can give them access only to select photos rather than the entire Photo Library.
  • Dictation is now on-device. I hope this is also for Siri commands in general.

iPadOS

iPadOS got some updates, but nothing like last year. That said, if we can even seen incremental additions yearly that are very iPad-focused, I’m okay with that.

  • Apple Pencil features – shape detection and copy/paste from written text will increase my pencil use by a lot.
  • FaceTime eye correction
  • Doesn’t appear that iPadOS will allow the app library or widgets along with the grid. Why?
  • Adding sidebars and context menus alone will help those in the “desktop replacement” crowd.
  • The search changes look fantastic.

MacOS

The highlights of this part of the presentation was the iPadification of the UI/UX, and the announcement of the ARM … err “Apple Silicon” … transition.

  • The new macOS UI looks really nice. Appreciate Apple brining things together but allowing each platform to do its own thing.
  • Catalyst updates are appreciated, but it still has so far to go. I feel like some developers might just skip the whole thing and put their iPad apps in the Mac App Store once the ARM transition is in flight.
  • Some of the Big Sur Dock icons are … horrific.

WatchOS

  • Finally, you can add multiple complications from the same app.
  • The watch/iPhone wind down functionality integrated with sleep tracking and battery notifications seem to be exactly what I’m looking for. I think the market for sleep apps will probably need to evolve depending on how advanced the native functionality is, but apps that give more data ABOUT your sleep will probably surge. I love Autosleep, but if the built in stuff is better I’ll go with it.

Misc

  • tvOS got a lot of polish, especially around the Home integration. I’ve definitely tried to invest in HomeKit stuff around the house and am tempted to get a few cameras now that they’re more integrated with HomeKit.
  • The AirPods features look amazing. I’ll be curious to see how clever it tries to be, however. The accelerometer work to keep the surround sound in sync are mind-blowing. I have gen 1 AirPods Pro but I’m looking forward to getting some pros next year.
  • HomePod 3rd party music support! I hope they allow folks to set a 3rd party as default.
  • For time based shortcut automations a new toggle has been added. Now these kind of automations can be executed automatically without tapping on a notification first.
  • Did anyone else notice the small HomePod icon on one of the slides?
  • iOS 14 adds a new Accessibility feature that allows you to perform different actions by tapping on the back of your iPhone. For instance, you can make it such that when you double tap the back of your iPhone, you are taken to the home screen, or open the camera or even run a shortcut!
  • I heard the word “private” about a million times. I love that privacy has really become ingrained in every decision the company makes. Using ‘approximate location’ for weather apps that only need your zip code should help kneecap a lot of the tracking apps out there.
  • Speaking of privacy, it looks like tracker blocking support for app analytics and things like Google analytics is coming to iOS and MacOS.

How’d my wishlist fare?

About a month ago, I posted a wishlist for WWDC. How’d Apple nerd Christmas work out for me?

On first read, I think I got 5 iOS of the updates, 1 of the iPadOS updates, and 2 of the miscellaneous ones. Some will reveal themselves over time, but I’m still pretty happy with the first glance from yesterday’s keynote.

Craig Federighi on Apple’s WWDC privacy news

From Michael Grothaus at Fast Company:

“We think we’re showing the way to the industry, to the customer, that they can demand more–they should expect more–about the protection of their privacy, and that we can help move the industry into building things that better protect privacy.”

[…]

“I think the protections that we’re building in, to intimately say that the customer’s device is in service of the customer, not of another company or entity–the customer is the one who is in control of their data and their device–is what’s most compatible with human rights and the interest of society,” Federighi says. “And so that’s what we’re going to keep trying to support–our customers being in control of their privacy.”

Glad this is getting more mainstream attention. The biggest features mentioned in this article are:

  • Approximate location, sharing which quadrant of a worldwide grid you’re in, not your exact location. This is something that’s gotten more attention lately, and I’m really pleased they’re doing this.
  • Cross-tracking prevention. Advertisers and data brokers have used these techniques to build a profile on all of us over the years.
  • Categorized data that’s being tracked, broken up by “type” (up to 31 types!) in the App Store.
  • Better password security notifications
  • Enhanced tracker blocking in Safari
  • Enhanced Safari extension support and security controls around permissions
  • Camera and mic notifications to let users know when either are active
  • Photo selection security

I believe that Apple’s stance on this has moved Google and Facebook in a better direction when it comes to security and privacy. Regardless of your opinion on their products, you should be thankful they’re pushing so hard on this.

Casting Google’s Speakers Aside

See what I did there?

As mentioned recently, I have switched over to Apple Music from Spotify. Part of the decision was based on personal preferences around the 2 services, but the reason that I was reluctant to drop Spotify in the first place was the lock-in I had with Google’s Chromecast ecosystem. As it turns out, by looking to invest in nicer speakers I ended up switching services and voice assistants along the way. I thought it’d be worth discussion as to why I decided to move to Sonos from the Chromecast setup we had, and some of the pros and cons I’ve noticed in the past few months.

Google stops playing (and sounding) nice

Something funny happened in the past year or so. Google, long known as the ‘open’ ecosystem, became a bit less so. With continued integration between the Nest and Google lines, it’s becoming less open and more of an ecosystem play with Google’s products. That’s fine, but it’s not why I initially bought Chromecasts, Google (now Nest) Hubs, etc. I was hopeful they’d give me the best shot of buying nearly any smart home product and they’d work.

Combine that with an increasing discomfort with Google’s data collection across more and more areas and mediocre sound quality on the Google Homes (and especially the Nest Hub & Home Minis), and I was interested in checking out a different approach to whole-home audio.

A few months ago I had posted an article about Google slowly locking down their smart assistant ecosystem and how I felt like it was time to explore a change. My home setup was a few Google Home & Minis, 2 Chromecast Audios plugged into existing speaker setups on our deck and patio areas, and a Google Nest Hub in our kitchen. We used Spotify for the most part, but I missed the feeling I used to have when using iTunes / Apple Music in years prior. Specifically, I’ve always been more interested in albums and Spotify is very playlist and “mood” centric. I think there’s a time an place for that but in general I was questioning the value of paying for Spotify despite its strengths compared to Apple Music.

Outside of the Google Home stuff, most of our “smart home” stuff is pretty platform agnostic:

  • 2 Nest thermostats
  • A bunch of Wemo and iHome smart plug
  • MyQ garage door
  • A Roomba
  • A HomePod (obviously the biggest outlier)

I’ve mostly relied on using Homebridge via a Raspberry Pi to stitch everything together so that we can use HomeKit scenes to automate most of our scenes (morning, evening, leaving & arriving home). We don’t really automate a ton, but I like being able to make sure the garage is closed if we’re both not home for a certain period of time, the lights are off if we’re away, or they come on if we are home and it’s almost sunset. Overall, pretty basic stuff – I’ve grown kind of sour on most of the stuff “smart” home devices offer these days so we’ve kept things pretty simple at our new house.

If we were going to ditch the Google Homes, we needed something to replace them with something that provided great sound, integrated with whatever music service we wanted, and worked in multiple rooms. Enter Sonos.

Why did I choose Sonos?

I’d been thinking about getting Sonos speakers for years now, as I wanted to get something that was service and platform agnostic. Sonos nails that – they integrate with all of the major streaming services, podcast services, audiobook vendors and even offer multiple options for voice assistants (Google Assistant and Alexa). Throw in Airplay 2 support and it was a no-brainer to upgrade most of our Google Home devices with Sonos Ones. One of my favorite things about the Sonos ecosystem is that you can control the speakers via their app or most services’ default apps (Apple Music is an exception, no huge surprise there).

There was a catch with our house – we have outdoor speakers that wouldn’t be easy to hook up to a Sonos speaker. To get our deck wired up, we replaced the Chromecast Audios we were using with 2 Airport Express units that I bought off of eBay. They’re AirPlay 2 compatible, so I was able to plug them straight into the amps for the 2 outdoor speakers we have and we had an Airplay 2 optimized home. Instead of spending hundreds for a Sonos amp, I was able to get something “good enough” for around $45.

Comparing AirPlay 2 to Casting

Previously, we had an entire setup that was all Google Cast powered, so we could ask any speaker to play music and it’d start playing Spotify wherever we wanted. With Sonos speakers, we introduced some small trade offs for the additional flexibility and sound quality. Some of the key differences between Airplay 2 and Casting:

  • Casting isn’t tied to your device at all. Airplay 2 still relies on a source to stream to each audio source, so that means if you were to stray too far away from your WiFi while controlling music it’d stop playing eventually. That’s not the case with Sonos, only Airplay 2 based streams.
  • Native iOS integration of Airplay 2 means that management of whole-home audio is much easier than it was from Spotify or the Google Home app (from control center or the Apple Watch now playing screen you can control any speaker that’s playing music)
  • Google Cast allows you to create named groups to send music to, while Airplay 2 uses your house layout to dictate grouping. Invoking an entire floor is pretty easy on both platforms but if I want to only call on a subset of speakers I could name that subset with Cast, where on Airplay I’d need to ask for each room when invoking that subset. Hoping I can eventually use HomePod shortcuts integration to fix this.
  • I use apps to invoke music way more than by voice now. This is actually a good thing because previously I’d typically ask for the same few playlists over and over. It’s similar to how I panic and order the same meal every time at a restaurant when pressed. Now, I find myself queueing up different albums and playlists all the time.

Add a dash of HomePod

Airplay 2 stuff won’t work with the Sonos system so I have to control them with my phone or iPad if I want to play music everywhere, but this really isn’t a big deal. If we ever want to go 100% into the Sonos world, we can always get something like the Sonos Amp, but I can’t really imagine that happening, to be honest. The only time we really need whole-home audio are if we’re having some sort of group gathering and want to play music everywhere. For now, if I want to play anything on our Sonos setup, outdoor speakers and my office don’t fit into the picture. But as previously mentioned, Sonos speakers are all Airplay 2 compatible, so if I want to play a song everywhere I just have to invoke the music from my phone, iPad or Mac.

Or a HomePod.

Another purchase I made about a year ago was a HomePod. They were on sale at Best Buy, so I picked on up, figuring I’d either return it or sell it eventually. The sound is fantastic, filling my office with very rich sound and serving as a HomeKit hub. Obviously, there are limitations to using a HomePod as well – currently it’s very ecosystem-limited. You can Airplay nearly anything to it but as far as native integration goes, it’s Apple Music or the highway. But it’s by far the best sounding speaker I own. It has smarts to auto tune itself for the room that it’s in, and it shows.

For a while, I just used it when I was working from home but once we made the Sonos switch, I started thinking more about moving to Apple Music. Originally, moving to Sonos wasn’t really about moving away from Spotify. That happened after messing around with the possibilities of an AirPlay 2 based whole-home audio setup. With HomePod + AirPlay 2 you can use your phone to control the HomePod and make that the primary audio source, sending music to the other speakers throughout the house. That way, you don’t run into most of the limitations that AirPlay 2 has compared to Chromecast. Since the HomePod is streaming music to all of the other speakers in our house instead of my phone, it’s really the best of both worlds. If Apple ends up allowing Spotify as a native HomePod integration later this year, it’ll be an even more elegant solution.

Google Assistant to Alexa

My original goals were to replace the Google Homes with better sounding speakers but leave nearly everything else in tact. However, one that original choice was set into motion I found myself making other tweaks as I went – integration with the HomePod, focusing more on Airplay 2, and then switching the default assistant on the Sonos speakers to use Alexa.

The reason is simply the cascading effects of moving to Apple Music. Alexa works with Apple, while Google does not. It’s still too early to have a ton of observations about Alexa vs Google Assistant but I will say that the UX of the Alexa app is light years better than the nested options hellscape Google has put out.

Conclusion

I’ve definitely added a little bit of short term complexity to how we were playing music in our house by making this switch. I know my wife has had a few instances where she throws her hands up with my constant experimentation with this sort of stuff. However, the trade offs have been worth it so far for me:

Pros

  • Way better sounding speakers overall.
  • More choices & service integration.
  • I’ve been really happy with Apple Music as a Spotify convert.
  • More music variety as a result of me invoking music via apps instead of voice.
  • Moving to Alexa puts my tech eggs in more baskets, and reduces my dependence on Google.

Cons

  • The previous setup was more streamlined compared to what we have right now. We could invoke music to any speaker via voice and it just worked.

I’ll be interested to see what Apple has in store for the HomePod as opening it up will further improve the flexibility of what we can play across the entire home. If Apple ends up releasing a mini version or one with a screen (my dream product), then we’d really be cooking.

Why is my own data least important in search?

From Tech Reflect:

I don’t know if this is a macOS or iOS specific thing, but it’s a trend on those platforms in recent years that is very frustrating. It’s hard enough finding things on the internet but once you find them, it should be easy to find them again.

The order in which iOS shows you Siri search results is indeed puzzling. I get there’s a privacy v. convenience tradeoff argument that can be made but it’s not that this data isn’t on your device in these instances. I feel the pain of this whenever I dabble with Apple Maps in particular. Addresses of people I’ve taken the time to create contact cards for or based on areas it knows I’ve been to should be prioritized and used in search results, yet it rarely is (Apple has a TON of information in my travels on my local device and seems to completely squander it).

Apple Weighs Letting Users Switch Default iPhone Apps to Rivals

Bloomberg:

Apple Inc. is considering giving rival apps more prominence on iPhones and iPads and opening its HomePod speaker to third-party music services after criticism the company provides an unfair advantage to its in-house products.

The technology giant is discussing whether to let users choose third-party web browser and mail applications as their default options on Apple’s mobile devices, replacing the company’s Safari browser and Mail app, according to people familiar with the matter. Since launching the App Store in 2008, Apple hasn’t allowed users to replace pre-installed apps such as these with third-party services. That has made it difficult for some developers to compete, and has raised concerns from lawmakers probing potential antitrust violations in the technology industry.

That would be fantastic news! If Apple can find a way to make a cheaper version of the HomePod that can compete more with the lower-end speakers on the market and also allow them to independently play from a music service other than Apple Music, you’d see sales take off. We’re not going to see HomePod become a market leader by any stretch, but a lot of Apple users who are on the fence between a Sonos One and a HomePod might choose differently than they do today.

Doesn’t fix the fact that Siri on the HomePod is no match for the Assistant/Alexa setup on the Sonos One, but some folks are okay with that.

As far as iOS defaults go, I think that’s a great start. Allow users to choose defaults for a few things like mail, web, mapping, messaging and music would be a huge win for users. Still a rumor at this point.

Twelve Million Phones, One Dataset, Zero Privacy

The New York Times:

EVERY MINUTE OF EVERY DAY, everywhere on the planet, dozens of companies — largely unregulated, little scrutinized — are logging the movements of tens of millions of people with mobile phones and storing the information in gigantic data files. The Times Privacy Project obtained one such file, by far the largest and most sensitive ever to be reviewed by journalists. It holds more than 50 billion location pings from the phones of more than 12 million Americans as they moved through several major cities, including Washington, New York, San Francisco and Los Angeles

One easy solution on the phone maker side would be new granular location permission levels. For example, most apps just need to know what city you’re in to offer weather, restaurant or event info. The default could report back a fuzzy location. Other than mapping apps, not many iOS apps really need my precise coordinates.

Google Home & Pixel XL

About 6 months ago, Google announced a slew of consumer-grade products geared squarely at Apple and Amazon. At the event, they presented the Home, the Pixel, and Google WiFi and they all caught my eye for different reasons. I’m intrigued by the concept of mesh networking rather than throwing a router in one corner of the house, I’ve also had my eye on the connected home being controlled by voice, and I’ve been waiting for a truly premium heir to the Nexus line of phones to see if it was really worth making the switch (again).  I alluded to this in a recent post about my slow breakup with the Apple ecosystem, but I’ve been slowly making purchasing decisions based on what works best for me and my family, not what works best just with Apple stuff. A few examples of this is Todoist instead of OmniFocus, Spotify instead of Apple Music, Roku Streaming Sticks instead of Apple TV, and so on. At this point I’m heavily invested in Apple hardware (MacBook Pro, iPhone, Apple Watch) but from an ecosystem angle I’m pretty well spread out amongst a number of services. So, what was it like to try out Google’s latest and greatest?

Google Home

I’ve been interested in a connected home setup for some time but wasn’t sold on the Amazon Echo given the price point and lack of integration with the way I listen to music at home – we have a number of Chromecast Audios hooked up to speakers throughout the home as well as Chromecasts on our TVs. When the demo of the Google Home was shown at the 2016 I/O, (although some of the functionality isn’t baked in yet) I was definitely interested if the price was right. When they were announced at $129 each, that was all I needed to know.

I immediately bought 2 Homes – one for our kitchen and one for our bedroom. We use these things constantly for tasks as simple as setting timers and controlling our Nest thermostats but also for things like controlling multi-room audio, getting general trivia and weather from the web, and turning lights on and off. The voice recognition works very well, even when music or TV audio are playing, and it gets my commands right a vast majority of the time. We’ve gotten into the habit of using it pretty frequently when in the kitchen or getting ready for work. It’s really been a joy to use, and the capabilities are improving every week.

That said, it’s got a long way to go before it can truly challenge the Echo on the number of features it has. But for me, I wanted something that looked good in our house and has the potential of being smarter over time with a company like Google backing it. The thing that really sold me was the integration with Chromecasts – instead of buying a Sonos system in our house we saved $2k by just hooking up existing speakers to Chromecasts. I also play a lot of podcasts throughout the house, which I love do to on the weekends. It’s been freeing to have the ability to have smart home products from multiple vendors that all work together. Sometimes it’s not as easy as just logging into your iCloud account, but you have more choices.

A few things I hope make their way into the Home is the ability to queue music better, multi account functionality (so my wife and I could each do Google account specific stuff), the ability to send messages, and a way to have voice feedback set to one level and media set to another. If you’re looking to get into voice controlled assistants or even just want something to play music on, this is a great option at $129.

Google Pixel XL

I also took the plunge on a 128gb Pixel XL. I had 14 days to return it, so I figured I’d give it an honest look to see if the battery life, camera, OS features and build quality made it worth it to switch.

The short answer is that the Google Pixel XL better than my iPhone 6s Plus in nearly every measurable way. Now I know that isn’t the fairest comparison as the 6s Plus was released in September 2015 and the Pixel XL shipped last November but the only real unfair comparison there would be camera quality and performance. That said, I’m floored with how great Android 7.0 is now compared to iOS 10 and how fantastic the camera is on the Pixel.

The longer answer is a bit more complicated.

Build and screen quality of the Pixel were on par with the iPhone – it’s nothing flashy, with similar bezels to the current Apple offerings, but it’s fine. I don’t mind the fingerprint sensor being on the back, but I do think it’s faster for me to have it on the front. It can be annoying to have to unlock with your PIN when your phone is laying flat on the desk for sure, but it’s not the end of the world. The saving grace for this difference is that Android lets you set trusted unlock locations and connections so you aren’t forced to use the fingerprint sensor constantly. Android has a concept of a “Smart Lock” that allows you to set trusted locations, devices, voices and more to allow you to not require a pin or fingerprint if you’re paired to your car’s Bluetooth, or you’re at home.

The actual feel of the hardware is great, to the point where I don’t need a case. Battery life is a tough one – the standby time of the Pixel was fantastic compared to my current phone but was slightly worse on days where I’d be heavily using the screen. I never struggled to get through the day but I was constantly in the 30% range by the end of the day, compared to maybe 40–50% range with my 6s Plus. Fast charging makes up for any issues here though, as a good 20–30 minutes can get you from 30% to 85% easily. Knowing that’s an option removes any possible battery anxiety.

The camera is the best phone camera I’ve ever used, and the ‘smart burst’ functionality of the phone means that you always get a really good shot when dealing with quick moving targets like a kid or two. I already use Google Photos as a backup for my photo library, so getting free ‘for life’ storage for anything shot from the Pixel XL is something Apple should be doing for iPhones.

On the software side, I firmly believe that Android is now better than iOS for my needs. The way notifications work and are grouped, the organization of the home screens, the default keyboard and overall UX make my time on my phone much more pleasurable. Things have evolved to the point where visually I like the look and feel of Android as well from a color, animation and layout perspective. Little things add up, too. Persistent notifications for chat conversations and media playback mean it’s very easy to switch contexts. After using Android for a few weeks, it feels like everything in iOS takes a few extra taps to accomplish. The app ecosystem really isn’t a problem anymore, either. There are a few apps here and there that I’ll miss from iOS (Day One, Reeder, Fantastical and Pennies come to mind) but it’s not a deal breaker like it was for me 3 years ago.

Other than the above mentioned apps above, the biggest things I missed from iOS was a good messaging solution (iMessage is so amazing and I have no idea why Google can’t find a way to merge Allo, Hangouts and SMS into a unified ‘thing’) and iCloud photo sharing (but I could still do this from my computer so no huge loss). The biggest gripes I had with Android and the Pixel mostly related to missing my watch being integrated (time to start shopping for an Android Wear device!), Average battery life under heavy use, having to use Pushbullet to get text notifications on my work computer (which is a great service, just not as nice as a native app like Messages on the Mac), and the location/volume of the one speaker on the bottom. Lift to wake also wasn’t super reliable in my experience, which is amazing on iOS.

Having a phone with a voice assistant that responds well, is more open (creating tasks in Todoist was dead simple), and gives good contextual answers is really a game changer. I found myself using voice for a ton in the past few months because of the Google Home and always disappointed by Siri. Having a seamless system that truly works everywhere is fantastic.

As an aside, not having notifications on my wrists for messages and other important apps was a big negative. Next time I go for an Android device I’ll have to get a smart watch as well.

As the 2 weeks came to a close I started thinking hard about if this phone or any phone is worth the $4–500 I’d have to spend to buy it (after selling my current iPhone to recoup some costs). I think that answer is no, but I am sad to move back to iOS and return the Pixel XL. Other than some battery gripes it’s really better in every way. So, in the short term I’m happy enough with my iPhone 6s Plus and iOS in general to not invest $900 on a new phone with new hardware less than 6 months out. If I were buying a new phone today I’d get the Pixel XL and I can recommend it wholeheartedly to anyone looking for a new phone. I’m going to have an eye on Google I/O, WWDC and the fall hardware announcements from each company with an even sharper eye than ever before. If I were a betting man, when it comes time to replace my current iPhone I’ll be buying the Pixel XL 2 or whatever it’s called unless Apple really wows me with their hardware and software. The things Apple needs to do with iOS 11 and the next iPhone aren’t out of reach, but I’m not super confident they’ll deliver.

At a higher level, it’s fascinating to me how well Google has gotten at walking and chewing gum. I’ve been using more and more of their services and with most of their hardware offerings looking so good, it’s not too hard to imagine a scenario where the only Apple products I own in a year will be my 3 year old MacBook Pro.