Apple does have relatively good live collaboration in its iWork apps. Perhaps there’s a future API there?
Apple does have relatively good live collaboration in its iWork apps. Perhaps there’s a future API there?
Does it? As much as i like the iWork apps, my experience (and impression of the general sentiment) has shown that Google Docs et al continues to blow the pants off iWork in that regard
I don't expect I am alone in this observation but the number of software companies they highlighted during the M1 debut was very slim and to be honest I have not heard of half of them until then and don't remember them now.
So to me it matters not how much faster AS is, what matters is if I can run want I want to run. I am not going to own two separate machines to do what I want to do. If AS machines cannot do all I need I will keep my current Mac till support runs out and look again
By having separate lines, Apple can sell the 12 for cheaper than the 12 Pro, and those who are willing to pay for the 12 Pro camera can do so. Don’t see any muddying here.
Debatable. If anything, modern Android designs are cleaner than iPhone looks.
> aesthetics
Again, debatable.
> and, most importantly, profit margin
This is the key factor, and it's tied to something you missed.
Apple products and iPhones are ahead in profit margins because Apple consistently delivers reasonable quality goods, with few disappointments, so that users trust them. They've gained user trust despite obvious "design, aesthetics" mis-steps such as the notch or the touch bar.
The key words are: consistent delivery, reasonable quality and few disappointments. That's how they hook users in. Apple mostly delivers on time something very close to what they promised and that thing doesn't have catastrophic flaws. That's a much taller bar than you'd think, in the tech sector.
I think Ben is missing something here: that the speed and specialist hardware (e.g. neural engine) on the new SoCs again give developers of native apps the ability to differentiate themselves (and the Mac) by offering apps that the competition (both web apps and PCs) can't. It's not just about running web apps more quickly.
Their software is shit though, and their walled garden, and insistence on using apple programming languages and IDEs for development, practically ensures that third party software will either not exist or be shit as well. There are only a handful of software shops that make decent software for apple, and they are all fully specialized on apple and therefore do not make software that plays nice with collaborators on other computers, nor used on a cloud server, etc. And if there actually exists better software from third parties that competes with apple software, you can forget about it ever being fully integrated. "Hey Siri, navigate to city hall using Google maps". Yeah right.
Mac has always lagged because the market wasn't big enough to warrant investment. Catalina wouldn't have cratered your gaming library if there was enough incentive for the developers to update the game. Now that iOS apps (read: games) can run on Mac with relatively low effort, and we already see PC games being easily ported to iOS (eg: Fortnite, legal challenges notwithstanding), I feel that this is the best thing to happen to Mac gaming in years.
I don’t see this at all. The Mac has long had a culture of extremely dedicated developers who have fully acculturated themselves into a design-oriented, performance focused culture that presents a stark contrast to the cultures of Windows or Web development. Third party software in the Mac has generally been extremely thoughtful and well designed.
There is plenty of shitty software on iOS, but that has a lot more to do with the amount of shovelware thrown into the store than anything inherent to Objective C.
I would agree that collaboration is a little smoother in G Suite, however in my experience this is mostly about ease of sharing. Once you've gotten another Apple user to understand that they can "receive a shared document from you" and work on it, then usually collaboration itself is smooth.
Fortunately those performance issues will probably only affect games released in a fairly narrow band of time, new enough that they’re still resource intensive but predating the transition. It would maybe be a 5 or 6 year span I guess depending on how performant the ARM system in question is?
“Keeps working regardless” is a great promise Apple has continued to make, but they only hold that up as long as it takes you to get the latest thing.
Design is about more than the way apps & interface looks. I recently switched from Android and the thing that most struck me is that iPhone usability is more consistent. I was able to do everything I wanted with Android and honestly I loved it, but its not as intuitive as iOS.
3 companies then: Hardware, software, and media
There's no significant preference for native; it's just that we won't tolerate bad apps. I'd guess that VSCode is the most popular editor among users here by a long way, and that Google Docs has far more users than MS Word. HN readers don't pick native apps when there's a good Electron or web-based alternative.
Unfortunately I feel that the current trend towards more restriction is making many of those talented developers tap out, and is burying the older software that would otherwise be just as excellent today if it was still allowed to run, but the possibility of those wins keeps me on MacOS through all the losses.
I may have a gambling problem.
For me, the reasons are that I love my current Ipad pro, I don't have a good reason to upgrade it, and the pricing isn't advantageous.
I change my Macbook often because of work-related reasons (I.e. when changing job I need to get a new one)
I change my iphone often because the new ones have a big enough differentiator (for me at least), but more importantly, the way the pricing is structured with my cellphone provider, I almost get it for free. (I.e. I had to upgrade my plan to get more gig/months and a few other features so I only had to pay $200 for the new iphone 12).
But for the iPad pro? I'd need to pay the full $1500 upfront and the new one doesn't offer a big enough value add compared to the previous ones.
It’s not ‘yeah, the new iPhone is great! You gotta have it.’ It’s “so what do you want to do mostly?” — no one knows.
That is what I believe the 90s curse really means — your evangelists are no longer as effective because they give potential customers an overwhelming amount of information that slows down, and sometimes prevents, a sale.
—- Your point is exactly right - Apple has decided to harvest the demand curve over making something undeniably great.
Edit: let me clarify, watch and homepod mini are currently in the category of 'just get it' products. This is only a critique of the iPhone line.
Time will tell...
Ben can write paeans to this new "cloud" business model. But at the end of the day, the question for us, the users is simple.
Do we own what we buy?
When I buy a Mac (and I've bought several), I am buying a computer. A general purpose computational device. And by selling it to me, the company is selling me a general purpose computational device.
What right does the company have to stop me from installing/modifying my device in any way that I see fit? Sure, they may refuse support/warranty, that is their prerogative, but what gives them the right to stop me from having someone else repair it? Or, to boot into Linux? Or, to open my own computer?
I have a MacBook Pro from 2016. Recently, I wanted to give it a thorough cleaning. So I took out my speciality screw driver and unscrewed the screws for the bottom plate.
It wouldn't budge.
It was then I realized that I needed suction cups and strength to move the plate downwards to unlatch something inside to make it "pop".
This design serves no engineering purpose. It exists to make it harder for me, the device owner, to access the device I've purchased without sacrificing dollars at the altar of Apple.
And this was their most "open" product. Prior to the M1 announcement, you couldn't boot into another OS - or significantly alter - your iOS device. And now we can't do so with our Macs. We seem to have collectively decided to blur the line of ownership.
A device we buy isn't ours even after purchase. No, we must continuously give our money to the corporation for the benefit of their revenue projections.
Which returns us to this,
> Sketch, to be sure, bears the most responsibility for its struggles; frankly, that native app piece reads like a refusal to face its fate.
With Sketch you own your data, and thanks to the open format, you can port that data to other mediums.
With Sketch you own a copy of the tool that allows you to do your job.
With these other, less powerful but "collaborative" software, you don't truly own your data or the tools to access it. You merely rent it.
Should there be an event where Figma is acquired or goes out of business, then (in all likelihood) every user of this platform will lack the ability and the choice to preserve their work for future generations (and for their business).
What are the odds of Figma staying as it is, in the control of founders, chugging along as a profitable business a year from now? 5 years from now? A decade? Two decades?
I do not wish to single out Ben, but this post is an exemplar of the shift in thinking being pushed by the current crop of tech cognoscenti. They have made a growing argument that the future is one without ownership. Where it's one where you don't own your devices, you rent them. And they assure us that's the future, and because that's the future, it's going to be amazing.
But that sounds like dystopia to me. It is one thing to have a tradeoff between accessing all the songs in the world and owning a few on vinyl to having the tools of your trade be abstracted away.
Spotify and Netflix aren't essential services to me. My computer is. My vector design software is. My ability to write code is. My ability to make things is.
They argue that there are benefits to "collaboration" with the "cloud", but that doesn't need to be so. The only reason why they're operating in the browser is because the tradition of web apps started within them. There is no reason why every other application can't collaborate natively, with combined local + server-based data storage with other apps across the world.
Video games do it all the time! Games like counterstrike etc are in some ways far more collaborative than a Figma file. The state of what occurs in the world depends on every other person in the world, with the context being time sensitive, and the state being additive. And it works beautifully.
If it can work for non-essential entertainment, why should we accept the reduced paradigm for our essential tools? Why should we buy into crippled software that is limited by the fact it runs in the browser? Why should we buy into the abusive business model of having to rent our ability to do work from another company? Why should we buy into the idea that we don't own the fruits of our labor? And that we don't get to have a copy of our work nor access it without paying the toll?
Bohemian Coding, if you ever read this, don't go down with this ship. Add support for Windows. Or, Linux. It will save your company.
Do they also allow you to switch the default browser too?
N.B, "SubEthaEdit had this for years!" - I know.
As a whole, it could be argued that where Wave failed, Slack - and predecessors like Basecamp - succeeded.
iWork has always seemed like it has a different user in mind with its collaborative features and never really had much traction in the market, which is already served by offerings whose entire reason for being is collaboration, not just as a general productivity suite.
That's how things turned out, but not how they were originally meant to be. Jobs' reluctance to port iTunes to Windows was so obvious that even the Isaacson bio made it clear. And it wasn't simply an atavistic impulse, it was faithfulness to his "Digital Hub" strategy which saw smart-device integration as a way to sell Macintoshes. Jobs was a late and reluctant convert to the idea of a post-PC era. Repasting an earlier comment of mine https://news.ycombinator.com/item?id=9470925 :
>> If you look at Apple's trajectory over the last 15 years, you can see the vision was consistently outlined from the very beginning—the Digital Hub strategy ([link working in 2020: https://youtu.be/AnrM4n6S3CU?t=2585 ]).
> That's not really the case though. As Jobs outlined it in that video, the Digital Hub strategy was to sell Macintoshes by positioning them as something you could dock your consumer-electronics gadgets, mostly from third parties, into. It was a plan to sell hubs, not spokes. This strategy had limited viability for Apple, because a typical consumer wasn't likely to think "I've just bought this $500 camcorder, so now I need to spend twice that or more on a Mac in order to offload and edit the video". If they were going to use any computer as the digital hub for their camcorder, it was probably going to be their Windows PC. That's probably why iTunes for Windows was such a difficult and long-drawn-out decision for Jobs: because it was a decision to mostly abandon the Digital Hub approach in favour of selling more of the spokes. Then Apple's slow and initially reluctant embrace of the "post-PC era" with over-the-air iDevice updates and cloud storage to partly displace iTunes means that it's increasingly taking nearly the opposite of Jobs' 2001 stance: "We're clearly migrating away from the PC as the centrepiece" and "We don't think of it in terms of the PC business anymore" are things that Tim Cook could say today without really startling anyone.
At his 2007 joint interview with Gates at D5 https://www.wsj.com/video/bill-gates-and-steve-jobs-at-d5-fu... Jobs is coming round to the post-PC agenda, reluctantly.
Uh, it's 3 options.
Small size, best camera option, remaining option.
I didn't know this - I just went to Apple's website, clicked iPhone and it has a single page that presents all this very clearly.
So um, yeah. This is by the way how I do 'tech advice' to anyone who ever needs it - I open google, I type in the question and the first link has the answer 95% of the time.
Phones haven't been in 'gotta have it' category since iPhone 6 when they released a bigger size that a lot of people wanted. Since then, it has been 'better camera' yearly releases, oh and 'better chip', as if anyone needs a supercomputer to browse Instagram.
The general view in the 1990s into the early 2000s was that you would have a centralized home computer/storage hub/etc. that everything else connected to.
Also requires me have a working internet to be able to use it. Sketch or Affinity Designer don't have these problems. I hope they will fix that problem with Figma.
There is clearly a difference between my iPhone 6 and iPhone X but I've never been on a particularly frequent upgrade cadence. Under normal circumstances, I'd probably upgrade to this year's model but there's not a lot of point until I get out and about a lot again.
Apparently there's approx a total of 3.5 billion smartphone users today. https://www.bankmycell.com/blog/how-many-phones-are-in-the-w...
iPhones could be 30% of the market.
For a user, it may seem like 'why can't they just add sync', for a programmer, it's a little more complicated :)
- iPhone 12 Mini, the small one
- iPhone 12 Pro Max, the huge one
And then there are these two: iPhone 12, iPhone 12 Pro.
They are the exactly same phone in every regard that a "regular" user cares about. They are so identical that Gruber lumped them together in benchmarks in his review at Daring Fireball.
The only difference is the difference in cameras which is important to a very small number of people. And even then it doesn't make sense to make two different models instead of one, with the new camera setup.
Some problems are straightforward and have final solutions in software, and that should be seen as a good thing.
Them going to Mongo is actually the best possible outcome - by replacing their dead end custom database with MongoDB, they make MongoDB into a more compelling product, making document-based databases more batteries-included than ever before, which is excellent news.
This doesn't give ARM chips an advantage over Intel CPUs at executing Javascript for obvious reasons, once you know why they added it.
I do prefer that alternative and if you take that away, I lose instead of gain.
> the HN crowd generally prefers native apps
is because it is hard to write a good Electron / web app that is actually performant + easy to use. Which is why many people here are wary of new Electron apps.
They seem to be attempting to avoid newer, patented instruction sets (Intel issued a strongly-worded statement about companies that attempt to use their ISA that seemed aimed squarely at Apple), but that means that SSE3+ (and maybe SSE2) are also avoided which could mean a lot more software doesn't work then they'd like to admit.
Wait for the actual benchmarks to come out. Especially real world workloads. I'll be very surprised if desktop Ryzens (or even the Intel CPUs) don't run circles around the M1 in a laptop.
Why? The alternatives are a smaller screen or terrible audio when using one’s phone as a phone. It’s different. But it makes perfect sense and made perfect sense the first day.
So the 'latest' options on Apple's website are presented as iPhone 12 Pro (best camera) and iPhone 12 (not best camera) - 2 options.
Then each option has 2 sizes, that makes it a total of 4.
I guess they could've named things better - it's still same old 'best camera = more expensive' formula from years prior, with an addition of 'smaller size' to the mix.
Then it’s not just big app companies that need to adapt, homebrew and CLI short lived utilities will also need ARM compatible versions or we’ll be wrapping each individual command in a Rosetta 2 launcher.
It might not be that bad, but I’d sure want to wait and see how it pans out, as it doesn’t look like a trivial transition.
Apple sees App Store apps, and web apps as having different advantages, and it is in their interest to have the best platform for both.
It’s not just me saying this, they keep saying it too, and proving it by investing in making web apps run better.
That doesn't exist anymore. Emulation of an architecture for the Intel -> ARM switch can't rely on some big gains to cover for emulation.
I'd have been more excited if they had provisions for some hardware acceleration of Intel instructions. I'm guessing they can't do direction instruction --> microops translation in the ARM silicon, there's probably a thousand patents blocking that.
Why am I suspicious? THERE IS ABSOLUTELY NO WAY THAT A 5W PART LIKE THE A14 IS FASTER THAN A 100W PART LIKE THE i9-10900k! I understand they are comparing single threaded speed. I'll accept that the A14 is more power efficient. I'll acknowledge that Intel has been struggling lately. But to imply that a low power mobile is straight up faster than a high power chip in any category makes me extremely suspicious that the benchmark isn't actually measuring speed (maybe it's normalizing by power draw), or that the ARM and x86 versions of the benchmark have different reference values (like a 1000 score for an ARM is not the same speed of calculation as a 1000 score on x86). It just can't be true that the tablet with a total price of $1k can hang with a $500 CPU that has practically unlimited size, weight and power compared to the tablet, and when the total price to make it comparable in features (motherboard, power supply, monitor, etc) makes the desktop system more expensive.
Regardless of whether it's an intentional trick or an oversight, I don't think that the benchmark showing the mobile chip is better than a desktop chip in RAW PERFORMANCE is true. And that means that a lot of the conclusions that they draw from the benchmark aren't true. There is no way that the A14 (nor the M1) is going to be faster in any raw performance category than a latest generation and top-spec desktop system.
(I would argue it's very simple: "Get the iPhone 12 in the size and colour you like", the Pro models are an up-sale for the people who want different screen-size or slightly better cameras)
You can run open source software on iOS or macOS. There’s nothing in these operating systems preventing apps from working with local files and having options to export and import data. There’s nothing preventing the author of a web application from doing the same (though it’s hard to see a lot of motivation for someone selling a saas subscription to do so). There are self-hosted open source collaboration tools like NextCloud, OnlyOffice, and Gitlab. The fact that VSCode is web based doesn’t change how it’s open source and completely open to tinkering.
What you’re doing here is confusing business model with technology.
And I think what you’re failing to do is look at this software from the perspective of enterprise customers.
A business isn’t worried about the things a consumer is worried about. They have access to their data ensured through a contract. They have legal assurances that they won’t be left out in the cold.
Software is, essentially, business logic. And often, what’s valuable about software isn’t the code itself, it’s the people who are supporting, patching, and improving that code. To a business, a software purchase often feels a lot more like a consulting contract. Businesses could run on 100% free software, but what they really need is someone else to spend the time working out issues and making it “just work.” They need to waste as little of their employees’ time on cost centers, because labor is the highest cost. Ownership doesn’t matter to a business because the only thing a business needs to own is its own core products and services.
You compared this situation to video games - interestingly enough, on that subject, the majority of game consoles have been 100% locked down for almost 4 decades now. Video games are almost entirely closed source. Modded games are mostly banned from online play. Nobody cares that they can’t open up their Xbox to upgrade/replace the components because it’s simply not a priority. Video Games are essentially media content, art, not business logic.
For x86-64, SSE2 came with the original x86-64 release in 1999/2000 (according to Wikipedia, it was announced in 1999 and the full specification became available in August 2000), so if patents are limited to 20 years, it's already out of patent.
https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...
The only previous valid point was that previous iPhones/iPad couldn't sustain these high clock rates for long since they were fanless, fortunately adding a fan is not very hard.
Sure the 5950X will probably win the multi-core benchmarks since it has 2x cores but it does not look promising for single core speed.
FWIW, the 100W i9-10900k isn't even Intel's fastest single-threaded chip: that's the i7-1165G7 (a 28W part). Intel's desktop stuff is ancient: they're effectively still just Skylake (2015) derivatives; for single-threaded stuff the more modern designs they're shipping on mobile (Ice Lake and later) beat the desktop parts because their IPC is that much faster.
Power doesn't really help single-threaded performance, aside from wide vector units, nowadays.
We have similar performance jumps in cryptocurrency mining: GPU’s are orders of magnitude faster than CPU’s and ASIC’s are orders of magnitude faster than GPU’s for the same power consumption.
If you buy hardware, you own it in the sense that you can do what you want with it assuming you have the technical expertise.
It doesn’t matter what software comes installed or how locked down it is.
Whether it is iOS, or PostmarketOS, we are dependent on thousands of other people making design decisions that support you in meeting our particular needs.
The only question is who serves your needs better.
I think it's fair to say that a company shouldn't be allowed to sue you or have you arrested for doing whatever you want to a device that you have purchased. For example, see the John Deere lawsuits where they are trying to DRM repairs to tractors and make it illegal to work on them.
What I think is less fair is arguing that a company has a requirement to make it easy or possible for you to work on your device. You're entering the world of engineering and end-user trade-offs and starting to talk about forcing companies to add or remove features that might be in the end-user's best interest.
Would a screw with a proprietary screw-head be ok if the company could argue that it made assembly easier? Would you require that they prove there is some tangible engineering benefit to all decisions? What's the line between "general purpose computing device" and "electronic toy"?
You could probably write a law that made it illegal to add restrictions whose sole purpose was to prevent the user from modifying or repairing their device but the only way you'd ever be able to enforce it is if someone was caught writing incriminating emails. Pretty much anything could have some imaginary engineering justification behind it.
If you can crack a proprietary protocol or re-create a proprietary screwdriver then more power to you, you shouldn't be arrested or sued. But telling a company they're not allowed to use a proprietary screw-head is a messy road to start down, and that's what you're saying when you're asking for companies to be forced to allow you to do what you want on your device.
Vote with your wallet and if the choice you vote for doesn't win then you're free to go create your own. But don't outlaw business models or legislate engineering choices. You should have a right to repair, you should have a right to get your data, you shouldn't have a right to tell me what my engineering choices need to be.
It’s also true that Apple has had a pattern of first locking things down, and then finding new, more secure and reliable ways of opening them back up again. Extensions are an obvious example of this pattern.
This is just a consequence of moving from a benign network environment to one in which hostile adversaries are trying to both socially engineer as well as exploit any computer or individual that is vulnerable.
We all have to adapt to this reality.
I don't see why you should say that 'no way' an M1 (with appropriate cooling etc) is going to outperform a desktop class system - if you read the Anandtech review it's clearly superior to competing architectures in many respects and is built on a better process technology.
[Edit] Apple is specifically doing this frayed product line to harvest the demand curve, lower supply costs, and amortize R&D across a large unit base... they are playing to the current low interest rate world in this product line. It's an intentional business choice, not a product choice.
Will the new Macbooks with M1 chips compare favorably against Intel laptops with low power and fanless designs? Yeah.
Is the existing A14 chip faster than than a 10900k (even in single threaded performance)? No way. There is something in the benchmark that is messed up to the point where you can't compare them.
But I do think this division is an essential step.
The method to pin a webapp to the home screen is substantially worse than it used to be.
There was a period where javascript running in Safari ran significantly faster than javascript running in a webapp opened from the home screen, or in any other app opening a web view. Is that still the case, or did they decide to share that function?
I know most people use a case, anyway, but I'd rather have an extra mm or 2 and a flat back, and either get more battery or just empty space...
it's good to remind ourselves that stratechery is a marketing vehicle for consulting services, not an inquiry into truth. these blog posts are examples of standard strategic analyses learned in any decent MBA program. absent relevant experience, these analyses may seem oracular, but they're not. they're decent and competently researched, but not amazingly insightful--by design: why give away the real crown jewels (if there are any)?
it's pretty clear that the trend in software has been toward subscriptions (renting software) for the last decade or two, and that subscriptions are more advantageous for profit-seeking businesses, so connecting two very well-known dots provides all the "insight" here. a third well-known dot, that apple has a lead in this regard, provides context.
1. Do we know what the contracts between Verizon-Apple look like? Verizon may get discounts for buying in bulk or for being a strategic partner.
2. Cell providers (like Verizon) may be funding OP's iPhone 12 with the monthly-payments of other customers that don't exercise their ability to upgrade their phone exactly every 24-months. I think this group may actually be rather large.
3. Interest rates for financing your iPhone via your cell provider's 2-year contract plan are likely lower than financing an iPad on a credit card or other third-party financing.
All of the above likely combine to make the iPhones proportionately cheaper than iPads, possibly significantly.
"RIM thought iPhone was impossible in 2007": https://web.archive.org/web/20150517013510/http://www.macnn....
In the presentation, Johny Sruji seemed to place a bigger emphasis on the reduced power consumption than he did the speed. Saying things like, “this is a big deal” and “this is unheard of”.
In my mind, the argument of wattage seems analogous to saying, “There is no way a low wattage LED bulb will ever outshine a high wattage filament bulb.” I have assumed that we’ve been able to make leaps and bounds in CPU technology since the dawn of computers while also reducing power consumption.
But maybe there is some critical information I am missing here. I’m certainly no expert and would love to hear more about why the wattage aspect holds weight.
OTOH ARM and Apple have tailored their chips to the workload of ~10 years ago (running Javascript at single-digit watts) which is far more tailored to what actually gets used these days.
There are videos on youtube demonstrating that you can edit large video files on an iPhone connected to an external monitor and do it smoother than a much larger PC.
Here is an example of using iPhone SE strapped to an external screen, editing a 4K footage: https://www.youtube.com/watch?v=LmbrOUPFDvg
Notice how smooth is everything.
> 2x faster CPU performance
> M1 delivers significantly higher performance at every power level when compared with the very latest PC laptop chip. At just 10 watts (the thermal envelope of a MacBook Air), M1 delivers up to 2x the CPU performance of the PC chip.
Which is precisely why they can make the move with Rosetta 2.
1 year ago, I renewed with the same provider, who offered me the same plan for £10/month.
1 month ago, same provider offered me 12GB data for £10/month. Shopping around indicated I could get 15-20GB for below £15/month if I jumped networks, and stuck on a sim-only plan.
Instead, I got an iPhone SE 2020 128GB (RRP £449) and 40GB monthly data for £70 up front and £26/month.
Total cost of ownership over the two years is £694.
Cost of the iPhone if I purchased it from Apple is £449.
That remains the remaining spend is £10.21/month for 40GB data.
I definitely got something discounted.
Your parent comment seemed to imply that the leap was due to an architecture shift like CPU -> GPU. It's not, it's just better CPU design.
... if (forced) integration with Google's web services is a plus, that is.
> (Apple) The company has the best chips in the world, and you have to buy the entire integrated widget to get them.
Does it? Those claims of 3x faster seem carefully cherry picked.
And it's not just Figma's collaborative features. Figma made fundamentally better decisions about design tool feature set. Better vector editor. Better concept of "symbol" as a component. Much better approach to auto-layout. Much better approach to shared colors / text styles.
At every step, Figma is just a better designed design tool. And a large part of why it's taking the design world by storm is exactly that. Most design work is done alone, not collaboratively dragging elements on the screen. Figma is just a great tool. Sketch is trying to catch up, but they would need to modify a LOT of their past decisions to get to the spot Figma is at.
Well, no point in arguing here. You may be right, but the machines will be in the hands of users next week. It would be stupid for Apple to make those claims if they weren't true. We'll see soon enough.
Assuming the claims are true, we shouldn't forget that Intel per core performance improvements have been incremental at best for several years. They've really run into some major problems with their fab process development in recent years. TSMC (Apple Silicon foundry) is well ahead. It has been kind of hard to watch since that has historically been such a strength for Intel. They're a strong company, they'll get it together.
A lot of people have iPads. They have replaced PCs for many tasks. Many I know only use their PCs for word processing or work. Other things we do on a computer are in the browser.
In 2 years I'll be replacing 2 Windows computers in my family. Give me the choice of a Mac with familiar iOS interface, and access to the App Store, or a Windows PC, only good for word processing and browsing the web, I choose Mac/iOS. I am familiar with iOS but not so much with Mac (I've owned 2 but could never get used to the UI). I reckon millions will be faced with the same choices.
I predict that in 5 years Dell, HP and Lenovo will be struggling. Time will tell...
Oh, and of course the M1 today ain't compelling enough. But wait until it evolves. Apple has a track record of iterating year over year. Fascinating...
Purpose built architecture could mean many things, like having efficient cores and high performance cores, codec specific hardware, the way that the memory is accessed, cache configuration, co-processors, signal processors.
Everything counts.
I mean open to customization and the development of a wider range of user experiences - which is what old school apps were about.
So the CLI should be the least problem with Big Sur. Unless they somehow cripple it on purpose so you can't use common unix command line stuff any more.
If you want to know more about this limitation, I suggest looking at a way of organizing computation that avoids this issue called "reversible computing"[1]. As I said, it won't be of practical significance for classical computing for a long while, but it's actually pretty fleshed out theoretically.
Even if it reaches a low % of total world population, it does command important amount of profits and dictates trends.
However, the creation of the M1 wasn't free. There's a significant R&D cost, both initial and ongoing that isn't calculated as part of the Cost of Goods Sold.
My car has a digital speedometer that reads in kph only. I want it to read mph, but there is no software update available to do this, and nobody seems to know how to hack it. I still own the car.
Everything produced by a big company works in the interests of the big company. There are no exceptions to this, ever, unless the company is simply failing.
Any time we buy something, we do so because our interests are sufficiently aligned with the interests of those who made the product. The alignment is always partial.
If you don’t want hardware that works in the interest of a big company. The only way to achieve that is not to get it from a big company. The same is true for software.
I’m totally in support of this.
Think we can't ignore that the iOS install base has been larger than the MacOS install base. Start looking at it that way and the iOS way of doing things is the norm in the eyes of Apple and MacOS is the odd one out.
However I am pretty sure a lot of people with less money than you would like to buy a cheap phone and put Apples software on it.
I'm curious about next week's launch and benchmarks, Apple's claims compare it to a 1.2GHz i7, which I expect to quickly throttle. That's why I also expect the parent comment to be right, current desktop CPUs will still be faster.
I believe the point is that it's one example (of hundreds, maybe thousands?) of performance paper-cuts addressed by Apple hardware that result in significant performance-per-watt advantages over devices not using Intel CPUs.
The benchmark is true but misleading. It compares 'Intel vs Apple Top Performance' meaning essentially the speed it could go at max. It is not a real world number or result and exists purely in a vacuum. If your phone ran at that speed for an extended period of time I guarantee it would melt. I think the only conclusion to be drawn is that Apple's mobile CPU's are very capable and well designed, and ARM has a lot of untapped potential.
Shoving the payments onto the carrier isn't always a bad thing.
For example, I don't always have $1,100 lying around doing nothing, and these days I'd rather be liquid in case of sudden job loss or other emergency. So keeping that $1,100 in my emergency fund and shoving the payments onto the carrier at 0% interest is a better idea than paying up-front for a phone.
An A14 is both faster and lower power than a 6502.
Also, why are you shouting? It's just computers. It's not important.
Basically, higher wattage makes chips create more heat in shorter time.
Heat in general is destructive, like with cooking or with camp fires or when a car “over heats” and stops working. Silicon chips are very detailed and any small change could make them stop working. So the heat applied needs to be below some threshold (i.e. don’t let it get too hot).
If the chip needs more wattage it creates more heat and that heat needs a “heat sink” and fan to protect the chip from degrading.
Heat sinks and fans require a lot of space, high surface area to volume ratios. Take a look at the PS5 tear down, 90% of the insides are a heat sink. Laptops and phones don’t have a lot of room for heat sinks or fans.
Therefore, if the chip can use less wattage then it will get less hot. Meaning it can work better in “fan-less” devices like the MacBook Air and the iPhone and iPad.
FWIW, I first saw collaborative editing over a network done on an Amiga. So I guess this has been a thing for "decades."
I mean most casual users don't use laptops already, either preferring their phones or tablets. Those that do, need it for work (or a nicher crowd - gaming). And there's no way in hell the primary buyers of Dell, HP and lenovo computers, companies, with no reason to pay the apple tax will pay for expensive 1000$+ MBPs for each employee
New Tim Cook Apple philosophy is doing what Apple fans used to criticise other tech companies of which is needlessly fragmenting their product line to try and milk more money out of it.
It's things like this that make the difference between ok and good and good and great.
And they were absolutely correct: battery life on the original iPhone was abysmal. But it turns out that consumers didn't care.
For example, we just saw an article rise to the top of HN in the last couple days about the pathetic state of Apple's developer documentation. Their focus seems to be less providing integrations into their hardware, and more providing integrations into their services. Meanwhile, developers increasingly distrust Apple because of bad policies and press around App Store review. It's a mess.
I agree that Apple could and should help app developers use this cool new hardware. I'm sure there are good people at Apple who're trying. But the company as a whole seems to be chasing other squirrels.
The perf vs power charts on that website also put to rest the mistake of thinking perf simply increases with watt consumption.
As for performance vs heat, well, you'd expect even better results from the chip consuming much less power. How does that 100W chip perform with a phone-miniaturized heat sink? Or the power-sipping chip with a double-tower fan cooler?
Apple's performance boosts will reward those companies who never valued performance. Why would they change their approach now?
"This moment has been brewing for years now, and the new Apple Silicon is both shocking, but also very much expected. In the coming weeks we’ll be trying to get our hands on the new hardware and verify Apple’s claims."
Anandtech doesn't have the ability to say either way since they are also going by the marketing data ... once they benchmark it for real then they'd be able to prove either way, but your statement is premature.
The Mac Mini also isn't the paragon of active cooling. I've worked with one of the current gen Intel Mac Minis, and that thing gets really hot! Like 60-80 degrees celcius. The insides must be cooking if the outside is that hot.
The 2013 Mac Pro also had heating design issues, only corrected with the current gen Mac Pro.
I'd say active cooling is a consistent weakness of the entire modern Mac hardware design division.
What you are describing is taking this even further in the direction of an information appliance.
I am unconvinced that there is any benefit that your model provides that Apple does not already.
You can already just buy a Mac with AppleCare and install MS office from the App Store.
People may want their choices to be simplified, but they are also going to need to be able to use whatever important new thing comes along. E.g. Zoom or Slack.
One is iOS's arbitrary separation of settings and other app functionality. It makes no sense. I'll never remember what goes where.
Secondly, Android's back button is simple. I can use it without thinking even though you can probably find a lot of inconsistencies in its behaviour. iOS has multiple inconsistent one-off solutions for going back that cause a lot more mental friction.
Windows 10 compatibility is an excellent achievement. But it comes at a cost of a significantly worse user experience (in my opinion, at least). Like legacy menu hell.
For ARM support for linux, my understanding is that pure server stuff is completely fine and battle tested, hobby/side peoject level utilities are a riskier bet. I remember hitting some of those on a rasberry pi a while ago, but then the situation might be way better already with all the buzz surroubding ARM nowadays.
For reference the github issue tracking page: https://github.com/Homebrew/brew/issues/7857
Believe whatever you like.
But this means you are also choosing to ignore absolutely enormous investments at every level of the stack that they have made to increase web performance, adopt standards and improve web user experience.
Could be, but as far as the new cross platform frameworks are shaping up right now it looks like their strategy is slightly different. Apple is seemingly creating a developer ecosystem to loosley describe interfaces and share them between platforms while they ultimately decide how your UI is rendered. Maybe you're right and one day that means flicking a switch and everything is unified. I also look at something like iPadOS for instance which started as extremely similar to iOS and has now diverged and become it's own thing, different to both the Mac and iPhone.
As a designer your work must be detached of workflow sentiments. Tools are tools. Nothing more. I can make a great UI with Inkscape any day I want.
As software development is changing, UI design tools are changing. Inkscape has no concept of Symbols or Auto-layout. Sure, you CAN establish a design system in Inkscape, but it will be mostly copy-paste and lots of resizing.Figma takes this a step further. Design components can be connected to JS components using Storybooks. Localization can be automatically applied to designs, and elements will automatically adjust because of a powerful, flexbox-like auto-layout.
Sure, you CAN make the same design using Inkscape. But Figma enables previously impossible workflows and makes the end result -- user experience with product -- better.
If I remember correctly, that's what happened with flux.
However I still believe it’s a bit more aggressive than necessary.
Over the past couple of years, coremltools [1], which is used to convert models from Tensorflow and other frameworks to run on Apple hardware (including the neural engine when available), has gone from a total joke to being quite good.
I had to get a Keras model running on iOS a few months ago, and I was expecting to spend days tracking down obscure errors and writing lots of custom code to get the conversion to work -- but instead it was literally 3 lines of code, and it worked on the first try.
This is made possible due to Apple being in control of the hardware and software. If there's a way to do this as easily with a mix of Linux, Windows, and Android devices, I haven't seen it.
Linux laptops can't even do something as seemingly simple as sleep and wake reliably because it's actually not simple, and the hardware and software people don't talk to each other to make it work.
This is the kind of stuff I don't want to lose.
I remember updating to Jaguar ended up killing my mouse drivers somehow and I had to manually restore my computer using only the keyboard. I hadn’t actually learned any UNIX or the command line yet back then, so that was a fun exercise.
He's not giving away the Crown Jewels, the vast majority of his writing is behind a pay wall, and, to my knowledge, he's full time and only funded via that pay wall to write Stratechery (https://stratechery.com/about/ indicates this as well).
The free articles are an example of his writing to generate traffic to maybe convince folks to sign up and pay him for the rest of the content (which is pretty good, consistently).
CADD is a 800lb angry gorilla sitting between you and your design.
Yes, it's a poor craftsperson who blames their tools. And, sometimes it's nice when the tool doesn't fight you every step of the journey.
I imagine the same is true for graphic design.
I think the hidden truth here is that "boring boxes" solve most problems pretty damn well.
Unless your product is literally art/design, then you don't need anything custom, and boring boxes are probably the correct choice.
There is some wiggle room here - It's easier to attract customers with pretty designs, and you can make it easier to onboard a new user with some nice effects and design flourishes. But if you go overboard you differentiate yourself too much and make your product much harder to reason about and interact with.
As a user trying to get value out of a product I mostly don't care what it looks like. I do care a lot if slow animations or videos keep getting in my way. Nice the first time, fucking miserable on the hundred and first.
So if they…
- Add a full offline mode
- Ditch Electron for a more focused/lightweight webassembly+canvas implementation
- Open and document their file format to allay lockin concerns
…I'd be much more inclined to use it instead of Sketch.
I don't know about that. When ever there's an article about Vim, it gets lots and lots of useful, insightful comments. If you search for Vim on HN, you'll see the catalog of Vim posts and threads.
I haven't seen the same thing regarding VS Code.
The next one or two though had really bad battery life (iphone 3g?) I mean, 3-5 hours active use down from 7-8 (which meant you needed to charge one in morning or evening usually).
Remember, the original iphone had data, but it was pretty slow (but still amazing).
Disclosure: I work at Google on JS libraries, and at one point was in the Chrome org, but my opinions are my own.
https://github.com/webmachinelearning/webnn/blob/master/expl...
Not all of them, mind you, but you need a boulder of salt.
They can have reached a limit where it is hard to make the single threaded speed of the chip higher without increasing the power draw.
setting up all these feature on a PC seems like such a kluge compared to apple's integrated hardware/software stack.
https://developer.nvidia.com/maxine
>Maxine includes APIs for the latest innovations from NVIDIA research such as face alignment, gaze correction, face re-lighting and real time translation in addition to capabilities such as super-resolution, noise removal, closed captioning and virtual assistants.
Some comments here are rehashing the native vs web app debate. I don't see the connection.
Stratechery's aggregation theory the most cogent, prescriptive analysis of one facet of our current business ecology. I don't recall if Ben has made a similar analysis of Apple's integration strategy.
Here's my take...
Apple bringing everything in house continues to surprise, startle, and delight me. It's so contrary to all the business trends, up to maybe the 2010s.
The book Design Rules: The Power of Modularity is an economic model (applying NPV to modularity) describing the divide & conquer strategy of prior juggernauts. GM, "Wintel", etc. I took it as gospel.
And then Apple rejected it. Cite cliche about there being only two ways to make big money: bundling or unbundling.
Tesla has adopted Apple's strategy. Rejecting the automobile manufacturer's conventional wisdom. And I think it'll prove just as successful (if Musk can avoid giving himself an aneurysm. ahem.).
This fully integrated strategy is different from the very successful conglomerate strategy (GE, Samsung, many others). I'm still trying to articulate how. Maybe mostly in focus.
Yes, Apple's integration allows better fit and finish. That's not the part that impresses me.
What most surprises me is that Apple doesn't have to share the benefits (profits, time to market) with any one else. For contrast, during the Wintel ascendency, advances were widely shared across the industry, as evidence by the economic success of clones and peripherals.
My hunch is this explains how and why Apple is able to capture and protect their margins, in a way no one else is able to replicate, because the other strategies don't permit it.
I still really can't phantom what's going on. Surely there are still interfaces, modules, common parts, right? So that Apple doesn't have to reinvent everything from scratch every product cycle. But those design choices are no longer publicly visible, extensible and substitutable by 3rd parties. Witness the whole right to repair debate. (I have another weird theory for Apple's unorthodox strategy around product group management, based just on guesses.)
Tesla is doing the exact same thing for their product category. (Edit: Actually, Tesla eschewing model years, and doing continuous improvements is nearly what I think Apple is doing internally. What we see externally are just snapshots of those products in their continuous evolution and development.)
One possible explanation for Apple's financial success is Wright's Law. It's like the Moore's Law of manufacturing. https://en.wikipedia.org/wiki/Experience_curve_effects h/t Sandy Munro's analysis of Battery Day. I now would love for somebodies to deconstruct a bunch of other apparent juggernauts to see if Wright's Law applies. Or not. AWS's data centers and solar panel manufacturing are two that pop into my mind.
Any way. Back to Ben's article. If Apple has a customer facing differentiator, I think it's mostly their integration that allows them to capture and maintain fantastic margins, plowed back into product development, a virtuous cycle.
This was in 2015 I think. Anyway, large businesses will look at TCO rather than purchase price. If Apple can perform better in that metric, they will be preferred, and vice versa.
https://www.computerworld.com/article/3131906/ibm-says-macs-...
That said, it would be silly of them not to in some of these most obvious cases: a flux/redshift comparable feature is now built into most OS’s as we’ve become attached to our devices, and Sherlock was argued by critics of the term to be a natural progression of iterating in their file indexing capabilities.
[1]: https://en.wikipedia.org/wiki/Sherlock_(software)#Sherlocked...
But comparing against 40 year old technology has nothing to do with comparing against current offerings.
We'll just have to wait a week to see how it fares compiling Chrome.
You'll probably know better than me but Apple's work on Webkit was presumably worthwhile enough for Google to fork it into Blink (no problem with that but maybe worth acknowledging that fact).
The i9 has a density of ~44mT/mm2 versus the M1's 134mT/mm2 (3x)
The i9 has ~9.2B transistors, compared to the M1's 16B (174%)
The M1 is two generations ahead on lithography and has a more sophisticated CPU design than Intel. It'll do fine.
That reminds me of this classic bit of technology humor, the Apple Product Cycle[1]. It doesn't ring quite as true today as when it was first posted, but the broad strokes are still similar. Specifically it appears we're on the stage where "The haters offer their assessment. The forums are ablaze with vitriolic rage. Haters pan the device for being less powerful than a Cray X1 while zealots counter that it is both smaller and lighter than a Buick Regal. The virtual slap-fight goes on and on, until obscure technical nuances like, “Will it play multiplexed Ogg Vorbis streams?” become matters of life and death."
[1] https://web.archive.org/web/20061028040301/http://www.mister...
There's probably lots of novel applications of AI/ML that remain to be built because of this limitation. Probably also good fodder for backing your way into a startup idea as a technologist.
Google forked Blink because for whatever reason they were unsatisfied with the state of Webkit -- nominally because they wanted to take a different approach to multi-process, but there may have been other technological and project direction/pace disagreements. Since then, a number of browsers have switched from Webkit to Blink/Chromium as their engine, and arguably Safari is falling behind on new features and overall quality (weird quirks that require web devs to work around).
That’s a result of Apple putting effort into hardware + software to make that happen.
If Apple's focus is on getting better power consumption and memory use (esp on mobile) then that's still investing and arguably that does as much if not more for users and the web than adding more features.
PS Let's not forget that Apple are still standing behind WebKit when Microsoft have given up on their own rendering engine so let's give them some credit for helping to avoid a Chrome only web.
Actually, that's exactly how Figma is built—their desktop app just wraps their web version while hooking into file system APIs provided by Electron/NodeJS. See https://www.figma.com/blog/webassembly-cut-figmas-load-time-....
And I'm sure there are power users who killed their original iPhone by noon in 2007, and I'm sure there are power users who do the same today.
What the fuck guys? Do you just not care about astronomers? Why is it that no one has properly implemented all of f.luxs features?
Now, you may not be a fan of their limited market power and subsequent inability to dictate everything else about the platform to the software ecosystem or adjacent devices. That would be a fair criticism. Apple's animosity towards open standards for accessing device functionality or cross-device communication does allow them to move faster. At the expense of the rest of the market.
I know people who upgrade their Mac every time there is a speed bump and just sell the old one. They would presumably be candidates for this.
I wouldn’t be surprised to see this.
I just don’t see any reason Apple would tie the leased hardware to a limited software bundle.
It’s hardly obvious that this is best for users in the long run.
I point this out not to say it’s wrong for them to attempt this, but because it makes no sense to use Google’s strategic decisions as evidence of Apple’s intentions or investment.
A screwdriver and an electrict drill are both tools.
PowerPC’s biggest flaw was power efficiency, a situation which became critical as Apple’s sales skewed toward laptops. The G5 was a beast but it ate power like a beast too; it was never going to work in a laptop, and so Apple had to switch.
Otherwise, it's great: the status indicators you want anyway go up in the corners and don't take out a full row of the "actual" screen.
having read this free piece and a number previously posted here, the analyses tend to be fairly average in comparison to mba-level work. perhaps you can enlighten me—who pays, and why? who has interest but also doesn’t otherwise have access to mba-level analyses or analytical tools?
a solid position, line of reasoning, or conclusion is difficult to draw from this article. he seems to want badly to say something, anything, insightful about apple’s new silicon to not miss the short window of opportunity afforded by the recent announcement. but what did he say other than repeat some numbers from anandtech and sidetrack onto sketch vs figma?
apple‘s strategy isn’t new or surprising, and this chip is one (comparatively small) part of jobs’ original vision of ubiquitous consumer computing, with apple at the center of it. the switch to arm isn’t even a strategic surprise. they wanted ever smaller and more powerful chips (which intel could have owned but f-ed up) to drive that ubiquitousness and be at the profitable forefront of it (that’s also why they’re so obsessed with thinness). apple has always wanted to own consumer computing. that’s it. that strategy is not that hard to make sense of. but somehow this article badly flubs that low hurdle.
I'm looking forward to benchmarks and seeing how well the new machines work - compiling speed, lightroom, responsiveness, how well Chrome works with max 16GB RAM :).
Although I find that the biggest performance impact is not Electron but that now you don’t have a local database and a lot of requests are going through the network before the caching mechanisms kick in.
In my opinion, it’s the correct move to make but incredibly hard. Now they have launched v10 they have at most and additionaa billing cycle (12 months) to start cranking out compelling features and polishing the product.
My concern with EN is that they bleed so many paying users that they end being unsustainable. Time will tell.
Yes. For one, there seems to be no Unified API for GPU Computing in the PC Market. And Microsoft doesn't seems to be interested in doing a Direct X version of it.
And for NPU, which is increasingly important for things such as Speech Recognition, ( For many parts of the word where Languages aren't easily typed into, they are the default way of input on their Phone ), Photo Face Recognition without using the Cloud. Increasing use in Graphics, Video, Audio productivity apps. There aren't even any specific Hardware on PC market. ( That is why Intel is desperate to move the XE as a co-processor ). And it will be years if not a whole decade before a PC parts comes up and reach a large enough market volume.
And it sort of makes you wonder 3-4 years down the road when the M1 ( excluding Memory ) becomes a $20 SoC, will we see a variant of Mac cheap enough that will hit back at the 1.5B Windows PC Market. Where Apple current has roughly 110M Mac.
If you use the native webview, it’ll probably use less memory but be slower because it’s basically running Safari instead of Chrome. It’s probably the wrong tradeoff for Figma because the browser’s memory usage and JS heap memory is pretty negligible compared to the amount of memory the user’s document uses, especially large ones with a lot of images. There’s way more room for optimization there and that has nothing to do with Electron.
It’s fun to think about what the performance would be if it was 100% C++ given infinite resources but realistically it’d be way less productive and more bug prone than React. I’ve written UIs in C++ before,would not repeat. That time would be better spent optimizing actual bottlenecks, like rendering the design file (where the GPU is the bottlebeck).
We actually have a native (not WASM) with native webview build we use internally for debugging with XCode. No, performance isn’t better enough to warrant dealing with Safari issues and shipping that over the Electron + WASM build.
On-device detection (“edge ai”) is gaining steam. Apple recently purchased a company called xnor.ai which specialized in optimizing models for low power conditions.
Of course set against that they've lost the ability to run x86 Windows in a VM for legacy business apps (but not sure many were doing that anyway!).
As to who pays? People who have things to do, and don't have the time to do the 'mba-level' work it takes to track all these different trends and such, and want just a short overview of 'whats important' and "who's moving where".
And your 'obvious' take is not obvious if you look at where apple is spending money, effort and hiring, etc.
And another aspect of this is that he's been talking about some of this stuff for many years, which isn't at all obvious from one of his pieces stand-alone. The weekly updates blend together and consistently reference stuff he's previously discussed, and he also has no problem calling out where he's gotten it wrong.
Funnily enough, I've read Emotional Design, It's been a long time (I think it was back in like 2006/2007) but my memory is that most of the focus of the book was on the design of physical objects. I don't really think website design has the same freedom. It doesn't invalidate all of his topics, but it certainly limits how they apply.
But there's a different angle here that I'd ask you to consider - Over the last 20 years, websites are eating up interactions that used to be conversations.
You might have walked to a bank and talked to a teller - Now you use their website.
You might have driven to home depot and asked a store associate a question - now you shop online.
You might have gone to blockbuster and rented a movie from the clerk - Now you browse netflix.
You might have gone to the post office to get some mail - Now that content is in an email instead.
Each of those interactions was just a conversation, literally just sounds coming out of a mouth, but they all achieved useful side effects. While you might have a pleasant chat every now and then, the goal was not reflective/emotional investment. The goal was the utility provided by the service.
I think approaching the design of a site with the goal of evoking an emotional or visceral reaction (ESPECIALLY from the literal appearance of the site) is actually turning the advice of the book on it's head - Put the user first!
If I'm interacting with your site to achieve a useful side effect, whether that's order an item, get the news, see my mail, deposit a check, watch a show, etc - Then my emotional reaction is heavily biased towards how well and how quickly I can achieve my goal. My emotions don't care a flying fuck whether your button is red/blue/green or if your gray is #d3d3d3 or #878787. And I certainly don't want to have to navigate a crazy custom design, just like I don't want to hit a detour while driving home - even if it happens to be scenic.
I do care, a whole lot, about consistently easy to use services, with a low barrier to entry. On the web, that mostly means boring boxes.
---
As a thought experiment, I'm sure you've been to the DMV before (I'm not actually, you might not be US based, but you probably have an equivalent).
Ever had that DMV trip that took 3 hours waiting in line before finally getting seen?
Not happy were you?
Ever had that DMV trip where it was basically empty and you got seen immediately?
I bet you felt thrilled. (probably an overstatement, but at least pleasantly surprised)
It's the same building, same carpets, floors, columns, windows, roof. The only change was how quickly and efficiently you accomplished the goal you had. But your emotional responses were miles apart.
Apply that to websites. I don't want to be looking at my bank's website - I do it because I need to move money or use their services. Make that the priority. Make it with boring, easy to use boxes, and I will love it.
Bury it in menus, or add 10 clicks because "that page looks a little cramped" and I will not be happy.
The last time I bought a phone (2016) there was no bundle that was cheaper than buying the phone and a sim-only plan.
This time around, the inverse is true.
The only thing I’m locked into is my plan for 2 years. I can do whatever I like with the handset, and intend to offload it in a year assuming Face ID is seamlessly usable in public again by then.
As to strategy groups, I've worked with strategy teams from a big chunk of the Fortune 100, did time at a Big 4, and I think your expectations of levels of critical analysis are ....overshooting reality :)
Ben's definitely not some strategy messiah, no question - I doubt he'd even claim to be in the top percentage. But he's consistent, thoughtful, and lets me just not think chase tons of other sources on a regular basis.
i'm not suggesting he's a crank, but that adoration should be tempered. we humans tend to get wrapped up in popularity rather than substance. the evaluation of analytical stances should be heavily tilted toward substance.
Mind blown.
(I was born in '86, albeit late in the year, so I find this especially hilarious.)