Most active commenters

    ←back to thread

    292 points kaboro | 16 comments | | HN request time: 1.8s | source | bottom
    Show context
    klelatti ◴[] No.25058716[source]
    > it is possible that Apple’s chip team is so far ahead of the competition, not just in 2020, but particularly as it develops even more powerful versions of Apple Silicon, that the commoditization of software inherent in web apps will work to Apple’s favor, just as the its move to Intel commoditized hardware, highlighting Apple’s then-software advantage in the 00s.

    I think Ben is missing something here: that the speed and specialist hardware (e.g. neural engine) on the new SoCs again give developers of native apps the ability to differentiate themselves (and the Mac) by offering apps that the competition (both web apps and PCs) can't. It's not just about running web apps more quickly.

    replies(8): >>25058922 #>>25058980 #>>25058990 #>>25059055 #>>25059382 #>>25061149 #>>25061376 #>>25067968 #
    1. verisimilidude ◴[] No.25061149[source]
    It's a nice idea in theory, but I don't see Apple putting in the effort to make this fruitful.

    For example, we just saw an article rise to the top of HN in the last couple days about the pathetic state of Apple's developer documentation. Their focus seems to be less providing integrations into their hardware, and more providing integrations into their services. Meanwhile, developers increasingly distrust Apple because of bad policies and press around App Store review. It's a mess.

    I agree that Apple could and should help app developers use this cool new hardware. I'm sure there are good people at Apple who're trying. But the company as a whole seems to be chasing other squirrels.

    replies(4): >>25062114 #>>25062633 #>>25064115 #>>25065146 #
    2. klelatti ◴[] No.25062114[source]
    Very largely agree (and the chasing squirrels analogy made me laugh!) but of course the speed comes without any extra effort from Apple - so if your native app becomes attractive because its now that much quicker - say some form of video editing - then you're good to go.
    replies(1): >>25062402 #
    3. fxtentacle ◴[] No.25062402[source]
    Not quite. If your native app becomes attractive, Apple might replace you with a built-in clone and then use that as the reason to kick you out of the app store.

    If I remember correctly, that's what happened with flux.

    replies(2): >>25062566 #>>25062590 #
    4. daxelrod ◴[] No.25062566{3}[source]
    f.lux was never allowed in the iOS App Store, because it needs private APIs to change the screen color temperature.

    Was it on macOS App Store at one point and then kicked off?

    replies(1): >>25063536 #
    5. ◴[] No.25062590{3}[source]
    6. jonas21 ◴[] No.25062633[source]
    There are some areas where Apple is prioritizing getting developers on board with their hardware, and the neural engine seems like one of them.

    Over the past couple of years, coremltools [1], which is used to convert models from Tensorflow and other frameworks to run on Apple hardware (including the neural engine when available), has gone from a total joke to being quite good.

    I had to get a Keras model running on iOS a few months ago, and I was expecting to spend days tracking down obscure errors and writing lots of custom code to get the conversion to work -- but instead it was literally 3 lines of code, and it worked on the first try.

    [1] https://github.com/apple/coremltools

    replies(1): >>25063455 #
    7. 411111111111111 ◴[] No.25063455[source]
    You're earning money with a model deployed on an iOS device? Now that's an achievement. It's even rare to actually get productive models in the first place but then doubling down on less powerful hardware then you could get with aws is just mind-blowing to me in an production context
    replies(2): >>25063769 #>>25068652 #
    8. samtheprogram ◴[] No.25063536{4}[source]
    The term GP is referencing is Sherlocked[1]. As someone familiar with the iOS jailbreaking ecosystem circa 2010, you could definitely loan the term to apps that are from outside their walled garden.

    That said, it would be silly of them not to in some of these most obvious cases: a flux/redshift comparable feature is now built into most OS’s as we’ve become attached to our devices, and Sherlock was argued by critics of the term to be a natural progression of iterating in their file indexing capabilities.

    [1]: https://en.wikipedia.org/wiki/Sherlock_(software)#Sherlocked...

    replies(2): >>25063904 #>>25064686 #
    9. dclusin ◴[] No.25063769{3}[source]
    It's the age old thin client vs. fat client debate repeating itself again. It seems like as the chips & tools get more mature we'll see more and more model deployments on customer hardware. Transmitting gigabytes of sensor/input data to a nearby data center for real time result just isn't feasible for most applications.

    There's probably lots of novel applications of AI/ML that remain to be built because of this limitation. Probably also good fodder for backing your way into a startup idea as a technologist.

    10. kergonath ◴[] No.25063904{5}[source]
    I am not aware of a sherlocked being kicked out of the App Store for duplicating features of Apple’s version, though. That was quite a bold claim, asserted without any example.
    11. discordance ◴[] No.25064115[source]
    One clear example of this is audio related apps. iOS has a rich ecosystem of DAWs and VSTs because their platform seems to be much better with low latency for audio. You don’t find the same on Android.

    That’s a result of Apple putting effort into hardware + software to make that happen.

    12. Der_Einzige ◴[] No.25064686{5}[source]
    It's not properly built in. OLEDs support full conversion to red-only light, allowing you to preserve your night vision. No other app or built in implementation except f.lux and cf.lumen allow for turning all colors off except red. This is the main reason that I jailbreak my android phone (a oneplus). Not ad blocking, not side loading apps, but because I want to not get my eyes destroyed every night when I try to go to the bathroom and use my phone as an impromptu flashlight...

    What the fuck guys? Do you just not care about astronomers? Why is it that no one has properly implemented all of f.luxs features?

    replies(1): >>25065068 #
    13. LexGray ◴[] No.25065068{6}[source]
    On iOS have you tried Settings > Accessibility > Display & Text size > turning on Color Filters and sliding intensity and hue to the far right? Maybe set the triple click shortcut to Color Filters?
    replies(1): >>25067353 #
    14. cactus2093 ◴[] No.25065146[source]
    They showed in the keynote Davinci Resolve running on a Macbook Air with impressive performance. They could have easily stuck to demos using Final Cut like they often do, so this seems like a pretty good sign that from day 1 they do care about 3rd party software running well. They've also been showing more and more games which are obviously also performance sensitive. I fully expect Tensorflow models and other major libraries will be able to take native advantage of the Neural Engine in the near future as well.
    15. ◴[] No.25067353{7}[source]
    16. shrimpx ◴[] No.25068652{3}[source]
    Suppose you want to do object detection on a phone’s live camera stream. Running your model on aws is probably infeasible, because you’re killing the users data plan while streaming frames into your remote model, and network latency kills the user experience.

    On-device detection (“edge ai”) is gaining steam. Apple recently purchased a company called xnor.ai which specialized in optimizing models for low power conditions.