Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> because these companies don't want to put in the engineering hours to build and support an app on a niche platform

Like who would at this point till it's established? Look at all the time and effort wasted on Apple Watch apps, Apple TV apps for very little returns.

Apple needs to realize that expecting developers to write apps for 5 different operating systems to get coverage across their ecosystem is absurd in both engineering time and engineering team money.

End of the day the $3500 vision pro needs Netflix more than Netflix needs the vision pro, and they already had a VR app that Apple could have supported but chose to build something completely new and bespoke requiring all devs to write completely new code.



sort by: page size:

> can't justify wasting that much time and stress on a platform that clearly is more concerned with meeting the needs of casual users and media professionals rather than developers

It’s because Apple prioritizes users over developers that they have so many users.


> Apple already spends tons of money to make an OS and a dev SDK for it.

Developers have to spend tons of money in building and maintaining cross platform apps written in any way: native, hybrid, or with a VM like NativeScript does.

In the real world budgets are limited and going native isn't always an option. Apple acts as if native was the only viable option, and as if we only had to develop apps for iOS.


> the apps I write are really small, really secure, really fast, accessible, highly usable, use very few system resources, leverage the latest Apple tech, have almost no external dependencies, work extremely well, and I write them very, very quickly.

And they're only available for one proprietary platform [1], no? Excluding half of the user base for phones (in the US), and probably much more for personal computers, seems like a really bad business move in many cases. And depending on the type of application, it could be considered a different type of accessibility problem. It's depressing, but so often we have to compromise technical excellence and even user experience for economic reasons; in this case, using a cross-platform technology to develop a suckier app that can reach all host platforms is often the smart business move.

[1]: Well, one platform for each form factor.


>Xcode development, the Apple Developer Program, and the whole experience of trying to ship your application on the Apple ecosystem is the fucking plague.

That's because you're a developer that works in different platforms ("Apple, Android, Windows, Linux"). So you probably spend most of the time working in some other platform, and doing macOS work for you is just an annoyance, having to context switch to another process, etc.

Apple tho wants macOS developers (and users prefer apps from people dedicated to macOS). That's because such vendors (e.g. Panic, BareBones, Omni, and so on) won't do lowest common denominator apps, they keep up with the platform, they understand it, they hook in to new platform-only capabilities, etc.).

>Apple's desktop market share being what it is, I cannot comprehend how they're generally getting away with this.

Apple's market share is what it is, in part because they enforce things like this. This means they can add new OS/hardware features and devs will add support for them fast (instead on relying to 1000s of half-ported apps, with shared functionality on Windows and Mac, and horrible cross-platformy UIs).

That's also why (forcing more people to use XCode, and thus the official APIs, BitCode, and so on), they can introduce a whole new architecture like AS, and have macOS apps support it from day one, or old x86 apps running smoothly translated.


> If anyone had the resources to hire top tier software engineers to write or audit everything in house it would be Apple. That they don’t do it is almost… greedy and shows a lack of foresight imo.

Maybe it shows that even Apple doesn't have the resources to do it, and therefore nobody does.


>Personally I think Apple missed a bit in the marketing by positioning this so heavily on average consumer use cases

Yeah especially with bringing iOS ecosystem, it means that for (general) developers it's not interesting.

If they were demoing a shell, a code editor, etc. But if it's iOS it's just too locked down.

Developers are actually good first generation buyers. They can likely make their boss pay for.

So it's such a waste, they are excluded because iOS


> Apple has more money than they can spend, but you can only get so much done within a given timespan.

This applies to attempts to do things that take around six months and can't be compressed into one second by hiring 16 million programmers for one second, not attempts to do things that take around six months in around six months.

> Plus, they probably don't want to add the necessary API. The law says to treat other apps equally, not to provide the user with a good experience.

This is the actual reason, and thereby the criticism.


> it's massively expensive, both in terms of actual dev time per feature (easily 10x the cost), and also in finding specialist developers who know these dated technologies

What are you basing this on? There are plenty of iOS developers, and developing for Mac vs iOS is more similar than pretty much any two other dev platforms you can name.


> and there's zero good reason that Apple couldn't provide their build tools for other platforms

How do you figure? Then Apple would have to maintain their build tools for other platforms. Waste of resources for something that ultimately isn't going to make Apple money. It's no different than Microsoft intentionally gimping Excel on Mac.

I've never seen anyone complain that they have to own a Windows computer to develop Windows apps. Or a Playstation to develop Playstation games. I'm not sure why Apple is such an exception in your eyes.


> Why does this keep happening?

Apple has invested in developing the full stack for their products. Not only do they have the full stack of components for the products but the entire toolchain to develop those products. This gives them a very strong foundation for pretty much any product they want to pursue.

The AppleTV and HomePod both use older A-series SoCs and run iOS with a custom shell on top. They get all of the iOS media and peripheral handling capability "for free". Both projects can focus on TV or speaker features since the base OS is largely a solved problem for them. If they need some special consideration from somewhere in SWE they just file a Radar. They don't just get binary blob dumps of firmware from outside vendors and have to beg for bug fixes and hope their contract is big enough to get some consideration.

The Vision Pro leverages their ARM SoCs, base OS, and all the motion coprocessors that have been in their phones and watches for a decade. Novel improvements from the Vision Pro's development will just feed back to those components and make it into the next phone, watch, or whatever.

Most other companies don't actually own their whole product stack. Even Microsoft is at the mercy of their suppliers with the Surface line. They get what Intel, NVIDIA, and AMD have to offer. Smartphone manufacturers are grabbing Qualcomm and Samsung SoCs which are collections of Cortex cores then slap Android on top hoping that Google's latest version is better than the previous version.

It's hard to really make leapfrog products when you're shipping the same shit as your competitors and trying to compete on price.


> ... and frankly most indie developers can't shell out potentially several thousand dollars on OSX/iOS devices and licenses just to do some testing ...

Sounds like these so-called “indie devs” who cannot afford to build for iOS devices should not tackle projects/clients that require building for iOS. Or, if a client is in the mix, bill the client a large enough fee to cover the cost of testing on real devices. That’s not a problem Apple is responsible for solving.

I would never rely on developing, testing, and releasing an Android app on a simulator alone. I don’t want to buy a bunch of Android devices. So I don’t take on work that is meant for Android, or I hire people who can properly test on devices. Pretty simple—and it’s both my choice and a matter of professional responsibility and accountability to ship work I can stand behind.

Apple isn’t going to change any time soon. I’m so tired of the disingenuous moaning from “indie devs” who want to take on projects for and make money from iOS, but can’t be bothered to get over their own personal anti-Apple feelings to buy a device.

The ecosystem of Apple devices are hardware and software. The simulators and build tools are never enough. You wouldn’t ship an app for Apple Watch without testing it on a watch, would you? Or would you ship it relying only on having one friend with an Apple Watch test it? Sounds lazy and unprofessional—and if an indie dev can’t do the job right, they shouldn’t take on the job.


> Apple only cares about developers who are developing apps for their platforms

IDK, it often feels like they don't care much about them either. I mean you know because of the users they are kinda forced to feed up with apple anyway even if they don't like it.

I mean I work for a company developing a App (with web version) and if there is a target specific problem it's most times Apple:

- The first version(s) of our web version won't support Safari as it's technically simple impossible.

- We currently have to live with some dev annoyances for all targets due too apple taking to long to update their software stack (XCode/LLVM).

- The fact that (in difference to android) you can't just simply emulate iOS or install Safari on Linux/Windows can make testing harder (on CI we have do have Mac/iOs tests, so it's only harder instead of impossible).

- Even just some ad-hoc scripts around the dev workflow ran into problems because OS X lacking behind years (should we say over a decade by now?) when it comes to updating basic CLI/shell tools.

- Compared to windows documentation of some (especially new) things can be quite crappy in my experience.

- Long update cycles can mean you have to wait forever for bug fixes.

- ...

It often feels that when adding support for Windows/Linux/Android to a Linux/Windows/Android program written mostly with cross platform tooling would add 10% workload adding Apple support doubles the workload, or might not even be possible.

Oh and non of the problems we ran into had anything to do with privacy...


> Most of the issues you've described have nothing to do with the "mythical man-nonth thinking"

Hence the "Besides..."

> And the talent pool at the level of a single company is practically infinite unless you indeed introduce artificial constraints like not wanting to pay to tap into that wide pool

I just explained how that's untrue, and it feels like you completely ignored what I said.

You can't just say that Apple is "a single company" like any other company. Apple is the largest platform vendor in the world. How many other single companies have platforms like iOS, macOS, watchOS, tvOS, and now VisionOS? How many other single companies have effectively two bespoke programming languages, Swift and Objective-C, much less one? Perhaps only Google and Microsoft are comparable in their massive need for platform docs.


> Funnily enough, I’d say this is the biggest rationale for Apple going ARM for Macs, which is to make the development environment exactly the same as their deployment targets, namely iOS, iPadOS, Apple Watch, Apple TV, and soon, MacOS.

No it's not. This is a negligable advantage for iOS developers (I am one and don't care about this at all). Improving the developer experience a bit does not bring enough value to Apple to make such a risky and expensive move.


> You think there are as many developers who want to develop for the Vision Pro as there would have been if Apple was still as popular with devs as they were 10 years ago?

Apple now charges 15% for smaller developers which didn't exist before. And the rules are far more clear about what is and isn't allowed.

As someone who built apps now and 10 years ago the situation is much better now.

It's ridiculous people talking about developer demand for Vision Pro when there hasn't even been hardware for developers to test on.

Simulators are useful but you can't ship apps until there is real hardware to test on.


> In the grand scheme of things, do you realize how niche software development is?

During the Mac Pro redesign, Apple said developers are by far their largest group of "pro" users. Then they built a $6000+ workstation (with a $6000+ display) that appears to be primarily aimed at video editors. I don't see why adding some real programmability to iOS would be too niche.

I think the more likely answer is that Apple is suffering from tunnel vision in their conception of what "software development" means.


> Apple needs to do what Microsoft did surrounding Linux on Windows. Allocate some engineers for a few years to make life easier for developers on their platform.

Apple will most likely continue to do what it always does: make billions of dollars while largely ignoring Linux.

iOS and macOS developers will continue to use Xcode the way they always have. (And perhaps Swift Playgrounds on iOS.)


>However this only works if your product is damn-near perfect. And Apple is infamously imperfect when it comes to software/services.

Which is something not just with Developers tooling, Apple seems to have problem with scaling. When they were focused on the iPhone, literally every other part of their business get completely neglected.

There are lots of low hanging fruits not just in developers tools, but they don't seems to care. Instead they spend massive resources on stupid Shows for Apple TV+. Every time I see hundreds of million spent on Shows and that heartless attempt called Apple Music I just felt Apple needs a taste of awful medicine that Steve Jobs once had.


> If Apple devoted their focus to products in proportion to their revenue, then they would be putting 12x as much effort into the iPhone than they would for the entire Mac lineup.

True. However, creators and developers (who, in my experience, almost always use a desktop) are important for the iOS platform. Someone has to write those native apps. Therefore it doesn't make sense to ignore them for too long.

next

Legal | privacy