←back to thread

Affinity Studio now free

(www.affinity.studio)
1199 points dagmx | 2 comments | | HN request time: 0.483s | source
Show context
pentagrama ◴[] No.45762521[source]
I used Affinity for several years, so to add some background here:

Serif is the company that originally built this software.

--------

2014–2024

Serif developed the Affinity suite, a collection of three independent desktop apps sold with a one-time payment model:

- Affinity Designer: vector graphic design (Adobe Illustrator equivalent)

- Affinity Photo: digital image editing (Adobe Photoshop equivalent)

- Affinity Publisher: print and layout design (Adobe InDesign equivalent)

They were solid, professional tools without subscriptions like Adobe, a big reason why many designers loved them.

-------

2024

Canva acquired Serif.

-------

2025 (today)

The product has been relaunched. The three apps are now merged into a single app, simply called Affinity, and it follows a freemium model.

From what I’ve tested, you need a Canva account to download and open the app (you can opt out of some telemetry during setup).

The new app has four tabs:

- Vector: formerly Affinity Designer

- Pixel: formerly Affinity Photo

- Layout: formerly Affinity Publisher

- Canva AI: a new, paid AI-powered section

Screenshot https://imgur.com/a/h1S6fcK

Hope can help!

replies(16): >>45762570 #>>45763276 #>>45763555 #>>45763695 #>>45763766 #>>45763807 #>>45764042 #>>45764560 #>>45765389 #>>45765538 #>>45765942 #>>45767528 #>>45769728 #>>45769747 #>>45770368 #>>45770565 #
alt227 ◴[] No.45763276[source]
This is such a shame IMO. The Serif suite was great, and I used to try to get every designer I could to dump adobe and switch to serif.

Now that it has switched to a freemium model trying to get you to subscribe to AI, I wont be using this or telling other people about it any more. Their priorities have changed. No longer are they trying to to beat adobe at their own game, they are just chasing AI money like everyone else.

replies(9): >>45763916 #>>45764127 #>>45765466 #>>45766273 #>>45767409 #>>45767470 #>>45767559 #>>45767730 #>>45774320 #
derefr ◴[] No.45763916[source]
To push back against this sentiment: “chasing AI money” isn’t necessarily their thought process here; i.e. it’s not the only reason they would “switch to a freemium model trying to get you to subscribe to AI.”

Keeping in mind that:

1. “AI” (i.e. large ML model) -driven features are in demand (if not by existing users, then by not-yet-users, serving as a TAM-expansion strategy)

2. Large ML models require a lot of resources to run. Not just GPU power (which, if you have less of it, just translates to slower runs) but VRAM (which, if you have not-enough of it, multiplies runtime of these models by 10-100x; and if you also don't have enough main memory, you can't run the model at all); and also plain-old storage space, which can add up if there are a lot of different models involved. (Remember that the Affinity apps have mobile versions!)

3. Many users will be sold on the feature-set of the app, and want to use it / pay for it, but won't have local hardware powerful enough to run the ML models — and if you just let them install the app but then reveal that they can't actually run the models, they'll feel ripped off. And those users either won't find the offering compelling enough to buy better hardware; or they'll be stuck with the hardware they have for whatever reason (e.g. because it's their company-assigned workstation and they're not allowed to use anything else for work.)

Together, these factors mean that the "obvious" way to design these features in a product intended for mass-market appeal (rather than a product designed only "for professionals" with corporate backing, like VFX or CAD software) is to put the ML models on a backend cluster, and have the apps act as network clients for said cluster.

Which means that, rather than just shipping an app, you're now operating a software service, which has monthly costs for you, scaled to aggregate usage, for the lifetime of that cluster.

Which in turn means that you now need to recoup those OpEx costs to stay profitable.

You could do this by pricing the predicted per-user average lifetime OpEx cost into the purchase price of the product… but because you expect to add more ML-driven features as your apps evolve, which might drive increases usage, calculating an actual price here is hard. (Your best chance is probably to break each AI feature into its own “plugin” and cost + sell each plugin separately.)

Much easier to avoid trying to set a one-time price based on lifetime OpEx, by just passing on OpEx as OpEx (i.e. a subscription); and much friendlier to customers to avoid pricing in things customers don’t actually want, by only charging that subscription to people who actually want the features that require the backend cluster to work.

replies(4): >>45764245 #>>45764341 #>>45765384 #>>45766696 #
aleph_minus_one ◴[] No.45766696[source]
> and if you just let them install the app but then reveal that they can't actually run the models, they'll feel ripped off.

Just release some simple free "test application" that checks whether the computer satisfies the system requirements and does something "simple" (but relevant for the user) so that the users want to try out this simple free test application and want to update their hardware so that it can run.

Now, after the users have been incentivized to update their hardware so that they can run the cool test application, you can upsell your users to the "full software experience". :-)

replies(1): >>45766824 #
1. tracker1 ◴[] No.45766824[source]
Seems like a waste when you know most computers are laptops that won't meet minimum requirements.
replies(1): >>45767431 #
2. aleph_minus_one ◴[] No.45767431[source]
This depends a lot on the group of users that you are talking about.