Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Most of marketing at traditional compnaies can be automated by ML. This is already happening, but there are huge F500 companies where the employees have an incentive not to automate.


sort by: page size:

ML is just as much marketing as AI. The machine isn't actually learning.

There are surely some startups for which this is bullshit. But the good version of it is:

- take some valuable task that's never been successfully automated before

- do it manually (and expensively) for a while to acquire data

- build an automated system with some combination of regular software and ML models trained on the data

- now you can do a valuable task for free

- scale up and profit

The risk is that it's hard to guess how much data you'll need to train an accurate, automated model. Maybe it's very large, and you can't keep doing it manually long enough to get there. Maybe it's very small and lots of companies will automate the same task and you won't have any advantage.

I think there'll be some big successes with this model, and many failures. So be skeptical -- ML isn't a magic bullet. But if a team has a good handle on how they're going to automate something valuable, it can be a good bet.

As an investor, you may well face the situation down the line "We've burned through $10M doing it manually, and we're sure that with another $10M we can finish the automation." Then you have to make a hard decision. With some applications like self-driving cars, it might be $10B.


I know someone who worked at a marketing "machine learning" company. The ML really needs to be in quotes, because if you looked under the hood you'd see the people behind the curtain (literally -- Amazon Turk workers). But VCs loved them, customers loved them. The blind selling to the blind... Marketers had money to spend, ML is the hotness, everyone makes money! (My friend no longer works there)

AI is the new marketing term. Machine learning is old.

Yes, an ML model. If you built the machinery to connect marketing actions to business outcomes, as the OP complains about, you could recommend what actions to take. It's a straightforward machine learning task (supervised learning).

I am skeptical of ML not because it will automate away human control of production (which would be quite a good thing in many respects) but because it is likely to reinforce arbitrary or implicit human biases while being sold as the epitome of rational objectivity. A lot of technologists I meet are socially and culturally illiterate and do not seem competent to implement such systems. I would further argue that employers have strong economic incentives to select for such illiteracy when hiring.

According to me there are more chances of ML/AI jobs being automated than back-end/frontend jobs.

I want to strongly disagree with the conclusion based on 1-off personal anectdota.

Last year my working group within our company secured a major contract; largest for our company for the year. We expected we would have to hire 20+ entry level positions to execute on the contract. While the process was fundamentally based on ML, we knew there would be a large quantity of human labor as well to execute on time.

My colleagues and I, instead of going on a hiring spree, asked the overlords for 3 months of overhead time to develop 'intelligent automation' to reduce the number of new / temporary hires. We were able to make substantial enough gains to not have to bring on any new hires and complete the work, ahead of schedule and way under budget with our existing crew.

It was adjusting our thinking about how we do our work and incorporating ML at multiple levels in our system to intelligently guide our process that allowed us to eliminate 60% of the new positions that previously would have been generated by this work.

100% ML is coming for you.


Thanks for your reply, though I'm not sure if I get your comment correctly. As in, I agree that marketing is super important but if I had an idea about a certain SaaS product that requires a certain ML model, I need to decide either to build it myself, or using somebody else's APIs.

> We found that as our AI got worse, our product got better.

Interesting. Could you elaborate on this too?


Ironically 'machine learning' has tremendous marketing.

You can improve lead generation by 6.8% without ML too. A/B testing a few changes can easily accomplish the same thing.

The AI could be optimizing to attract people who are willing to give out their email address than actual potential customers so it doesn't help the business. We tried using Google ads to optimize for leads before and we saw ads were on sketchy websites promising people free stuff.

Lead generation was a bad KPI to use but it was chosen can't optimize for sales because the volume is to low to train the ML models. This is the reality of many businesses that experiment in AI and they produce junk.

ML does work well for certain domains where you have the scale of data to pull it off, but now the hype has spread where it is not appropriate.


There are likely very specific areas where ML can have a real impact and other areas that are nice, interesting but not really priority.

From my experience, a large part of the demand also needs to be driven by the AI and ML execs' ability to sell to the business.


Yes, but let's say they are just exploring the problem but they haven't solved it.

You can see that they are breaking the problem into various cost estimates, and other features, and just running a simple regression to figure out which features they should target for automation.

The problem with ML is that you don't know now if it will work. You just have an intuition that, eventually one of these areas will be able to be automated.

You notice that their approach to the front end is to make various modules (log in etc) that the AI can automatically deploy in various starter packages.

Kind of simplistic at the moment, but maybe it will yield something?


Sadly our marketing tests show that when we say our product uses ML we get far less engagement than when we say AI. I don't know that net-conversion is better with AI but ML sure doesn't capture people's imagination. Sigh.

Btw, reminds me of the old joke that goes something like this: AI for marketing, ML for recruiting, Regression for design, multiplication for implementation.


Well, IBM has 350k employees. If training a LLM on curated data costs tens of millions of dollars but ends up reducing headcount by 50k, it would be a massive win for any CEO.

You have to understand that all the incentives are perfectly aligned for corporations to put this to work, even spending tens of millions in getting it right.

The first corporate CEO who announces that his company used AI to reduce employee costs while increasing profits is going to get such a fat bonus that everyone will follow along.


Really? Because most product managers I meet are the ones going “how can we put AI and Machine Learning into <some task that doesn’t need it>?”

I 100% agree that most companies don’t need nearly as much ML as think though.


There is a lot of old tech out there that could be implemented and would probably provide ROI, but isnt. So many businesses are still running on paper! My point is that industry isnt just sitting around waiting for new tech so they can get higher efficiency, there are reasons why not everyone is doing the latest and greatest (including ML) and they're usually valid.

It's important to note that despite the recent excitement about LLMs, which is still an emerging technology, "AI" is not a new market by any means, nor are major companies only now investing in it. For the better part of a decade, ML has been widely adopted across industries, and the average person uses an "AI" system many times in a given day.

For example, if you open the home screen on the average smartphone right now, you'll see apps like:

- Delivery apps like Uber, Lyft, etc., whose recommendations, ETA predictions, driver matching, and more are built on ML.

- Media apps like YouTube, Netflix, etc., all of whom rely on models for recommendations.

- Email apps like Gmail, whose filtering (both spam and categorization) and text completion are based on ML.

- Photo apps like Instagram, Snapchat, and even your phone's basic Camera app, all of which use computer vision.

If you Google anything, you're perusing the output of a model. If you're being recommended something on basically any platform, you're interacting with ML. If you ever use speech-to-text, you're using a neural network. Your bank uses ML for fraud detection, your posts on social media are moderated by ML-based content moderation, and if you have a car with any recent-ish sort of lane departure assistance, you're driving with help from a neural network.

Most of these companies have large, mature ML teams, whose outputs represent massive amounts of revenue. Hence, they represent a legitimate market for selling picks and shovels.


But the point of ML to begin with is likely often to appeal not by a better product but by appealing to investors or managers. If you create a better product but it doesn’t have “AI” in it then it failed in that aspect. What’s needed is a set of things that can be sold as AI or ML but isn’t.
next

Legal | privacy