> Well, one of the measures is, is anybody actually making money with this AI yet, in the sense of mining gold and not selling shovels?
With a new tech, or starting something new it's rare to be profitable within first 5 years in the first place. Certainly there's coming in revenue in many places. I spend a lot on AI services myself. Starting from all current popular LLM variants, like Claude, GPT, Perplexity to music generation tools like Suno. Some of them I use for fun, but many of them I do think my productivity and value output has increased more than I pay for them. Many I use for experimenting or just out of curiousity. There's a lot of revenue coming in, but it also at the same time makes sense that the costs right now are higher than what they are actually making. But I have also directly increased my income thanks to AI tools, because I do my usual work faster, and I do freelancing on the side which I charge quite a bit for if it comes to hourly. Far more than I spend on those AI tools. without AI tools I couldn't do the work as fast or have the energy to produce this much.
> How, exactly, does an AI assistant on Amazon's shopping site "make money"?
It depends on how this AI assistant is built. I have a lot of thoughts about shopping UX, and I think it's a UX related question, how a better UX will increase e-commerce conversion, but I don't want to go that deep into it here. I definitely imagine ways how AI can improve UX in such a way that it finds matching products for the customer much faster than standard UX would. This would provide value because it takes less time to find the product and it would possibly find a higher quality match.
> Evidence that it has helped programmers is decidedly mixed at best. For everyone saying it has given them superpowers we have an awful lot of reports of buggy generated code and more bugs making it into final code as a result.
I know that it's definitely helped me a lot. I don't know if it's a skill issue or a thought issue or what it is, how some people don't see it valuable multiplier for their productivity, but I haven't noticed myself doing buggy work more because of that.
> You might say "it's not all about the money", which is true, but again, this is about hype, not social utility. I don't see AI living up to the hype. All the moneymakers are the shovel-sellers. If AI was living up to the hype, somebody ought to be making money by now.
I mean I wouldn't say that. I do think it has to make money, but I also think that with new tech there's always a period of time where it makes strategical sense for it to lose money, just like starting any new company. And some definitely do make money. Again I make individually more money, because I can work more, and I can also translate it into freelance work which if I just did usually salary based work might not be rewarded as such as directly. Although I use AI that also help me at my work to spend less hours on it.
> Part of the reason it has not lived up to the hype is the sky-high bar the hype has set. The stock market bubble is not pricing companies like nVidia to make some decent money on AI over the next few years. It's priced like they're going to be the only company that can do AI and that everyone has not yet begun to spend on AI. But if returns don't start coming back on the AI spend, that valuation is going to prove to be premature.
This argument requires to come up with specific numbers, and the market valuation is very nuanced.
> It can help to look back at a previous example of this exuberance to see what I'm talking about: The Dot-Com boom. The reality is, basically everything that the Dot Com boom promised happened! Even the thing people mocked for years, "selling pet food online", happens now.
Yeah, but I think it's an argument for AI rather than anything else?
> But it happened 20 years later. Far too late for a company founded in 1998 that absolutely depended on having 2020-levels of internet infrastructure.
The major players right now have a lot of funds to keep going with it though.
> AI isn't going to disappear and we may even be underestimating the change it will bring in the long term. But that doesn't mean that the curve is going to smoothly slope up over the next 50 years. These things often get out "over their skis". AI seems badly over its skis to me, not because it won't be as useful in 20 years as it is promised, but because it is not as useful right now as is promised.
We don't know the future or the curve, and no one can for sure predict the timelines, but I think based on the knowledge we have it makes sense to put in the money that currently is being put into AI, based on at least capabilities and speed of those capabilities improving. If we had an estimation that there's 50% chances of AGI by 2035, I think people are absolutely not putting enough money in right now, because if AGI was to happen by 2035, then it would make sense for very many to go absolutely all in bonkers on that.
With a new tech, or starting something new it's rare to be profitable within first 5 years in the first place. Certainly there's coming in revenue in many places. I spend a lot on AI services myself. Starting from all current popular LLM variants, like Claude, GPT, Perplexity to music generation tools like Suno. Some of them I use for fun, but many of them I do think my productivity and value output has increased more than I pay for them. Many I use for experimenting or just out of curiousity. There's a lot of revenue coming in, but it also at the same time makes sense that the costs right now are higher than what they are actually making. But I have also directly increased my income thanks to AI tools, because I do my usual work faster, and I do freelancing on the side which I charge quite a bit for if it comes to hourly. Far more than I spend on those AI tools. without AI tools I couldn't do the work as fast or have the energy to produce this much.
> How, exactly, does an AI assistant on Amazon's shopping site "make money"?
It depends on how this AI assistant is built. I have a lot of thoughts about shopping UX, and I think it's a UX related question, how a better UX will increase e-commerce conversion, but I don't want to go that deep into it here. I definitely imagine ways how AI can improve UX in such a way that it finds matching products for the customer much faster than standard UX would. This would provide value because it takes less time to find the product and it would possibly find a higher quality match.
> Evidence that it has helped programmers is decidedly mixed at best. For everyone saying it has given them superpowers we have an awful lot of reports of buggy generated code and more bugs making it into final code as a result.
I know that it's definitely helped me a lot. I don't know if it's a skill issue or a thought issue or what it is, how some people don't see it valuable multiplier for their productivity, but I haven't noticed myself doing buggy work more because of that.
> You might say "it's not all about the money", which is true, but again, this is about hype, not social utility. I don't see AI living up to the hype. All the moneymakers are the shovel-sellers. If AI was living up to the hype, somebody ought to be making money by now.
I mean I wouldn't say that. I do think it has to make money, but I also think that with new tech there's always a period of time where it makes strategical sense for it to lose money, just like starting any new company. And some definitely do make money. Again I make individually more money, because I can work more, and I can also translate it into freelance work which if I just did usually salary based work might not be rewarded as such as directly. Although I use AI that also help me at my work to spend less hours on it.
> Part of the reason it has not lived up to the hype is the sky-high bar the hype has set. The stock market bubble is not pricing companies like nVidia to make some decent money on AI over the next few years. It's priced like they're going to be the only company that can do AI and that everyone has not yet begun to spend on AI. But if returns don't start coming back on the AI spend, that valuation is going to prove to be premature.
This argument requires to come up with specific numbers, and the market valuation is very nuanced.
> It can help to look back at a previous example of this exuberance to see what I'm talking about: The Dot-Com boom. The reality is, basically everything that the Dot Com boom promised happened! Even the thing people mocked for years, "selling pet food online", happens now.
Yeah, but I think it's an argument for AI rather than anything else?
> But it happened 20 years later. Far too late for a company founded in 1998 that absolutely depended on having 2020-levels of internet infrastructure.
The major players right now have a lot of funds to keep going with it though.
> AI isn't going to disappear and we may even be underestimating the change it will bring in the long term. But that doesn't mean that the curve is going to smoothly slope up over the next 50 years. These things often get out "over their skis". AI seems badly over its skis to me, not because it won't be as useful in 20 years as it is promised, but because it is not as useful right now as is promised.
We don't know the future or the curve, and no one can for sure predict the timelines, but I think based on the knowledge we have it makes sense to put in the money that currently is being put into AI, based on at least capabilities and speed of those capabilities improving. If we had an estimation that there's 50% chances of AGI by 2035, I think people are absolutely not putting enough money in right now, because if AGI was to happen by 2035, then it would make sense for very many to go absolutely all in bonkers on that.