• 0 Posts
  • 51 Comments
Joined 3 years ago
cake
Cake day: June 15th, 2023

help-circle






  • The reality is, that it’s often stated that generative AI is an inevitability, that regardless of how people feel about it, it’s going to happen and become ubiquitous in every facet of our lives.

    That’s only true if it turns out to be worth it. If the cost of using it is lower than the alternative, and the market willing to buy it is the same. If the current cloud hosted tools cease to be massively subsidized, and consumers choose to avoid it, then it’s inevitably a historical footnote, like turbine powered cars, Web 3.0, and laser disk.

    Those heavily invested in it, ether literally through shares of Nvidia, or figuratively through the potential to deskill and shift power away from skilled workers at their companies don’t want that to be a possibility, they need to prevent consumers from having a choice.

    If it was an inevitability in it’s own right, if it was just as good and easily substitutable, why would they care about consumers knowing before they payed for it?



  • They have a near monopoly on cloud service genAI data center GPUs. They don’t make the semiconductors. They just hand the design for those chip to TSMC and then sell what TSMC makes for them. The vast majority of their revenue right now is coming from selling stuff to new genAI data centers, if those stop getting built, they loose 80% of their revenue. And their current valuation is based on an assumption of an order of magnitude of new such data centers being built year on year.

    I think, that it’s very likely that demand for new such chips is liable to drop to 0 because the capacity of currently extant data center using their chips is already overbuilt for realistic demand. No one other than Nvidia is making money on these data centers, and there is no path to profitability.





  • I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.

    AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.

    I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.





  • The current situation is a bubble based on an over hyped extension of the cloud compute boom. Nearly a trillion dollars of capital expenditure over the past 5 years from major tech companies chasing down this white whale and filling up new data centers with Nvidia GPUs. With revenue caping out at maybe 45 billion annually across all of them for “AI” products and services, and that’s before even talking about ongoing operation costs such as power for the data centers, wages for people working on them, or the wages of people working to develop services to run on them.

    None of this is making any fucking profit, and every attempt to find new revenue ether increases their costs even more or falls flat on its face the moment it is actually shipped. No one wants to call it out at higher levels because NVIDIA is holding up the whole fucking stock market right now, and them crashing out because everyone stoped buying new GPUs will hurt everyone else’s growth narrative.


  • See that’s the kicker, windows has so many “are you sure” pop ups about stuff that most people just click through them without reading the fine print. People get desensitized to it and just ignore them, or maybe even they just assume microsoft is trying to sell them on a feature they don’t care about.

    And in this case it didn’t save the files to the trash can, I imagine because it was synching local files with what was in one drive. Not the user deleting local files.


  • I had a colleague at work that had to redo several days of work because of the one drive thing.

    The long and short of it is that they noticed that their connection was being super slow, opened up task manager to see if anything was eating bandwidth, saw one drive, went it it, correctly diagnosed that it was uploading files to it and eating up bandwidth, and then deleted all the files in one drive to stop it.

    One drive decided that this meant they wanted all the local copies of the files deleted as well. Like, on the one hand, not the correct way to stop that behavior, but also like, the kind of thing a lot of people would try, and it then deleting all the local files in turn is an unintuitive outcome.


  • Because it’s something where the current government can claim they’re “doing something” or “addressing a real problem” but it also doesn’t threaten the rich and powerful.

    Going after Facebook would threaten the rich and powerful, for who it is an important tool for manipulating people, who think they can use it to mold culture to what they want it to be my breaking the minds of children.

    The current UK government is desperate to say to the public that they’re governing and fixing problems, but they also really don’t want to piss off the rich and powerful.