• 0 Posts
  • 36 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle


  • I’m guessing you weren’t around in the 90s then? Because the amount of money set on fire on stupid dotcom startups was also staggering.

    The scale is very different. OpenAI needs to raise capital at a valuation far higher than any other startup in history just to keep the doors open another 18-24 months. And then continue to do so.

    There’s also a very large difference between far ranging bad investments and extremely concentrated ones. The current bubble is distinctly the latter. There hasn’t really been a bubble completely dependent on massive capital investments by a handful of major players like this before.

    There’s OpenAI and Anthropic (and by proxy MS/Google/Amazon). Meta is a lesser player. Musk-backed companies are pretty much teetering at the edge of also rans and there’s a huge cliff for everything after that.

    It’s hard for me to imagine investors that don’t understand the technology now but getting burned by it being enthusiastic about investing in a new technology they don’t understand that promises the same things, but is totally different this time, trust me. Institutional and systemic trauma is real.

    (took about 15 years because 2008 happened).

    I mean, that’s kind of exactly what I’m saying? Not that it’s irrecoverable, but that losing a decade plus of progress is significant. I think the disconnect is that you don’t seem to think that’s a big deal as long as things eventually bounce back. I see that as potentially losing out on a generation worth of researchers and one of the largest opportunity costs associated with the LLM craze.


  • Sure, but those are largely the big tech companies you’re talking about, and research tends to come from universities and private orgs.

    Well, that’s because the hyperscalers are the only ones who can afford it at this point. Altman has said ChatGPT 4 training cost in the neighborhood of $100M (largely subsidized by Microsoft). The scale of capital being set on fire in the pursuit of LLMs is just staggering. That’s why I think the failure of LLMs will have serious knock-on effects with AI research generally.

    To be clear: I don’t disagree with you re: the fact that AI research will continue and will eventually recover. I just think that if the LLM bubble pops, it’s going to set things back for years because it will be much more difficult for researchers to get funded for a long time going forward. It won’t be “LLMs fail and everyone else continues on as normal,” it’s going to be “LLMs fail and have significant collateral damage on the research community.”


  • There is real risk that the hype cycle around LLMs will smother other research in the cradle when the bubble pops.

    The hyperscalers are dumping tens of billions of dollars into infrastructure investment every single quarter right now on the promise of LLMs. If LLMs don’t turn into something with a tangible ROI, the term AI will become every bit as radioactive to investors in the future as it is lucrative right now.

    Viable paths of research will become much harder to fund if investors get burned because the business model they’re funding right now doesn’t solidify beyond “trust us bro.”






  • As somebody that’s a paying Kagi user and generally happy with the service, it is interesting seeing exactly where the tradeoffs are.

    While I’d say Kagi pretty much universally returns better results for technical information or things like recipes where it deprioritizes search spam, it’s also pretty clear that there are other areas where the absence of targeting hurts results. Any type of localized results, e.g., searching for nearby restaurants or other businesses tends to be really hit or miss and I tend to fall back to Google there.

    Of course, that’s because Kagi is avoiding targeting to the point where they don’t even use your general location to prioritize results. It’s an interesting balancing act and I’m not quite sure they’ve hit the sweet spot yet, at least for me personally, but I like the overall mission and the results for most searches so I’m happy with the overall experience currently.


  • Searches are supposed to be fast at giving you the answer you’re looking for. But that is antithetical to advertising.

    And we have evidence that this is exactly why it happened, too:

    https://www.wheresyoured.at/the-men-who-killed-google/

    While I’d highly recommend giving either the article a read or the companion podcast a listen because Ed Zitron did some fantastic reporting on this, the tl;dr is that a couple of years ago, there was direct conflict between the search and advertising wings of Google over search query metrics.

    The advertising teams wanted the metrics to go up to help juice ad numbers. The search team rightly understood that there were plenty of ways they could do so, but that it would make for a worse user experience. The advertising team won.

    The head of the advertising team during this was a man named Prabhakar Raghavan. Roughly a year later, he became the head of Google Search. And the timing of all this lines up with when people started noting Google just getting worse and worse to actually use.

    Oh, and the icing on the cake? Raghavan’s previous job? Head of Yahoo Search just before that business cratered to the point that Yahoo decided to just become a bing frontend.

    Zitron is fond of saying that these people have names and it’s important that we know who’s making the decisions that are actively making the world of tech worse for everyone; I tend to agree.


  • Tree nested communication is much more superior than traditional thread based communication

    Heavily depends, IMO.

    Nested threads are great temporary discussion of a specific story or idea. They’re absolutely miserable for long-running discussions. New posts get lost in the tree and information ends up scattered across multiple threads as a result.

    It’s also been my personal experience that the nested threads format just doesn’t seem to build communities in the same way forums did. I have real-life friendships that were made on forums decades ago and I never had that experience with reddit despite being a very early user.

    I don’t think that’s entirely due to the ephemeral format, but I do think it plays a part in it. A deep thread between two people on Reddit might last a few hours and a dozen replies before it falls off the page. On forums threads running months or years were pretty common, and that kind of engagement with the same people certainly changes how your relationships develop with them.



  • In a vacuum, sure, but it also completely tracks with Sam Altman’s behavior outside of OpenAI.

    Employees at previous companies he’s run had expressed very similar concerns about Altman acting in dishonest and manipulative ways. At his most high profile gig before OpenAi, Paul Graham flew from London to San Francisco to personally (and quietly) fire him from Y Combinator because Altman had gone off the reservation there too. The guy has a track record of doing exactly the kind of thing Toner is claiming.

    What we know publicly strongly suggests Altman is a serial manipulator. I’m inclined to believe Toner on the basis that it fits with what we otherwise know about the man. From what I can tell, the board wasn’t wrong; they lost because Altman’s core skill is being a power broker and he went nuclear when the board tried to do their job.



  • Like how Ferrari cars are designed for 20 year olds but only 80 year olds can afford to buy them.

    I mean, making the comparison to motorsports just emphasizes how cheap gaming is as a hobby.

    Autocross is as entry level as you can get and a typical ~$50 entry fee gets you maybe 10 minutes of seat time and it’s typical to need to drive 2-3 hours each way for an event. That’s before you start adding in things like the fact that a $1500 set of tires will last you a season or two at most, suspension and brake upgrades easily running a couple of thousand dollars, etc.

    Start dipping into actual track time and fees jump to more like $250-750 plus around that much again for track insurance per event. And the upgrades needed for the car to hold up on track are even more expensive still. And this is all ignoring the purchase price of the car and potentially needing to trailer a dedicated track car.

    I’ve almost certainly spent far less on PC gaming in the last 5 years combined than I have on motorsports in the past 3 months. I’m on the upper end of spending for most gamers and a dabbler at best when it comes to the cars.

    The insanity of the GPU market since covid has put some upward pressure on things but A. the proliferation of great indie titles means you can get incredible value without breaking bank on the highest end equipment and B. even then, the money I spent literally tonight ordering just brake pads and rotors would buy you a 4070 all day long. And I went cheaper than I could have.

    Gaming dollars go a long, long way. It’s a hobby that was affordable even when I was younger and broke. It’s still relatively affordable compared to many, many other hobbies.



  • It’s why I’ve avoided anything smarthome tied to any particular vendor.

    My endpoint devices are almost entirely Zwave or Zigbee/Matter based. I started out with a SmartThings hub but migrated it all to Home Assistant last year. HA has honestly had easier integrations than SmartThings did and supports almost anything under the sun.

    I don’t have to worry about suddenly losing control of my devices and the only ‘subscription’ associated with it all is $15/year for a domain name to make setting up remote access easier. This approach requires a little more research, but it opens up the ability to mix and match devices however you’d like. Absolutely zero regrets.


  • Free Stars is being made by the original creators of the series, Paul Reiche and Fred Ford. They had nothing to do with SC3 or Origins.

    The reason why it’s not using the Star Control name is because the IP ownership around the whole thing is messy. The short version is that Paul and Fred owned the rights to the universe, but Atari owned the rights to the Star Control name.

    When Atari went bankrupt, Stardock bought the name. They thought they’d bough the universe. This resulted in Stardock spending the next couple of years trying trying to use the courts to bully Paul and Fred into turning over the rights to them and generally being dickheads.

    This finally ended in a settlement and work on Free Stars has been happening quietly for the last couple of years.


  • commandar@lemmy.worldtoPC Gaming@lemmy.caASUS Scammed Us
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    6 months ago

    For 3D printers, they’re subpar.

    Noctua fans are typically 12v and tuned for lower speed for lower noise; in 3DP you’re generally looking for 24v fans* with the highest CFM:static pressure ratio you can get which will generally mean a louder, higher RPM fan.

    They’ll work, but you can generally get industrial fans from Delta, Sunon, etc that are a better fit for the application, often for less money.

    * - 5v and 12v fans are getting more common simply because they tend to be more available. Preference for high CFM:static pressure holds true regardless.