• 0 Posts
  • 77 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle



  • I’m guessing you weren’t around in the 90s then? Because the amount of money set on fire on stupid dotcom startups was also staggering.

    The scale is very different. OpenAI needs to raise capital at a valuation far higher than any other startup in history just to keep the doors open another 18-24 months. And then continue to do so.

    There’s also a very large difference between far ranging bad investments and extremely concentrated ones. The current bubble is distinctly the latter. There hasn’t really been a bubble completely dependent on massive capital investments by a handful of major players like this before.

    There’s OpenAI and Anthropic (and by proxy MS/Google/Amazon). Meta is a lesser player. Musk-backed companies are pretty much teetering at the edge of also rans and there’s a huge cliff for everything after that.

    It’s hard for me to imagine investors that don’t understand the technology now but getting burned by it being enthusiastic about investing in a new technology they don’t understand that promises the same things, but is totally different this time, trust me. Institutional and systemic trauma is real.

    (took about 15 years because 2008 happened).

    I mean, that’s kind of exactly what I’m saying? Not that it’s irrecoverable, but that losing a decade plus of progress is significant. I think the disconnect is that you don’t seem to think that’s a big deal as long as things eventually bounce back. I see that as potentially losing out on a generation worth of researchers and one of the largest opportunity costs associated with the LLM craze.


  • Sure, but those are largely the big tech companies you’re talking about, and research tends to come from universities and private orgs.

    Well, that’s because the hyperscalers are the only ones who can afford it at this point. Altman has said ChatGPT 4 training cost in the neighborhood of $100M (largely subsidized by Microsoft). The scale of capital being set on fire in the pursuit of LLMs is just staggering. That’s why I think the failure of LLMs will have serious knock-on effects with AI research generally.

    To be clear: I don’t disagree with you re: the fact that AI research will continue and will eventually recover. I just think that if the LLM bubble pops, it’s going to set things back for years because it will be much more difficult for researchers to get funded for a long time going forward. It won’t be “LLMs fail and everyone else continues on as normal,” it’s going to be “LLMs fail and have significant collateral damage on the research community.”


  • There is real risk that the hype cycle around LLMs will smother other research in the cradle when the bubble pops.

    The hyperscalers are dumping tens of billions of dollars into infrastructure investment every single quarter right now on the promise of LLMs. If LLMs don’t turn into something with a tangible ROI, the term AI will become every bit as radioactive to investors in the future as it is lucrative right now.

    Viable paths of research will become much harder to fund if investors get burned because the business model they’re funding right now doesn’t solidify beyond “trust us bro.”








  • The BSG reboot really suffered from being a product of its era.

    It’s when shows were first really dipping their toes into telling an overarching narrative, but writer’s rooms were still very much geared toward producing stories of the week. The result was that a lot of shows at the time would start incredibly strongly, set up a lot of really interesting premises, and then just meander along because the writers were literally making things up along the way and because there was no coherent plan.

    Know how Game Of Thrones fell apart in the last couple of seasons when they outran the preplanned narrative of the books? That’s how a lot of TV ended up in the early 2000s. BSG and Lost are probably the two most prominent examples from around that time, but it was a pretty common problem as the format of TV shows was starting to change.







  • Article III only lays out there there will be a supreme court and a Chief justice and makes Congress responsible for establishing them. It does not lay out the makeup or structure of that court. The current body of 9 justices is set by federal statute and could be changed by a simple act of Congress.

    Article III also explicitly states that whatever Justices are appointed hold their office as long as they maintain good behavior (I e., as long as they haven’t been impeached) and that Congress cannot reduce their pay.

    Term limits are explicitly unconstitutional.

    Setting the number of judges is explicitly within Congress’ constitutional powers.

    Randomized panels would probably be challenged just because it’s never been tested, but the language in the Constitution re: Congress establishing the Supreme Court is vague. That said, Congress has already established inferior Federal courts that operate in this manner, so there’s precedent.


  • I think you’re missing the point.

    As things stand now, you get cases that are tailor made to the whims of specific people because there’s a 100% chance it ends up in front of those specific people. That’s an absolutely massive problem.

    The point is that you’re less likely to have cases that are specifically aimed at stroking any given individual’s brand of crazy when there’s only a ~1 in 3 chance they’ll even hear it. A panel of 9 from a pool of 26 means that you go from a 100% chance that, say, Alito and Thomas, hear a case together to around 12%. That’s a huge gamble when it takes years and a massive amount of money to get a case in front of SCOTUS.

    No, it doesn’t solve all conceivable problems with the court. But it’d help address the fact that SCOTUS justices are entirely too powerful as individuals and it can be done via simple act of Congress.

    Appointees should just be subject to term limits and yearly affirmation votes by members of the BAR association to renew or revoke their qualifications

    Not going to happen. SCOTUS terms are life appointments constitutionally. That means you’ve gotten into amendment territory which just plain is not realistic right now.