I haven’t kept up with the prompt injection memes, but I wonder if anyone bothered making a frontend or Ollama plugin that just uses the thousands of random public chatbots to actually accomplish a free high token rate usage lol.
Let’s get McDonald’s support to make it
Writing a linked list in Python just feels wrong. I know it’s probably homework.
If you must, stdlib has dequeue which is going to be faster and simpler than rolling your own. Plus no reference semantics to deal with.
It’s a meme job interview question iirc.
Yeah, linked lists are rarely a good idea. Modern memory optimization, where contiguous regions of memory are loaded into CPU caches, means that array-backed lists have better performance in virtually all situations.
In a way, I’d want to argue that you should actually only ever roll your own linked lists, because you should only use linked lists when you’re not working in-memory, i.e. when array-backed lists are not an option to begin with.
You really need frequent middle insertion (insert joke here) for the linked list to become better than an array list.
Oh damn, I was just about to reply to your reply to @[email protected] (which is literally directly above this comment, on my screen) suggesting exactly this. Glad that Piefed initially failed to register my clicking “reply”.
What would you use if you don’t know how much space you were going to need in advance, and you were gonna only read the data once for every time the structure got created.
Array list/vector types often have dynamic resize built in, and then if you can benchmark it that always helps.
Yes, but dynamic resize typically means copying all of the old data to the new destination, whereas a linked list does not need to do this. The time complexity of reading a large quantity of data into a linked list is O(N), but reading it into an array can end up being O(N^2) or at best O(N log N).
You can make the things in your list big chunks so that you don’t pay much penalty on cache performance.
I thought of another good example situation: a text buffer for an editor. If you use an array, then on large documents inserting a character at the beginning of the document requires you to rewrite the rest of the array, every single character, to move everything up. If you use a linked list of chunks, you can cap the amount of rewriting you need to do at the size of a single chunk.
I’m surprised they went for such a high performance chat bot as Claude
Paying for a needlessly expensive llm model makes leadership think they’re “investing in inovation”
Is this even a real screenshot? Seems fake to me. The bot on the support page I tried is much more dumb and doesn’t seem LLM based at all. More like the chat bots we had before LLMs fucked up the world.
They fired their IT manager and hired AI agent.
How can you be sure that they do?
Could be using a cheaper model like haiku, which is still “claude”
Personally, I’ve been using Ecosia as a coding assistant for shit my local LLM on my SteamDeck can’t figure out without triggering thermal threshold warnings.
Its… theoretically more eco friendly than uh, Ronald McDonald teaches Python.
Wait, Ecosia has an AI assistant? One - that sounds ridiculously counterproductive to their goal of being eco, Two - I need to check that out haha
It has an AI assisted search feature.
Which just also happens to have basically no guidelines preventing it from analyzing and rewriting code that you copy paste to it.
lol.
But uh, they have some kind of scheme involving Ai usage or search queries = pledge to plant trees.
… ???
At least with most models I’ve run locally, they’ll all at least try to read code, tell you how it works, suggest improvements, etc.
You could probably just copy paste a block of random code into any Ai ‘thing’, and ask it to help you with it, and theres a pretty good chance it would try to.
Bro Grimmace is just 1337
Isn’t there a free claude option?
Yeah, but McDonald’s isn’t paying for those tokens.
Add whitespace to taste
ChatGPT is also free.
At least until their subscription model implodes
Would.like to see the original image







