I’m beautiful and tough like a diamond…or beef jerky in a ball gown.

  • 10 Posts
  • 33 Comments
Joined 9 months ago
cake
Cake day: July 15th, 2025

help-circle
  • I suppose I’m afraid that having a dog myself would be like a magnet for other dogs while on walks that I might be uncomfortable with or that my being nervous could make a normal meet and greet go poorly.

    Yes. Also, your dog will pick up on your nervousness and either get nervous themselves or become defensive, neither of which are ideal and could make for a bad situation if you’re ever at a park or out for a walk. Dogs are little copycats when it comes to mirroring their owner’s anxieties and behaviors, and even if you deal with your anxiety, the dog may have adopted it in the mean time and you’d have to work to repair that damage.

    Basically, you’re smart to be asking these questions before taking on the responsibility of adoption. I’d recommend waiting until you’ve worked out your issues before potentially passing them on to your four-legged friend.













  • Disclaimer: : All of my LLM experience is with local models in Ollama on extremely modest hardware (an old laptop with NVidia graphics) , so I can’t speak for the technical reasons the context window isn’t infinite or at least larger on the big player’s models. My understanding is that the context window is basically its short term memory. In humans, short term memory is also fairly limited in capacity. But unlike humans, the LLM can’t really see (or hold) the big picture in its mind.

    But yeah, all you said is correct. Expanding on that, if you try to get it to generate something long-form, such as a novel, it’s basically just generating infinite chapters using the previous chapter (or as much of the history fits into its context window) as reference for the next. This means, at minimum, it’s going to be full of plot holes and will never reach a conclusion unless explicitly directed to wrap things up. And, again, given the limited context window, the ending will be full of plot holes and essentially based only on the previous chapter or two.

    It’s funny because I recently found an old backup drive from high school with some half-written Jurassic Park fan fiction on it, so I tasked an LLM with fleshing it out, mostly for shits and giggles. The result is pure slop that seems like it’s building to something and ultimately goes nowhere. The other funny thing is that it reads almost exactly like a season of Camp Cretaceous / Chaos Theory (the animated kids JP series) and I now fully believe those are also LLM-generated.