thanks for clarifying. it was hard for me to dignify such a comment with a response.
you’re also going to run into hardware acceleration issues trying to run Metal acceleration with a Linux kernel. i don’t really see a need to containerize these workloads these days anyway with tools like uv.
it’s a big pain in my ass at times trying to do web dev work with an aarch64-darwin dev env vs the target x86_64-linux. adding in hardware acceleration issues just sounds painful.
i also just personally don’t like containers. feels like bludgeon of a solution.
oh i see. embedded systems makes sense. i wouldn’t even try to go beyond the factory recommendation for systems like that. maybe for fun. likely there are kernel modifications or modules that are required for those systems.
i’m baffled GoDaddy still exists. i’ve never heard good things about them, but every normie i know mentions them first when the idea of buying a domain comes up
saying “Linux does dynamic linking and Window does static linking” is both false and a mischaracterization. Windows absolutely does dynamic linking with its Dynamically Linked Libraries (.dll). how dependencies are linked is up to the developer and whatever hardware constraints. one reason i like Rust is that it prefers static linking, and a lot of tool chains are moving in that direction. the reason Linux distros push people toward their internal package management tools (eg apt) is to have tighter control over dynamic linking.
and we’re also glossing over scoop and chocolatey and winget and Docker.
but that’s where you get to stuff like flatpack and snap and Nix that try to contain the dynamic dependencies.
i don’t think downloading exes hoping that Windows has stuffed enough DLLs into the OS and just running them is a better solution.
super fair. i am a Linux guy normally. i’m just being honest. i wish there was a better more open alternative.
if you want to go with the Linux alternative it’s going to cost. get at least 32GB of RAM and at least a 4090 to run the kind of models you’re asking for. it’s the way she goes
honestly it’s hard to beat Macs these days in this space for two reasons:
unified memory means that you don’t have to load up on RAM just to load the model and then also shell out for a video card with barely enough VRAM to fit a basic language model
their supply chain is solid and has mostly avoided the constraints that other OEMs and parts manufacturers are struggling with
pricing is tough. sure, crypto is on its way out, but GPUs are still the platform of choice for most neural net workloads (outside of SoCs like Apple M-series). i built a PC in late 2024, and it’s easily worth twice what i paid for it.
i guess it would be nice, but packages being a few months out of date is pretty normal for Ubuntu, in my experience. i’m not sure what their testing process is like, but part of using something like Ubuntu is stability guarantees. if they felt like the couldn’t do that for newer versions for whatever reason (resource constraints, lack of downstream interest from stakeholders, etc) they’re not necessarily obligated to.
there’s a world of options. this is an LTS distro. use Arch or Nix or whatever if you want the latest packages. i actually switched to NixOS because the CUDA drivers were too new on Arch, and i wanted a better way to pin versions.
or i dunno keep publicly complaining about it until someone does the work for you
as someone who has been watching far too much Food Network on the treadmill: just give em some freakin time to cook. the best things i’ve made personally are low and slow or from scratch pasta or slaw that sat in the fridge overnight. the 15-45min time frame has produced so many undercooked or otherwise mangled $80 steaks. like, even for a weeknight dinner i’m using things i marinated overnight or whatever. and in a kitchen setting you literally have all morning to prep in addition to doing overnight prep or even coming in super early to start your fresh bread. the format precludes entire classes of dishes.
i can generate this story, at least structurally, using Claude. given the other headline about how MAGA is easily fooled by AI slop, i’d be surprised if this was any more than the unholy rage bait/confirmation bias slop that has old men congratulating AI bimbos in red hats for being “a real American woman”
guaranteed any company worth more than a handful of salt does not want this. my company would throw a library of books at them for using data in any way that isn’t 100% explicit. for the longest time they blocked me from running Ollama on my laptop cuz the lawyers didn’t understand how neural networks work and thought i was exfiltrating data.
this is only going to hurt companies that probably shouldn’t be using Atlassian products anyway (ie any company with more agility than a boomer era corporate dinosaur)
“changed nothing” can’t be really true. sure, productivity might have been on the whole unchanged, but in my experience it’s because of the clusterfuck of half finished and poorly understood work that all of a sudden gets a pass and takes real experts to come in and mitigate and clean up. measuring productivity with lines of code was always a bad metric.
Microsoft is running out of moat to commit this type of developer abuse. Linux numbers just this year have shown that Windows can bleed, and we’re starting to enter a world where the software that people need isn’t on Windows when the converse used to be a given.
semantic search is a great use case. get a good embedding model and setup Postgres with pgvector, and i can semantic search my Obsidian D&D notes