Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)P
Posts
3
Comments
134
Joined
3 yr. ago

  • You check the clock. You check again, because you didn't actually read the time because you were too absorbed in the process of checking the clock that you forgot to check the clock.

    You check the clock again. You have a new email. You consider checking the clock again, but give up and accept your fate because checking the clock a (second? Third? Tenth? First?) time is just too much right now, you're already running late anyways so it was kind of all procrastinating in the first place. You don't even know what you were supposed to be checking it for. Just wait and see, it's probably not that important. Maybe you'll check the clock and see if it sparks your memory.

    You check the clock. You finally see the time. The bus drives past you.

  • Hmm... Nothing off the top of my head right now. I checked out the Wikipedia page for Deep Learning and it's not bad, but quite a bit of technical info and jumping around the timeline, though it does go all the way back to the 1920's with it's history as jumping off points. Most of what I know came from grad school and having researched creative AI around 2015-2019, and being a bit obsessed with it growing up before and during my undergrad.

    If I were to pitch some key notes, the page details lots of the cool networks that dominated in the 60's-2000's, but it's worth noting that there were lots of competing models besides neural nets at the time. Then 2011, two things happened at right about the same time: The ReLU (a simple way to help preserve data through many layers, increasing complexity) which, while established in the 60's, only swept everything for deep learning in 2011, and majorly, Nvidia's cheap graphics cards with parallel processing and CUDA that were found to majorly boost efficiency of running networks.

    I found a few links with some cool perspectives: Nvidia post with some technical details

    Solid and simplified timeline with lots of great details

    It does exclude a few of the big popular culture events, like Watson on Jeopardy in 2011. To me it's fascinating because Watson's architecture was an absolute mess by today's standards, over 100 different algorithms working in conjunction, mixing tons of techniques together to get a pretty specifically tuned question and answer machine. It took 2880 CPU cores to run, and it could win about 70% of the time at Jeopardy. Compare that to today's GPT, which while ChatGPT requires way more massive amounts of processing power to run, have an otherwise elegant structure and I can run awfully competent ones on a $400 graphics card. I was actually in a gap year waiting to go to my undergrad to study AI and robotics during the Watson craze, so seeing it and then seeing the 2012 big bang was wild.

  • Yeah I probably should have added the /s to that one.

  • Oh for sure. And it's a great realm to research, but pretty dirty to rip apart another field to bolster your own. Then again, string theorist...

  • For me, it's the next major milestone in what's been a roughly decade-ish trend of research, and the groundbreaking part is how rapidly it accelerated. We saw a similar boom in 2012-2018, and now it's just accelerating.

    Before 2011/2012, if your network was too deep, too many layers, it would just breakdown and give pretty random results - it couldn't learn - so they had to perform relatively simple tasks. Then a few techniques were developed that enabled deep learning, the ability to really stretch the amount of patterns a network could learn if given enough data. Suddenly, things that were jokes in computer science became reality. The move from deep networks to 95% image recognition ability, for example, took about 1 years to halve the error rate, about 5 years to go from about 35-40% incorrect classification to 5%. That's the same stuff that powered all the hype around AI beating Go champions and professional Starcraft players.

    The Transformer (the T in GPT) came out in 2017, around the peak of the deep learning boom. In 2 years, GPT-2 was released, and while it's funny to look back on now, it practically revolutionized temporal data coherence and showed that throwing lots of data at this architecture didn't break it, like previous ones had. Then they kept throwing more and more and more data, and it kept going and improving. With GPT-3 about a year later, like in 2012, we saw an immediate spike in previously impossible challenges being destroyed, and seemingly they haven't degraded with more data yet. While it's unsustainable, it's the same kind of puzzle piece that pushed deep learning into the forefront in 2012, and the same concepts are being applied to different domains like image generation, which has also seen massive boosts thanks in-part to the 2017 research.

    Anyways, small rant, but yeah - it's hype lies in its historical context, for me. The chat bot is an incredible demonstration of the incredible underlying advancements to data processing that were made in the past decade, and if working out patterns from massive quantities of data is a pointless endeavour I have sad news for all folks with brains.

  • Windows 11 has tabbed file explorer, a package manager, it's quick, the interface looks nice and feels nice, and it's been really stable for me. I don't know where the complaints are at, it's been great. All they need to do is regress all of the ads-in-your-OS stuff from 10. Bring back the start menu that doesn't hang for 30 seconds looking something up online before showing you your installed programs.

  • I understand that he's placing these relative to quantum computing, and that he is specifically a scientist who is deeply invested in that realm, it just seems too reductionist from a software perspective, because ultimately yeah - we are indeed limited by the architecture of our physical computing paradigm, but that doesn't discount the incredible advancements we've made in the space.

    Maybe I'm being too hyperbolic over this small article, but does this basically mean any advancements in CS research are basically just glorified (insert elementary mechanical thing here) because they use bits and von Neumann architecture?

    I used to adore Kaku when I was young, but as I got into academics, saw how attached he was to string theory long after it's expiry date, and seeing how popular he got on pretty wild and speculative fiction, I struggle to take him too seriously in this realm.

    My experience, which comes with years in labs working on creative computation, AI, and NLP, these large language models are impressive and revolutionary, but quite frankly, for dumb reasons. The transformer was a great advancement, but seemingly only if we piled obscene amounts of data on it, previously unspeculated of amounts. Now we can train smaller bots off of the data from these bigger ones, which is neat, but it's still that mass of data.

    To the general public: Yes, LLMs are overblown. To someone who spent years researching creativity assistance AI and NLPs: These are freaking awesome, and I'm amazed at the capabilities we have now in creating code that can do qualitative analysis and natural language interfacing, but the model is unsustainable unless techniques like Orca come along and shrink down the data requirements. That said, I'm running pretty competent language and image models on 12GB of relatively cheap consumer video card, so we're progressing fast.

    Edit to Add: And I do agree that we're going to see wild stuff with quantum computing one day, but that can't discount the excellent research being done by folks working with existing hardware, and it's upsetting to hear a scientist bawk at a field like that. And I recognize I led this by speaking down on string theory, but string theory pop science (including Dr. Kaku) caused havoc in people taking physics seriously.

  • I'm not sure if it's relevant here, but I'd recommend taking a look at the book Adult Children of Emotionally Immature Parents. I picked up the audiobook from my library and it really helped me understand myself, my development, and my parents a lot better and to have a healthier outlook on our relationship. I always understood my parents had their own baggage, but I didn't realize the specifics I could be on the lookout for, the specific reactions I'd had that could be linked to it, and how to move forward.

    It could at least be a good start. Best of luck!

  • This is the start of the use cases I wanted to see take off with Mastodon/Lemmy/Kbin. Much like the previous era of distributed content with user-hosted voice servers and forums, having larger communities/organizations run their own instances and avoid trying to treat the space as one big pool of content is the real use case here. The fact that you can cross-instance subscribe and post makes it viable long-term.

    It also gives "free" verification of information's sources based on the domain, the same way that (modern) email gives you an extra layer of confidence when you see a verified domain. I would love the see the Government of Canada, CBC, Universities, all starting their own instances and utilizing them in unique and interesting ways. With enough adoption, official provincial/municipality instances could pop up to make organized communities easier.

    It feels to me like a starting move away from the autocracy that the platform economy has created. It's not universal, but I absolutely push back against too many instances trying to be "general purpose Reddit replacements" because that seems like a fleeting use case for what it can eventually become, and it just confuses the whole abstraction of what these decentralized socials afford.

  • Soap. 100%.

  • I know this post and comment might sound shilly but switching to more expensive microfibre underwear actually made a big impact on my life and motivated me to start buying better fitting and better material clothes.

    I'd always bought cheap and thought anything else was silly. I was wrong. So much more comfortable, I haven't had a single pair even begin to wear down a little bit, less sweating and feel cleaner, fit better, and haven't been scrunchy or uncomfortable once compared to the daily issues of that cheap FotL life. This led to more expensive and longer lasting socks with textures I like better, better fitting shoes that survive more than one season.

    It was spawned by some severe weight loss and a need to restock my wardrobe. My old underwear stuck around as backups to tell me I needed to do laundry, but going back to the old ones was bad enough that I stopped postponing laundry.

    Basically, I really didn't appreciate how much I absolutely hated so many textures I was constantly in contact with until I tried alternative underwear and realized you don't have to just deal with that all the time.

  • It depends what "From Scratch" means to you, as I don't know your level of programming or interests, because you could be talking about making a game from beginning to end, and you could be talking about...

    • Using a general purpose game engine (Unity, Godot, Unreal) and pre-made assets (e.g., Unity Asset Store, Epic Marketplace)?
    • Using a general purpose game engine almost purely as a rendering+input engine with a nice user interface and building your own engine overtop of that
    • Using frameworks for user input and rendering images, but not necessarily ones built for games, so they're more general purpose and you'll need to write a lot of game code to put it all together into your own engine before you even starting "Making the game", but offer extreme control over every piece so that you can make something very strange and experimental, but lots of technical overhead before you get started
    • Writing your own frameworks for handling user input and rendering images... that same as previous, but you'll spend 99% of your time trying to rewrite the wheel and get it to go as fast as any off the shelf replacement

    If you're new to programming and just want to make a game, consider Godot with GDScript - here's a guide created in Godot to learn GDScript interactively with no programming experience. GDScript is like Python, a very widely used language outside of games, but it is exclusive to Godot so you'll need to transfer it. You can also use C# in Godot, but it's a bigger learning curve, though it is very general and used in a lot of games.

    I'm a big Godot fan, but Unity and Unreal Engine are solid. Unreal might have a steeper learning curve, Godot is a free and open-source project with a nice community but it doesn't have the extensive userbase and forum repository of Unity and Unreal, Unity is so widely used there's lots of info out there.

    If you did want to go really from scratch, you can try using something like Pygame in Python or Processing in Java, which are entirely code-created (no user interface) but offer lots of helpful functionality for making games purely from code. Very flexible. That said, they'll often run slow, they'll take more time to get started on a project, and you'll very quickly hit a ceiling for how much you can realistically do in them before anything practical.

    If you want to go a bit lower, C++ with SDL2, learning OpenGL, and learning about how games are rendered and all that is great - it will be fast, and you'll learn the skills to modify Godot, Unreal, etc. to do anything you'd like, but similar caveats to previous; there's likely a low ceiling for the quality you'll be able to put out and high overhead to get started on a project.

  • My cynical guess is that's what they're hoping the community will do ("like lemmings, I tell you!" - spez, probably) to drive higher traffic numbers before some announcement or meeting.

  • Yeah, I really think it's important to not see Lemmy as one singular community, or a lot of important use cases will go ignored.

  • Not the OP, but in Canada at least, I think you would legally be expected to because common law is (as far as I'm aware) very nearly marriage and is entirely implied by time living together in a conjugal relationship. It might be provincial to determine the actual property laws, though.

    I don't have a firm opinion here, but I think the key difference in your case is that a conjugal relationship has some expectation of... Oh I don't know, mutuality? A landlord tenant relationship is a lease agreement. If your roommate didn't sign any kind of lease agreement, they might have a legal case to just not pay you and suffer no consequences (I don't know), but they're not in a conjugal relationship, so there's also no implication of shared ownership.

    Without signing lease agreement and being in a conjugal relationship, I think there is a pretty fair case that expecting shared ownership is a fair assumption.

    That all said, it's also really up to the individuals to figure that out early, and the deception in the meme suggests that the agency to have that discussion wasn't available, and that's really the part I find problematic here.

  • That and expropriation/eminent domain, etc. Even if you pay your taxes, if the government needs it, they have processes to take it.

    I'm not saying it's an inherently bad thing, but it's another one of those important things to realize is already present if anyone wants to argue for/against certain government reforms.

  • I certainly used to, and used to think it was essentially gender neutral, but again - in certain contexts like a male dominated classroom, the women/nb students could easily feel excluded by it. Outside of that, I also recognized my trans friends had a lot of thoughtless people intentionally misgendering them on the regular just to be mean, and finding small ways to reduce that reinforcement felt better than not. It was also surprisingly not that tough for me to adopt the more neutral language, so if it's a subtle help with no skin off my back it just seems very win-win.

  • I know it's controversial, but moving away from "guys" when I address a group and more or less defaulting to "they" when referring to people I don't know.

    They was practical, because I deal with so many students exclusively via email, and the majority of them have foreign names where I'd never be able to place a gender anyways if they didn't state pronouns.

    Switching away from guys was natural, but I'm in a very male dominated field and I'd heard from women students in my undergrad that they did feel just a bit excluded in a class setting (not as much social settings) when the professor addresses a room of 120 men and 5 women with "Guys", so it just more or less fell to the side in favour of folks/everyone.

  • Only when it's intentionally censored and trained to react in a particular way. When it's not, you remember it was trained on random internet content.