• 0 Posts
  • 25 Comments
Joined 3 years ago
cake
Cake day: July 20th, 2023

help-circle
  • TLDR: I think AI is coming for teaching (for better or worse), but not coming to replace us, because the teacher:student ratio is already as poor as can be.


    I’m a teacher. Giving a serious answer, AI is likely going to be very involved in this industry over the next decade, purely for it’s ability to track and scaffold individual learning better than one adult doing the same for 30 people, but that would require a shift to even more digital learning, which takes the “how” of teaching out of the hands of a teacher in a way it currently is not.

    That said, I don’t actually think it’s coming for my job precisely because there is 1 teacher to every 30 students. If you compare us to how cashiers have been replaced by self service tills, teachers have already been stripped to a minimal coverage of the classroom, and you cannot have 30 students independently working, because children and teenagers are predominantly motivated to avoid working. I’m this regard, it can only supplement our job, as they can’t meaningfully cut the adult to student ratio further for safety reasons.

    Also, although I think it’ll start to be seen in the next 10 years, I’m not sure where it would come in. State schools do not have the budget, energy or time to experiment with individualised AI learning support, and private schools prefer to maintain older styles of teaching for a long time, as they prioritise the development of attitude and trust over academic scores as not only does it supplement academic scores, but it is what the corporate employers of privately educated students seek above merit.


  • Funnily enough I actually have Firefox open by default whenever I boot up my PC.

    I have no taskbar or desktop items. I always default to a specific workflow of pressing the windows key (or whatever we call it for Linux), and searching for everything. I have since early windows 10.

    I realised that 90% of the time, I was opening Firefox, so now it just opens. I have a pretty minimal toolbar setup for it, so it’s basically just an address bar that automatically focuses when I start typing.

    One day I’ll set up something where I have multiple search hotkeys for web search, file search, application search, music etc, that will sort of replace this.


  • 27 here, back to university too for similar reasons and seeing the same thing.

    I don’t actually blame the lecturers or teachers. A huge part of higher education is self motivated learning with access to people who are incredibly knowledgeable, who also happen to be your teachers / lecturers.any lectures are there to guide the topics of independent learning.

    Until a certain point, the purpose of most education was education itself. The matter half of the 20th century into today has seen a shift of the purpose of university being for employment on the other side. This is an enormous difference, it no longer appeals only to people who are passionate about the subject. If 70% of the lecture theatre is there not to learn but graduate, it changes the learning itself. People by nature want to optimise their tasks to get their goal; if the goal is to be as educated on the subject as possible, then you’re motivated across the board. If the goal is to get a job and the degree is a checkbox in the process, or even if you’re going because “that’s what you do”, then the motivation is to pass. There is no bare minimum to learning, there is to graduating.

    The goalposts move on difficulty too. Universities are for-profit companies, who sell qualifications. Inevitably the difficulty of the qualification will creep downwards, as the expectation of difficulty from the learner does the same.

    I think this has been happening for long enough that in all but the most prestigious or passionate corners of higher education, the staff and teachers also first entered higher education in establishments where everyone was motivated by either employment or profit.

    Don’t get me wrong, I do believe plenty of people in higher education are motivated by education for the sake of it, but it’s no longer the default expectation.


  • I’m guilty of using LLMs from time to time, and more guilty of finding it gradually replacing what I used to Google search.

    If it’s something that Wikipedia can help me with, that’s still my first port of call, but gradually, for anything problem solving related, I just ask an LLM.

    Even a year or two ago, I was googling things with reliable websites for advice at the end, like reddit, but clearly that has decayed as a reremovedble source for support.

    Googling things that require more than just knowledge is difficult now, and asking the sometimes wrong machine is consistently more useful.


  • I’m guilty of using LLMs from time to time, and more guilty of finding it gradually replacing what I used to Google search.

    If it’s something that Wikipedia can help me with, that’s still my first port of call, but gradually, for anything problem solving related, I just ask an LLM.

    Even a year or two ago, I was googling things with reliable websites for advice at the end, like reddit, but clearly that has decayed as a reremovedble source for support.

    Googling things that require more than just knowledge is difficult now, and asking the sometimes wrong machine is consistently more useful.


  • I have an ADHD diagnosis, and I do think this is 60% just being better at diagnosing it, but I do also believe ADHD is sort of on the rise.

    There is an incredible book called Scattered Minds by Gabor Maté, which is the significant book on ADHD in the same way that The Body Keeps the Score is for trauma, which delves into the potential ADHD causes beyond it being hereditary.

    Of course modern dopamine-consumerist culture is part of the problem, but it largely makes ADHD symptoms obvious, and various unmet attention needs in early childhood are significantly more linked to developing ADHD, not to fault the parent or other caregiver who may not have the availability or ability to provide that attention due to modern societal demands. It’s been some years since I read it but I really remember one part clearly; it’s basically impossible to test nature Vs nurture in separated-at-birth twins because the act of separating twins at birth spikes the likelihood of having ADHD so much.

    But honestly I think the largest contributor to increased ADHD cases is not that we’re better at diagnosing it, it’s that modern society increasingly warrants its diagnoses. 12000 years ago ADHD traits weren’t a disorder, as much as having different physical strength or height to your peers isn’t. Modern capitalist society demands an efficiency of its workforce and ADHD is an inherently inefficient trait, and therefore suddenly warrants treatment.

    Don’t get me wrong, medication is incredible, and has turned days I’ve barely been able to get out of bed into productive days, but that’s still valuing being productive.


  • When I was still using Instagram reels, I was always amazed how quickly the algorithm figured me out. If I hesitated for even a second on a reel, it would amend my next ones immediately. I assume the real trick is comparing it to the average time spent on a reel, everyone spends longer on a wall of text reel, but when I stop on a Linux reel for an extra second, I’m immediately in the 1% for engagement.

    I read something years ago about how your phone keyboard tracks your recommended words, it knows if you’re more likely to type apple or Apple, or if you type soup more than average, and any app that gets that data and compares it to the baseline has an instant, in depth profile on you.


  • I’ve seen quite a lot recently saying a particularly distracting aspect of phones isn’t that they’re a screen and a visual stimulus, but a tool and a haptic stimulus.

    An increasingly popular way to combat checking your phone while watching TV is to busy your hands with something. If this works and is widely adopted, we won’t need shows to have second-screen writing repetition; our brains tell our hands to use the tool, and it just so happens that the tool is full of text and speech and occupies the language center of our brains, meaning we stop listening to the show.


    Also, a whole separate thing I often think about, before 2010, there were very few high budget TV shows. TV was made on a much smaller budget than film, and the writing often took a hit too, and that was just the reality of watching TV. They were also designed to hook people who were clicking around channels with lots of recaps and narrative refreshers, for people tuning in halfway through, this is like the second-screen writing issues we complain about now on steroids, straight to TV movies were also terrible for this.

    Movies that were designed for Cinema revenue weren’t impacted by this or course, but even DVD revenue movies often have simpler plots and reiterate their narratives for people who are half watching while chatting or stoned or whatever.



  • Compared to crypto and NFTs, there is at least something in this mix, not that I could identify it.

    I’ve become increasingly comfortable with LLM usage, to the point that myself from last year would hate me. Compared to projects I used to do with where I’d be deep into Google Reddit and Wikipedia, ChatGPT gives me pretty good answers much more quickly, and far more tailored to my needs.

    I’m getting into home labs, and currently everything I have runs on ass old laptops and phones, but I do daydream if the day where I can run an ethically and sustainably trained, LLM myself that compares to current GPT-5 because as much as I hate to say it, it’s really useful to my life to have a sometimes incorrect but overalls knowledgeable voice that’s perpetually ready to support me.

    The irony is that I’ll never build a server that can run a local LLM due to the price hikes caused by the technology in the first place.





  • As much as I don’t disagree, I think the “Apple is closest to Nazism” comment touches on something different. Other massive American companies have awful practices but they don’t care particularly how their way of making money looks. Apple wields a specific aesthetic power that generally dictates a hegemonic uniformity, that strays the line of being to their detriment at times. I don’t think any other big tech company would care in the same way if not for their desire to copy Apple.




  • Blurry photos is fine to make an stylistic choice. The 2019 movie The Lighthouse stylistically looked like a 1920s film, before modern music intentionally used bitcrushing, it used vinyl cracks, boomer shooters made in this decade intentionally look like 1990s Doom clones.

    When a medium’s shortcoming is patched by technology, it ultimately becomes an artifact of the era where it was accidental. Once a few years have passed, it becomes more synonymous with the era than the mistake.

    It’s not necessarily nostalgia, Gen Alpha and the younger half of Gen Z never grew up without smartphones, so they don’t miss the era of poor film photography. Although every generation does this simulation of forgotten mistakes, it’s particularly poignant now, where the high quality, perfectly lit, professional feeling photos convey something artificial, i.e. smartphone software emulating camera hardware, faces tuned with filters or outright AI generated content. Even if it’s false imperfection, the alternative is false perfection.

    Art using deliberate imperfections that were unavoidable in the past is romanticising something perceived as before commercialism, and that’s admirable.


  • Making it up as you go along isn’t inherently bad. Nine times in ten I prefer a story which is planned out but basically any medium that’s open to additional seasons, novels, sequels, etc is capable of falling into this category.

    It’s only really a sin when the medium promises a long form mystery while doing this, hence the fact Lost is #1 here. Sherlock Holmes was written as episodic mystery and Arthur Conan Doyle clearly never planned future stories as he went and nobody minded. Togashi, the manga author for Hunter x Hunter stumbled into his most famous arc just because he’d made his metaphysic and societies up as he went and the stars aligned, leading to the Chimera Ant arc. The Simpsons rarely ever changes it’s status quo between episodes, and therefore can be made up as it goes along, because it’s going nowhere. Breaking Bad literally changed the ending of season one to not kill Jesse partly due to the writers strikes and subsequent shortening of the season, and Mike as a character exists because Bob Odenkirk was busy.

    Any medium that decieves the audience, promising a well reasoned, long form mystery without any planning of what that mystery is, is bad. Perhaps you’ll strike gold and have an epiphany as to how to bring the plot together perfectly, but that’ll just be luck. Ultimately this is an expression of consumerism; baiting the expectations of art and narrative to deceive the audience for nothing more than engagement, and therefore money.



  • I’m trying to make my own smart watch as a hobby experiment at the moment, and one of my most important features is NFC payments. It’s a nightmare, although I understand why. Currently my plan is to buy another smart watch or smart ring and take the NFC chip from it, which is maddening, but more or less my only option due to contactless payment security.

    To do contactless payments, your bank must effectively permit the specific device, otherwise go through GPay or Apple Pay, who in turn just do the permitting themselves. Anything outside of the standard ecosystem just gets overlooked.

    The best workaround while avoiding these companies is to find a smart watch or ring that has compatibility with a proxy card, such as Curve. But beyond halving the price of the accessory, this is pretty much an arbitrary decision.