• accideath@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    12 days ago

    Most people do know how to use a computer though. Windows and macOS have been around for a very long time by now, and both have not required you to use the CLI for anything but very extreme cases in more than 25 years. You’re not starting with a blank slate. They know how a GUI is supposed to work. It is self explanatory to them. Shoving them towards a CLI is making them relearn stuff they already knew how to do. There’s a reason a lot of Windows migrants end up with KDE or Cinnamon. It’s familiar, it’s easy. Most people do in fact associate a cog with settings. CLI aren’t familiar to most people and thus a much larger hurdle.

    Also, I’m not talking about fixing problems. The CLI is a perfectly valid tool to fix problems. Not everything has to be graphical. Just enough that you don’t need it unless something breaks.

    • Nalivai@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 days ago

      Most people do know how to use a computer though.

      That was kind of true for a brief period of time. And even then it wasn’t true entirely. Now most people encounter a computer when they enter the workforce. They know shit about shit, they never had to tinker with computers, most of them never had one outside of some chromebook that allowed them to render two web pages. In most cases they start from basically blank slate.

      Most people do in fact associate a cog with settings.

      Most people don’t know that it’s cog. Most people don’t know it’s a button. Most people don’t have concept of a button in mind. Most people entering workforce right this moment never used a mouse to press a cog button in their life. Unless they’re in IT or engineering.

      Also, I’m not talking about fixing problems

      This is usually when you kind of required to use console on Linux, that’s why I was talking about it.

      But my broader point was against so called intuitive self-explanatory nature of the menu you have to click with your mouse.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        Of course they know how to use a computer. They don’t know a thing about how a computer works but that doesn’t mean they can’t use it. Heck, my 8 y/o cousin can figure out how to open and play Minecraft on his tablet. No need for him to know about commands, programming languages and bits n bytes.

        Most people these days know how to use their phones, at the very least, and even there cog = settings. Most people don’t know how to use a CLI or how a spreadsheet program works, but they certainly can use a browser on a computer. Which is also a form of using a computer.

        And maybe they don’t explicitly know it’s a button. But they know if they tap or click on a cog it takes them to settings.

        And even figuring out how a mouse works is a thing of a few seconds, if all you’ve used before was a touchscreen (or even nothing at all). There‘s a reason they took off in the first place.

        Although, if someone truly has never used a computer in any shape or form before. No smartphone, no tablet, not even a smart TV, you‘d probably have a point that it’s not much more difficult for them to learn the common iconography than it would be to learn the CLI. But people rarely start with such a blank slate today.

        Don’t get me wrong, I don’t think it’s a good thing, people are less and less tech literate these days. But my point is, tech illiteracy doesn’t mean they have never used any computer ever and do not know what an app- or settings-icon is. I’d wager it’s more the other way around: People are so used to their devices working and their UIs looking pretty (and very samey) that iconography like cogs for settings are especially self explanatory to them. It’s the same on their phone, tablet and even TV after all.