• bridgeburner@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      14 hours ago

      Works in theory, but not in practice, as there are no tools that can tell 100% reliably if something was written by AI. Best way IMO to test students is via oral exams. Let them explain certain things and topics they allegedly wrote about in their thesis; that way you can quickly see who actually bothered to learn and understand and not only let their thesis write by AI.

        • laranis@lemmy.zip
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          8 hours ago

          This is it. Stop it with these take-home test. Homework was always bullshit.

          But then professors and teachers would have to think themselves instead of regurgitating the same lesson plan and worksheets from two decades ago

          Hot take incoming: good public school teachers are criminally underpaid. Most teachers are paid exactly what they’re worth.

      • dustyData@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        13 hours ago

        There was this cool initiative by a professor who is a friend of mine. He would give a pretty standard homework, but then the additional instructions were to complete said homework using an LLM. Then, the students would have to write, by hand, an analysis of all that the LLM got wrong, or could’ve done better. They then proceeded to discuss their analysis in class. Participating in the discussion with actual meaningful arguments was half of the points, the other half being the quality of the handwritten analysis.

        It was more work, but at least the fuckers quickly appreciated that the machine was actually shit at doing their homework, and even if it could pass, it would be with the bare minimum. It also pruned the students who actually wanted to learn from the slackers who were just wasting their parent’s money.

      • Atomic@sh.itjust.works
        link
        fedilink
        arrow-up
        5
        ·
        13 hours ago

        It’s also important for parents to genuinly take an interest in their children’s education. Help them understand why we don’t use AI for school-work. And be there for them when they need help so they don’t have to resort to AI when it feels hopeless.

        I remember a study we all read when I was working as a sub teacher. Ages 7 - 12. How much time does an average parent spend talking to their child on an average day. Giving commands is not talking for the purpose of the study.

        5 minutes. It was 5-6 minutes. It explains a lot doesn’t it.

        • forkDestroyer@infosec.pub
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          There are plenty of negligent parents out there, but also plenty who don’t have the time because they gotta pay those bills that crept up on them, especially in this economy.

    • Jankatarch@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      5 hours ago

      They are fine with cheating as long as you are using microsoft peoducts.

      My uni gives everyone multiple chatbot accounts. (Paid by our tuition money no doubt)

      Also all programming general events since I became a student have been “Make app for X purposes but with AI in it.”

      I seen a TA open up their grok history when I asked why my answer to an exam question was wrong.

      Universities don’t actually give shit about “academic integrity,” they simply want to buy bunch of microsoft products and call it a day.

      Related Rant:
      Recently had one professor try a “group exam” and worked with people requiring chatgpt to do “that one recursive fibonacci assignment.”
      I think I have changed as a person.