• KittyBobo [he/him, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 months ago

    I mean without AI they’d just be using bad photoshop. Heck if you got someone who was good at photoshop and could make realistic propaganda that’d be worse than AI that can easily be picked apart.

      • DamarcusArt@lemmygrad.ml
        link
        fedilink
        English
        arrow-up
        13
        ·
        8 months ago

        They didn’t even bother to rotate and angle it properly, it’s like they just copy-pasted the text and called it a day 😂

    • gramathy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I wonder if the forensic techniques to identify photoshopped images /altered audio work on ai-generated media?

      I know you can timestamp audio to a specific point in time by matching the frequency of the background electrical hum if it’s available, so if it should be available but isn’t that could indicate a video or audio clip is fake, and that differences in image grain/quality can identify patchwork images, but does that also come out in AI-generated images?