A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.

  • e0qdk@kbin.social
    link
    fedilink
    arrow-up
    50
    arrow-down
    2
    ·
    2 years ago

    This story may be amusing, but it’s actually a serious issue if Apple is doing this and people are not aware of it because cellphone imagery is used in things like court cases. Relative positions of people in a scene really fucking matter in those kinds of situations. Someone’s photo of a crime could be dismissed or discredited using this exact news story as an example – or worse, someone could be wrongly convicted because the composite produced a misleading representation of the scene.

    • falkerie71@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      2 years ago

      I see your point, though I wouldn’t put it that far. It’s an edge case that has to happen in a very short duration.
      Similar effects can be acheived with traditional cameras with rolling shutter.
      If you’re only concerned of relative positions of different people during a time frame, I don’t think you need to be that worried. Being aware of it is enough.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 years ago

        With all the image manipulation and generation tools available to even amateurs, I’m not sure how any photography is admissible as evidence these days.

        At some point there’s going to have to be a whole bunch of digital signing (and timestamp signatures) going on inside the camera for things to be even considered.

  • estutweh@aussie.zone
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    5
    ·
    2 years ago

    Seriously? She almost vomited because the photos didn’t match? Give me a fucking break!

    • elint@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      2 years ago

      You think that’s absurd? Have you never gotten married? Wedding photos are extremely important and while “she almost vomited” may be hyperbole, I can definitely understand being very pissed off if that was the only version of the photo. Our wedding photographer whitened our teeth in our photos and we requested that they undo that so we look like ourselves. The sentiment was nice, but we didn’t want that. I would have been pretty unhappy if they hadn’t held onto the originals and were unable to revert our teeth back to their normal shades. Photos of our bridal showers and dress hunting were nearly as important as the wedding photos themselves. I can understand being upset with this undesired result.

  • slaacaa@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    8
    ·
    edit-2
    2 years ago

    Uhm, ok?

    The way the girl’s post is written, it’s like she found out Apple made camera lenses from orphans’ retinas (“almost made me vomit on the street”). I assumed it was well known that iPhone takes many photos and stitches the pic together (hence the usually great quality). Now the software made a mistake, resulting in a definitely cool/interesting pic, but that’s it.

    Also, maybe stop flailing your arms around when you want your pic taken in your wedding dress.

  • NaoPb@eviltoast.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 years ago

    Ah yes, I remember noticing it would make like a short video instead of one picture, back when I had an iPhone. I turned that function off because I didn’t see the benefits.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 years ago

      That’s not what this is. I also turned that off, it’s called “Live Photo” or something like that. Honestly I find it to be a dumb feature.

      What this is, is the iPhone taking a large number of images and stitching them together for better results.

      • jol@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        5
        ·
        2 years ago

        It’s not dumb. It let’s you select the best moment within a 1-2 second margin after or before you took the picture.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          11
          ·
          2 years ago

          No, these are literally just short videos. You interact with them like photos, you see them as photos, half the time people sending them think they are photos, but when you tap all the way into them they are a short video. They are absolutely not presented as a “choose your exact frame” pre-photo things, they are presented as photos.

          • Blue and Orange@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 years ago

            Yeah “Live photo” really is just an Apple marketing term. You interact with them in a certain way on iOS and they are presented in a certain way, but anywhere else they’re just very short videos.

          • locuester@lemmy.zip
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            5
            ·
            2 years ago

            Wrong. Pretty crazy, it does let you change which frame is the photo. Click edit, then hit the Live Photo icon next to “cancel”

            • KairuByte@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              2 years ago

              That isnt the point of a Live Photo, that’s just a “feature.” Similar to how YouTube lets you choose a thumbnail for a video, but that’s not really the point of YouTube.

              • locuester@lemmy.zip
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                2 years ago

                Per Apple support:

                With Live Photos, your iPhone records what happens 1.5 seconds before and after you take a picture. Then you can pick a different key photo, add a fun effect, edit your Live Photo, and share with your family and friends.

                So it’s actually the first example of what Live Photo is for.

                If you didn’t even know about this, don’t feel bad. I’m an Apple fanboy and my daughter just showed me that it allowed you to do this “different key photo” last month. Kids are good for that.

                • KairuByte@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  2 years ago

                  I’m aware that’s it’s possible, but that isn’t part of the onboarding or anything. What I mean is, it’s an addon. It was never part of the original iteration, which was just “look moving Harry Potter photos.”

                  It’s a gimmick that doesn’t even work cross device, because it’s literally just a short video.

  • jtk@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    2 years ago

    Who wants photos of a fake reality? Might as well just AI generate them.

    • hitmyspot@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 years ago

      A photo is a fake reality. It’s a capture of the world from the perspective of a camera that no person has ever seen.

      Sure we can approximate with viewfinders and colour match as much as possible but it’s not reality. Take a photo of a light bulb, versus look at a light bulb, as one obvious example.

      This is just one other way to get less consistency in the time of different parts of the photos, but overall better capture what we want to see in a photo.

      • dan1101@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        However I think most cameras and most people traditionally have wanted the most accurate photos possible. If the camera is outputting fiction that can be a big problem.

        • nyan@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 years ago

          Oh, dear. No, in most cases people seem to want the prettiest photos possible. Otherwise digital filters wouldn’t be so popular.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      2 years ago

      To their credit, it’s not “fake”. This isn’t from generative AI, this is from AI picking from multiple different exposures of the same shot and stitching various parts of them together to create the “best” version of the photo.

      Everything seen in the photo was still 100% captured in-lens. Just… not at the exact same time.

    • ByGourou@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      It’s not the case as someone already explained, but also, who care about the photo being fake ? People take photos to show to other people and keep a memory, and that photo looking better than reality is usually not an issue. I would still prefer choice with a toggle somewhere, which we will never get with an Apple product.

  • restingboredface@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    I may have missed this in the comments already but it is really important to note here that the article says the photo was taken using panorama mode, which is why the computational photography thing is even an issue. If you have used panorama mode ever you should go in expecting some funkiest, especially if someone in the shot is moving, as the bride apparently was when it was shot.

  • aeronmelon@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 years ago

    It’s a really cool discovery, but I don’t know how Apple is suppose to program against it.

    What surprises me is how much of a time range each photo has to work with. Enough time for Tessa to put down one arm and then the other. It’s basically recording a mini-video and selecting frames from it. I wonder if turning off things like Live Photo (which retroactively starts the video a second or two before you actually press record) would force the Camera app to select from a briefer range of time.

    Maybe combining facial recognition with post processing to tell the software that if it thinks it’s looking at multiple copies of the same person, it needs to time-sync the sections of frames chosen for the final photo. It wouldn’t be foolproof, but it would be better than nothing.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 years ago

      Or maybe just don’t move your arm for literally less than a second while the foto(s) is/are taken… Moving your arm(s) down happens in less than a second if one just let them fall by gravity. It’s a funny pic nonetheless.

  • kirklennon@kbin.social
    link
    fedilink
    arrow-up
    2
    arrow-down
    3
    ·
    edit-2
    2 years ago

    This person is an actress and comedian. This is not an iPhone error; it’s just a manually-edited photo from three separate takes that she pretended came out of the phone as-is. It’s a hoax for laughs/attention.

  • RocketBoots@programming.dev
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    9
    ·
    2 years ago

    Folks, the semiconductor devices in these cameras just work this way. Ccd chips (are those still what’s used?) have been a thing for years. They take a bunch of smaller images, stitch them together and then do some basic post processing on the device. This is just how these devices function. No one’s writing code to rewrite reality. At most, there is some simple AI being used to touch up things like lighting and skin tone, but it’s the same sort of tech being used on you on a zoom call.

    The real question to be asked here is did she say yes to the dress.