Read More 

We have 3 videos:

2014-05-19 (RegicideAnon, “satellite”, stereo):

direct MP4: here

2014-06-12 (RegicideAnon, “drone”, infrared):

direct MP4: here

2014-08-25 (Area-Alienware, “satellite”, less cropped):

I went looking for oddities which could reveal artificiality.

Frame rate

I don’t really have any video editing software so I split the RA “satellite” video (1) into individual frame images for easier analysis. This command gives 666 MB of output:

ffmpeg -i ‘Satellite Video – Airliner and UFOs.mp4’ -frames:v 1643 ‘%04d.png’

(I specified the frame count limit because the second minute of the video is blank anyway.)

The video is encoded at a film standard 24 fps, but the plane and orbs and background noise only update exactly once every fourth frame. I.e., the original satellite(s) apparently captured video at exactly 6 fps. This value seems like an unusual choice regardless of whether it is a real satellite recording or a 3D render.

The cursor and GPS coordinates and screen panning all move at the full 24 fps. This difference is remarkable. It implies either that a real screen recorder recorded playback of a real source video, and the frame rate difference between the two is natural, or that someone went to the specific and deliberate extra effort to render motion of the fake scene at 6 fps and then do fake screen panning at 24 fps. It’s odd.

On the other hand, I notice that the plane video updates once every fourth frame remainder 0. For example, the very first source frame is displayed for 4 frames exactly, and not 1, 2, or 3. If it were a natural screen capture, would the timing of the screen recording and the source video be aligned like that? Unless someone knows a natural technical reason why they should align, it has only a 1 in 4 chance of happening by accident. That’s more consistent with a rendered fake.

The Vimeo video (3) was encoded at 30 fps (actually NTSC 29.97003). However, it does not have any new/interesting/different frames; it is simply the same 24 frames duplicated to re-encode it as 30 fps.

The infrared “drone” video (2) is encoded at 24 fps. For the first 1:02, each frame is unique, so this video shows much smoother motion of the orbs. (After 1:02, the video is slow motion replays.) I split the first 1:02 to frames:

ffmpeg -i ‘UAV-Captures Airliner and UFOs.mp4’ -frames:v 1490 -vf crop=960:720 ‘%04d.png’

I didn’t really learn much from the drone video. For some reason, I couldn’t synchronize the two videos. I tried measuring the time between two identifiable moments: the arrival of orb #3, and the teleport flash, but the drone video seemed to run about 5% faster (27.29s vs 28.67s). I’m not sure if that’s my error or not.

Aspect ratio

The satellite videos are 16:9. The RA version on YouTube contains two stereo images side-by-side, which should be 16:9 each, but are presented squashed to half width. So for the full effect, play it back at 32:9.

The drone video is 4:3, except that the edits or encoding have letterboxed it to 16:9.


The RA video is slightly “cropped”. Or more precisely, there are black bars over the left and right edges of the video, when compared with the Vimeo version. (There is no difference in the vertical edges.)

This blacking out of the edges removes some of the clouds and, notably, the “NROL” text.

This blacking out is done identically in each stereo half.

The width of the blacked out edges is a round number: if each stereo half is displayed at its correct aspect at 1280×720, the black edge bars are exactly 50 pixels wide.

In my opinion this “cropping” cannot be any accident. Note that if the original satellite software displayed a stereo image side-by-side, the screen recording software would not preserve information about the layout of the two stereo parts. So to manually black out the same portion of both halves of the video, it’s not a one-click job. This would have been done carefully, suggesting someone specifically wanted you to notice or not notice the “NROL” text. Or could the cropping be an intrinsic effect of some stereo display software?

The vertical cropping is also curious, cutting the text in half as it does. You can read the GPS coordinates, if and only if you really try. That doesn’t necessarily indicate realness or fakeness, but it suggests deliberateness. Someone wanted you to spend time analyzing this.


The RA satellite video is in 3D stereo.

I do not understand the supposed magic that allows satellites hundreds of miles away to capture in stereo, especially in a freely-targetable direction, but I do also not understand enough to dispute it.

The Vimeo version of the video is not stereo. I hypothesized that, assuming these were 3D renders, the Vimeo video could be a separate render done with stereo mode switched off, which would render the scene using a centered camera; unless there are 3 satellite cameras, having a center view would reveal it as fake. I overlaid the Vimeo video on top of the RA video and rolled the window opacity up and down to see which half matched. Result: The Vimeo video is the left half of the RA video, and not a separate center render. Demo here. (So this particular test fails to reveal it as fake. If it’s fake, either the creator anticipated this test, or their particular 3D editing workflow didn’t allow for this mistake to creep in anyway.)

Anyway, since we have a 3D image, let’s view it in 3D. This was very difficult because the differences in distance in this scene are so extremely subtle. I spent the day experimenting with different ways of viewing the 3D. First I shifted the left image 5 pixels right, and the right image 5 pixels left; this moved the center pivot point pretty much exactly where the plane is, and makes it easier to tell the difference between things closer than the plane and things further than the plane. Then I combined the left and right halves in a few different ways:

Wobble vision: Red-cyan anaglyph: (Red on right edge = close; Cyan on right edge = far.) Embossed: (I simply subtracted the brightness of the pixels in one half from the other. White on right edge = close. Black on right edge = far.)

(Download these files:

The embossed video is the ugliest but turned out to be the most useful in showing the very subtle differences in distance. In particular, the embossed video shows definitively that the orbs are moving around the plane in 3D. This still doesn’t mean the video is genuine, but it limits the ways it could have been faked. It is not a simple 2D edit of another video.


While stepping between frames, I suddenly noticed that the flickering patterns in the random background noise in the sky are much more correlated between the left and right halves of each frame than from one frame to the next. To me this was surprising. This means it cannot be, for example, digital camera CCD noise, because each camera would have a completely different random pattern.

Is this pattern adequately explained by atmospheric perturbations? If you point two identical cameras at the sky, at the same time, will they show matching “random” patterns? How close do the cameras have to be?

Or does this prove that an artificial pseudo-random noise filter e.g. Perlin noise was used, and the creator didn’t think to change the seed number for the second half? I am tired I do not know.

To do

There are other things I pondered about, but ran out of energy for:

Find earlier source: Because the RA video is stereo and the Vimeo video is less cropped, neither is a superset of the other; neither can be the direct origin of the other. The Vimeo video description says it was “published on a Ufology site” but does not say where.

Thermal analysis: Do the colors in the infrared video make sense for the aircraft involved? (And how is the teleportation flash simultaneously cold yet bright?)

3D: Can satellites really do this?

submitted by /u/midir
[link] [comments]