My observations on the orb/plane videos (frame rate, aspect ratio, cropping, stereo, background noise), plus 3D versions

Generate New Template

public (1567) , uap (1561) , disclosure (1383) , ufo (1296) , transparency (1220) , government (1066) , information (1011) , disinformation (943) , ufos (919) , campaign (862) , congress (769) , people (678) , phenomena (648) , national (632) , urge (586) , regarding (565) , issue (565) , unidentified (561) , security (538) , truth (506) , american (506) , support (503) , potential (498) , know (477) , trust (469) , uaps (467) , writing (462) , accountability (461) , time (424) , scientific (410) , intelligence (381) , understanding (370) , act (363) , rep (357) , members (354) , defense (331) , action (314) , efforts (311) , research (310) , committee (309) , objects (302) , related (301) , legislation (297) , house (290) , secrecy (288) , oversight (286) , template (283) , ndaa (282) , concern (280) , being (274) ,

 Read More 

We have 3 videos:

2014-05-19 (RegicideAnon, “satellite”, stereo):

https://web.archive.org/web/20140525100932/http://www.youtube.com/watch?v=5Ok1A1fSzxY

direct MP4: here

2014-06-12 (RegicideAnon, “drone”, infrared):

https://web.archive.org/web/20140827060121/https://www.youtube.com/watch?v=ShapuD290K0

direct MP4: here

2014-08-25 (Area-Alienware, “satellite”, less cropped):

https://vimeo.com/104295906

I went looking for oddities which could reveal artificiality.

Frame rate

I don’t really have any video editing software so I split the RA “satellite” video (1) into individual frame images for easier analysis. This command gives 666 MB of output:

ffmpeg -i ‘Satellite Video – Airliner and UFOs.mp4’ -frames:v 1643 ‘%04d.png’

(I specified the frame count limit because the second minute of the video is blank anyway.)

The video is encoded at a film standard 24 fps, but the plane and orbs and background noise only update exactly once every fourth frame. I.e., the original satellite(s) apparently captured video at exactly 6 fps. This value seems like an unusual choice regardless of whether it is a real satellite recording or a 3D render.

The cursor and GPS coordinates and screen panning all move at the full 24 fps. This difference is remarkable. It implies either that a real screen recorder recorded playback of a real source video, and the frame rate difference between the two is natural, or that someone went to the specific and deliberate extra effort to render motion of the fake scene at 6 fps and then do fake screen panning at 24 fps. It’s odd.

On the other hand, I notice that the plane video updates once every fourth frame remainder 0. For example, the very first source frame is displayed for 4 frames exactly, and not 1, 2, or 3. If it were a natural screen capture, would the timing of the screen recording and the source video be aligned like that? Unless someone knows a natural technical reason why they should align, it has only a 1 in 4 chance of happening by accident. That’s more consistent with a rendered fake.

The Vimeo video (3) was encoded at 30 fps (actually NTSC 29.97003). However, it does not have any new/interesting/different frames; it is simply the same 24 frames duplicated to re-encode it as 30 fps.

The infrared “drone” video (2) is encoded at 24 fps. For the first 1:02, each frame is unique, so this video shows much smoother motion of the orbs. (After 1:02, the video is slow motion replays.) I split the first 1:02 to frames:

ffmpeg -i ‘UAV-Captures Airliner and UFOs.mp4’ -frames:v 1490 -vf crop=960:720 ‘%04d.png’

I didn’t really learn much from the drone video. For some reason, I couldn’t synchronize the two videos. I tried measuring the time between two identifiable moments: the arrival of orb #3, and the teleport flash, but the drone video seemed to run about 5% faster (27.29s vs 28.67s). I’m not sure if that’s my error or not.

Aspect ratio

The satellite videos are 16:9. The RA version on YouTube contains two stereo images side-by-side, which should be 16:9 each, but are presented squashed to half width. So for the full effect, play it back at 32:9.

The drone video is 4:3, except that the edits or encoding have letterboxed it to 16:9.

Cropping

The RA video is slightly “cropped”. Or more precisely, there are black bars over the left and right edges of the video, when compared with the Vimeo version. (There is no difference in the vertical edges.)

This blacking out of the edges removes some of the clouds and, notably, the “NROL” text.

This blacking out is done identically in each stereo half.

The width of the blacked out edges is a round number: if each stereo half is displayed at its correct aspect at 1280×720, the black edge bars are exactly 50 pixels wide.

In my opinion this “cropping” cannot be any accident. Note that if the original satellite software displayed a stereo image side-by-side, the screen recording software would not preserve information about the layout of the two stereo parts. So to manually black out the same portion of both halves of the video, it’s not a one-click job. This would have been done carefully, suggesting someone specifically wanted you to notice or not notice the “NROL” text. Or could the cropping be an intrinsic effect of some stereo display software?

The vertical cropping is also curious, cutting the text in half as it does. You can read the GPS coordinates, if and only if you really try. That doesn’t necessarily indicate realness or fakeness, but it suggests deliberateness. Someone wanted you to spend time analyzing this.

Stereo

The RA satellite video is in 3D stereo.

I do not understand the supposed magic that allows satellites hundreds of miles away to capture in stereo, especially in a freely-targetable direction, but I do also not understand enough to dispute it.

The Vimeo version of the video is not stereo. I hypothesized that, assuming these were 3D renders, the Vimeo video could be a separate render done with stereo mode switched off, which would render the scene using a centered camera; unless there are 3 satellite cameras, having a center view would reveal it as fake. I overlaid the Vimeo video on top of the RA video and rolled the window opacity up and down to see which half matched. Result: The Vimeo video is the left half of the RA video, and not a separate center render. Demo here. (So this particular test fails to reveal it as fake. If it’s fake, either the creator anticipated this test, or their particular 3D editing workflow didn’t allow for this mistake to creep in anyway.)

Anyway, since we have a 3D image, let’s view it in 3D. This was very difficult because the differences in distance in this scene are so extremely subtle. I spent the day experimenting with different ways of viewing the 3D. First I shifted the left image 5 pixels right, and the right image 5 pixels left; this moved the center pivot point pretty much exactly where the plane is, and makes it easier to tell the difference between things closer than the plane and things further than the plane. Then I combined the left and right halves in a few different ways:

Wobble vision: https://youtu.be/r0BRmA3Nwt0 Red-cyan anaglyph: https://youtu.be/LWvAoKeXCvw (Red on right edge = close; Cyan on right edge = far.) Embossed: https://youtu.be/1wqxPrLEP_c (I simply subtracted the brightness of the pixels in one half from the other. White on right edge = close. Black on right edge = far.)

(Download these files: https://drive.google.com/drive/folders/1cQoYJZ6iIixYBfwkl-dcTkkjJXw0nHzH)

The embossed video is the ugliest but turned out to be the most useful in showing the very subtle differences in distance. In particular, the embossed video shows definitively that the orbs are moving around the plane in 3D. This still doesn’t mean the video is genuine, but it limits the ways it could have been faked. It is not a simple 2D edit of another video.

Noise

While stepping between frames, I suddenly noticed that the flickering patterns in the random background noise in the sky are much more correlated between the left and right halves of each frame than from one frame to the next. To me this was surprising. This means it cannot be, for example, digital camera CCD noise, because each camera would have a completely different random pattern.

Is this pattern adequately explained by atmospheric perturbations? If you point two identical cameras at the sky, at the same time, will they show matching “random” patterns? How close do the cameras have to be?

Or does this prove that an artificial pseudo-random noise filter e.g. Perlin noise was used, and the creator didn’t think to change the seed number for the second half? I am tired I do not know.

To do

There are other things I pondered about, but ran out of energy for:

Find earlier source: Because the RA video is stereo and the Vimeo video is less cropped, neither is a superset of the other; neither can be the direct origin of the other. The Vimeo video description says it was “published on a Ufology site” but does not say where.

Thermal analysis: Do the colors in the infrared video make sense for the aircraft involved? (And how is the teleportation flash simultaneously cold yet bright?)

3D: Can satellites really do this?

submitted by /u/midir
[link] [comments]