Two mundane phenomena to investigate: the first part sounds like an issue with optics and/or software; I assume attempting to range something involves moving optical elements. Your description does read a bit like attempting autofocus with a damaged lens system. As for diamonds, this does sound like a kaleidoscope, which again suggests broken optics.
Can you see the edges between the diamond and the background? Typically, the motion occurs when you can see the edges, and disappears when you are far enough away to make the edges too thin to be seen.
Is your camera at equal distance to the object in all photographs?
NeRF has troubles with variation in the distance between the object and camera (which Mip-NeRF [0] solves).
The constant zooming is straight from a specific level in the game. The next level is similar and constantly zooms out. Those are the only two I've experience so far with that mechanic.
There's a camera wobble effect he's applying that actually moves the camera slightly from side to side - might account for what you're seeing? On the shadertoy code (https://www.shadertoy.com/view/Mdf3zM), remove the #define WOBBLE and you should see a clean zoom.
If you keep zooming in eventually the perspective flips. As you keep zooming in more the perspective gets warped and it creates a bizarre fish-eye effect.
I had about as much fun playing around with perspective warping as I did the finding things to look at in ASCII.
That's not unusual for computer vision (or any kind of sensor really), that kind of data is normally filtered and smoothed after that, and merged with previous frames or other sensors.
What would be worrying is the model misclassifying an object, not detecting it at all, or having the bounding box consistently off.
No, I didn't notice the movement you describe. But the color wheel's content appears to grow and shrink the slightest bit when I move my head towards it and away from it, respectively. Probably a halo effect.
Ha good catch. I think it's due to an incorrect raw radar point there. This is the reason why we need this tool: to quickly find these issues. There is actually an AR version of this where we have a human-size 3D point cloud - much easier to debug [1]
I went down this rabbit hole myself, it's great fun. On our camera we noticed The RGB sensor data lined up well moving right-to-left, but misaligned when scanning left-to-right, creating a subtle rainbow effect.
I haven't seen that. I have seen a rotating camera of a castle spire with the Z buffer culling backwards. It looked strange as the castle seemed to be rotating the wrong way. I recall the skybox was rendered properly adding to the illusion.
Edit: I realize what I saw is a textured version of what you are describing.
Apparently one common fix for this according to some Reddit post is to use framecapping. It might be that i am too biased against delta timing in games (ie. using the time delta between frames for animations) because i've seen way too many games break using it and is the reason i run any game older than ~7 years with framecapping enabled, but given the description on the site then i guess what is really happening is that since the game uses mostly baked lighting, they sample the environment lighting to apply to the entities and then -since the sampled positions, be it probes or lightmap lumels, are few and in fixed locations- they interpolate them as they entities move using the time delta between frames to make that interpolation look smooth. Then as the game runs in faster hardware, it ends up with smaller time deltas, which in turn breaks things because time deltas are evil :-P.
Though that is obviously a guess (that this is what happens, not that time deltas are evil, that is fact :-P), though i've done something similar at the past for getting light on dynamic entities from a static environment, so some things did click.
reply