DSPRelated.com
Forums

Spherical Mapping/Unmapping

Started by napierm 7 years ago6 replieslatest reply 7 years ago192 views

Hello,

I'm working a problem where I need to take the image of a ball and map it to a 3D sphere. Then rotate that sphere and map it back to a 2D image.  The idea is to do a cross correlation of the rotated image with another shot of the ball and try to figure out the rotation from one picture to the next.

I have this "sort of working" but wind up with distortion due to the mapping process.  I get something that is close but the features are just different enough that cross correlation is iffy.

What I'm currently doing is using a spherical projection of the image disk and the Haversine formula to do the rotations.  I've tried a few weighting functions to alter the mapping of the disk to the sphere to emphasize the edge details.  Changes the shape a little but not enough to where it matches.

Also, my math is home-grown and doesn't easily map to available matrix math libraries.

Any good suggestions?

Thanks in advance,

Mark Napier


[ - ]
Reply by Tim WescottJuly 10, 2017

Any arbitrary rotation in 3D, small rotations, rotations in 2D only, what?

This may be a "can't get there from here" sort of problem, at least not without an exhaustive search.  I think you'd need to be able to make a regular polygon with thousands of points to do it, and them things don't exist.

[ - ]
Reply by napiermJuly 10, 2017

No, I'm using the spherical projection mapping that is the same as what is used to project a bit/bump map onto a sphere (or other irregular object) in image processing.  Basically just a grid array of the angles phi and theta in 3D polar coordinates.  Known as "direct polar" coordinates.

http://paulbourke.net/geometry/transformationproje...

Then knowing that I want to rotate some other point on the "globe" to be the new north pole I use the Haversine equations.

Problem is that there is a distortion introduced in my image to projection so my rotated image is not matching.  Or, to be honest, in the rotation step.  Need to expand my testing to understand exactly which step is in error.

My question is whether or not there are standard ways of doing this.




[ - ]
Reply by CedronJuly 10, 2017

Your distortion probably comes from failing to account for the perspective view introduced by your camera.

Your ultimate goal is to build a mapping from the pixel location on the screen to the longitude/latitude coordinates on the sphere.

This can be done is several ways.  The particulars of your arrangement might make efficient approaches possible.

Ced

[ - ]
Reply by jimelectrJuly 10, 2017

Hmmm... Maybe quaternions could help?

 https://en.wikipedia.org/wiki/Quaternions_and_spat...

Not that I'm an expert or anything.  About all I know is that by "encoding" 3D rotation into a 4-element array called a quaternion, errors don't accumulate the way they do by multiplying by rotation matrices.

HTH.

Jim

[ - ]
Reply by napiermJuly 10, 2017

Hey Jim,

That looks very nice.

Thank you for the link,

Mark


[ - ]
Reply by CedronJuly 10, 2017

What is the nature of the image?  For instance, is it a generated image or a picture taken by a camera?

I have home grown math for both cases.  Too complicated to espouse on here.  If this is of a commercial nature, I'm currently looking for projects, I could do this.

You can contact me through my blog page at the right.

Ced

dsprelated.com/blogs-1/nf/Cedron_Dawg.php

Just click on the little red envelope in the author blurb.