Complex Lens Shader

I’m in need of a lens shader. I want to preface this by saying I know the purpose of a lens shader is to alter the primary ray position and direction from the camera (which is what I need) however the documentation is practically non existent, and examples are simple at best, so I’m unaware what the capabilities are of the lens shader.

At the moment I expect this will be a mental ray shader (other renderers may apply in the future) used within maya.

Before getting into all kinds of details, I can explain in plain English what I’m looking for, and hopefully someone can inform me as to weather or not its even possible.

The physical setup:

  1. As you can see from the image, I need to cast a ray out from the projector placement to where it intersects the screen object, this ray will also be going through a lens, not just the projection matrix

  2. I’ll need to derive a normal from the viewer position to the intersection point

  3. Set the lens shader ray start to viewer position, and direction to the normal from step 2.

The problem I see is I don’t know if you can perform ray intersections with geometry during the lens shader stage, only blurb I found was from a pdf of nVidia’s website.
http://docs.autodesk.com/MENTALRAY/2013/JPN/mental-ray-help/files/manual/node103.html
“for example, lens shaders are called before an intersection is done and hence have no information such as the intersection point or the normal there.”

Not to mention I don’t know if I’ll ever be able to define this custom projector lens in terms of an equation in the first place.
If I can’t is it possible within a lens shader to sample an object and at a uv value and return the world position from it?

All the math for this is easy (especially if looking up the uv location from the object) but I have no idea how to implement it within the rendering process. (would a refraction shader do the trick?)

I’ve created a temporary workaround by laying out 2 uv sets on my model. One from the projector pixel map, and another using a 180 fisheye lens map. I’ve created an offset map (baked uv’s of one to the other) that can be used within the stmap node of nuke to warp the image from a fisheye render at dome center to the projector. Doing this however has 2 drawbacks

  1. Doing this warp post render will degrade the image quality, and introduce another render step before the render can be viewed.
  2. Dome center is not really my viewer’s perspective so, some image distortion can occur. (this could be fixed by altering uvs)

Anyone have any thoughts, or suggestions (perhaps mental ray isn’t the tool for this job)?
Thanks in advance.