Parameters for 180 FOV for Fisheye camera

17 views (last 30 days)
Hello, i am trying to obtain photorealistic images from the Unreal simulation, specificaly i need omnidirectional images. I am trying to use the 3D Fisheye camera block and it has the following properties:
Distortion center, image size, mapping coefficients, stretch matrix.
I looked into Scaramuzza's model but I cannot seem to understand how these properties relate to the FOV.
I want to obtain a 180degree FOV, what would be the desired coefficients/stretch etc ?

Accepted Answer

Hassaan
Hassaan on 15 Jan 2024
@Maciej Jankowski I know a very long answer but hopefully it would be step towards the right direction. Some of the steps I followed for one of my client. Thanks.
The 3D Fisheye camera block in Unreal Engine is used to simulate a fisheye lens camera, which is capable of capturing a wide angle of view. To achieve a photorealistic omnidirectional image with a 180-degree field of view (FOV), you'll need to adjust the properties of the fisheye camera to match the characteristics of a real fisheye lens that provides a hemispherical image.
Here's a breakdown of the properties you mentioned:
  1. Distortion Center: This is typically the center of the image where the fisheye distortion is symmetrical around this point. For most fisheye lenses, this would be the center of the image sensor (width/2, height/2).
  2. Image Size: The resolution of the camera output. The size you choose depends on your requirements for image detail and the capabilities of your rendering setup.
  3. Mapping Coefficients: These are the parameters that define the fisheye distortion. They are part of the equation that maps the 3D world points to the 2D image sensor.
  4. Stretch Matrix: This is used to adjust the aspect ratio of the resulting fisheye image. For a circular fisheye image, this would be an identity matrix because you don't want to stretch the image in any direction. If your fisheye image is elliptical, you might need to adjust this matrix to compensate and make it circular.
Scaramuzza's camera model provides a way to represent fisheye distortion through a polynomial equation. The coefficients in this model define how radial distortion operates on the image.
To achieve a 180-degree FOV, the mapping coefficients will need to be determined experimentally or derived from the lens equations that correspond to a fisheye lens with such a FOV. A true 180-degree fisheye lens will usually have specific lens distortion characteristics that you would need to replicate in the simulation.
A general approach to configuring the 3D Fisheye camera block for a 180-degree FOV:
  • Distortion Center: Set this to the center of your image size.
  • Image Size: Choose based on desired resolution (e.g., 1920x1080).
  • Mapping Coefficients: You will need to calibrate these based on the specific fisheye distortion you want to simulate. This might involve some trial and error or the use of calibration tools that can analyze a real fisheye lens and provide the coefficients.
  • Stretch Matrix: Set this to the identity matrix if you want a circular fisheye image without stretching.
For a fisheye lens simulating a 180-degree FOV, the mapping coefficients will depend on the specific distortion model you're using. A common model is the polynomial model where the radial distortion can be represented as:
r(u)=k1∗u+k2∗u3+k3∗u5+...
Here, r(u) is the distorted radius, u is the undistorted radius, and 1,2,3,...k1,k2,k3,... are the distortion coefficients.
You will need to adjust your mapping coefficients to ensure that lines that are supposed to be straight (excluding those that go through the distortion center) are curved to the correct extent, thereby ensuring a realistic simulation of a fisheye lens with a 180-degree FOV.
Specifically, for a 180-degree FOV, the coefficients should be set such that the edge of the fisheye circle corresponds to the edge of the hemisphere in your 3D scene. This would mean that if you drew a straight line from the center of the image to the edge, the length of this line (in image space) would map to a 90-degree angle from the optical axis in the scene space.
The coefficients are typically found through a calibration process using a real-world fisheye lens or are provided by the lens manufacturer. If Unreal Engine provides a calibration tool or preset fisheye lens configurations, those could be a good starting point.
Once you have set up your camera, you can verify that you are getting a 180-degree FOV by checking that the rendered image captures half of the entire 360-degree environment around the camera's position. If the camera setup is correct, the image should show this hemispherical view without any gaps or missing areas.
------------------------------------------------------------------------------------------------------------------------------------------------
If you find the solution helpful and it resolves your issue, it would be greatly appreciated if you could accept the answer. Also, leaving an upvote and a comment are also wonderful ways to provide feedback.
Professional Interests
  • Technical Services and Consulting
  • Embedded Systems | Firmware Developement | Simulations
  • Electrical and Electronics Engineering
Feel free to contact me.

More Answers (0)

Categories

Find more on MATLAB Support Package for USB Webcams in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!