The camera calibration method aims to identify the geometric characteristics of the image creation process. You often classify the camera based on intrinsic parameters such as the skew of the axis, focal length, and principal point, while you describe its orientation with extrinsic parameters like rotation and translation. Linear or nonlinear algorithms estimate intrinsic and extrinsic parameters by utilizing known points in real-time and their projections in the picture plane.
We can define camera calibration matrix as the technique used to estimate the characteristics of a camera. This involves obtaining all the necessary information such as parameters or coefficients of the camera to establish an accurate relationship between a 3D point in the real world and its corresponding 2D projection in the image captured by the calibrated camera.
Intrinsic Parameters
https://mphy0026.readthedocs.io/en/latest/calibration/camera_calibration.html
Intrinsic parameters include: - Scale factor (often equal to 1) - Focal length (distance between the centre of projection an the image plane) - principal point (assume pixel 0,0 to be at the centre of the image) - Skew (when the principal point is not exactly at the centre of the image plane) - Geometric distortion (due to the lens).
OpenCV- Camera Calibration
- https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html
- https://www.mathworks.com/help/vision/ug/camera-calibration.html
Intrinsic parameters are specific to a camera. They include information like focal length (fx, fy ) and optical centers ( cx, cy). The focal length and optical centers can be used to create a camera matrix, which can be used to remove distortion due to the lenses of a specific camera. The camera matrix is unique to a specific camera, so once calculated, it can be reused on other images taken by the same camera. It is expressed as a 3x3 matrix:

Extrinsic parameters corresponds to rotation and translation vectors which translates a coordinates of a 3D point to a coordinate system.