当前位置：网站首页>Derivation of Σ GL perspective projection matrix
Derivation of Σ GL perspective projection matrix
20220423 16:41:28 【itzyjr】
Computer monitors are twodimensional surfaces . from OpenGL Rendered 3D The scene must be as 2D The image is projected onto the computer screen . The projection matrix is used for this projection transformation . First , It converts all vertex data from eye coordinates to clipping coordinates . then , By matching with the clipping coordinates w Component division , These clipping coordinates are also converted to normalized device coordinates (NDC).
Clipping coordinates ： The eye coordinates are now multiplied by the projection matrix , Become clipping coordinates . The projection matrix defines the visual cone —— How vertex data is projected onto the screen （ Perspective or orthogonal ）. It's called clipping coordinates , Because the transformed vertices (x,y,z) It's through talking to ±w_{clip} Cut by comparison .
Cone culling ( tailoring ) The operation is performed in clipping coordinates , Just divided by w_{clip} Before . Through and with w_{clip} Comparison , The clipping coordinates are tested x_{clip}、y_{clip} and z_{clip}. Because divided by w_{clip} Then become normalized NDC coordinate , So to satisfy 1<=x_{clip}/w_{clip}<=1, therefore x_{clip}∈[w_{clip},w_{clip}]; Empathy y_{clip}∈[w_{clip},w_{clip}],z_{clip}∈[w_{clip},w_{clip}]. If any clipping coordinates are less than w_{clip} Or greater than w_{clip}, The vertex will be discarded ( Cut out ).
therefore , We must remember , tailoring （ Cone culling ） and NDC The transformations are integrated into the projection matrix . Here's how to start from 6 Construct the projection matrix with two parameters ：left、right、bottom、top、near and far The boundary value .
then ,OpenGL The edges of the clipped polygon will be reconstructed ( See the two red lines in the figure below ).
The gray area in the following figure is the point that is reserved but not discarded , Satisfy ：x_{c},y_{c},z_{c}∈(w_{c},w_{c})
The following figure shows the perspective cone and normalized equipment coordinates (NDC)：
In perspective projection , The cone of the visual cone （ Eye coordinates ） Medium 3D Points are mapped to cubes (NDC);x The coordinates are from [l,r] To [1,1],y The coordinates are from [b,t] To [1,1],z The coordinates are from [n,f] To [1,1].
void glFrustum
(
// Specify the coordinates of the left and right vertical clipping planes .
GLdouble left, GLdouble right,
// Specify the coordinates of the bottom and top horizontal clipping planes .
GLdouble bottom, GLdouble top,
// Specifies the distance to the near depth clipping plane and the far depth clipping plane . Both distances must be positive .
GLdouble nearVal, GLdouble farVal
);
Please note that , Eye coordinates are defined in the righthand coordinate system , but NDC Use the lefthand coordinate system . in other words , The camera at the origin moves along in eye space Z Axis view , But in NDC Middle edge +Z Axis view . because glFrustum() We only accept near and far Positive value of , Therefore, we need to negate them when constructing projection matrices .
stay OpenGL in , One in eye space 3D The point is projected to the near plane ( The projection plane ) On . The following figure shows a point in eye space (x_{e},y_{e},z_{e}) How to project to the near plane (x_{p},y_{p},z_{p}).
From the of the visual cone [ Top view ], That is, the of eye space x coordinate ,x_{e} Mapped to x_{p},x_{p} It is calculated by using the ratio of similar triangles ：
From the side of the visual cone ,y_{p} Calculated in a similar way ：
Be careful x_{p} and y_{p} Rely on a z_{e}; They are associated with z_{e} In inverse proportion . let me put it another way , They are all z_{e} except . This is the first clue to construct the projection matrix . After transforming the eye coordinates by multiplying the projection matrix , The clipping coordinates are still homogeneous . It eventually becomes standardized equipment coordinates (NDC), Divided by the clipping coordinates w component .（ For more details , see OpenGL_Transformation)
therefore , We can cut the coordinates of w The component is set to z_{e}. The second of the projection matrix 4 Line into (0,0,1,0).
Next , We use a linear relationship to describe x_{p} and y_{p} Mapping to NDC Of x_{n} and y_{n};[l,r]⇒ [1,1] and [b,t]⇒ [1,1].
then , We will x_{p} and y_{p} Substitute into the above equation .
Be careful , For perspective Division (x_{c}/w_{c},y_{c}/w_{c}）, Let's make two terms of each equation be z_{e} to be divisible by . We will w_{c} Set to z_{e}, The terms in parentheses become the of clipping coordinates x_{c} and y_{c}.
From these equations , We can find the second of the projection matrix 1 Xing He 2 That's ok .
Now? , We just need to solve the problem 3 Row projection matrix . Find out z_{n} A little different from others , Because of... In eye space z_{e} Always projected onto the near plane n. But we need the only z Values for clipping and depth testing . Besides , We should be able to cancel the projection （ inverse transformation ）. Because we know z Don't depend on x or y value , So we borrow w Component to find z_{n} and z_{e} The relationship between . therefore , We can specify the second of the projection matrix in this way 3 That's ok .
In eye space ,w_{e} be equal to 1. therefore , The equation becomes ：
To find the coefficient A and B, We use (z_{e},z_{n}) Relationship ：(n,1) and (f,1), And put them into the above equation , figure out A、B.
here ,z_{e} And z_{n} The relationship becomes ：
Last , We found all the entries of the projection matrix . The complete projection matrix is ：
The projection matrix is suitable for general viewing cone . If the visual body is symmetrical , namely r=l, t=b, It can be reduced to ：
Before we move on , Please look again z_{e} and z_{n} The relationship between , equation （3）. You notice that this is a rational function ,z_{e} and z_{n} There is a nonlinear relationship between （ Here's the picture ）. This means that the accuracy in the near plane is very high , But the accuracy in the far plane is very low . If the range [n,f] More and more big , It will lead to the problem of depth accuracy （zfighting）; Near the far plane z_{e} Small changes in do not affect z_{n} value .n and f The distance between should be as short as possible , To minimize the depth buffer accuracy problem .
Use FOV Specified perspective projection ：
h = 2 × near × tan(θ/2)
w = h × aspect;
Corresponding to the above perspective projection matrix ：
r = w/2;t = h/2
therefore ：
n/r = (2×n)/w
= (2×n)/(h × aspect)
= (2×n)/(2 × n × tan(θ/2) × aspect)
= cot(θ/2)/aspect
Similarly obtained ,n/t = cot(θ/2)
Then the perspective projection matrix is ：
版权声明
本文为[itzyjr]所创，转载请带上原文链接，感谢
/html/aNsFIQ.html
边栏推荐
 如何建立 TikTok用户信任并拉动粉丝增长
 无线鹅颈麦主播麦手持麦无线麦克风方案应当如何选择
 loggie 源码分析 source file 模块主干分析
 关于局域网如何组建介绍
 PyTorch：train模式与eval模式的那些坑
 Dlib of face recognition framework
 NVIDIA graphics card driver error
 Installation and management procedures
 5minute NLP: text to text transfer transformer (T5) unified text to text task model
 Easyexcel reads the geographical location data in the excel table and sorts them according to Chinese pinyin
猜你喜欢

MySQL masterslave synchronization pit avoidance version tutorial

Construction of promtail + Loki + grafana log monitoring system

Public variables of robotframework

File upload and download of robot framework

Selenium IDE and XPath installation of chrome plugin

Project framework of robot framework

Use case execution of robot framework

Use case labeling mechanism of robot framework

Deepinv20 installation MariaDB

Pycham connects to the remote server and realizes remote debugging
随机推荐
 DDT + Excel for interface test
 Pytorch: the pit between train mode and eval mode
 Introduction to how to set up LAN
 Loggie source code analysis source file module backbone analysis
 How to choose the wireless gooseneck anchor microphone and handheld microphone scheme
 How to build tiktok user trust and drive fan growth
 【PIMF】OpenHarmony啃论文俱乐部—在ACM Survey闲逛是什么体验
 Kunteng full duplex digital wireless transceiver chip kt1605 / kt1606 / kt1607 / kt1608 is suitable for interphone scheme
 English  day15, 16 x sentence true research daily sentence (clause disconnection, modification)
 Knowledge points and examples of [seven input / output systems]
 详解牛客手套
 How does flash cache data in memory?
 批量制造测试数据的思路，附源码
 聊一聊浏览器缓存控制
 深入了解3D模型相关知识（建模、材质贴图、UV、法线），置换贴图、凹凸贴图与法线贴图的区别
 博士申请  厦门大学信息学院郭诗辉老师团队招收全奖博士/博后/实习生
 ACL 2022  DialogVED：用于对话回复生成的预训练隐变量编码解码模型
 True math problems in 1959 college entrance examination
 MySQL masterslave replication
 ByteVCharts可视化图表库，你想要的我都有
 ◰GL着色器处理程序封装
 ◰GL阴影贴图核心步骤
 ∑GL透视投影矩阵的推导
 Pseudo Distributed installation spark
 Zhongang Mining: Fluorite Flotation Process
 Dancenn: overview of byte selfdeveloped 100 billion scale file metadata storage system
 Modify the test case name generated by DDT
 面试百分百问到的进程，你究竟了解多少
 Encapsulating the logging module
 Camtasia2022软件新增功能介绍
 PyMySQL
 Mock test
 Mock test using postman
 正则过滤内网地址和网段
 信息摘要、数字签名、数字证书、对称加密与非对称加密详解
 计算饼状图百分比
 扫码登录的原理你真的了解吗？
 vscode如何比较两个文件的异同
 Do you really understand the principle of code scanning login?
 Calculate pie chart percentage