当前位置:网站首页>Product Quantization (PQ)
Product Quantization (PQ)
2022-08-09 10:47:00 【qq_26391203】
How product quantization is used in image retrieval:
"' After quantitative learning, for a given query sample, the query sample and library can be calculated by looking up a tableAsymmetric distance of the samples in"'
A brief description of product quantization: The typical representative of vector quantization methods is the product quantization (PQ, Product
Quantization) method, which decomposes the feature space into Cartesian products of multiple low-dimensional subspaces, and then quantize each subspace individually.In the training phase, each subspace is clustered to obtain kk centroids (ie quantizers), and the Cartesian product of all these centroids constitutes a dense division of the whole space, and can ensure that the quantization error is relatively small;After quantitative learning, for a given query sample, the asymmetric distance between the query sample and the sample in the library can be calculated by looking up the table.Approximate Nearest Neighbor Search- K-means clustering algorithm: Clustering belongs to unsupervised learning, the previous regression, Naive Bayes, SVM, etc. all have the category label y, that is to say, the classification of the sample has been given in the sample.However, there is no y given in the clustered samples, only the feature x. For example, it is assumed that the stars in the universe can be represented as the point set clip_image002 [10] in the three-dimensional space.The purpose of clustering is to find the latent class y of each sample x and put together samples x of the same class y.For example, for the stars above, after clustering, the result is a cluster of stars. The points in the cluster are relatively close to each other, and the distance between the stars in the cluster is relatively far.
- Product quantization process idea: https://www.cnblogs.com/mafuqiang/p/7161592.html
边栏推荐
- TensorFlow: NameError: name 'input_data' is not defined
- unix环境编程 第十五章 15.3 函数popen和pclose
- 1001 害死人不偿命的(3n+1)猜想 (15 分)
- tensor.eq() tensor.item() tensor.argmax()
- kubernetes中不可见的OOM
- 单元测试1之单元测试的引用
- The torch. The stack () official explanation, explanation and example
- 好久没上博客了,好长时间没有进展了
- 机器学习-逻辑回归(logistics regression)
- 深度学习--神经网络(基础讲解)
猜你喜欢
随机推荐
UNIX Environment Programming Chapter 15 15.5FIFO
机器学习-逻辑回归(logistics regression)
UNIX Environment Programming Chapter 15 15.6 XSI IPC
编程技术提升
unix环境编程 第十四章 14.8 存储映射I/O
unix环境编程 第十五章 15.7消息队列
相伴成长,彼此成就 用友U9 cloud做好制造业数智化升级的同路人
torch.stack()的官方解释,详解以及例子
Unix Environment Programming Chapter 14 14.8 Memory Mapped I/O
商业技术解决方案与高阶技术专题 - 数据可视化专题
情感分析SowNLP词库
我用开天平台做了一个定时发送天气预报系统【开天aPaaS大作战】
10000以内素数表(代码块)
一天半的结果——xmms on E2
shell脚本实战(第2版)/人民邮电出版社 脚本2 验证输入:仅限字母和数字
Cluster understanding
Official explanation, detailed explanation and example of torch.cat() function
Unix Environment Programming Chapter 15 15.9 Shared Storage
2022强网杯WP
人物 | 从程序员到架构师,我是如何快速成长的?