当前位置:网站首页>How does redis query a key from massive data?
How does redis query a key from massive data?
2022-04-21 14:02:00 【Roc. Chang】
Redis How to query a from massive data Key ?
keys Use
grammar :keys PATTERN: Used to find all that match a given pattern PATTERN Of key
keys * # Inquire about redis All of the key
keys h?llo # ?: Pass on a single character , It can be hello hallo ..., It doesn't contain hllo.
keys h*llo # * With any number of characters , hllo hello heello heallo ...
keys h[ae]llo # []: Wildcard a character in parentheses , You can find out hello hallo, We can't find out hllo.
keys h[^e]llo # matching e The previous letter , It can be hallo ... hdllo , It doesn't contain hello.
keys h[a-b]llo # - amount to or, matching a or b, It can be hallo hbllo; You can add more conditions , such as : keys h[a-b-c-d]llo
# The above syntax can also be combined accordingly , Such as
keys h[a-b]*llo # matching ha( Any number of characters )llo and hb( Any number of characters )llo
The formal environment is disabled keys Why
keys The time complexity of is :O(N), among N Is the number of keys in the database , Suppose that the key name in the database and the length of the given schema are limited . Although its complexity is O(N), But the duration is very short , You can do it on an entry-level computer 40ms scanning 100 Ten thousand data .
But if it is High concurrency Under the condition of , Use keys There will be problems , because Redis Is a single threaded database , The database will be locked every time the command is executed , also keys The command has no paging function , It will traverse the whole database every time , If 20ms After performing an operation , If it is a million level concurrency , that Redis Every time you execute a command, there will be a brief lock , This leads to a large number of requests being blocked , Make other services unavailable , And then CPU High usage , Finally, the server goes down .
keys The problem is :
-
No paging , It will traverse all databases at once , And find out all qualified key value ; But the results of the query may be very useful, which is very resource consuming .
-
Although the query speed is very fast , But as the amount of data grows , The longer the query time will become .
SCAN Use
grammar :SCAN cursor [MATCH pattern] [COUNT count] [TYPE type]
SCAN Is a cursor based iterator . This means that every time the command is called , The server will return the corresponding query data and a new cursor , The user needs to use this cursor as a cursor parameter in the next call .
The cursor is set to 0 when , The iteration will begin , The cursor returned by the server is 0 when , The iteration will end .
Parameter description :
cursor: Cursor position (hashThe index value of the bucket ), An integer value ; from 0 Start , To 0 end ; The result of the query may be 0 individual , But the cursor is not 0, As long as the cursor is not 0, It means that the traversal is not over yet .match pattern: Regular match fields ( Optional )count: Limit the number of single scans limit hint( reference value , The number of bottom traversals is not necessarily ), The default is 10. It is not the maximum number of returned query results . such as count by 10000, Means every scan 1w Bar record , But maybe only 10 One that meets the conditions or has 2w Records conform to .( Optional )
matters needing attention :
- A cursor is a Hash value , It's out of order , A traversal from from 0 Start , To 0 end , It means returning to the starting point .
- There is no need to use the same... For each iteration COUNT value . As long as the cursor obtained last time is passed in the next traversal , So the next traversal will use the last COUNT value .
- scan The returned result may have duplicate data , The client needs to be de duplicated
- The new data may not be traversed to
127.0.0.1:6379> scan 0 match 1* count 15
1) "17"
2) 1) "key:12"
2) "key:18"
3) "key:14"
4) "key:14"
5) "key:16"
6) "key:17"
7) "key:15"
8) "key:10"
9) "key:13"
10) "key:17"
11) "key:1" # scanning 15 Elements , But only 11 Elements match
127.0.0.1:6379> scan 17 # count Will use the last 15
1) "0" # The cursor value is 0, It means that the traversal is over
2) 1) "key:15"
2) "key:118"
3) "key:10"
4) "key:112"
5) "key:119"
6) "key:13"
7) "key:16"
8) "key:19"
9) "key:111"
Blog address : Redis How to query a from massive data Key ?
版权声明
本文为[Roc. Chang]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204211351518910.html
边栏推荐
- Zabbix5系列-制作拓扑图 (十三)
- SQL injection vulnerability shooting range - sqli labs learning
- 让别人连接自己的mysql数据库,共享mysql数据库
- 优先级队列 (堆)常用接口介绍 堆的存储 堆的创建
- <2021SC@SDUSC>山东大学软件工程应用与实践JPress代码分析(十一)
- 大学英语词汇解析 中国大学mooc 华中科技大学 测验题答案
- Chapter III commercial password standards and product applications of commercial password application and security evaluation - Summary
- impala常用命令(持续更新ing)
- MySQL 5.7 优化:Explain 执行计划近万字详解
- pytorch机器学习之numpy基础
猜你喜欢

Zabbix5 series - monitoring HP server ILO management port (6)

【leetcode】516. Longest palindrome subsequence

基于word2vec的k-means聚类

FTP service

电脑端微信内置浏览器开启调试模式

Shandong University project training raspberry pie promotion plan phase II (VI) condition judgment and cycle

SQL injection vulnerability shooting range - sqli labs learning

Zabbix5 series - monitoring Hikvision camera (VII)

2021-08-16记一次无意发现正方教务系统的bug

Zabbix5 series - creating auto discovery templates (XVI)
随机推荐
Deep analysis of JVM bytecode file structure
Detailed explanation of JVM memory allocation mechanism
应急响应笔记
< 2021SC@SDUSC Software engineering application and practice of Shandong University jpress code analysis (13)
Zabbix5系列-监控华为、H3C交换机(snmpv2c/snmpv3/snmptrap) (二)
Ceph维护命令了解
如何在机房限制学生端摆脱老师的控制,实现自由上网
流量分析(CTF)
Zabbix5 series - report tool zbxtable (XVIII)
让别人连接自己的mysql数据库,共享mysql数据库
Initial view of SQL level hint function of the new features of Yijing jieqian bank
centos 离线安装mysql
tcpdump抓包与nmap简单使用
ssh服务器--密钥认证
Zabbix5系列-接入Grafana面板 (十七)
POI and easyexcel reading and writing test
ftp服务
优先级队列 (堆)常用接口介绍 堆的存储 堆的创建
pytorch geometric中为何要将稀疏邻接矩阵写成转置的形式adj_t
SQL injection vulnerability shooting range - sqli labs learning