CN112214135B - Multi-touch identification method, device and equipment for infrared touch screen - Google Patents
Multi-touch identification method, device and equipment for infrared touch screen Download PDFInfo
- Publication number
- CN112214135B CN112214135B CN202010911951.XA CN202010911951A CN112214135B CN 112214135 B CN112214135 B CN 112214135B CN 202010911951 A CN202010911951 A CN 202010911951A CN 112214135 B CN112214135 B CN 112214135B
- Authority
- CN
- China
- Prior art keywords
- infrared
- pixel
- touch
- peak
- infrared image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04104—Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
Abstract
The application is applicable to the technical field of computers, and provides a multi-point touch identification method for an infrared touch screen, which comprises the following steps: generating an infrared image according to the collected infrared information; traversing pixel values of all points in the infrared image to obtain a peak area of the infrared image; and determining a target touch area according to the peak area and a preset screening rule. In the prior art, the suspected touch area is determined by simply judging whether the infrared signal is blocked, so that a large number of suspected touch areas can be detected. Compared with the prior art, the number of the suspected touch areas obtained through the embodiment is greatly reduced, the workload for screening the target touch areas from the suspected touch areas subsequently is also reduced, the identification speed can be improved, the condition that the screen is mistaken or missed due to overlarge workload can be avoided, the target touch areas can be screened more accurately, and the identification precision and the identification speed during multi-point touch are improved.
Description
Technical Field
The application belongs to the technical field of computers, and particularly relates to a multi-touch identification method, device and equipment for an infrared touch screen.
Background
An Infrared Touch Screen (Infrared Touch Screen for short) is composed of Infrared emitting and receiving sensing elements mounted on the outer frame of the Touch Screen, and a bidirectional Infrared emitting tube and Infrared receiving tube jointly form a positioning plane. When the infrared touch screen device is applied, in order to detect a touch point of a user on the infrared touch screen, the infrared touch screen device can sequentially scan whether infrared signals are shielded between all corresponding infrared transmitting tubes and infrared receiving tubes in the transverse direction and the longitudinal direction to generate changes, and position coordinates of a touch area are calculated and determined according to the shielded positions of the infrared signals.
However, the above method for detecting touch points is generally limited to single-point touch, when a user performs multi-point touch, a large number of suspected touch areas may be detected, and a large error may also exist in a real touch area screened from the large number of suspected touch areas, so that the real touch area cannot be accurately located.
Disclosure of Invention
The embodiment of the application provides a multi-touch identification method, a multi-touch identification device and multi-touch identification equipment for an infrared touch screen, and can solve the problem that a real touch area cannot be accurately positioned in the prior art.
In a first aspect, an embodiment of the present application provides a method for recognizing multiple touches on an infrared touch screen, including:
generating an infrared image according to the collected infrared information;
traversing pixel values of all points in the infrared image to obtain a peak area of the infrared image;
and determining a target touch area according to the peak area and a preset screening rule.
Further, the traversing pixel values of each point in the infrared image to obtain a peak area of the infrared image includes:
traversing pixel values of pixel points of each row in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each row, and obtaining a first peak area according to the first pixel value curve;
traversing pixel values of pixel points of each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points of each column, and obtaining a second peak area according to the second pixel value curve;
determining a peak region of the infrared image based on the first peak region and the second peak region.
Further, the traversing pixel values of pixel points in each row in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points in each row, and obtaining a first peak area according to the first pixel value curve includes:
traversing pixel values of pixel points of each line in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each line and a peak point in the first pixel value curve;
and determining a first peak area corresponding to each first pixel value curve according to the peak point and the preset peak height of each first pixel value curve.
Further, the traversing pixel values of pixel points in each row in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points in each row, and obtaining a first peak area according to the first pixel value curve includes:
traversing pixel values of pixel points in each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each column and a peak point in the second pixel value curve;
and determining a second peak area corresponding to each second pixel value curve according to the peak point and the preset peak height of each second pixel value curve.
Further, the generating an infrared image according to the collected infrared information includes:
acquiring signal values acquired by the infrared receiving tubes;
and generating an infrared image according to the signal value.
Further, the generating an infrared image according to the signal value includes:
generating an initial image according to the signal value;
and setting the pixel value of the pixel point of which the pixel value is smaller than the preset pixel threshold value in the initial image to be 0 to obtain the infrared image.
Further, the determining a target touch area according to the peak area and a preset screening rule includes:
acquiring a connected region of a pixel point corresponding to the peak region, and determining a suspected touch region according to the connected region;
acquiring characteristic information of the suspected touch area, and determining authenticity evaluation degree information of the suspected touch area based on the characteristic information;
screening out a target touch area from the suspected touch area according to the authenticity evaluation degree information;
and determining the position information of the target touch area.
In a second aspect, an embodiment of the present application provides an infrared touch screen multi-touch recognition apparatus, including:
the first generating unit is used for generating an infrared image according to the collected infrared information;
the first processing unit is used for traversing the pixel values of all points in the infrared image to obtain a peak area of the infrared image;
and the first determining unit is used for determining a target touch area according to the peak area and a preset screening rule.
Further, the first processing unit includes:
the second processing unit is used for traversing the pixel values of the pixel points in each row in the horizontal direction in the infrared image to obtain a first pixel value curve of the pixel values of the pixel points in each row, and obtaining a first peak area according to the first pixel value curve;
the third processing unit is used for traversing the pixel values of the pixel points in each row in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each row, and obtaining a second peak area according to the second pixel value curve;
a second determining unit configured to determine a peak region of the infrared image based on the first peak region and the second peak region.
Further, the second processing unit is specifically configured to:
traversing pixel values of pixel points of each line in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each line and a peak point in the first pixel value curve;
and determining a first peak area corresponding to each first pixel value curve according to the peak point and the preset peak height of each first pixel value curve.
Further, the third processing unit is specifically configured to:
traversing pixel values of pixel points in each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each column and a peak point in the second pixel value curve;
and determining a second peak area corresponding to each second pixel value curve according to the peak point and the preset peak height of each second pixel value curve.
Further, the first generating unit includes:
the first acquisition unit is used for acquiring signal values acquired by the infrared receiving tubes;
and the second generating unit is used for generating an infrared image according to the signal value.
Further, the second generating unit is specifically configured to:
generating an initial image according to the signal value;
and setting the pixel value of the pixel point of which the pixel value is smaller than the preset pixel threshold value in the initial image to be 0 to obtain the infrared image.
Further, the first determining unit is specifically configured to:
acquiring a connected region of a pixel point corresponding to the peak region, and determining a suspected touch region according to the connected region;
acquiring characteristic information of the suspected touch area, and determining authenticity evaluation degree information of the suspected touch area based on the characteristic information;
screening out a target touch area from the suspected touch area according to the authenticity evaluation degree information;
and determining the position information of the target touch area.
In a third aspect, an embodiment of the present application provides an infrared touch screen multi-touch recognition device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor, when executing the computer program, implements the infrared touch screen multi-touch recognition method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and when executed by a processor, the computer program implements the infrared touch screen multi-touch recognition method according to the first aspect.
In the embodiment of the application, an infrared image is generated according to the collected infrared information; traversing pixel values of all points in the infrared image to obtain a peak area of the infrared image; and determining a target touch area according to the peak area and a preset screening rule. The peak areas of the infrared images correspond to the suspected touch areas, and the screening is further performed on the basis of the obtained suspected touch areas, so that the number of the obtained suspected touch areas is less than that of the suspected touch areas obtained in the prior art, the number of the obtained suspected touch areas is greatly reduced, the workload of screening the target touch areas from the suspected touch areas subsequently can be reduced, the identification speed can be increased, the condition that the screening is mistaken or missed due to the fact that the workload is too large can be avoided, the target touch areas can be screened more accurately, and the identification precision and the identification speed during multi-point touch are increased.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a multi-touch recognition method for an infrared touch screen according to a first embodiment of the present application;
FIG. 2 is a schematic diagram of an infrared path disposed on an infrared touch screen according to a first embodiment of the present disclosure;
fig. 3 is a schematic diagram of a first pixel value curve of a pixel point in an infrared image in a multi-touch recognition method for an infrared touch screen according to a first embodiment of the present application;
FIG. 4 is a diagram of an infrared touch screen multi-touch recognition terminal according to a second embodiment of the present application;
fig. 5 is a schematic diagram of an infrared touch screen multi-touch recognition device according to a third embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Referring to fig. 1, fig. 1 is a schematic flow chart of a multi-touch recognition method for an infrared touch screen according to a first embodiment of the present application. In this embodiment, an execution main body of the infrared touch screen multi-touch recognition method is a device having an infrared touch screen multi-touch recognition function. The multi-touch recognition method of the infrared touch screen as shown in fig. 1 may include:
s101: and generating an infrared image according to the collected infrared information.
And a plurality of groups of infrared transceiving modules are arranged in an infrared touch screen of the terminal equipment. Each group of infrared transceiver modules comprises an infrared transmitting tube and an infrared receiving tube. The infrared transmitting tube and the infrared receiving tube are arranged separately.
In the infrared transceiving module, the infrared transmitting tube can transmit infrared rays, the infrared receiving tube can receive the infrared rays and convert optical signals of the received infrared rays into electric signals to obtain signal values corresponding to the infrared rays. The infrared detection network can be formed by multiple groups of infrared receiving and transmitting modules.
When the infrared transmitting tube transmits infrared rays, the infrared rays may be received by the corresponding infrared receiving tube. If the infrared ray is not blocked, the signal value of the infrared ray received by the infrared receiving tube should be a preset value.
If the infrared ray is blocked, the intensity of the infrared ray will drop sharply, so that the signal value of the infrared ray received by the infrared receiving tube is far smaller than the preset value.
Therefore, in the practical application process, the terminal device can judge whether the infrared ray corresponding to the infrared receiving tube is blocked according to the signal value of the infrared ray detected by the infrared receiving tube.
And if the signal value of the infrared ray received by one infrared receiving tube is smaller than a preset threshold value, the infrared ray corresponding to the infrared receiving tube is shielded by the object.
And if the signal value of the infrared ray received by a certain infrared receiving tube is greater than or equal to a preset threshold value, indicating that the infrared ray corresponding to the infrared receiving tube is not shielded by the object.
The infrared shielding object may be a finger, a stylus or other object.
The preset threshold value may be determined according to the preset value and the allowable error threshold value. For example, the preset threshold may be a difference value obtained by subtracting the error threshold from the preset value. Alternatively, the preset threshold may be another value that is greater than the difference and smaller than the preset value.
When the terminal equipment performs touch recognition, the terminal equipment can acquire the acquired infrared information. The infrared ray information may be signal values of infrared rays collected by the respective infrared receiving tubes.
And then, the terminal equipment can determine the shielded infrared rays according to the signal values of the infrared rays collected by the infrared receiving tubes.
After determining the blocked infrared rays, the terminal device may determine the start position and the end position of each blocked infrared ray according to a preset rectangular coordinate system.
The starting position is the position coordinate of the infrared emission tube corresponding to the shielded infrared ray.
The end position is a position coordinate of the infrared receiving tube corresponding to the blocked infrared ray.
The terminal equipment can determine pixel points through which the shielded infrared rays pass according to the starting position and the ending position of each shielded infrared ray and the width of each infrared ray.
Then, the terminal device may generate an infrared image based on the pixel points through which the respective blocked infrared rays pass.
In the infrared image, if a pixel point at a certain position is passed by a shielded infrared ray, the pixel value of the pixel point is accumulated by K. If the pixel point is passed by N shielded infrared rays, the pixel value of the pixel point is accumulated for N times, and the pixel value of the pixel point is N x K. K is a natural number greater than 0, and N is a positive integer greater than or equal to 0.
The terminal equipment can determine the pixel value of each pixel point in the infrared image according to the number of the shielded infrared rays passing through each pixel point in the infrared image.
For example, as shown in fig. 2, fig. 2 is a schematic diagram of an infrared path disposed on an infrared touch screen. The rectangular area in fig. 2 is an area blocked by the blocking object, and the rectangular area blocks a plurality of infrared rays. At this time, the terminal device may determine the blocked infrared rays according to the signal values collected by the infrared receiving tubes, and determine the pixel values of the pixel points through which the blocked infrared rays pass, thereby obtaining the infrared image in this embodiment.
Note that the infrared image may be a grayscale image. At this time, the pixel value of each pixel point in the infrared image is the gray value of the pixel point. In this embodiment, the grayscale image may be an 8-bit grayscale image, that is, the image may have 256 levels of grayscale values at most, and the grayscale value range may be 0 to 255.
In the embodiment of the application, the terminal device may generate an infrared image according to the collected infrared information. In the infrared image, the higher the pixel value of the pixel point is, the higher the possibility that the pixel point belongs to the shielded area is. Therefore, compared with the prior art, the terminal device of the embodiment can more intuitively and accurately represent the possibility that each pixel belongs to the shielded area through the pixel value of each pixel in the infrared image.
Further, since the touch point is generally passed by a plurality of infrared rays, the terminal device may take the currently generated infrared image as an initial image.
And then, the terminal equipment carries out filtering processing on the initial image, takes the pixel points with the pixel values smaller than the preset pixel threshold value in the initial image as noise points, and sets the pixel values of the noise points as 0, so as to obtain the filtered infrared image.
The preset pixel threshold value can be set according to an actual scene. For example, the preset pixel threshold may be set to a value of 5, 10, 20, etc.
In the infrared image, the lower the pixel value of a pixel point is, the lower the possibility that the pixel point belongs to a shielded area is. Therefore, the terminal device filters the initial image through the preset pixel threshold value, noise points in the initial image can be filtered, the subsequent calculation amount for processing the infrared image is reduced, and the accuracy for identifying the target touch area is improved.
S102: and traversing the pixel value of each pixel point in the infrared image to obtain the peak area of the infrared image.
After the terminal device acquires the infrared image, the pixel value of each row of pixel points in the infrared image in the horizontal direction and the pixel value of each column of pixel points in the vertical direction can be acquired.
Then, the terminal device may traverse pixel values of pixel points in each row in the infrared image in the horizontal direction, and perform curve fitting according to the pixel values of the pixel points in each row and the abscissa of each pixel point to obtain a first pixel curve corresponding to the pixel points in each row.
After the terminal device acquires the first pixel curve, a peak point and a valley point in the first pixel curve may be searched.
In the first pixel curve, the pixel value of the peak point is greater than the pixel values of the pixel points on the two adjacent sides, and the pixel value of the valley point is less than the pixel values of the pixel points on the two adjacent sides.
After determining the peak point of the first pixel curve, the terminal device may determine a wave region corresponding to the peak point. Wherein, the area from the peak point to the valley point at two adjacent sides is the wave area of the peak point.
For example, as shown in fig. 3, a is a peak point in the first pixel curve, B and C are valley points adjacent to the point a in the first pixel curve, the pixel values of the respective pixel points in the region from the point B to the point a monotonically increase, and the pixel values of the respective pixel points in the region from the point a to the point C monotonically decrease.
At this time, the terminal device may determine a region from point B to point C as a wave region of point a.
After determining the wave region corresponding to the peak point, the terminal device may intercept a first peak region in the wave region corresponding to the peak point according to a preset peak height. The difference value between the pixel value of the peak point and the pixel value of each pixel point in the first peak area is smaller than or equal to the preset peak height.
For example, as shown in fig. 3, a region from point B to point C is determined as a wave region of the peak point a. The terminal device can intercept the region from the point B to the point C according to the preset peak height to obtain a point D and a point E.
The region from the point D to the point E is a peak region of the point A, and the difference value between the pixel value of the point A and each pixel point of the region from the point D to the point E is smaller than or equal to the preset peak height.
Similarly, the terminal device may traverse the pixel values of the pixels in each column in the vertical direction in the infrared image, and perform curve fitting according to the pixel values of the pixels in each column and the ordinate of each pixel to obtain a second pixel curve corresponding to each column.
Then, the terminal device determines a peak point and a valley point in the second pixel curve according to the second pixel curve.
In the second pixel curve, the pixel value of the peak point is greater than the pixel values of the pixel points on the two adjacent sides, and the pixel value of the valley point is less than the pixel values of the pixel points on the two adjacent sides.
After determining the peak point of the second pixel curve, the terminal device may determine a wave region corresponding to the peak point. Wherein, the area from the peak point to the valley point at two adjacent sides is the wave area of the peak point.
After determining the wave region corresponding to the peak point, the terminal device may intercept a second peak region in the wave region corresponding to the peak point according to a preset peak height.
The difference value between the pixel value of the peak point and the pixel value of each pixel point in the second peak area is smaller than or equal to the preset peak height.
In this way, the terminal device may determine the first peak area corresponding to each first pixel curve according to the peak point and the preset peak height of each first pixel curve, and determine the second peak area corresponding to each second pixel curve according to the peak point and the preset peak height of each second pixel curve.
Then, the terminal device may determine a peak region of the infrared image based on the first peak region and the second peak region.
Specifically, if a certain pixel point belongs to both a first peak region of a certain first pixel curve and a second peak region of a certain second pixel curve, the terminal device may determine the pixel point as a peak point of the infrared image, and determine a position where the peak point of the infrared image is located as the peak region of the infrared image.
For example, if a certain pixel belongs to the first peak region in the X-th row and the pixel also belongs to the second peak region in the Y-th row, the pixel is a peak point of the infrared image, and the position of the pixel belongs to the peak region of the infrared image. X and Y are both positive integers greater than 0.
It can be understood that, in the infrared image, the higher the pixel value of a pixel point is, the higher the possibility that the pixel point belongs to a blocked area is. Therefore, the blocked region should be a peak region where the peak point in the infrared image is located.
Therefore, in this embodiment, the terminal device traverses the pixel values of the respective pixel points in the infrared image, so as to identify the peak region of the infrared image. At this time, the terminal device may limit the screening range of the target touch area to the peak area of the recognized infrared image, reduce the workload of screening the target touch area, and improve the accuracy of recognizing the target touch area.
And, the terminal device may identify the first peak region from the row direction and the second peak region from the column direction, respectively, in identifying the peak region of the infrared image. Then, the terminal equipment integrates the first peak area and the second peak area to identify the peak area of the infrared image. The wave crest region of the infrared image is determined in the secondary identification mode, so that the accuracy of identifying the wave crest region of the infrared image can be further improved.
S103: and determining a target touch area according to the peak area and a preset screening rule.
After the terminal device acquires the peak area of the infrared image, the terminal device can determine a target touch area from the peak area according to a preset screening rule.
The preset screening rules can be set according to actual requirements.
In some possible implementation manners, the terminal device may obtain the number of pixel points included in each peak area, and determine the peak area in which the number of pixel points is greater than the first number threshold as the target touch area.
When a user touches the infrared touch screen, the area touched by the user should have a certain area. In addition, in the infrared image, the area of each peak region is related to the number of pixel points in the peak region. Therefore, the terminal device may set the first number threshold when recognizing the target touch area.
If the number of the pixel points in a certain peak area is smaller than or equal to the first number threshold, the area of the peak area is smaller, and the peak area is not a target touch area.
If the number of the pixel points in a certain peak area is greater than the first number threshold, the terminal device may determine the peak area as a target touch area.
At the moment, the terminal equipment can quickly and accurately identify the target touch area from each wave crest area according to the number of pixel points contained in each wave crest area, and the identification speed and accuracy of the target touch area are improved.
In other possible implementation manners, the terminal device may determine a connected region of a pixel point corresponding to the peak region, and determine the connected region as a suspected touch region according to the connected region.
And pixel points in each wave peak area are wave peak points of the infrared image. At this time, the terminal device may determine the connected region of the pixel point corresponding to the peak region according to the coordinates of each peak point.
The communication area is an image area formed by adjacent peak points, namely any two pixel points in the communication area can be connected by a line completely belonging to the communication area.
The terminal device may obtain the connected region of the pixel point corresponding to the peak region through any one of algorithms such as an image segmentation method, a seed filling method, a watershed method, a region growing method and the like based on the binary image, and the terminal device does not limit the adopted algorithm.
After the connected areas are determined, the terminal device may screen the connected areas to obtain a suspected touch area. For example, the terminal device may obtain the number of pixel points of each connected region, and determine the connected region in which the number of pixel points is greater than the second number threshold as the suspected touch region.
After the suspected touch area is determined, the terminal device may acquire feature information of the suspected touch area, and determine authenticity evaluation degree information of the suspected touch area based on the feature information.
The characteristic information is used to identify characteristics of the suspected touch area. The characteristic information may include one or more of information such as an area of the pseudo-touch area, the number of infrared rays passing through the pseudo-touch area, and a highest pixel value in the pseudo-touch area.
After the terminal device obtains the feature information, the terminal device may score the feature information according to a preset scoring standard to obtain authenticity evaluation degree information corresponding to each suspected touch area.
When the characteristic information includes only one type of information, the authenticity rating information is a rating value of the information. When the characteristic information includes a plurality of kinds of information, the authenticity evaluation degree information is a sum of score values corresponding to the respective information.
After determining the authenticity evaluation degree information of the suspected touch area, the terminal device may screen out the target touch area from the suspected touch area based on the authenticity evaluation degree information, and obtain the position information of the target touch area.
Specifically, the higher the authenticity evaluation degree information is, the higher the possibility that the pseudo touch area corresponding to the authenticity evaluation degree information is a real touch area is.
Therefore, the terminal device may determine the suspected touch area corresponding to the authenticity evaluation degree information greater than the preset information threshold as the target touch area. Or, the terminal device may sort the authenticity evaluation degree information corresponding to each pseudo touch area in order from high to low, and select the pseudo touch area corresponding to the highest K pieces of authenticity evaluation degree information as the target touch area. K is a preset positive integer.
For example, when the user uses the infrared touch panel, the user generally performs single-point touch with one finger, two-point touch with two fingers, and three-point touch with three fingers, and therefore K may be set to 3. When a plurality of suspected touch areas exist, the suspected touch area corresponding to the highest 3 pieces of authenticity evaluation degree information may be selected as the target touch area.
In the above description, the terminal device sets the preset scoring criterion. Through the preset scoring standard, the terminal equipment can calculate the authenticity evaluation degree information of each suspected touch area according to the characteristic information of each suspected touch area, and the credibility of each suspected touch area is quantized, so that the terminal equipment can conveniently and quickly screen the target touch area according to the authenticity evaluation degree information of each suspected touch area, and the identification speed and the accuracy of the target touch area are improved.
To sum up, in the embodiment of the present application, the terminal device generates an infrared image according to the collected infrared information; traversing pixel values of all points in the infrared image to obtain a peak area of the infrared image; and determining a target touch area according to the peak area and a preset screening rule. In the prior art, a suspected touch area is determined by simply judging whether an infrared signal is shielded or not, once the infrared signal is shielded, a suspected touch area exists, so that a large number of suspected touch areas are detected, the workload for screening the final target touch area from the large number of suspected touch areas is huge, the identification speed during multi-point touch is low, and due to the fact that the data size is too large, the condition of wrong screening or screen missing may occur during screening, and the identification precision during multi-point touch is low. In the above scheme, although whether the infrared signal is blocked is also determined, when the infrared ray is blocked, the suspected touch area is not directly determined, but the infrared image is determined according to the blocked infrared ray, and then the peak area corresponding to the infrared image is acquired. It is understood that the infrared image in this embodiment may correspond to a suspected touch area in the prior art. In this embodiment, the peak area of the infrared image corresponds to the suspected touch area, that is, in the above scheme, the suspected touch area is obtained by further performing screening on the basis of the suspected touch area obtained in the prior art. Therefore, the number of the suspected touch areas obtained in the above scheme is less than that obtained in the prior art. Therefore, the number of the suspected touch areas obtained through the embodiment is greatly reduced, the workload for subsequently screening the target touch areas from the suspected touch areas is also reduced, the identification speed can be increased, the condition that the screen is mistaken or missed due to overlarge workload can be avoided, the target touch areas can be more accurately screened, and the identification precision and the identification speed during multi-point touch are improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Referring to fig. 4, fig. 4 is a schematic diagram of an infrared touch screen multi-touch recognition terminal according to a second embodiment of the present application. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 4, the infrared touch screen multi-touch recognition terminal includes:
a first generating unit 410, configured to generate an infrared image according to the collected infrared information;
the first processing unit 420 is configured to traverse pixel values of each point in the infrared image to obtain a peak area of the infrared image;
a first determining unit 430, configured to determine a target touch area according to the peak area and a preset filtering rule.
Further, the first processing unit 420 includes:
the second processing unit is used for traversing the pixel values of the pixel points in each row in the horizontal direction in the infrared image to obtain a first pixel value curve of the pixel values of the pixel points in each row, and obtaining a first peak area according to the first pixel value curve;
the third processing unit is used for traversing the pixel values of the pixel points in each row in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each row, and obtaining a second peak area according to the second pixel value curve;
a second determining unit configured to determine a peak region of the infrared image based on the first peak region and the second peak region.
Further, the second processing unit is specifically configured to:
traversing pixel values of pixel points of each line in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each line and a peak point in the first pixel value curve;
and determining a first peak area corresponding to each first pixel value curve according to the peak point and the preset peak height of each first pixel value curve.
Further, the third processing unit is specifically configured to:
traversing pixel values of pixel points in each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each column and a peak point in the second pixel value curve;
and determining a second peak area corresponding to each second pixel value curve according to the peak point and the preset peak height of each second pixel value curve.
Further, the first generating unit 410 includes:
the first acquisition unit is used for acquiring signal values acquired by the infrared receiving tubes;
and the second generating unit is used for generating an infrared image according to the signal value.
Further, the second generating unit is specifically configured to:
generating an initial image according to the signal value;
and setting the pixel value of the pixel point of which the pixel value is smaller than the preset pixel threshold value in the initial image to be 0 to obtain the infrared image.
Further, the first determining unit 430 is specifically configured to:
acquiring a connected region of a pixel point corresponding to the peak region, and determining a suspected touch region according to the connected region;
acquiring characteristic information of the suspected touch area, and determining authenticity evaluation degree information of the suspected touch area based on the characteristic information;
screening out a target touch area from the suspected touch area according to the authenticity evaluation degree information;
and determining the position information of the target touch area.
Fig. 5 is a schematic diagram of an infrared touch screen multi-touch recognition device according to a third embodiment of the present application. As shown in fig. 5, the infrared touch screen multi-touch recognition device 5 of the embodiment includes: a processor 50, a memory 51 and a computer program 52, such as an infrared touch screen multi-touch recognition program, stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps in the above-mentioned embodiments of the infrared touch screen multi-touch recognition method, such as the steps 101 to 103 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 410 to 430 shown in fig. 4.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 52 in the infrared touch screen multi-touch recognition device 5. For example, the computer program 52 may be divided into a first generating unit, a first processing unit, and a first determining unit, and the specific functions of each unit are as follows:
the first generating unit is used for generating an infrared image according to the collected infrared information;
the first processing unit is used for traversing the pixel values of all points in the infrared image to obtain a peak area of the infrared image;
and the first determining unit is used for determining a target touch area according to the peak area and a preset screening rule.
The infrared touch screen multi-touch recognition device may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of an infrared touch screen multi-touch recognition device 5, and does not constitute a limitation of the infrared touch screen multi-touch recognition device 5, and may include more or fewer components than those shown, or some components in combination, or different components, for example, the infrared touch screen multi-touch recognition device may also include an input-output device, a network access device, a bus, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the infrared touch screen multi-touch recognition device 5, such as a hard disk or a memory of the infrared touch screen multi-touch recognition device 5. The memory 51 may also be an external storage device of the infrared touch screen multi-touch recognition device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the infrared touch screen multi-touch recognition device 5. Further, the infrared touch screen multi-touch recognition device 5 may also include both an internal storage unit and an external storage device of the infrared touch screen multi-touch recognition device 5. The memory 51 is used for storing the computer program and other programs and data required by the infrared touch screen multi-touch recognition device. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. A multi-touch recognition method for an infrared touch screen is characterized by comprising the following steps:
generating an infrared image according to the collected infrared information; the infrared information comprises signal values of infrared rays collected by the infrared receiving tubes; the infrared image is an image formed by pixel points through which all the shielded infrared rays pass;
traversing pixel values of all points in the infrared image to obtain a peak area of the infrared image; the pixel value of each point of the infrared image is the gray value of the point;
and determining a target touch area according to the peak area and a preset screening rule.
2. The method for multi-touch recognition of an infrared touch screen as claimed in claim 1, wherein said traversing pixel values of respective points in the infrared image to obtain a peak area of the infrared image comprises:
traversing pixel values of pixel points of each row in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each row, and obtaining a first peak area according to the first pixel value curve;
traversing pixel values of pixel points of each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points of each column, and obtaining a second peak area according to the second pixel value curve;
determining a peak region of the infrared image based on the first peak region and the second peak region.
3. The method for multi-touch recognition of an infrared touch screen as claimed in claim 2, wherein said traversing pixel values of pixel points in each row in a horizontal direction in the infrared image to obtain a first pixel value curve of pixel values of pixel points in each row, and obtaining a first peak area according to the first pixel value curve comprises:
traversing pixel values of pixel points of each line in the infrared image in the horizontal direction to obtain a first pixel value curve of the pixel values of the pixel points of each line and a peak point in the first pixel value curve;
and determining a first peak area corresponding to each first pixel value curve according to the peak point and the preset peak height of each first pixel value curve.
4. The method for multi-touch recognition of an infrared touch screen as claimed in claim 2, wherein the traversing pixel values of pixel points in each column in a vertical direction in the infrared image to obtain a second pixel value curve of pixel values of pixel points in each column, and obtaining a second peak area according to the second pixel value curve comprises:
traversing pixel values of pixel points in each column in the infrared image in the vertical direction to obtain a second pixel value curve of the pixel values of the pixel points in each column and a peak point in the second pixel value curve;
and determining a second peak area corresponding to each second pixel value curve according to the peak point and the preset peak height of each second pixel value curve.
5. The infrared touch screen multi-touch recognition method of claim 1, wherein the generating an infrared image from the collected infrared information comprises:
acquiring signal values acquired by the infrared receiving tubes;
and generating an infrared image according to the signal value.
6. The infrared touch screen multi-touch recognition method of claim 5, wherein the generating an infrared image from the signal values comprises:
generating an initial image according to the signal value;
and setting the pixel value of the pixel point of which the pixel value is smaller than the preset pixel threshold value in the initial image to be 0 to obtain the infrared image.
7. The method for multi-touch recognition on an infrared touch screen according to claim 1, wherein the determining the target touch area according to the peak area and a preset screening rule comprises:
acquiring a connected region of a pixel point corresponding to the peak region, and determining a suspected touch region according to the connected region;
acquiring characteristic information of the suspected touch area, and determining authenticity evaluation degree information of the suspected touch area based on the characteristic information;
screening out a target touch area from the suspected touch area according to the authenticity evaluation degree information;
and determining the position information of the target touch area.
8. An infrared touch screen multi-touch recognition device, comprising:
the first generating unit is used for generating an infrared image according to the collected infrared information; the infrared information comprises signal values of infrared rays collected by the infrared receiving tubes; the infrared image is an image formed by pixel points through which all the shielded infrared rays pass;
the first processing unit is used for traversing the pixel values of all points in the infrared image to obtain a peak area of the infrared image; the pixel value of each point of the infrared image is the gray value of the point;
and the first determining unit is used for determining a target touch area according to the peak area and a preset screening rule.
9. An infrared touch screen multi-touch recognition device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010911951.XA CN112214135B (en) | 2020-09-02 | 2020-09-02 | Multi-touch identification method, device and equipment for infrared touch screen |
PCT/CN2020/116194 WO2022047839A1 (en) | 2020-09-02 | 2020-09-18 | Multi-point touch recognition method, apparatus and device for infrared touch screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010911951.XA CN112214135B (en) | 2020-09-02 | 2020-09-02 | Multi-touch identification method, device and equipment for infrared touch screen |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112214135A CN112214135A (en) | 2021-01-12 |
CN112214135B true CN112214135B (en) | 2022-04-26 |
Family
ID=74048982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010911951.XA Active CN112214135B (en) | 2020-09-02 | 2020-09-02 | Multi-touch identification method, device and equipment for infrared touch screen |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112214135B (en) |
WO (1) | WO2022047839A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114168055A (en) * | 2021-10-29 | 2022-03-11 | 深圳市联新移动医疗科技有限公司 | Sliding unlocking control method and system and medical cart |
CN116107450A (en) * | 2022-12-02 | 2023-05-12 | 深圳市鸿合创新信息技术有限责任公司 | Touch point identification method and device of infrared touch screen and infrared touch screen |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1932726A (en) * | 2006-10-13 | 2007-03-21 | 广东威创日新电子有限公司 | Digital image sensor locator based on CMOS and locating method |
CN110094846A (en) * | 2019-05-24 | 2019-08-06 | 珠海格力电器股份有限公司 | Air conditioner air sweeping control method and device, storage medium and air conditioning equipment |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11716444B2 (en) * | 2006-02-15 | 2023-08-01 | Kurtis John Ritchey | Human-like emulation enterprise system and method |
CN101882033B (en) * | 2010-07-16 | 2012-07-11 | 广东威创视讯科技股份有限公司 | Method and device for speeding up acquisition of coordinate of touch point |
CN101976334B (en) * | 2010-10-26 | 2014-02-19 | 广东威创视讯科技股份有限公司 | Anti-interference touch identification method and device |
CN103677441B (en) * | 2012-09-18 | 2017-02-08 | 北京汇冠新技术股份有限公司 | Infrared multipoint identification method, infrared multipoint identification device and infrared touch screen |
CN105373267B (en) * | 2015-12-14 | 2018-09-18 | 广州华欣电子科技有限公司 | A kind of infrared touch screen multi-point recognition method, device and infrared touch panel |
CN109814757B (en) * | 2019-01-29 | 2022-05-27 | 京东方科技集团股份有限公司 | Touch detection method and device, touch equipment, computer equipment and readable medium |
-
2020
- 2020-09-02 CN CN202010911951.XA patent/CN112214135B/en active Active
- 2020-09-18 WO PCT/CN2020/116194 patent/WO2022047839A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1932726A (en) * | 2006-10-13 | 2007-03-21 | 广东威创日新电子有限公司 | Digital image sensor locator based on CMOS and locating method |
CN110094846A (en) * | 2019-05-24 | 2019-08-06 | 珠海格力电器股份有限公司 | Air conditioner air sweeping control method and device, storage medium and air conditioning equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2022047839A1 (en) | 2022-03-10 |
CN112214135A (en) | 2021-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20070036396A1 (en) | Pupil detection device and iris authentication apparatus | |
US8611662B2 (en) | Text detection using multi-layer connected components with histograms | |
CN112214135B (en) | Multi-touch identification method, device and equipment for infrared touch screen | |
CN107437294B (en) | Character segmentation method, device, equipment and storage medium | |
CN110619333B (en) | Text line segmentation method, text line segmentation device and electronic equipment | |
CN109978925A (en) | Robot pose recognition method and robot thereof | |
CN110458998B (en) | Bill detection method, bill detection device and terminal | |
US11593974B2 (en) | Map building method, computer-readable storage medium and robot | |
US20070071287A1 (en) | Pupil detection device and iris authentication apparatus | |
CN107331031A (en) | A kind of recognition methods of note denomination, device, equipment and storage medium | |
WO2024216965A1 (en) | Image processing method, point cloud generation method, and image processing apparatus | |
CN114638294A (en) | Data enhancement method and device, terminal equipment and storage medium | |
CN103399674B (en) | A kind of multipoint touch detection method and device | |
CN108764225B (en) | Method and device for identifying transversely spliced paper money and electronic equipment | |
CN109886189B (en) | Fingerprint template acquisition method and related device | |
CN112629828B (en) | Optical information detection method, device and equipment | |
CN112530079B (en) | Method, device, terminal equipment and storage medium for detecting bill factors | |
CN112733650A (en) | Target face detection method and device, terminal equipment and storage medium | |
CN107945186A (en) | Method, apparatus, computer-readable recording medium and the terminal device of segmentation figure picture | |
CN109543554B (en) | Bill detection method, device, terminal and computer readable storage medium | |
CN108510636B (en) | Image segmentation method, image segmentation device and terminal equipment | |
CN116107450A (en) | Touch point identification method and device of infrared touch screen and infrared touch screen | |
CN112416128B (en) | Gesture recognition method and terminal equipment | |
CN114120518A (en) | Method and device for detecting continuous paper money, electronic equipment and storage medium | |
CN112527179A (en) | Scribble image identification method and device and terminal equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |