[0064] Example three
[0065] On the basis of the foregoing embodiments, the third embodiment of the present invention may also provide an air conditioning control method. image 3 A schematic flowchart of an air conditioning control method proposed in Embodiment 3 of the present invention is shown. Such as image 3 As shown, the air conditioning control method may include: step 310 to step 330.
[0066] In step 310, the user's body posture and the user's body motion range information are acquired.
[0067] Here, the user’s body posture and the user’s body motion amplitude information can be obtained through radar or image collection,
[0068] In step 320, the user's physical activity state is determined according to the user's body posture and the user's body motion range information.
[0069] Here, the user's activity state is obtained according to the user's body posture and the user's motion range information. By combining the user's body posture and the user's motion range information, the body posture recognition is combined with the motion range information to identify the user's activity state. For example, when it is judged that the user is in a standing posture, possible activities include chatting, walking, working, and exercising. Combined with the user's motion range information, the user's activity state is further identified to obtain an accurate user's activity state.
[0070] In step 330, the operation mode of the air conditioner is controlled according to the physical activity state.
[0071] Here, the operation of the air conditioner is controlled according to the user's activity state, including controlling the blowing direction, wind force, temperature and humidity of the air conditioner according to the user's activity state.
[0072] In this embodiment, by determining the user's physical activity state according to the user's body posture and the user's body motion amplitude information, the user's physical activity state can be accurately identified, and then the physical activity state can be controlled according to the physical activity state. How the air conditioner works. Not only can the user's activity state be finely identified to obtain the user's current activity state, so that the air conditioner can be accurately adjusted according to the user's activity state to improve user experience.
[0073] In an optional implementation manner, in step 320, determining the user's physical activity state according to the user's body posture and the user's body motion range information includes:
[0074] The user's body posture and the user's body motion amplitude information are matched with the parameter values corresponding to the preset activity state, and the user's activity state is obtained according to the matching result.
[0075] Here, matching the user's body posture and the user's body motion amplitude information with the parameter values corresponding to the preset activity state is to input the user's body posture and the user's motion amplitude information into the database for matching. There are parameters corresponding to the activity state corresponding to different user's body posture and user's motion amplitude information. It is worth noting that due to the different accuracy of millimeter wave radar, the parameters will be different. Therefore, the parameters corresponding to the active state are set according to the actual situation. As a result, the user's activity state can be accurately recognized through the user's body posture and the user's motion range information.
[0076] Figure 4 show image 3 A schematic flowchart of the specific steps of step 310 is shown. In an alternative implementation, such as Figure 4 As shown, in step 310, obtaining the user's body posture and the user's body motion range information may include steps 311 to 313.
[0077] In step 311, the user's body is detected by the millimeter wave radar to obtain the user's point cloud data map.
[0078] Here, the user is detected by the millimeter-wave radar, which can be scanned by the millimeter-wave radar installed on the air conditioner, or the user can be scanned by the millimeter-wave radar installed in other places in the room according to actual needs. Thus, a point cloud data diagram of the user is obtained, and the point cloud data diagram may be a point cloud data diagram including only the user's body. In addition, the scanning time can be a preset time interval, so that the user's body motion track changes can be used to accurately identify the user's activity state.
[0079] In step 312, the body posture of the user is obtained according to the point cloud data map.
[0080] Here, according to the millimeter wave radar, a real-time 3D point cloud data can be obtained in great detail, including the target's 3D coordinates, distance, azimuth, reflected laser intensity, laser coding, time, etc. Therefore, according to the point cloud data map, the body posture of the user can be recognized.
[0081] It is worth noting that the user's body posture includes one of lying posture, squatting posture, sitting posture, and standing posture.
[0082] In step 313, the user's body motion amplitude information is obtained according to the point cloud data map.
[0083] Here, since the point cloud data map includes the target's three-dimensional coordinates, distance, azimuth, reflected laser intensity, laser code, time and other data, the point cloud data map can be analyzed to obtain the user's body Movement amplitude information.
[0084] Thus, by using millimeter wave radar to detect the user, it is possible to accurately acquire the user's body posture and the user's body motion amplitude information.
[0085] Figure 5 show Figure 4 A schematic flowchart of the specific steps of step 312 is shown. In an alternative implementation, such as Figure 5 As shown, in step 312, obtaining the user's body posture according to the point cloud data map may include step 3121 to step 3122.
[0086] In step 3121, the difference between the maximum Z-axis coordinate value and the minimum Z-axis coordinate value in the point cloud data graph is calculated, and the difference is determined as the height of the user; where the Z-axis is The direction of the height of the user's body in the point cloud data graph.
[0087] Here, since the point cloud data diagram has the three-dimensional coordinates of the user's body, the difference between the maximum Z-axis coordinate value and the minimum Z-axis coordinate value in the point cloud data diagram is calculated, and the difference is determined as The height of the user, where the Z-axis coordinate is the direction of the height of the user's body in the point cloud data graph. It is worth noting that the point cloud data diagram is a point cloud data diagram of the user's body.
[0088] In step 3122, the parameter values corresponding to the height and the preset body posture are matched, and the body posture of the user is obtained according to the matching result.
[0089] Here, by matching the recognized height of the user with the parameter value corresponding to the preset body posture, the user's body posture is obtained according to the matching result. For example, among the preset parameter values corresponding to the posture of the human body, the highest point of the lying posture recognition does not exceed 30cm, that is, the z coordinate is less than 30cm, and the highest point of the standing posture recognition needs to be stabilized near the height of the user. The highest point of recognition is about 40cm to 60cm, and the height between 60cm and height is the sitting posture. In addition, if the obtained point cloud data graphs are multiple, the difference between the maximum Z-axis coordinate value and the minimum Z-axis coordinate value in each of the point cloud data graphs is still calculated according to the above method, but the difference is taken The average of the values is taken as height.
[0090] It is worth noting that the height will be different due to different ages. Therefore, the specific parameters of the sitting, squatting, lying and standing positions are set according to actual conditions.
[0091] In an optional implementation manner, in step 313, obtaining the user's body motion amplitude information according to the point cloud data map may include:
[0092] Calculate the density information of the moving points in the point cloud data graph, and determine the density information as the user's body motion amplitude information, where the density information of the moving points is used to reflect the magnitude of the user's motion amplitude.
[0093] Here, as the human body is moving, the micro-movements generated by it will cause micro-movements on the radar bowl of the millimeter wave radar, and the characteristics of the user's motion state can be obtained from these micro-movements. In addition, the point cloud data graph includes the target's three-dimensional coordinates, distance, azimuth, intensity of the reflected laser, laser code, time and other data. The range of human motion is very intense, and the more moving points in the point cloud data graph will become denser. Therefore, the magnitude of the human motion range can be judged according to the density of the detected moving points.