Robot system
A robot system and robot technology, applied in the field of robot systems, can solve the problems of robot productivity reduction, drive limitation, robot drive limitation, etc.
Pending Publication Date: 2020-11-24
SEIKO EPSON CORP
7 Cites 0 Cited by
AI-Extracted Technical Summary
Problems solved by technology
[0005] However, in the information processing device described in Patent Document 1, since only unconfirmed persons are recognized, the driving of the robot is restricted.
Therefore, when the robot is...
Method used
[0047] On top of this, the control unit 51 has a function of changing the distance based on the identification result of the person by the identification unit 54. Thereby, it is possible to vary the distance at which a person performing work in the vicinity of the robot 1 and the robot 1 can approach the robot 1 according to the identification result of the person. As a result, work can be p...
Abstract
Provided is a robot system that performs control to decelerate or stop the operation of a robot when it is determined that there is a person within a predetermined distance from the robot, and that does not readily reduce the productivity of work. The robot system is characterized by including a robot configured to operate in cooperation with a person, a specifying section configured to specify aperson present in a region at a predetermined distance from the robot, and a control section configured to decelerate or stop the operation of the robot when the presence of the person in the region is specified by the specifying section. The control section changes the distance based on a result of specifying the person by the specifying section.
Application Domain
Programme controlProgramme-controlled manipulator +1
Technology Topic
Robotic systemsRoboty +2
Image
Examples
- Experimental program(4)
Example
[0025] First embodiment
[0026] Firstly, the robot system of the first embodiment will be described.
[0027] figure 1 It is a diagram illustrating a robot system according to a first embodiment. figure 2 Yes figure 1 A schematic diagram of a robot system shown in FIG. image 3 Mean figure 1 as well as figure 2 A block diagram showing a hardware configuration example of the robot system shown in FIG. Figure 4 It is described in more detail for image 3 Functional block diagram.
[0028] In addition, figure 1 , A three-axis orthogonal to each other as illustrated X-axis, Y-axis and Z-axis. Further, in the present specification, "connected" includes the case of directly connected both to the case of the indirectly connected by any member.
[0029] 1.1 robot system
[0030] figure 1 The robot system 100 shown in, for example, the job holding the work object, transporting or assembling or the like. The robot system 100 having: a robot; a control means 5 controlling the driving of the robot 1; 120, detecting a force acting on the force sensor of the robot 1; imaging unit 3, captured around the robot 1. In addition, these units can communicate, the communication may be performed through a network such as the Internet through a wired or wireless.
[0031] 1.1.1 Robot
[0032] like figure 1 as well as figure 2 , The robot 1 is a so-called six-axis vertical articulated robot having a base connected to the base station 110 and a robot 110 of the robot 10.
[0033] 110 is a robot base 1 installed in any installation position of the section. In the present embodiment, the base station 110 is provided by a e.g. figure 2 The installation position shown in X-Y plane configuration of 70. In addition, the installation position of the base 110 is not limited to the floor 70 and the like, for example, may be a wall, a ceiling, a movable carriage and the like. Further, disposed between the robot 1 and the position of the force sensor 70 is configured 120, capable of detecting the force acting on the robot 1.
[0034] like figure 1 as well as figure 2 , The robot 10 has a robot arm manipulator 11, manipulator 12, manipulator 13, manipulator 14, the robot arm 15 and robot arm 16. The robot arm 11 to the base 16 i.e. 110 side toward the distal side from the proximal side opposite to the side that is the order of the base 110 is coupled. Each of the arms 11 to 16 to be rotatable with respect to the adjacent base station 110 or the robot arm. For example, the robot arm 16 figure 1 Be disc-shaped as shown, the robot arm 15 can be rotated with respect to.
[0035] like figure 1 , The front end of the robot manipulator 10 which is connected to the hand gripping the grip 17 of the work object. In addition, the hand grip 17 is replaceable, may be used a suction hand, magnetic hand tools screws, engagement means 17, etc. instead of the hand grip.
[0036] like Figure 4 , The robot 1 has a drive unit 130, so that the driving unit 130 includes a robot arm or a robot arm relative to the other base station 110 rotates a motor (not shown) and a speed reducer (not shown). As the motor, for example, using an AC servo motor, the DC servo motor servo motor. As the reducer, for example, a planetary gear type speed reducer, like wave gear device. Further, the position sensor having a rotation angle detecting rotary shaft or the motor reducer robot 1140. The position sensor 140 may be used, for example, a rotary encoder. Further, the driving unit 130 and the position sensor 140 is provided, for example, the base station 110 and each of the arms 11 to 16, in the present embodiment, the robot 1 has six drive unit 130 and a six position sensor 140. Further, each of the drive unit 130 built in, for example, electrically by a robot (not shown) is connected to a motor drive control device 5. And, in each position sensor 140 Figure 4 Also not shown, it is electrically connected to the control means 5.
[0037] Also be provided with any of the components, devices and the like in the robot 1.
[0038] 1.1.2 the control device
[0039] like Figure 4 , The control device 5 has the function of controlling the drive of the robot 1, the robot 1 with respect to communicably connected. Further, between the robot 1 and the control device 5 can be a wired connection, or through a wireless connection. Further, the control unit 5 is connected to a display device includes a display or the like, for example, 401 (the display unit), and includes an input device 402 such as a keyboard, a touch panel and the like.
[0040] like Figure 4 , The control device 5 comprises: a control unit 51; a storage unit 52; external input-output section 53; section 54 determines, in determining a person around the robot; and a registering unit 55, registration of the personal information. The control device 5 of each structural element may be communicatively connected to each other through a variety of bus.
[0041]The control unit 51 executes various programs stored in the storage unit 52, and the like. Accordingly, it is possible to realize the drive control of the robot 1, the processing of various operations and the like is determined. Control unit 51 according to this embodiment in determining portion 54 determines a person in a predetermined distance range from the robot 1 by the operation of the robot 1 is stopped or decelerated. The operation of the robot 1 so that the deceleration means of the robot manipulator 10 is reduced, the driving speed of the hand grip 17. Accordingly, the robot 1 can be prevented and people, things collide, a collision occurs or if it is possible to absorb the impact. Further, the control unit 51 further has a function of an output from the force sensor 120 through the operation of the robot 1 is stopped or decelerated. Thus, even if the robot 1 with people, objects collide, the impact can be alleviated.
[0042] Storage unit 52 stores various programs can be executed by the control unit 51. Further, the storage unit 52 can store various kinds of data received from the external input-output unit 53.
[0043] External input-output unit 53 includes an external interface for communicating with the robot 1, the connecting means 401 and an input device 402 display.
[0044] Determination section 54 determines a person in the range from a predetermined distance of the robot. It is a person who is determined to identify a human and measuring its position. Determination unit 54 determines the human is not particularly limited, but in the present embodiment, it is possible by using the result of the photographing part 3. Specifically, extracts a contour of the subject image captured results of processing by template matching, for example, to detect whether a human. Method Besides the method of imaging by the imaging section 3 of the result of, for example, include using the distance sensor, the method of measuring the temperature distribution, and the detection method characterized by using the operation result of the voice recording and the like.
[0045] Further, the determination result of the portion 54, whether the information was in the range of distance and in the case where someone is information related to its position from a predetermined robot 1 is output to the control unit 51.
[0046] As described above, the control unit 51 to make the operation of the robot 1 is stopped or decelerated based on the determination of the determination portion 54 of the person, and such as to limit the driving conditions, so it was determined whether the region of the predetermined distance from the robot 1 information. That is, in the case where the robot 1 from a predetermined distance range while someone is determined by determining section 54, based on the determination result, the control unit 51 limits the driving of the robot 1. Thus, the control unit 51 based on the distance from the robot 1 to the robot 1 and the presence of a predetermined range of possible collisions occur.
[0047] Based on this, the control unit 51 has a function to change based on the determination of the distance determination section 54 of the person. Thus, one can near the robot 1 according to the determination result of human work or the like may be different from the robot 1 close distance. As a result, according to the proficiency varies from person to person job, with or without qualifications, etc., allowing for more work at the vicinity of the robot 1, it is possible to improve the efficiency of operations. As a result, a predetermined limit is determined by the drive control of the robot 1 when the distance was in a range, the security can be maintained, and increase the productivity of the operation of the robot 1.
[0048] 55 Subscriber's personal information registration unit. Personal information include, for example, identification information identifying an individual, the individual attribute information indicating attributes and the like. Wherein, as the identification information, for example, include the name, identification number, a facial image, height, sound, action, and other dominant hand. In addition, as attribute information, such as lists job proficiency, qualifications, job history, job action trajectory.
[0049] Control means for realizing functions of such a hardware configuration of each portion 5 is not particularly limited, but for example, as image 3 Illustrated, it is configured to include: a communication controller 61 connected to the computer 62 and the robot 1 is communicatively connected to the controller 61.
[0050] Among them, image 3 Processor shown, for example, include a CPU (Central Processing Unit, central processing unit), FPGA (Field-Programmable Gate Array, field programmable gate array), ASIC (ApplicationSpecific Integrated Circuit, ASIC) and the like.
[0051] In addition, as image 3 Memory shown, for example, include a RAM (Random Access Memory) or the like a volatile memory, ROM (Read Only Memory) such as a nonvolatile memory or the like. Further, not limited to non-removable memory, it may be a structure having a detachable external memory device.
[0052] And, as image 3 External interface shown, include various communications connector. As an example, include a USB (Universal Serial Bus, Universal Serial Bus) connector, RS-232C connector, a wired LAN (Local AreaNetwork, Local Area Network), a wireless LAN.
[0053] Further, the control unit 51, determination unit 54, and registration section 55 may be set to at least two elements in a hardware structure.
[0054] Further, the control device 5, on the basis of structure, other structures also can be attached. Further, the recording medium is stored in the storage unit 52, for example, may be stored, for example, a CD-ROM of the storage unit 52, various programs and data may be stored in advance, supplied from the recording medium, and the like may also be provided through a network.
[0055] 1.1.3 Force Sensor
[0056] figure 1 The force sensor 120 is shown disposed within the base 110 of the robot 1. Accordingly, it is possible to detect acting gripping hand 17, the force of the robot manipulator 10.
[0057] Such a force sensor 120, preferably six-axis force sensor, a three-axis force sensor such as a force sensor. In the force sensor, the force can be detected with high accuracy. Further, since the size of the force can be detected and orientation, for example, can thus grasp the grasping force of the hand 17, the robot manipulator 10 which acts in a direction. The force sensor 120 detects the force into an electrical signal, and outputs the electric signal to the control means 5.
[0058] Further, the control unit 51 includes a control means 5 to make the operation of the robot 1 is stopped or decelerated based on the output from the force sensor 120. Accordingly, even if the grasping hand 17, the robot manipulator 10 and the like when the obstacle collision, it is possible to limit the impact of, for example, can be minimized.
[0059] Imaging section 1.1.4
[0060] figure 1 as well as figure 2 Capturing section 3 shown in the set position 70, can be captured around the robot 1 is provided.
[0061] 3 imaging unit not shown, but, for example: an imaging element, constituted by CCD (ChargeCoupled Device, Charge Coupled Device) image sensor having a plurality of pixels; and an optical system comprising a lens. The light from the imaging section 3 or the like the subject through the lens receiving surface of the imaging device imaging the light into an electrical signal, the electrical signal output means 5 to the control. Further, the imaging unit 3 as long as a configuration having a photograph function, the structure is not limited, and may be other structures.
[0062] In addition, the position of the imaging unit 3 is not limited to the illustrated position. For example, the photographing section 3 may be provided with a ceiling or wall in the room provided in the robot 1. Further, according to the position of the imaging section 3, the human presence is not easily summarized shadow's face, but in this case, can also make helmet, and for example, a two-dimensional bar code identification tag affixed to the surface of the helmet .
[0063] Further, the photographing section 3 may be provided according to need, it may be replaced by other equipment capable of detecting a person. As a device capable of detecting people, for example, in the case of people holding the transmission optical, infrared, radio, ultrasound equipment, or the like, can be obtained include those radio waves transmitted direction until the distance equipped device.
[0064] 1.1.5 display device and an input device
[0065] Figure 4 The display device 401 shown includes a display having a function of displaying various screens and the like. Thus, the operator can confirm the driving state of the robot 1 and the like via the display device 401.
[0066] The input device 402 includes a keyboard, for example. Thus, the operator can control various processing apparatus 5 instructs the like by operating the input device 402. Although not shown, the input device 402 may include, for example, teach the like.
[0067] Further, instead of the apparatus 401 as an input and display device 402, the input device may function using the display means 401 and an input device 402 having both the display. As a display input device, such as a touch panel display or the like can be used. Further, the robot system 100 may each have a display device 401 and input device 402 may have a plurality.
[0068] The method of controlling a robot 1.2
[0069] Next, a method of controlling a robot according to a first embodiment will be described.
[0070] Figure 5 It is a flowchart illustrating a control method of a robot control device 5. Image 6 as well as Figure 7 They are used to illustrate Figure 5 FIG method of controlling the robot 1 as shown.
[0071] First, as Figure 5 Step S01 shown in photographing by photographing section 3 is set to Image 6 as well as Figure 7 Shooting range around the robot 130 of FIG. Imaging range 30 is determined by the imaging optical system of example 3 part. The image acquisition section 54 is output to the determination.
[0072] Then, as Figure 5 Shown in step S02, the determining section 54 detects Image 6 as well as Figure 7 Does anyone shooting range shown 9 30. Further, in this stage, even if the shooting range is assumed a case of detecting 30 9 person and it does not need to detect the registered person 9 in the personal information database registration unit 55 collates at least detect only the uncertain 9 people can be. Thus, for example, the outline profile of the object by capturing an image acquisition unit 3, if the shape of the outline of a human type, it is possible to detect a human 9. Then, it was the case 9, to the next step S03 in the imaging range 30 is detected. On the other hand, in the case of failing to detect a human within the imaging range 309, the process returns to step S01.
[0073] Then, as Figure 5Step shown in S03, the image acquisition can be based on determining whether a face authentication determination unit 54. Refers to features of the face authentication image acquired with a face registered in the registration face image database 55 is a control unit. Thus, in this step, for example, based on the quality of an image, the size of the captured person, face angle, etc. can be performed to determine whether the face authentication. And, if the face authentication can be performed, then moved to the next step S04. On the other hand, if the face authentication can not be performed, the operation of the robot 1 is not started, the process ends. In the case of face authentication can not be performed in, it includes only did not get hands, feet, etc., where a portion other than the face. In this face authentication to ensure that only people registered in the database of the robot 1 and close collaboration work. As a result, it is possible to achieve a particularly high security robot system 100.
[0074] Then, as Figure 5 Step S04 shown in, based on the acquired image, the face authentication performed by determining unit 54. Further, in the face authentication, it is preferable in the registered face image database registration control section 55 to determine an individual person, but may be limited to a plurality of candidates.
[0075] Then, as Figure 5 Step shown in S05, and the results from the other elements of the face authentication is registered in the database to determine the individual. The result of the face authentication image is that if a person can be obtained in one shot is concentrated, the person becomes person in this step should be determined. Further specific gesture times, include a case where a plurality of candidates, the result of the face authentication may be based on other elements, for example, human image wore a particular pattern, performed by a person, like person's height the final element to be used to identify a person.
[0076] Further, in the case of face authentication of the object becomes the person is not registered in the database as long as the process can be unregistered person.
[0077] In addition, the face authentication is not required, it can only be determined by other elements of the individual.
[0078] Then, as Figure 5 Step S06 illustrated, first, the control unit 51 determines the operation mode of the robot 1 based on the determination result of the determination portion 54 persons. Operation mode refers to motion content determined person permitted. That is, the operation of the robot 1 according to the distance from the robot 1 to be described later of 9 human is limited, but in the present operation mode determination step based on the operation content rather applied before the limit distance. Thus, the operation mode is determined based on the personal information determined in advance as long as the registration unit 55 is stored in the database can be.
[0079] Specific examples of the operation mode determined when a predetermined region include human intrusion 1000 described later, the driving speed of the robot 1 is allowed, the contact of the robot 1 can continue driving force parameter threshold or the like. And the case where, for example, the person using the determined position responsible jobs of the robot 1, the robot can be increased permitted driving speed 1, or increasing the threshold to continue driving force. On the other hand, in the case where the person is not registered, these driving speed, the threshold value can be set to a minimum.
[0080] Then, as Figure 5 Step S07 shown as Image 6 as well as Figure 7 , By the control unit 51 calculates the distance from the robot 1 of 9 human L9. L9 determined distance from the separation distance of the robot, for example, on the image of the foot 1 and extracts a positional relationship of the ground human foot 9 on the image.
[0081] Then, as Figure 5 Step shown in S08, the control unit 51 determines whether the determination result based on the person 9 to change from the standard distance L0 limit driving the robot 1 by a distance L1. Specifically, if Image 6 as well as Figure 7 As shown, to limit the robot invasion 9 when the human driving region is the region 1 of 1000, the distance from the outer edge of the region 1000 to the robot 1 is set distance L1. As described above, when the invaded zone 1000, the control unit 51 according to this embodiment of the operation of the robot 1 is stopped or decelerated. Thus, when 1, the limit driving the robot 1 by a distance L1 of about 9 people close timing of the robot.
[0082] Further, in the present embodiment, based on the determination of 9 people, the control unit 51 determines whether the change from the standard distance L0 from L1, needs to be changed when the next move to a step S09. As an example of step S09 is performed as below.
[0083] Specifically, in the case of determining the person 9 who is inexperienced in the case where, for example, a job, without changing the distance L1. and, Image 6 It is a distance from the robot 1 of 9 human L9 example greater than the distance L1. From this, Image 6 , Since the person 9 in the inner region 1000, thus driving the robot 1 is limited. As a result, even though no experience as a job as a low proficiency people work too close to the robot 1, 1000 and the invasion of the region, it is possible to prevent the robot 1 collision or mitigate the impact.
[0084] On the other hand, in the case where, for example, such as those responsible for the position of the person 9 who is determined, as long as the distance L1 can be reduced from the standard distance L0. Figure 7 Is the distance L1 becomes smaller than Image 6 Small distance L1 shown in FIG illustrated for a situation. exist Figure 7 , The result of such a change of the distance L1 is a distance L1 from after the change becomes smaller than the distance of the robot 1 of 9 human small L9. Thereby, since the region 1000 becomes narrower, as Figure 7 Addition, the person 9 in the region 1000, even in a state close to the robot 1, the robot 1 is not driven is limited. Thus, if it is responsible for that person's position 9, even in a state close to the robot 1, the robot 1 can perform the job effectively.
[0085] Further, the distance L1 may be, for example, registered in the database registration unit 55. The following Table 1 shows the distance L1 registered in the database registration unit 55, an example of another data table.
[0086] 【Table 1】
[0087]
[0088] In addition, the driving speed of the table 1 relative to the maximum speed ratio.
[0089] In Table 1, A name is a skilled operator, who fully accept the equivalent education. In addition, the name of human B proficiency jobs than the name A poor man, but experience has also received the equivalent of fewer people education. Also, the names of people who work C proficiency than the name B of poor people, equivalent to not accept the so-called no experience and education.
[0090] Name A ~ C, the name of the person A minimum distance L1, and 1000 even though the invaded zone, reducing the amplitude of the driving speed is also minimized. Name of the person C is the maximum distance L1, and the intrusion speed of the drive region 1000 reduces the maximum amplitude also. Name of the person B is the intermediate distance L1 A person's name and the names of C, and reducing the amplitude of the driving speed in the invaded zone 1000 is the same.
[0091] Further, at step S08, if no change had set distances L1, it may not be changed. In this case, omitting step S09, moves to step S10. Further, the initial step S08 as long as the distance L1 of L0 can be set to the standard distance. The distance should be the standard distance L0 unregistered person setting the determined result of whether the person 9, are set to substantially the maximum distance that can be set.
[0092] After the step S09 moves to step S10. In step S10, it is determined whether or not the distance L1 was 9 region 1000 after the change. I.e., whether the distance L1 is determined after the change in the distance of the robot 1 from a human 9 above L9. And Image 6 I.e., a state was the case 9, the region 1000 proceeds to step S11. On the other hand, Figure 7 I.e., a state in the region 1000 nobody is 9, the process proceeds to step S12.
[0093] Then, as Figure 5 Shown in step S11, the operation determines the content of further restrictions on the determined operation mode. That is, in the step S11, since the robot 1 at a position closer than the distance L1 after the change of state was 9, whereas the need to further restrict the driving of the robot 1.
[0094] On the other hand, Figure 5 Shown in step S12, the decided operation mode is determined as the operation of the robot 1 directly content. That is, in the step S12, since the robot 1 is at a position farther away than the distance L1 was changed after 9 state, it is not necessary to drive the robot 1 additional constraint.
[0095] Then, as Figure 5 Step shown in S13, the control unit 51 to the robot 1 determined by the operation contents operated.
[0096] Then, as Figure 5 Step shown in S14, the operation determines whether the end of the robot 1. And, when the end of the operation of the robot 1, the flow is terminated. On the other hand, when the continued operation of the robot 1, the flow returns to step S01.
[0097] As described above, the robot system 100 according to a first embodiment includes: the robot 1, and operates in collaboration with others; determining portion 54 to determine who is the region within the distance L1 from the robot a predetermined 10009; and a control unit 51, by determining portion 54 determines the region 9 was 1000, the operation of the robot 1 is stopped or decelerated. Then, the control unit 51 based on the determination result of determination unit 9 persons 54 to change the distance L1 mode configuration.
[0098] In such a robot system 100, when the person 9 to the robot 1, to limit the driving of the robot 1 is not uniform, but is based on the determination result of human limitations. 9 driven to change the way the distance L1 configuration. Thus, for example, by reducing in the case of people with high proficiency job close distances L1, even in a high state of proficiency person approaches the robot 1, the robot 1 is not driven is limited. Thus, the robot 1 perform operations efficiently. That is, according to the present embodiment, it is possible to realize a robot system 100: when it is determined from the robot 1 within a predetermined area from the 1000 was 9, for the robot operation of a deceleration or stop control, and not easy to make the productivity of jobs is reduced.
Example
[0099] 2. Second Embodiment
[0100] Next, the robot system according to a second embodiment will be described.
[0101] Figure 8 It is a schematic view of a robot system according to a second embodiment.
[0102] Hereinafter, a second embodiment will be described, but in the following description, differences from the first embodiment is explained, and explanation of the same items omitted. In addition, Figure 8 , The same of the first embodiment are denoted by the same reference numerals.
[0103] The second embodiment except for the control of the control unit 51, remaining the same as in the first embodiment.
[0104]In the first embodiment, the area around the robot 1000 is set to 1, when the person 9 invaded region 1000, limiting the drive of the robot 1. On the other hand, in the present embodiment, the driving of the robot 1 restricted area 1000 is divided into two stages. That is, the other set in the inner region region 2000 1000. And, when a person 9 to the invaded zone more strictly limit the driving mode of the robot 1 composed of 2000.
[0105] Figure 8 It is more strictly limited driving the robot 1 is set in an area 2000 Image 6 Examples of the inner region 1000 shown in FIG. If the distance from the outer edge of the region 2000 to the robot 1 is set to the distance L2, the distance L2 L1 smaller than the distance. And, when the region 1000 as compared to human intrusion 9, when the parameter threshold invaded region 9 people permitted driving speed of the robot 1, upon contact with the robot 1 can continue driving force is 2000, and the like become more stringent manner set up.
[0106] Further, the distance L1, L2 can be previously registered in the database registration unit 55, for example. The following Table 2 shows registered in the database registration unit 55 by a distance L1, L2, an example of another data table.
[0107] 【Table 2】
[0108]
[0109] Further, as described in Table 2 described in the Table 1.
[0110] Name A ~ C, the name of the person A minimum distance L2, even though the invaded zone 2000 and to reduce the amplitude of the driving speed is minimal. Name of the person C the maximum distance L2, and the invaded zone when the robot 1 is stopped when 2000. B's name in the middle distance L2 A person's name and the names of C, and the amplitude of the drive to reduce the speed of the invaded zone 2000 is the same.
[0111] 2000, e.g., 10 people 9 easily set such a direct contact area of the robot manipulator 1 of the robot, a teaching operates a robot operation, i.e., a so-called direct teaching. In the direct teaching, the person in claim 9 for the robot manipulator 10 holding the guide direction of the target job. Therefore, people in the vicinity of the robot 1 to 9, and allowing the force applied to the robot 1. That is, the distance L2 becomes the shortest distance possible to perform operations at the foot of the robot 1 is in contact with, i.e., the shortest distance from the base station 110 and the foot 9 of the person. Further, distances L1, L2 are not limited to the above definition.
[0112] From this viewpoint, the control unit 51 sets the region of 2000 according to the present embodiment, even if the person 9 invaded the region 2000, and also at a low speed driving mode of the robot 1 continues to control the robot 1. Further, even when the detected force acting on the force sensor 120 of the robot 1, the control unit 51 until a manner to allow the load control to a predetermined size of the robot 1. Can be made by direct teaching efficiently performing such control, it is possible to achieve higher productivity of the robot system 100 of work.
[0113] In other words, the robot system 100 of the present embodiment includes a robot 1 detects force acting on a force sensor 120, the control unit 51 by detecting the force to the force sensor 120 is a predetermined value or more, i.e., shown in Table 2 above the allowed load at , the operation of the robot 1 so that slow or stop the drive way of limitation.
[0114] Thus, for example, the case where the job even when direct teaching or the like, it is possible to ensure safety.
[0115] Further, the control unit 51 may have a predetermined value changes the determination unit 54 based on the result of the determination 9 of the person, i.e., a threshold function table as permissible load limits of the robot driving force 1 shown in Figure 2. By having such a function, for example, as shown in Table name A as shown in Figure 2, when a high proficiency person job operation, the robot 1 can be relaxed driving conditions is restricted. This makes it possible to ensure the safety of the robot 1, and further improve the efficiency of operations.
[0116] In the second embodiment as described above, but also can obtain the same effects as the first embodiment.
Example
[0117] 3. Third Embodiment
[0118] Next, the robot system according to the third embodiment will be described.
[0119] Figure 9 to 13 They are a schematic view of a robot system according to a third embodiment.
[0120] Hereinafter, a third embodiment will be described, but in the following description, differences from the first embodiment is explained, and explanation of the same items omitted. In addition, Figure 9 to 13 , The same of the first embodiment are denoted by the same reference numerals.
[0121] The third embodiment except for the control of the control unit 51, remaining the same as in the first embodiment.
[0122] Figure 9 Someone 9A and 9B two examples of people inside the shooting range in the 30's. on the other hand, Figure 10 Someone 9A, and 9C three examples of people inside the shooting range in 30 people 9B. also, Figure 9 as well as Figure 10 People shown 9A, 9B, 9C are determined by the determination section is 54. Further, for convenience of explanation, people 9A, 9B, 9C having the name 2 of Table A, B, C corresponding to the property that the personal information is registered in the database registration unit 55. 9A i.e., human A 2 is a table name of person, name of person B 2 Table 9B is a person, who 9C is a table of names of persons C 2.
[0123] In the case of so many individuals in the surrounding of the robot 1, the robot 1 and are more easily accessible. Thus, in the case of driving the robot by limiting the control unit 511, the control unit 51 may be changed based on the distance determined by the number 54 L1 determination unit configured manner.
[0124] Specifically, if the comparison as Figure 9 There are shown in the case where the capture range 30 of two such Figure 10 Shooting range shown there are three of 30 cases, it can be considered Figure 10 Probability one condition causing the robot 1 shown in contact with the person becomes high. Therefore, with Figure 9 Compared to Figure 10 Set the distance L1 in a variable length way. Thus, even in the case of more than one person around the robot 1, it is possible to ensure safety.
[0125] in addition, Figure 11 It is shown Figure 9 Inner region 1000 shown in FIG state was added. 9C.
[0126] In such a situation, the robot system 100 may be provided with registering unit 55 to register the personal information, when the person to be registered in the registration unit 55 as a registration subject, the control unit 51 based on in through 54 determined by determination section constituting the subject enrollment in the area 1000 to change the distance L1 manner.
[0127] Specifically, Figure 9 , No one with respect to the region 1000, in Figure 11 , The 9C was added at the same location with the people 9A. In such a situation, it can be considered although Figure 9 Outer region 1000 shown, but the increase in the number of the registered subject, in terms of probability, tends to cause the robot 1 to the human touch. Therefore, with Figure 9 Compared to Figure 11 Set the distance L1 in a variable length way. Thus, even in the case of more than one person around the robot 1, it is possible also to ensure safety.
[0128] Additionally, the following Table 3 is assumed to determine how one case when registered personal data to the database registration unit 55 of the imaging range 30. Here, it represents the distance L1, L2, one case of other data.
[0129] 【table 3】
[0130]
[0131] Further, according to Table 3 as described in the Table 1. As shown in Table 3, preferably in the case where a plurality of persons registered in advance consider the case of a plurality of persons are simultaneously determined. Further, at this time it may be different according to the content of the first person and the second person's driving restrictions.
[0132] Further, the following Table 4 is assumed to determine how the individual imaging range 30, and the determining comprises humans as the registered data when the person responsible for the case where the job is not responsible for the person in a database registration unit 55 example. Here, it represents the distance L1, L2, one case of other data.
[0133] 【Table 4】
[0134]
[0135] Further, as described in Table 4 described in the Table 1. As shown in Table 4, the content may be different depending on whether the driver responsible restrictions.
[0136] in addition, Figure 12 as well as Figure 13 Is inside the imaging range 30 respectively are examples of job proficiency of mutually different people.
[0137] exist Figure 12 , The inside of someone shooting range 30 9A. On the other hand, Figure 13 , The inside of someone shooting range 30 9C. exist Figure 12 as well as Figure 13 The robot system 100 shown there includes a respective registered first job proficiency human proficiency 9A (the first person) proficiency personal information for a first job proficiency is lower than the second proficiency registering personal information have portions 9C (second person) 55.
[0138] At this time, when the control unit 51 based on information registered in the registration unit 55 of the determination unit 54 determines to someone. 9A, the distance from the outer edge of the region 1000 to the robot 1 is defined by a first distance LL1. On the other hand, when the control unit 51 determines 9C was determined by the unit 54 based on the information registered in the registration unit 55, the distance from the outer edge of the region 1000 to the robot 1 is set to a second distance longer than the first distance of LL2 LL1 .
[0139] By performing such the control, even at relatively high proficiency job 9A person approaching a state of the robot 1, the robot 1 is not easily driven is limited. Accordingly, even in a state close to human 9A, they are possible to make the robot 1 perform an efficient operation.
[0140] In addition, if Figure 11 , When it is determined that there is a first job proficiency human proficiency 9A (first person) and a second job proficiency human proficiency 9C (second person) both sides, preferably the control unit 51 distance from the outer edge of the robot 1 to zone 1000 is a second distance LL2. That is, preferably different from each other when the human proficiency 9A, 9C while in the inner side of the imaging range 30, people with low proficiency 9C cooperate to set the distance from the outer edge of the region 1000 to the robot 1.
[0141] This makes it possible for people 9A, both the safety of the robot system 9C of 100.
PUM


Description & Claims & Application Information
We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.