WO2009113339A1 - 動作教示システム及び動作教示方法 - Google Patents
動作教示システム及び動作教示方法 Download PDFInfo
- Publication number
- WO2009113339A1 WO2009113339A1 PCT/JP2009/051741 JP2009051741W WO2009113339A1 WO 2009113339 A1 WO2009113339 A1 WO 2009113339A1 JP 2009051741 W JP2009051741 W JP 2009051741W WO 2009113339 A1 WO2009113339 A1 WO 2009113339A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gripping
- pattern
- operator
- workpiece
- robot
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39484—Locate, reach and grasp, visual guided grasping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39543—Recognize object and plan hand shapes in grasping movements
Definitions
- the present invention relates to a motion teaching system and a motion teaching method for teaching motion to a robot that grips a workpiece.
- Patent Document 4 A technique for giving a robot a workpiece shape model in advance, autonomously recognizing a workpiece existing in the work space, and causing the robot to automatically perform workpiece gripping has been researched and developed (for example, Patent Document 4). See).
- Patent Document 4 A technique for giving a robot a workpiece shape model in advance, autonomously recognizing a workpiece existing in the work space, and causing the robot to automatically perform workpiece gripping has been researched and developed (for example, Patent Document 4). See).
- Patent Document 4 A technique for giving a robot a workpiece shape model in advance, autonomously recognizing a workpiece existing in the work space, and causing the robot to automatically perform workpiece gripping has been researched and developed (for example, Patent Document 4). See).
- Patent Document 4 A technique for giving a robot a workpiece shape model in advance, autonomously recognizing a workpiece existing in the work space, and causing the robot to automatically perform workpiece gripping has been researched and developed (for example, Patent Document 4). See).
- Patent Document 4 A technique for
- Patent Documents 1 to 3 disclose techniques in which an operator teaches a robot using a motion teaching device having a graphic display function.
- the invention disclosed in Patent Document 1 was made by the inventors of the present application.
- the present invention relates to a motion teaching device that teaches a motion of a robot in a three-dimensional space using a motion teaching device having a two-dimensional display.
- the motion teaching apparatus can be applied when a robot grips a workpiece whose shape and three-dimensional position in the work space are unknown. Teaching to the robot using the motion teaching device is performed as follows.
- a photographed image of the robot work space including the workpiece is displayed on a two-dimensional display.
- the operator looks at the two-dimensional display on which the photographed image is displayed and uses a parametric modeling method used in CAD (Computer Aided Design) with an input device such as a mouse or a touch panel on the display surface of the display.
- Draw geometric elements Specifically, in order to model the shape of the workpiece, the operator adjusts the position, orientation, and shape parameters of the model on the screen so as to match the workpiece image, and the coordinate system of the center of gravity position of the workpiece model.
- a coordinate system to which translation and rotation are added is set as necessary so that the workpiece can be easily gripped.
- the robot can grip the workpiece. Is possible.
- a distance measuring device such as a laser range finder or a stereo camera.
- Patent Document 2 discloses an operation teaching device including a storage device that stores a shape model of an end effector shape provided at the tip of a robot arm, and a display that displays the end effector shape.
- the motion teaching apparatus (1) accepts designation of a point on the end effector by the operator on the display screen on which the end effector shape is displayed, (2) calculates the three-dimensional position of the designated point, and (3 ) Accepts an input of a posture change centered on the designated point by the operator, and (4) adjusts the posture of the end effector based on the three-dimensional position of the designated point and the accepted posture change.
- the motion teaching device of Patent Document 2 supports the operation of an operator to input a change in posture of the end effector whose position and shape are known by graphic display.
- the motion teaching apparatus of Patent Document 2 does not teach a robot to operate a workpiece whose shape and three-dimensional position are unknown.
- Patent Document 3 discloses a motion teaching device for teaching a robot a work point of a workpiece whose shape and three-dimensional position are known using a graphic display. For example, when a vehicle body is used as a workpiece, the motion teaching device is used for teaching a welding work point on the body.
- the motion teaching device of Patent Document 3 has a storage device in addition to a graphic display.
- the storage device stores shape data of a workpiece approximated by a polyhedron based on a world coordinate system which is a common coordinate system for the robot and the workpiece.
- the motion teaching device accesses the storage device, displays the work shape two-dimensionally on the graphic display, and accepts designation of the work surface and work point of the work by the operator on the display screen.
- the motion teaching device displays the robot posture superimposed on the graphic display, accepts correction of the robot posture by the operator, confirms the robot posture, converts the confirmed posture into the robot coordinate system, and converts the robot posture.
- Patent Document 1 for teaching a robot an operation on a workpiece whose shape and three-dimensional position are unknown, the operator uses the CAD while looking at the two-dimensional display.
- Teaching to the robot is performed by performing a three-dimensional drawing work.
- skill of an operator is required, and it is difficult to say that it is easy for general users who do not have CAD drawing technology.
- the present invention has been made on the basis of the above-described knowledge, and is an operation capable of teaching a robot a gripping operation of a workpiece whose shape and three-dimensional position are unknown by an intuitive and simple input operation by an operator. It is an object of the present invention to provide a teaching system and an operation teaching method.
- a first aspect of the present invention is an operation teaching system for teaching a gripping operation to a robot having a robot arm including a hand for gripping a workpiece.
- the hand can be applied to each of an imaging device, a distance measuring device that measures a distance to an object, a display device, an input device that accepts an input operation by an operator, and a plurality of elementary shape models. And a database in which at least one gripping pattern is described.
- the motion teaching system includes a captured image display unit, a recognition area designation unit, a raw shape model designation unit, a fitting unit, and a grip pattern selection unit.
- the photographed image display means causes the display device to display a photographed image of a work space including the work acquired by the imaging device.
- the recognition area designating means uses the input device for a recognition area including the part on the image of the work displayed on the display device for an operation for designating a part in the work to be gripped by the hand. Are accepted as two-dimensionally designated operations.
- the element shape model designating unit accepts an operation for designating an element shape model to be applied to the part from the plurality of element shape models via the input device.
- the fitting means fits the original shape model specified by the operator to the three-dimensional position data of the space corresponding to the recognition area acquired using the distance measuring device.
- the gripping pattern selection unit selects at least one gripping pattern applicable to gripping the workpiece by searching the database based on the elementary shape model designated by the operator and the fitting result by the fitting unit. .
- a second aspect of the present invention is an operation teaching method for teaching a gripping operation to a robot having a robot arm including a hand for gripping a workpiece.
- the operator performs an operation for two-dimensionally specifying the recognition area in the captured image displayed on the display device, and an elemental shape model to be applied to the recognition area.
- the gripping part of the workpiece can be specified intuitively by the operation to select.
- the 1st and 2nd aspect of this invention uses the database which linked
- the above-described operation teaching system may further include a gripping pattern display unit and a gripping pattern determination unit.
- the grip pattern display means displays the at least one grip pattern selected by the grip pattern selection means on the display device.
- the gripping pattern determination unit accepts an operation by the operator for selecting a final gripping pattern to be performed by the robot based on the display content generated by the gripping pattern display unit, and remains in the final gripping pattern. The operator's operation for adjusting the degree of freedom is accepted.
- the grip pattern display means may display a model image of the hand representing the at least one grip pattern superimposed on the captured image.
- the gripping pattern determination means accepts the adjustment operation of the degree of freedom by the operator by a drag operation of the model image in the display screen of the display device, and the display of the model image after the drag operation in the display screen
- the adjustment value of the degree of freedom may be determined based on the position and orientation at. With such a configuration, the operator can determine an appropriate grip pattern by a simple drag operation and teach the robot.
- the grip pattern selection unit further uses the three-dimensional position data around the recognition area acquired using the distance measuring device, and The interference between the trajectory of the robot arm by the grip pattern and the obstacle around the workpiece may be determined, and the grip pattern may be narrowed down based on the determination result. As a result, a more appropriate gripping method that avoids interference with obstacles around the workpiece can be obtained.
- a motion teaching system and a motion teaching method capable of teaching a robot a gripping operation of a workpiece whose shape and three-dimensional position are unknown by an intuitive and simple input operation by an operator.
- FIG. 1 It is a block diagram which shows the operation
- (A) is a figure which shows an example of the picked-up image containing the workpiece
- (b)-(d) is a figure which shows the example of designation
- FIG. 1 is a block diagram showing the overall configuration of the motion teaching system according to the present embodiment.
- the motion teaching system according to the present embodiment includes a robot 1 that performs a gripping operation of a workpiece 90 and a teaching terminal 2 for teaching the robot 1 of the gripping operation.
- a robot 1 that performs a gripping operation of a workpiece 90
- a teaching terminal 2 for teaching the robot 1 of the gripping operation.
- configurations of the robot 1 and the teaching terminal 2 will be described.
- a hand 11 for gripping the workpiece 90 is provided at the tip of the robot arm 10.
- the image pickup device 12 takes a work space where the robot 1 and the work 90 are placed, and obtains a taken image. As will be described later, the photographed image acquired by the imaging device 12 is output to the display screen of the teaching terminal 2 and presented to the operator. Therefore, for example, the imaging device 12 may be a camera including an imaging element such as a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide Semiconductor) image sensor having sensitivity in the visible light region.
- CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- the distance measuring device 13 measures the distance to the object in the work space.
- the measurement principle of the distance measuring device 13 is not particularly limited, and various known techniques can be applied.
- a laser range finder, a light projection stereo distance measuring device, or a stereo camera may be used as the distance measuring device 13.
- the imaging device 12 and the distance measuring device 13 may be configured as a single device. .
- the communication unit 14 transmits the captured image acquired by the imaging device 12 and other information related to operation teaching to the communication unit 20 on the teaching terminal 2 side.
- a known wireless communication method or wired communication method may be applied to the communication method of the communication unit 14.
- the communication unit 14 and the communication unit 20 do not need to be directly connected, and may be connected via a LAN (Local Area Network), the Internet, a mobile phone communication network, or the like.
- the database 15 is a database in which a plurality of elementary shape models and a plurality of grip patterns are recorded in association with each other.
- the grip pattern is data describing an operation pattern including the position and posture of the finger of the hand 11.
- the elementary model is a model of a three-dimensional geometric element such as a cylinder, a quadrangular prism, or a cylinder.
- the raw shape model is presented to an operator who uses the teaching terminal 2 as a shape candidate of the gripped portion of the workpiece 90.
- searching the database 15 by designating a certain gripping pattern at least one elementary shape model to which the gripping pattern can be applied can be known. Specific examples of the data structure of the database 15, the operation pattern, and the original shape model will be described later.
- the control unit 16 executes control and data processing related to teaching of the gripping operation to the robot 1 in cooperation with the teaching terminal 2 in addition to the operation control of the robot arm 10 and the hand 11. Specifically, the control unit 16 controls shooting of the imaging device 12, measurement control of the distance measurement device 13, data transmission / reception using the communication unit 14, access to the database 15, and three-dimensional measurement by the distance measurement device 13. The fitting processing between the point cloud data and the original shape model is executed.
- the control unit 16 stores one or a plurality of CPUs (Central Processing Unit), a control program executed by the CPU, a RAM (Random Access Memory) used as a temporary storage area for operation data, a control program, control data, and the like.
- CPUs Central Processing Unit
- RAM Random Access Memory
- a non-volatile memory such as EEPROM (Electrically-Erasable-and Programmable-Read-Only-Memory) may be used.
- the control program may be a set of a plurality of program modules.
- a plurality of coordinate systems specifically, a camera coordinate system related to the imaging space of the imaging device 12, an image coordinate system of an image captured by the imaging device 12,
- the relative relationship between the measurement space coordinate system related to the measurement space of the distance measurement device 13, the tool coordinate system set in the hand 11, and the world coordinate system fixed in the work space needs to be known.
- calibration may be performed in advance.
- the communication unit 20 performs data transmission with the communication unit 14 on the robot 1 side.
- the display device 21 is a device capable of graphic display. Under the control of the control unit 23 described later, the display device 21 displays a captured image, displays a raw shape model, displays a hand model indicating a grip pattern, and the like. Do.
- an LCD Liquid Crystal Display
- CRT Cathode Ray Tube
- the input device 22 is a device that receives an operation input to the teaching terminal 2 by the operator.
- the input device 22 is a pointing device capable of designating an input position and coordinates on the screen of the display device 21 according to an operation by a part of the body such as an operator's hand and foot or an operator's voice. do it.
- a mouse, a touch panel, a voice input device, a laser pointer, or the like may be used for the input device 22.
- the control unit 23 executes control and data processing related to teaching of the gripping operation to the robot 1 in cooperation with the control unit 16 on the robot 1 side. Specifically, the control unit 23 executes display control of the display device 21, acquisition and analysis of operation contents of the input device 22, data transmission / reception using the communication unit 20, and the like. Similarly to the control unit 16, the control unit 23 may be configured using one or a plurality of CPUs (Central Processing Unit), a control program executed by the CPU, a RAM, a nonvolatile memory, and the like.
- CPUs Central Processing Unit
- the teaching terminal 2 may be configured as an integrated small terminal that can be easily carried by an operator, equipped with a battery that supplies operating power to the above-described components.
- the form of the teaching terminal 2 is not limited to such an integrated type.
- the teaching terminal 2 may be configured using a communication terminal, an LCD, a mouse, a PC (Personal Computer), or the like, each of which is an independent device.
- FIG. 2 is a flowchart showing an operation teaching procedure by the operation teaching system of the present embodiment.
- steps S101 to S105 show processing performed on the robot 1 side
- steps S201 to S206 show processing performed on the teaching terminal 2 side.
- step S ⁇ b> 101 the robot 1 acquires a captured image of the work space including the work 90 using the imaging device 12.
- the acquired captured image is transferred to the teaching terminal 2.
- the position and orientation of the imaging device 12 and the angle of view at the time of acquiring a captured image may be remotely controlled from the teaching terminal 2.
- An appropriate captured image of the work space including the workpiece 90 can be obtained, and it is easy to teach the robot 1 an appropriate grip pattern.
- step S201 the teaching terminal 2 displays the captured image acquired in step S101 on the display device 21.
- FIG. 3A shows an example of a captured image display screen including the image 30 of the work 90.
- the workpiece 90 is a cup with a handle as shown in FIG.
- step S202 the teaching terminal 2 receives designation of a range to be a “recognition region” in the captured image displayed on the display device 21 from the operator via the input device 22.
- the “recognition area” is an area set by the operator in the captured image in order to designate a part in the work 90 to be gripped by the hand 11.
- the recognition area is used as a recognition area by the distance measuring device 13, that is, a distance measurement target area in step S102 described later.
- 3 (b) to 3 (d) show examples of display screens of the display device 21 including the recognition area designated on the display screen.
- the recognition area 31 displayed in FIG. 3B designates the body portion of the cup.
- the recognition area 32 displayed in FIG. 3C designates the edge portion of the cup opening.
- the recognition area 33 displayed in FIG. 3D designates a handle portion.
- the teaching terminal 2 may accept the designation of the recognition area by the operator by operating a pointer displayed on the captured image displayed on the display device 21.
- a pointer displayed on the captured image displayed on the display device 21 For example, in order to accept the designation of the recognition area 31 shown in FIG. 3B, an operation for drawing a closed curve indicating the outer periphery of the recognition area 31 with a pointer may be accepted.
- Such an operation is extremely intuitive when the input device 22 is a touch panel, for example. That is, by arranging the touch panel directly above the display screen of the display device 21, the operator can directly touch the displayed work image 30 with a finger or a touch pen to draw a recognition area. it can.
- the teaching terminal 2 receives the designation of the raw shape model to be applied to the recognition area via the input device 22. That is, the operator uses the input device 22 to specify what three-dimensional geometric shape the recognition area specified by the operator has.
- the teaching terminal 2 may output, for example, an icon list 40 as shown in FIG. 4 to the display device 21 in order to prompt the user to select a raw shape model.
- the icons 41 to 44 shown in FIG. 4 each show an example of the elementary shape model. Icon 41 is a quadrangular prism model, icon 42 is a flat plate model, icon 43 is a cylindrical model, and icon 44 is a truncated cone model.
- Each elementary shape model has a shape parameter for defining the shape and size, and a placement parameter for defining the position and orientation.
- the types of the elementary models are not limited to those shown in FIG. 4, and various shapes such as a super quadratic ellipsoid, an L-shaped prism, and a C-shaped cylinder can be used. Prepare according to the type.
- step S102 the robot 1 receives data indicating the recognition area designated by the operator from the teaching terminal 2. Then, the robot 1 uses the distance measurement device 13 to acquire point cloud data (hereinafter referred to as 3D position data) indicating the 3D position of the work space corresponding to the recognition area.
- FIG. 5 is a conceptual diagram showing the acquisition of the three-dimensional position data 50 in the work space corresponding to the recognition area 31 in FIG.
- the three-dimensional position data obtained in step S102 represents the depth viewed from the robot 1 in the work space including the work 90.
- the robot 1 may perform distance measurement from a plurality of viewpoints by controlling the position and orientation of the distance measurement device 13.
- three-dimensional position data obtained by measurement from a plurality of viewpoints many point groups indicating the depth of the workpiece 90 can be obtained.
- polar coordinates having a three-dimensional position in the work space corresponding to the center point of the recognition area as an origin may be set, and the viewpoint of the distance measuring device 13 may be moved in two horizontal and vertical declination directions.
- the three-dimensional position data outside the recognition area set by the operator is also acquired by changing the viewpoint, the three-dimensional position in the cone-shaped space connecting the viewpoint and the edge of the recognition area at the initial position of the distance measuring device 13 It is better to leave only the data and delete other data.
- step 103 the robot 1 receives from the teaching terminal 2 data indicating the raw shape model designated by the operator. Then, the robot 1 fits the original shape model designated by the operator to the three-dimensional position data obtained in step S102. More specifically, the robot 1 only needs to fit the shape parameters and the placement parameters of the specified elementary shape model so as to best match the three-dimensional position data in the recognition region obtained by measurement.
- a known technique in the field of three-dimensional image processing may be used.
- FIG. 6 shows the fitting result when the user selects the cylindrical model, which is displayed by superimposing the original shape model (specifically, the cylindrical model) 60 on the image 30 of the workpiece 90.
- step S104 the robot 1 refers to the database 15 to select a gripping pattern that can be applied to the raw shape model whose shape parameter is determined by fitting.
- a gripping pattern that can be applied to the raw shape model whose shape parameter is determined by fitting.
- FIG. 7 shows a cylindrical model 70 which is one of the elementary shape models.
- the database 15 may record information indicating in what shape parameter range of each elementary shape model each grip pattern is applicable in association with the type of the hand 11.
- four grip patterns with application conditions are applicable as grip patterns applicable when the hand type of the hand 11 is a “planar three-joint two-finger hand” and the elementary model is a cylindrical model. It is recorded.
- the four grip patterns are “grip pattern A: side scissor grip”, “grip pattern B: end face scissor grip”, “grip pattern C: grip grip”, and “grip pattern D: edge scissors”. It is “gripping”.
- the application conditions of each gripping pattern are described by the shape parameters D, L, and T of the cylindrical model 70 shown in FIG.
- FIGS. 8 and 9 are perspective views showing grip patterns A to D by a cylindrical model 70 and a hand model 83 corresponding to the hand 11.
- double-directional arrows in the drawings representing the grip patterns A to D indicate the remaining degrees of freedom that can be adjusted for each grip pattern.
- two degrees of freedom that is, a rotational freedom degree RF around the central axis of the cylindrical model 70 and a translational freedom degree TF along the central axis can be adjusted.
- these degrees of freedom left in the grip pattern are finally determined by the operator in step S206 described later.
- step S ⁇ b> 204 the teaching terminal 2 displays the shape model 60 indicating the fitting result and the hand model 83 indicating the grip pattern selected in step S ⁇ b> 104 superimposed on the captured image including the image 30 of the workpiece 90. .
- the operator can easily grasp the degree of matching between the workpiece 90 and the original shape model and the grip pattern.
- FIGS. 10A and 10B show examples of display screens of the display device 21 in step S204.
- FIG. 10A is a display example of the above-described “grip pattern C: grasp grip”.
- FIG. 10B is a display example of “grip pattern A: side scissor grip” described above.
- the display device 21 may display the plurality of candidates side by side, or switch and display one candidate at a time according to the operation of the operator. May be.
- step S205 the teaching terminal 2 accepts the final determination of the grip pattern by the operator.
- the determined grip pattern is referred to as a final grip pattern.
- the teaching terminal 2 prompts the operator to adjust the degree of freedom left in the final grip pattern.
- the teaching terminal 2 may display a pointer on the screen of the display device 21 and accept a pointer operation by the operator.
- the operator adjusts the degree of freedom by operating the pointer. More specifically, the teaching terminal 2 may accept a drag operation on the hand model 83 displayed on the screen.
- the teaching terminal 2 determines the adjustment value of the degree of freedom left in the final gripping pattern according to the relative position and relative orientation between the hand model 83 after the drag operation and the original shape model 60 or the workpiece image 30. Good. Thereby, the operator can determine an appropriate grip pattern by a simple drag operation.
- FIG. 11A to 11C are diagrams showing specific examples of the degree of freedom adjustment by the operator in step S206.
- FIG. 11A shows the movement of the pointer 84 on the display screen.
- FIG. 11B shows the display change of the pointer 84 due to the drag of the hand model 83.
- FIG. 11C shows a state where the translational freedom degree TF is adjusted by a drag operation.
- step S105 of FIG. 2 the robot 1 performs a gripping operation of the workpiece 90 based on the finally determined gripping pattern.
- the operation teaching system according to the first embodiment of the invention described above has the following effects. That is, by the operation teaching system of the present embodiment, the operator draws a recognition area on the display screen on which an image captured by the imaging device 12 is displayed, and selects an element shape model to be applied to the recognition area.
- the shape of the gripping part can be specified and the gripping method can be determined. That is, the operator can teach a gripping operation of the workpiece 90 by a simple and intuitive operation without performing a detailed input operation requiring skill like the parametric modeling method used in CAD.
- the motion teaching system makes it easy to present an appropriate gripping pattern candidate to the operator by associating the raw shape model and the gripping pattern in advance and recording them in the database 15. That is, the operator does not have to specify the position and posture of the hand 11 and the arrangement of each finger in detail.
- an operator designates a part to be gripped by the robot 1, and only the part is modeled by applying a raw shape model. For this reason, it is easy to model an object, and various gripping operations can be realized without complicated recognition technology.
- the display device 21 and the input device 22 do not require a complicated interface. That is, the teaching terminal 2 is suitable for operation on a small display screen. For this reason, it becomes easy to use a small portable terminal such as a smartphone or a small game machine having a communication function as the teaching terminal 2.
- the robot 1 interferes with the trajectory of the robot arm 10 and an obstacle around the workpiece 90 (collision with the obstacle) when executing the gripping pattern selected based on the result of the fitting (step S104 in FIG. 2). ) May be determined. This determination may be made by comparing the three-dimensional position data around the recognition area acquired by the distance measuring device 13 with the calculation result of the trajectory of the robot arm 10. Further, the gripping pattern may be narrowed down based on the determination result of the interference with the obstacle. Specifically, a grip pattern that may interfere with an obstacle may be identified and excluded from grip pattern candidates to be transmitted to the teaching terminal 2 in order to present it to the operator. Further, in order to notify the operator that there is a possibility of interference with an obstacle, the interference determination result may be transmitted to the teaching terminal 2.
- the robot 1 may autonomously execute a gripping operation according to one of the selected gripping patterns.
- the function sharing between the robot 1 and the teaching terminal 2 in the first embodiment of the invention is an example.
- the fitting between the raw shape model and the three-dimensional position data may be executed on the teaching terminal 2 side.
- the database 15 may be provided on the teaching terminal 2 side.
- a third device may exist in addition to the robot 1 and the teaching terminal 2.
- the imaging device 12 and the distance measuring device 13 do not need to be directly mounted on the robot main body including the robot arm 10 and the hand 11, and may be arranged in the work space as independent devices separate from the robot main body.
- the present invention can be used in a motion teaching system and a motion teaching method for teaching motion to a robot that grips a workpiece.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
前記ワークを含む作業空間の撮影画像を表示装置に表示させるステップ(a)、
前記ハンドに把持させる前記ワーク中の部位を指定するための操作を、前記表示装置に表示された前記ワークの像上において前記部位を含む認識領域を2次元的に指定する操作として受け付けるステップ(b)、
前記部位に当てはめる素形状モデルを複数の素形状モデルの中から指定する操作を受け付けるステップ(c)、
前記ステップ(c)にて指定された素形状モデルを、距離計測装置を用いて取得された前記認識領域に相当する空間の3次元位置データにフィッティングさせるステップ(d)、及び
前記複数の素形状モデルのそれぞれについて前記ハンドが適用可能な少なくとも1つの把持パターンが記述されたデータベースを、前記ステップ(c)にて指定された素形状モデル及び前記ステップ(d)おけるフィッティング結果に基づいて検索することにより、前記ワークの把持に適用可能な少なくとも1つの把持パターンを選択するステップ(e)。
10 ロボットアーム
11 ハンド
12 撮像装置
13 距離計測装置
14 通信部
15 データベース
2 教示端末
20 通信部
21 表示装置
22 入力装置
23 制御部
30 ワーク像
31~33 認識領域
40 アイコンリスト
41~44 アイコン
50 3次元位置データ
90 ワーク
60 素形状モデル(円柱モデル)
70 円筒モデル
83 ハンドモデル
84 ポインタ
RF 回転自由度
TF 並進自由度
図1は、本実施の形態にかかる動作教示システムの全体構成を示すブロック図である。本実施の形態の動作教示システムは、ワーク90の把持動作を実行するロボット1、及びロボット1に対して把持動作の教示を行うための教示端末2を含む。以下、ロボット1及び教示端末2の構成について説明する。
発明の実施の形態1では、図2に示したように、フィッティングによる把持パターンの選択後に、選択された把持パターンをオペレータに提示し、オペレータが最終把持パターンの決定と、最終把持パターンの自由度調整を行なうものとして説明した。このような手順は、ロボット1の動作を最適な把持動作に近づけるために有効である。しかしながら、図2のステップS204~S206を行なわずに、選択した把持パターンの1つに従ってロボット1に自律的に把持動作を実行させてもよい。
Claims (8)
- ワークを把持するためのハンドを含むロボットアームを有するロボットに対して把持動作を教示するための動作教示システムであって、
撮像装置と、
対象物までの距離を計測する距離計測装置と、
表示装置と、
オペレータによる入力操作を受け付ける入力装置と、
複数の素形状モデルのそれぞれについて前記ハンドが適用可能な少なくとも1つの把持パターンが記述されたデータベースと、
前記撮像装置によって取得された前記ワークを含む作業空間の撮影画像を前記表示装置に表示させる撮影画像表示手段と、
前記ハンドに把持させる前記ワーク中の部位を指定するための操作を、前記表示装置に表示された前記ワークの像上において前記部位を含む認識領域を前記入力装置を用いて2次元的に指定する操作として受け付ける認識領域指定手段と、
前記部位に当てはめる素形状モデルを前記複数の素形状モデルの中から指定する操作を、前記入力装置を介して受け付ける素形状モデル指定手段と、
前記オペレータにより指定された素形状モデルを、前記距離計測装置を用いて取得された前記認識領域に相当する空間の3次元位置データにフィッティングさせるフィッティング手段と、
前記オペレータにより指定された素形状モデル及び前記フィッティング手段によるフィティング結果に基づいて前記データベースを検索することにより、前記ワークの把持に適用可能な少なくとも1つの把持パターンを選択する把持パターン選択手段と、
を備える動作教示システム。 - 前記把持パターン選択手段によって選択された前記少なくとも1つの把持パターンを前記表示装置に表示させる把持パターン表示手段と、
前記把持パターン表示手段によって生成される表示内容に基づいて前記ロボットに行わせる最終把持パターンを選択するための前記オペレータによる操作を受け付けると共に、前記最終把持パターンに残された自由度を調節する前記オペレータの操作を受け付ける把持パターン決定手段と、
をさらに備える請求項1に記載の動作教示システム。 - 前記把持パターン表示手段は、前記少なくとも1つの把持パターンを表す前記ハンドのモデル画像を前記撮影画像に重ねて表示し、
前記把持パターン決定手段は、前記オペレータによる前記自由度の調節操作を、前記表示装置の表示画面内における前記モデル画像のドラッグ操作により受け付け、前記ドラッグ操作後の前記モデル画像の前記表示画面内における位置及び姿勢に基づいて前記自由度の調整値を決定する、請求項2に記載の動作教示システム。 - 前記把持パターン選択手段は、さらに、
前記距離計測装置を用いて取得された前記認識領域の周辺の3次元位置データを用いて、前記少なくとも1つの把持パターンによる前記ロボットアームの軌道と前記ワーク周囲の障害物との干渉を判定し、当該判定結果に基づいて把持パターンの絞込みを行う、請求項1乃至3のいずれか1項に記載の動作教示システム。 - ワークを把持するためのハンドを含むロボットアームを有するロボットに対して把持動作を教示するための動作教示方法であって、
前記ワークを含む作業空間の撮影画像を表示装置に表示させるステップ(a)と、
前記ハンドに把持させる前記ワーク中の部位を指定するための操作を、前記表示装置に表示された前記ワークの像上において前記部位を含む認識領域を2次元的に指定する操作として受け付けるステップ(b)と、
前記部位に当てはめる素形状モデルを複数の素形状モデルの中から指定する操作を受け付けるステップ(c)と、
前記ステップ(c)にて指定された素形状モデルを、距離計測装置を用いて取得された前記認識領域に相当する空間の3次元位置データにフィッティングさせるステップ(d)と、
前記複数の素形状モデルのそれぞれについて前記ハンドが適用可能な少なくとも1つの把持パターンが記述されたデータベースを、前記ステップ(c)にて指定された素形状モデル及び前記ステップ(d)におけるフィッティング結果に基づいて検索することにより、前記ワークの把持に適用可能な少なくとも1つの把持パターンを選択するステップ(e)と、
を備える動作教示方法。 - 前記ステップ(e)にて選択された前記少なくとも1つの把持パターンを前記表示装置に表示させるステップ(f)と、
前記ステップ(f)にて生成される表示内容に基づいて前記ロボットに行わせる最終把持パターンを選択するための前記オペレータによる操作を受け付けるステップ(g)と、
前記最終把持パターンに残された自由度を調節する前記オペレータの操作を受け付けるステップ(h)と、
をさらに備える請求項5に記載の動作教示方法。 - 前記ステップ(f)では、前記少なくとも1つの把持パターンを表す前記ハンドのモデル画像を前記撮影画像に重ねて表示し、
前記ステップ(h)では、前記オペレータによる前記自由度の調節操作を、前記表示装置の表示画面内における前記モデル画像のドラッグ操作により受け付け、前記ドラッグ操作後の前記モデル画像の前記表示画面内における位置及び姿勢に基づいて前記自由度の調整値を決定する、請求項6に記載の動作教示方法。 - 前記ステップ(e)では、さらに、前記距離計測装置を用いて取得された前記認識領域の周辺の3次元位置データを用いて、前記少なくとも1つの把持パターンによる前記ロボットアームの軌道と前記ワーク周囲の障害物との干渉を判定し、当該判定結果に基づいて把持パターンの絞込みを行う、請求項5乃至7のいずれか1項に記載の動作教示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09720016.6A EP2263837B1 (en) | 2008-03-10 | 2009-02-03 | Operation teaching system and operation teaching method |
US12/921,291 US8355816B2 (en) | 2008-03-10 | 2009-02-03 | Action teaching system and action teaching method |
KR1020107010947A KR101193125B1 (ko) | 2008-03-10 | 2009-02-03 | 동작 교시시스템 및 동작 교시방법 |
CN2009801084228A CN101970184B (zh) | 2008-03-10 | 2009-02-03 | 动作教导系统以及动作教导方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-058957 | 2008-03-10 | ||
JP2008058957A JP4835616B2 (ja) | 2008-03-10 | 2008-03-10 | 動作教示システム及び動作教示方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009113339A1 true WO2009113339A1 (ja) | 2009-09-17 |
Family
ID=41065011
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/051741 WO2009113339A1 (ja) | 2008-03-10 | 2009-02-03 | 動作教示システム及び動作教示方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US8355816B2 (ja) |
EP (1) | EP2263837B1 (ja) |
JP (1) | JP4835616B2 (ja) |
KR (1) | KR101193125B1 (ja) |
CN (1) | CN101970184B (ja) |
WO (1) | WO2009113339A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112017007903T5 (de) | 2017-10-03 | 2020-05-14 | Mitsubishi Electric Corporation | Haltepositions- und orientierungslehreinrichtung, haltepositions- und orientierungslehrverfahren und robotersystem |
Families Citing this family (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100179689A1 (en) * | 2009-01-09 | 2010-07-15 | National Taiwan University Of Science And Technology | Method of teaching robotic system |
DE102010007458A1 (de) * | 2010-02-10 | 2011-08-11 | KUKA Laboratories GmbH, 86165 | Verfahren für eine kollisionsfreie Bahnplanung eines Industrieroboters |
JP2011251395A (ja) * | 2010-06-04 | 2011-12-15 | Takamaru Kogyo Kk | ロボット教示システム |
JP5560948B2 (ja) * | 2010-06-23 | 2014-07-30 | 株式会社安川電機 | ロボット装置 |
JP5685027B2 (ja) | 2010-09-07 | 2015-03-18 | キヤノン株式会社 | 情報処理装置、物体把持システム、ロボットシステム、情報処理方法、物体把持方法およびプログラム |
KR101778030B1 (ko) * | 2010-09-27 | 2017-09-26 | 삼성전자주식회사 | 로봇 및 그 제어방법 |
JP5659787B2 (ja) * | 2010-12-28 | 2015-01-28 | トヨタ自動車株式会社 | 操作環境モデル構築システム、および操作環境モデル構築方法 |
BE1019786A3 (nl) * | 2011-01-31 | 2012-12-04 | Robojob Bv Met Beperkte Aansprakelijkheid | Werkwijze voor het manipuleren van een reeks achtereenvolgens aangeboden identieke werkstukken door middel van een robot, evenals een inrichting en een grijper daarbij toegepast. |
JP2012206219A (ja) * | 2011-03-30 | 2012-10-25 | Seiko Epson Corp | ロボット制御装置及びロボットシステム |
JP5892360B2 (ja) | 2011-08-02 | 2016-03-23 | ソニー株式会社 | ロボット指示装置、ロボット指示方法、プログラム、及び通信システム |
JP5892361B2 (ja) * | 2011-08-02 | 2016-03-23 | ソニー株式会社 | 制御装置、制御方法、プログラム、及びロボット制御システム |
JP5852364B2 (ja) | 2011-08-26 | 2016-02-03 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、およびプログラム |
KR101896473B1 (ko) | 2012-01-04 | 2018-10-24 | 삼성전자주식회사 | 로봇 핸드의 제어 방법 |
JP2013184257A (ja) * | 2012-03-08 | 2013-09-19 | Sony Corp | ロボット装置及びロボット装置の制御方法、並びにコンピューター・プログラム |
JP5642738B2 (ja) * | 2012-07-26 | 2014-12-17 | ファナック株式会社 | バラ積みされた物品をロボットで取出す装置及び方法 |
US9095978B2 (en) * | 2012-12-07 | 2015-08-04 | GM Global Technology Operations LLC | Planning a grasp approach, position, and pre-grasp pose for a robotic grasper based on object, grasper, and environmental constraint data |
JP5582427B2 (ja) * | 2012-12-18 | 2014-09-03 | 株式会社安川電機 | 教示データ作成装置、ロボットシステム、及び教示データ作成方法 |
US10578573B2 (en) * | 2013-03-12 | 2020-03-03 | Msa Technology, Llc | Diagnostics for catalytic structures and combustible gas sensors including catalytic structures |
US9227323B1 (en) * | 2013-03-15 | 2016-01-05 | Google Inc. | Methods and systems for recognizing machine-readable information on three-dimensional objects |
JP5673717B2 (ja) * | 2013-03-19 | 2015-02-18 | 株式会社安川電機 | ロボットシステム及び被加工物の製造方法 |
JP5965859B2 (ja) * | 2013-03-28 | 2016-08-10 | 株式会社神戸製鋼所 | 溶接線情報設定装置、プログラム、自動教示システム、および溶接線情報設定方法 |
JP5983506B2 (ja) * | 2013-04-04 | 2016-08-31 | トヨタ自動車株式会社 | 把持対象物の把持パターン検出方法およびロボット |
GB201309156D0 (en) * | 2013-05-21 | 2013-07-03 | Univ Birmingham | Grasp modelling |
JP6361153B2 (ja) * | 2014-02-05 | 2018-07-25 | 株式会社デンソーウェーブ | ロボットの教示装置 |
JP6380828B2 (ja) * | 2014-03-07 | 2018-08-29 | セイコーエプソン株式会社 | ロボット、ロボットシステム、制御装置、及び制御方法 |
JP6370064B2 (ja) * | 2014-03-11 | 2018-08-08 | 株式会社スター精機 | 樹脂成形品取出し機におけるゲート切断位置データ設定方法及び樹脂成形品取出し機 |
JP6361213B2 (ja) * | 2014-03-26 | 2018-07-25 | セイコーエプソン株式会社 | ロボット制御装置、ロボット、ロボットシステム、教示方法、及びプログラム |
CN106232304B (zh) * | 2014-05-01 | 2018-09-07 | 本田技研工业株式会社 | 用于关节型机器人的教学数据准备装置和教学数据准备方法 |
DE102014223167A1 (de) * | 2014-11-13 | 2016-05-19 | Kuka Roboter Gmbh | Bestimmen von objektbezogenen Greifräumen mittels eines Roboters |
US9492923B2 (en) | 2014-12-16 | 2016-11-15 | Amazon Technologies, Inc. | Generating robotic grasping instructions for inventory items |
JP6486678B2 (ja) * | 2014-12-25 | 2019-03-20 | 株式会社キーエンス | 画像処理装置、画像処理システム、画像処理方法及びコンピュータプログラム |
JP6486679B2 (ja) * | 2014-12-25 | 2019-03-20 | 株式会社キーエンス | 画像処理装置、画像処理システム、画像処理方法及びコンピュータプログラム |
US10089575B1 (en) | 2015-05-27 | 2018-10-02 | X Development Llc | Determining grasping parameters for grasping of an object by a robot grasping end effector |
JP6493013B2 (ja) * | 2015-06-25 | 2019-04-03 | 富士通株式会社 | 指モデル検証プログラム、指モデル検証方法、および情報処理装置 |
US10688665B2 (en) * | 2016-01-06 | 2020-06-23 | Hitachi Ltd. | Robot system, and control method |
DE112016006116T5 (de) * | 2016-01-29 | 2018-09-13 | Mitsubishi Electric Corporation | Roboterlehrvorrichtung und Verfahren zum Erzeugen eines Robotersteuerprogramms |
US9914214B1 (en) * | 2016-02-22 | 2018-03-13 | X Development Llc | Preshaping for underactuated fingers |
CN107309871A (zh) * | 2016-04-27 | 2017-11-03 | 广明光电股份有限公司 | 机器手臂的编程方法 |
JP2017196705A (ja) * | 2016-04-28 | 2017-11-02 | セイコーエプソン株式会社 | ロボット、及びロボットシステム |
US9687983B1 (en) * | 2016-05-11 | 2017-06-27 | X Development Llc | Generating a grasp pose for grasping of an object by a grasping end effector of a robot |
US11338436B2 (en) | 2016-07-18 | 2022-05-24 | RightHand Robotics, Inc. | Assessing robotic grasping |
JP2018034243A (ja) * | 2016-08-31 | 2018-03-08 | セイコーエプソン株式会社 | ロボット、ロボット制御装置、及びロボットシステム |
JP2018034242A (ja) * | 2016-08-31 | 2018-03-08 | セイコーエプソン株式会社 | ロボット制御装置、ロボット、及びロボットシステム |
JP6892286B2 (ja) * | 2017-03-03 | 2021-06-23 | 株式会社キーエンス | 画像処理装置、画像処理方法、及びコンピュータプログラム |
JP7145146B2 (ja) * | 2017-03-06 | 2022-09-30 | 株式会社Fuji | 画像処理データ作成方法 |
US11003177B2 (en) * | 2017-03-24 | 2021-05-11 | Mitsubishi Electric Corporation | Apparatus and method for generating robot program |
WO2018193130A1 (de) * | 2017-04-21 | 2018-10-25 | Roboception Gmbh | Verfahren zur erstellung einer datenbank mit greiferposen, verfahren zum steuern eines roboters, computerlesbares speichermedium und handhabungssystem |
JP6880982B2 (ja) * | 2017-04-21 | 2021-06-02 | セイコーエプソン株式会社 | 制御装置およびロボットシステム |
JP6499716B2 (ja) * | 2017-05-26 | 2019-04-10 | ファナック株式会社 | 形状認識装置、形状認識方法及びプログラム |
JP6457587B2 (ja) * | 2017-06-07 | 2019-01-23 | ファナック株式会社 | ワークの動画に基づいて教示点を設定するロボットの教示装置 |
WO2019059364A1 (ja) * | 2017-09-22 | 2019-03-28 | 三菱電機株式会社 | 遠隔制御マニピュレータシステムおよび制御装置 |
US12011835B2 (en) * | 2017-09-30 | 2024-06-18 | Siemens Aktiengesellschaft | Engineering autonomous systems with reusable skills |
JP6969283B2 (ja) * | 2017-10-25 | 2021-11-24 | オムロン株式会社 | 制御システム |
JP6880457B2 (ja) * | 2017-11-14 | 2021-06-02 | オムロン株式会社 | 把持方法、把持システム及びプログラム |
TWI734867B (zh) * | 2017-11-20 | 2021-08-01 | 達明機器人股份有限公司 | 機器手臂作業軌跡的教導系統及方法 |
JP6737764B2 (ja) | 2017-11-24 | 2020-08-12 | ファナック株式会社 | ロボットに対して教示操作を行う教示装置 |
JP6881268B2 (ja) * | 2017-12-05 | 2021-06-02 | トヨタ自動車株式会社 | 把持装置、把持判定方法および把持判定プログラム |
JP6937260B2 (ja) * | 2018-03-19 | 2021-09-22 | 株式会社東芝 | 把持制御装置、把持システム、およびプログラム |
US10766149B2 (en) | 2018-03-23 | 2020-09-08 | Amazon Technologies, Inc. | Optimization-based spring lattice deformation model for soft materials |
CN108515290B (zh) * | 2018-03-30 | 2019-12-03 | 宁波高新区神台德机械设备有限公司 | 工业机器人动作检测装置 |
US10967507B2 (en) * | 2018-05-02 | 2021-04-06 | X Development Llc | Positioning a robot sensor for object classification |
US11926057B2 (en) | 2018-06-14 | 2024-03-12 | Yamaha Hatsudoki Kabushiki Kaisha | Robot system |
DE102019122790B4 (de) * | 2018-08-24 | 2021-03-25 | Nvidia Corp. | Robotersteuerungssystem |
US11833681B2 (en) | 2018-08-24 | 2023-12-05 | Nvidia Corporation | Robotic control system |
JP6802225B2 (ja) | 2018-08-31 | 2020-12-16 | ファナック株式会社 | 情報処理装置および情報処理方法 |
JP6863945B2 (ja) * | 2018-10-24 | 2021-04-21 | ファナック株式会社 | ロボットの制御方法 |
JP7204513B2 (ja) * | 2019-02-13 | 2023-01-16 | 株式会社東芝 | 制御装置及びプログラム |
CN112584987B (zh) * | 2019-03-15 | 2024-07-09 | 欧姆龙株式会社 | 把持位置姿势登记装置、把持位置姿势登记方法以及计算机可读存储介质 |
US11312581B2 (en) * | 2019-04-16 | 2022-04-26 | Abb Schweiz Ag | Object grasp system and method |
US11648674B2 (en) * | 2019-07-23 | 2023-05-16 | Teradyne, Inc. | System and method for robotic bin picking using advanced scanning techniques |
KR20190104483A (ko) * | 2019-08-21 | 2019-09-10 | 엘지전자 주식회사 | 로봇 시스템 및 그 제어 방법 |
JP7443014B2 (ja) * | 2019-10-08 | 2024-03-05 | 大豊精機株式会社 | ロボットアーム試験装置 |
JP7392409B2 (ja) * | 2019-11-15 | 2023-12-06 | オムロン株式会社 | 設定装置、設定方法およびプログラム |
KR20210063975A (ko) * | 2019-11-25 | 2021-06-02 | 엘지전자 주식회사 | 로봇 및 그 제어 방법 |
JP7276108B2 (ja) * | 2019-12-13 | 2023-05-18 | トヨタ自動車株式会社 | 遠隔操作システム及び遠隔操作方法 |
JP7342676B2 (ja) * | 2019-12-13 | 2023-09-12 | トヨタ自動車株式会社 | 遠隔操作システム及び遠隔操作方法 |
DE112021000384T5 (de) * | 2020-02-19 | 2022-10-20 | Fanuc Corporation | Lerndatensatz-Erzeugungsvorrichtung undLerndatensatz-Erzeugungsverfahren |
DE102020113277B4 (de) | 2020-05-15 | 2024-07-11 | Gerhard Schubert Gesellschaft mit beschränkter Haftung | Verfahren zum Erzeugen eines Trainingsdatensatzes zum Trainieren eines Industrieroboters, Verfahren zur Steuerung des Betriebs eines Industrieroboters und Industrieroboter |
DE102021201921B4 (de) | 2021-03-01 | 2024-11-07 | Robert Bosch Gesellschaft mit beschränkter Haftung | Vorrichtung und verfahren zum steuern eines roboters zum aufnehmen eines objekts |
WO2023022237A1 (ja) * | 2021-08-19 | 2023-02-23 | 京セラ株式会社 | ロボットの保持態様決定装置、保持態様決定方法、及びロボット制御システム |
DE102021212860B4 (de) | 2021-11-16 | 2024-05-08 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Aufnehmen eines Objekts mittels eines Roboters |
JP7571760B2 (ja) | 2022-04-04 | 2024-10-23 | トヨタ自動車株式会社 | 3次元モデルの生成システム、生成方法、及びプログラム |
DE102022206274A1 (de) | 2022-06-22 | 2023-12-28 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Steuern eines Roboters zum Manipulieren, insbesondere Aufnehmen, eines Objekts |
WO2024090154A1 (ja) * | 2022-10-26 | 2024-05-02 | 住友重機械工業株式会社 | ロボット教示装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01283603A (ja) | 1988-05-11 | 1989-11-15 | Fanuc Ltd | ロボットオフラインティーチング方式 |
JP2003094367A (ja) * | 2001-09-21 | 2003-04-03 | Ricoh Co Ltd | 手先視覚付ロボットハンド |
JP2003256025A (ja) | 2001-12-25 | 2003-09-10 | National Institute Of Advanced Industrial & Technology | ロボット動作教示方法及び装置 |
JP2004333422A (ja) | 2003-05-12 | 2004-11-25 | Fanuc Ltd | 画像処理装置 |
JP2005111618A (ja) | 2003-10-08 | 2005-04-28 | Fanuc Ltd | ロボットの手動送り装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4396903A (en) * | 1981-05-29 | 1983-08-02 | Westinghouse Electric Corp. | Electro-optical system for correlating and integrating image data from frame-to-frame |
JP3300682B2 (ja) * | 1999-04-08 | 2002-07-08 | ファナック株式会社 | 画像処理機能を持つロボット装置 |
JP2002283259A (ja) * | 2001-03-27 | 2002-10-03 | Sony Corp | ロボット装置のための動作教示装置及び動作教示方法、並びに記憶媒体 |
CA2354301A1 (en) * | 2001-07-27 | 2003-01-27 | Djamel Yahia Meddah | Geometric hashing for model-based recognition of an object |
JP2004188533A (ja) * | 2002-12-10 | 2004-07-08 | Toyota Motor Corp | 対象物の取扱い推定方法および取扱い推定装置 |
JP4167940B2 (ja) * | 2003-05-29 | 2008-10-22 | ファナック株式会社 | ロボットシステム |
JP4745769B2 (ja) * | 2005-09-13 | 2011-08-10 | キヤノン株式会社 | カメラ雲台装置 |
JP4715539B2 (ja) | 2006-02-15 | 2011-07-06 | トヨタ自動車株式会社 | 画像処理装置、その方法、および画像処理用プログラム |
US7925068B2 (en) * | 2007-02-01 | 2011-04-12 | General Electric Company | Method and apparatus for forming a guide image for an ultrasound image scanner |
US7903883B2 (en) * | 2007-03-30 | 2011-03-08 | Microsoft Corporation | Local bi-gram model for object recognition |
US7844106B2 (en) * | 2007-04-23 | 2010-11-30 | Mitsubishi Electric Research Laboratories, Inc | Method and system for determining poses of objects from range images using adaptive sampling of pose spaces |
-
2008
- 2008-03-10 JP JP2008058957A patent/JP4835616B2/ja not_active Expired - Fee Related
-
2009
- 2009-02-03 US US12/921,291 patent/US8355816B2/en active Active
- 2009-02-03 WO PCT/JP2009/051741 patent/WO2009113339A1/ja active Application Filing
- 2009-02-03 EP EP09720016.6A patent/EP2263837B1/en not_active Not-in-force
- 2009-02-03 CN CN2009801084228A patent/CN101970184B/zh not_active Expired - Fee Related
- 2009-02-03 KR KR1020107010947A patent/KR101193125B1/ko active IP Right Grant
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01283603A (ja) | 1988-05-11 | 1989-11-15 | Fanuc Ltd | ロボットオフラインティーチング方式 |
JP2003094367A (ja) * | 2001-09-21 | 2003-04-03 | Ricoh Co Ltd | 手先視覚付ロボットハンド |
JP2003256025A (ja) | 2001-12-25 | 2003-09-10 | National Institute Of Advanced Industrial & Technology | ロボット動作教示方法及び装置 |
JP2004333422A (ja) | 2003-05-12 | 2004-11-25 | Fanuc Ltd | 画像処理装置 |
JP2005111618A (ja) | 2003-10-08 | 2005-04-28 | Fanuc Ltd | ロボットの手動送り装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2263837A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE112017007903T5 (de) | 2017-10-03 | 2020-05-14 | Mitsubishi Electric Corporation | Haltepositions- und orientierungslehreinrichtung, haltepositions- und orientierungslehrverfahren und robotersystem |
DE112017007903B4 (de) | 2017-10-03 | 2022-04-14 | Mitsubishi Electric Corporation | Haltepositions- und Orientierungslehreinrichtung, Haltepositions- und Orientierungslehrverfahren und Robotersystem |
Also Published As
Publication number | Publication date |
---|---|
US20110010009A1 (en) | 2011-01-13 |
CN101970184A (zh) | 2011-02-09 |
EP2263837A1 (en) | 2010-12-22 |
CN101970184B (zh) | 2012-07-11 |
EP2263837B1 (en) | 2013-06-19 |
EP2263837A4 (en) | 2012-08-29 |
KR20100084663A (ko) | 2010-07-27 |
JP2009214212A (ja) | 2009-09-24 |
US8355816B2 (en) | 2013-01-15 |
JP4835616B2 (ja) | 2011-12-14 |
KR101193125B1 (ko) | 2012-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4835616B2 (ja) | 動作教示システム及び動作教示方法 | |
CN106457569B (zh) | 机器人系统及操作机器人系统的方法 | |
JP6361213B2 (ja) | ロボット制御装置、ロボット、ロボットシステム、教示方法、及びプログラム | |
EP2963513B1 (en) | Teaching apparatus and robot system | |
JP4844453B2 (ja) | ロボットの教示装置及び教示方法 | |
US8406923B2 (en) | Apparatus for determining pickup pose of robot arm with camera | |
Lambrecht et al. | Spatial programming for industrial robots based on gestures and augmented reality | |
US20190202058A1 (en) | Method of programming an industrial robot | |
JP2020055075A (ja) | 拡張現実と複合現実を用いたロボット制御装置及び表示装置 | |
US11833697B2 (en) | Method of programming an industrial robot | |
US20160346921A1 (en) | Portable apparatus for controlling robot and method thereof | |
JP6897396B2 (ja) | 制御装置、ロボットシステムおよび制御方法 | |
WO2020190166A1 (ru) | Способ и система захвата объекта с помощью роботизированного устройства | |
US10724963B2 (en) | Device and method for calculating area to be out of inspection target of inspection system | |
JP7190552B1 (ja) | ロボット教示システム | |
JP4649554B1 (ja) | ロボット制御装置 | |
JP2018122376A (ja) | 画像処理装置、ロボット制御装置、及びロボット | |
JP6488571B2 (ja) | 教示装置、及びロボットシステム | |
CN117400255A (zh) | 一种机器人的控制方法、装置、设备及存储介质 | |
Du et al. | A novel natural mobile human-machine interaction method with augmented reality | |
CN116802020A (zh) | 用于监督式自主抓取的用户界面 | |
JP6343930B2 (ja) | ロボットシステム、ロボット制御装置、及びロボット制御方法 | |
JP2022163836A (ja) | ロボット画像の表示方法、コンピュータープログラム、及び、ロボット画像の表示システム | |
JP7509534B2 (ja) | 画像処理装置、ロボットシステム、及び画像処理方法 | |
JP2019111588A (ja) | ロボットシステム、情報処理装置、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980108422.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09720016 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20107010947 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12921291 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009720016 Country of ref document: EP |