MyCobot en
MyCobot en
MyCobot en
Introduction
1 myCobot - From 0 to 1 1.1
2 Product 1.2
3 How to Read 1.3
pymycobot 1.6.1
1、arm_swing 1.6.1.1
2、arm_route_plan 1.6.1.2
3、arm_safety_control 1.6.1.3
4、testing_arm 1.6.1.4
5、arm_dance 1.6.1.5
6、use_arm 1.6.1.6
7、calibration_manipulator_arm 1.6.1.7
8、control_sunction_pump 1.6.1.8
myblockly 1.6.2
Myblockly Introduction 1.6.2.1
1、relax_fixed_arm 1.6.2.2
2、testing_jaw 1.6.2.3
3、set_move_time 1.6.2.4
4、mechanism_control 1.6.2.5
5、advanced_control_arm 1.6.2.6
Ros 1.6.3
Preparation
1 Background Knowledge 2.1
1
1.5 Servos and Motors 2.1.5
2 Hardware Learning 2.2
3 myStudio 2.3
3.1 myStudio 2.3.1
Development
Software Platfrom and API 3.1
1 arduino 3.2
3 python 3.4
3.1 api 3.4.1
4 ROS&Moveit 3.5
4.1 Install enviroment 3.5.1
4.2 Install mycobot_ros 3.5.2
2
8.4 Face Detection 3.9.4
8.5 QR code Detection 3.9.5
4 Others
1 Maintenance 4.1
2 FAQ 4.2
3 Resources 4.3
3
myCobot: From 0 to 1
Hardware
Embedded Microcontroller Based on ESP32
Motor and Steering Gear
M5Stack Basic/ Atom
Software
Arduino 开发环境
C++
Python
4
ROS , MoveIt
Communication Data
Virtual Machines & Linux (visual system)
Algorithm
Series Manipulator
Coordinate and Coordinate Transformation 坐标与坐标转换
DH Parameters
Kinematics
Manipulator Algorithm (e.g. dynamics)
Machine Vision (Vision Set)
Color Recognition
Image Recognition
Hand-Eye Calibration
See and Grab
Extended Applications
End-effector: gripper,suction pump, etc.
Robot Suit & Industry 4.0 Applications
Parts of Gitbook
View the directory on the left to jump
5
Background Knowledge -- learn about tools, industrial robots,
algorithms, software, hardware,etc.
Hardware Learning -- learn about embedded hardware, structural
components, electronic components, etc.
Purpose of Use -- identify the purpose you want to use it for, and
complete the study related to your task
Development and Use
Development Environment -- learn to use Arduino, ROS, uiFlow,
roboFlow, python and others development environment to develop
myCobot
Accessories -- learn to use myCobot with different accessories, such us
bases, grippers, suction pumps and so on
Machine Vision -- learn to control myCobot under the guidance of
machine vision
Robot Modification -- learn how to modificate myCobot into a 4 or 5
axis manipulator
myCobot Suit
Intelligent Warehouse: learn how to use myCobot to carry different
objects
Artificial Intelligence: learn how to control myCobot to grasp objects
intelligently under the guidance of machine vision
Industry 4.0: learn how to grasp and place objects intelligently by
simulating production line
Information Source
Official website: www.elephantrobotics.com
Tutorial video:
https://www.youtube.com/channel/UC68l2RaRF2Mp8fzpCTzNBfA
Shop website: https://shop.elephantrobotics.com
Contact Us
If you have any other questions, you can contact us as follows.
6
myCobot Introduction
1 Design Background
Upholding the mission of “Enjoy Robots World”, Elephant Robotics designed
and developed myCobot, the world’s smallest and lightest collaborative robot,
retaining most functions of industrial robots. With compact and elegant industrial
design, excellent and powerful performance, and huge software and hardware
development space, myCobot has unlimited possibilities in applications.
2 Introduction
7
myCobot is the world's smallest and lightest six-axis collaborative robot,
jointly produced by Elephant Robotics and M5STACK. It is more than a
productivity tool full of imaginations, can carry on the secondary development
according to the demands of users to achieve personalized customization.
With a weight of 850g, a payload of 250g and an arm length of 350mm, myCobot
is compact but powerful, can not only be matched with a variety of end effectors
to adapt to different kinds of application scenarios also support the secondary
development of multi-platforms software to meet the needs of various scenarios
such as scientific research and education, smart home, and commercial
applications.
3 Framework
8
4 Parameters
表1: myCobot 280 Product Parameters
Parameter Data
Model myCobot-280
Payload 250g
Repeatability ±0.5mm
Weight 850g
5 Accessories
myCobot
End Effectors
Parallel Gripper
9
Adaptive Gripper
Angled Gripper
Suction Pump
Base
G Base
Flap Base
Others
Rocker
Battery Case
6 Features
Unique Industrial Design & Extremely Compact
myCobot is an integrated modular design and only weighs 850g which is very
easy to carry. Its overall body structure is compact with less spare parts and
can be quickly disassembled and replaced to realize plug and play.
High configuration & Equipped with 2 Screens
myCobot contains 6 high-performance servo motors with fast response, small
inertia and smooth rotation, and carries 2 screens supporting fastLED library
to show the expanded application scene more easily and clearly.
Lego Connector & Thousands of M5STACK Ecological Application
The base and end of myCobot are equipped with Lego Connector, which is
suitable for the development of various miniature embedded equipment. Its
base is controlled by M5STACK Basic, and thousands of application cases
can be use directly.
Bloky Programming & Supporting Industrial ROS
Using UIFlow visual programming software, programming myCobot is simple
and easy for everyone. You can also Arduino, ROS, or other multiple
functional modules of open source system, even RoboFlow, software of
industrial robots from Elephant Robotics.
Track Recording & Learn by hand
Get rid of the traditional point saving mode, myCobot supports drag teaching
to record the saved track and can save up to 60mins different tracks making
it easy and fun for new players to learn.
10
7 Patents
myCobot is protected by patents
11
12
Reading Instruction
Reading Objectives
Gitbook is designed to help you to achieve goals
Major Goals
Extended Goals
13
Skills and Estimated Sugg
Learning
Level Qualifications Learning Develo
Background
Requires Time Platf
Have a
Major in knowledge of
Information, programming
Freshman Electronics, languages 100h uiFlow
Mechanical, Basic
Automation knowledge of
electronics
Knowledge
of Arduino or
other similar
hardware
products Know how to
Knowledge debug API
Superior of steering interface Have 50h Arduino
gear and a knowledge of
programming communication
Knowledge
of IO
interface,
etc.
Experience
with at least
Understand
one kind of
the Cartesian
industrial
coordinates
robot or
Understand
Professional consumer 30h all
joint control
robot
Understand
Ability to
the basic use
develop
of robots
hardware
and software
Learning Schedule
14
NO Target Theory Practice Hour
1.accessories of
myCobot
1.track recording
1 Unboxing 2.drive 1h
& learn by hand
myCobot to
track recording
1.application
background of
industrial robot 1.joint control
2.learning of and
coordinate and reproduction
Background
space, cartesian 2.speed control
2 Knowledge 5h
3d coordinate 3.coordinate
Learning
and rotation, xyz point position
3.joint and control and
coordinate circulation
control of
industrial robot
1.identify different
1. choose the
software
platform that
Software: platforms and its
works for you
4 Firmware purposes for use 2h
2.download and
and Updates 2.principles of
update the
firmware loading
firmware
and adaptation
1.building of
arduino 1.be familiar
2.library file with arduino
Software: download and 2.load the
5 Development update of arduino library 2h
Environment 3. have a 3.program and
knowledge of run the first line
serial of code
communication
1.basic
1.communicate
communication
with myCobot
and operation
2.control
Learning and types of robots
myCobot to
Development 2.common
6 move 5h
of Robot operation of robot
3.operate I0
Library 3.control of
port, gripper
direction mode
and other
and coordinate
signals
mode
15
NO Target Theory Practice Hour
1.understand the
1.displays
basic architecture
different fonts in
and relationships
basic
of visual
2.make
programming
myCobot move
interface:
in different
sensors,
positions
7 uiFlow actuators and 10h
according to the
processes
three buttons of
2.variables,
basic
loops, and
3.control
judgments
myCobot to
3.control method
cycle multiple
of manipulator
points
arm
1.learning of
common
industrial
operating system
for robots
1.control the
2.common
movement of
modules of
myCobot
roboFlow : point
2.basic control
8 roboFlow position, fast 5h
of IO input and
movement,IO
output
control and input
3.cycle control
3.advanced
and judgment
modules of
roboFlow :
looping, judging,
and pallet
procedures
1.operate the
1.learning of robot to move at
robot point different points
motion 2.rules for 2.operate the
Intelligent
9 the placement of robot to grab 10h
Warehouse
pallets 3.principle and place
of end-effector 3.control and
control recognition of
gripper
16
NO Target Theory Practice Hour
1.introduction to
the visual sensor
of stickV
2.introduction
and programming
environment of
1.build
maixpy
virtualbox
3.methods and
environment
strategies of
2.read different
Algorithm common color
colors
10 about Image recognition 20h
3.recognition of
Recognition 4.methods and
different shapes
strategies of
4.data transfer
common shape
and reading in
recognition
uiFlow
5.methods and
strategies of
common area
identification
6.json data
transfer
1.operate
myCobot to
1.correlation of camera
world and coordinate
camera system
Vision and
coordinates 2.the motion of
11 Robotics 10h
2.standardization myCobot in the
Control
of qr code image camera
3. movement and coordinate
correction system
3.re-calibration
and setting
1. learn and
make the flow 1.sensor
chart connection
2. learn and 2.connection
make electrical and drive of
Artificial
12 connection gripper 20h
Intelligence
diagram 3.robot actuator
3. description and & joint
operation of debugging of
shape uiFlow
classification
In addition, you can also buy our Industry 4.0 suit to learn how to build and use
the simulation industrial applications.
Service Support
If the above schdule cannot meet your needs, you can contact the custom
service for further communication. We provide software and hardware
customization service.
17
Facebook: https://www.facebook.com/MyCobot-116558893805177
Mail: support@elephantrobotics.com
18
4 Use cases
mycobot can be used for both personal learning applications and educational
applications.
Personal Applications
Leisure & entertainment
dance with rhythm, play an instrument, take pictures and video, write painting,
play games and so on
Intelligent furniture
remote control of myCobot to grab and carry objects with gripper and suction
pump, such as mixing coffee, grabing bread, etc.
Teaching AIDS
simulating the teaching of industrial robot, K12 teaching through entertainment
experience, cost-effective research laboratory assistant
Commercial Exhibition
cooperate with other products to do commercial scene demonstration
stickT
Educational Applications
1 Maker Education /K12 Education
19
2 Artificial Intelligence Course
20
4 Quick Start
Step 1: What’s in the Box
After the packaging box is in place, please confirm that the robot packaging is
intact and undamaged. If there is any damage, please contact the logistics
company and the local supplier in time.
21
Please install the robot system in an environment that meets the conditions
described in the table in order to exert and maintain the performance of the
machine and use it safely.
Environment Target
Temperature -10℃~45℃
Relative
20%~70%
Humidity
Indoor/Outdoor Indoor
22
Rated Voltage: 7-9V
Rated Current: 3-5A
The type of plug: DC 5.5mm×2.1
Note: You can’t simply supply with typeC inserted into Basic.
After entering the recording mode, select the recording storage location
Start Recording:
Click record and select the storage location, then manually drag the robotic
arm to complete your target action and remeber to save it, the action will be
recorded and stored .
Note: The default recording time is 100s.If the recording time is too long
that will exceed the memory, you can customize it by modifying the code, or
record it on the computer.
Play:
Open source code: The code above is open, named MainControl, you can
download and customize the code via github, refer to the link below:
https://github.com/elephantrobotics/myCobot/tree/main/Arduino/MycobotBasic/ex
amples
23
Raspberry myCobot
You can create a python file at any position, for example: light_led.py , then write
the following code in the file:
mc = MyCobot(PI_PORT, PI_BAUD)
mc.set_color(255, 0, 0)
After saving, run it. You'll see the light board on top of myCobot lit in red.
python2 update:
python3 update:
More examples
In addition to the above content, we provide more examples to help users use the
python API.
https://www.elephantrobotics.com/wp-content/uploads/2021/04/PythonAPI
tutorials.zip
24
The use of ROS
We've pre-installed ROS kinetic in the Raspberry version of myCobot and provide
the package myCobotROS , you can use it easily.
cd ~/ros_catkin_ws
source devel/setup.bash
https://github.com/elephantrobotics/myCobotROS#3-visualization-in-rviz
25
Python API of myCobot
这是用于与mycobot进行串行通信并对其进行控制的python API。
Installation
Notes:
Make sure that you flash Atom to the top Atom,flash Transponder to the Basic.
Firmware Atom and Transponder Download link:https :
//github.com/elephantrobotics/myCobot/tree/main/Software You can also use
myStudio to update,download link of myStudio:https :
//github.com/elephantrobotics/myStudio/releases
Pip
Source code
Usage:
Please go to here.
26
、
1 The robot arm swings left and
right
Introduction of API
API used to control the robot arm swings left anf right is:
1、 MyCobot(port)
Parameter Description: port :The type of data is String, is the port number that
controls the robot arm, and the windows system can view it at the port in Device
Manager.
2、 get_angles()
Function: Get the angle of the six joint points of the robot arm
Return Value: The type of return value is list, with six elemental data,
corresponding to joints 1 to 6.
3、 send_angles(degrees,speed)
Parameter Description:
degrees :The type of parameter is list. The angle data for the six joint points
must be included. The angle value range of the six joint points is -180 to 180.
speed :The type pf data is int, value range 0~100. Represents the speed at
which the robot arm runs to the specified position, and the higher the value, the
greater the speed.
Parameter Description
id :Represents the joints of the robotic arm, a total of six joints, with a specific
representation. For example, joint1 can be: Angle.J1.value 。
degree :Represents the angle of the joint, value range -180 to 180.
5、 release_all_servos()
Content of code
27
from pymycobot.mycobot import MyCobot
from pymycobot.genre import Angle
from pymycobot import PI_PORT, PI_BAUD # When using the Raspberry Pi version of myCob
import time
# Pass coordinate parameters with a number of columns, let the robot arm move to the s
mc.send_angles([0, 0, 0, 0, 0, 0], 50)
print(mc.is_paused())
# Set the wait time to ensure that the robot arm has reached the specified position
# while not mc.is_paused():
time.sleep(2.5)
# Set the wait time to ensure that the robot arm has reached the specified position
time.sleep(2)
# Set the wait time to ensure that the robot arm has reached the specified positio
time.sleep(1.5)
# Set the wait time to ensure that the robot arm has reached to the specified posi
time.sleep(1.5)
num -= 1
# Let the robot arm shrink. You can swing the robot arm manually and then use the get_
# Use this function to get the robot arm to where you want it to be.
mc.send_angles([88.68, -138.51, 155.65, -128.05, -9.93, -15.29], 50)
# Set the wait time to ensure that the robot arm has reached to the specified position
time.sleep(2.5)
Effects as shown
28
29
、
2 The head of the robot arm
intelligently plans the route
Knowledge preparation for API
send_coords([x,y,z,rx,ry,rz],speed,model) is used to control the movement of the
head of the robot arm to a specified point in a specified manner.It is primarily used
to enable intelligent planning of the head of the robot arm from one position to
another.X,Y,Z represents the position of the head of the robot arm in space(The
coordinate system is a right-angle coordinate system),rx,ry,rz represents the
posture of the head of the robot arm at that point.(The coordinate system is
Euler coordinates)。The implementation of the algorithm and the representation
of Euler coordinates require a certain degree of academic knowledge, we won't
make too much explanation here, we only need to understand the right angle
coordinate system can we use this function well.
Introduction of API
1、 send_coords([x,y,z,rx,ry,rz],speed,model)
Function:Intelligently plan the route to move the head of the robot arm from the
original point to the specified point.
Parameter Description
speed :Represents the speed of the robot arm. The range of values is 0 to 100,
the higher the value, the faster the speed.
model :Values are limited to 0 and 1. 0 indicates that the path to the head of the
robot arm is nonlinear, means the route is randomly planned, as long as the head
of the robot arm moves to the specified point in a specified manner. 1 indicates
the movement of the head of the robot arm is linear, means the intelligent
planning route allows the head of the robot arm to move in a straight line to the
specified point.
2、 get_coords()
Function: Get the spatial coordinates of the head of the robot arm at this time and
the current posture.
30
Return value:The returned type is a list collection of six float elements, the first
three coordinates x,y,z representing the coordinates of the head of the robot arm,
and the last three coordinates rx, ry, rz representing the posture of the head of the
robot arm.
3、 send_coord(id,coord,speed)
Parameter Description:
coord :The type of data is float, Represents modifying the coordinate value.
speed :Represents the speed of the robot arm. Value range 0~100, the higher
the value, the faster it is.
Content of code
# Intelligently plan your route, Let the head linearly arrive at the coordinate[59.9,-
mc.send_coords([59.9, -65.8, 350.7, -50.99, 83.14, -52.42], 80, 1)
# Set the wait time
time.sleep(1.5)
# Change only the x coordinates of the head,Set the x coordinate of the head to -40. L
mc.send_coord(Coord.X.value, -40, 70)
Effects as shown
31
32
3 、Safety control of th robot arm
Introduction of API
1、 is_power_on()
2、 power_on()
3、 power_off()
Function: Power is lost to the robot arm and all functions will fail.
4、 pause()
5、 resume()
6、 stop()
7、 is_in_position(data,flag)
Function: Determines whether the robot arm has reached the specified position
Parameter Description:
flag :1 Represents the head data of the robot arm,0 Represents the angle
collection data of the robot arm.
8、 is_paused()
Content of code
33
from pymycobot.mycobot import MyCobot
from pymycobot import PI_PORT, PI_BAUD # When using the Raspberry Pi version of myCob
import time
Effects as shown
34
35
4 、Test the robot arm
Introduction of API
1、 is_all_servo_enable()
2、 jog_angle(joint_id,direction,speed)
Parameter Description:
3、 jog_stop()
4、 release_servo(servo_id)
Parameter Description:
5、 focus_servo(servo_id)
Parameter Description
Content of code
36
from pymycobot.mycobot import MyCobot
from pymycobot import PI_PORT, PI_BAUD # When using the Raspberry Pi version of myCob
import time
Effects as shown
37
5 、Dance of the robot arm
Introduction of API
1、 set_color(R, G, B)
Parameter Description:
Content of code
if __name__ == '__main__':
# Initialize a MyCobot object
mc = MyCobot(PI_PORT, PI_BAUD)
# Set the start time
start = time.time()
# Let the robot arm move to the specified position
mc.send_angles([-1.49, 115, -153.45, 30, -33.42, 137.9], 80)
# Check whether it move to the specified positon
while not mc.is_in_position([-1.49, 115, -153.45, 30, -33.42, 137.9], 0):
# Restore the movement of the robot arm
mc.resume()
# Let the robot arm move 0.5s
time.sleep(0.5)
# Pause the movement of the robot arm
mc.pause()
# Check if the movement timed out
if time.time() - start > 3:
break
# Set start time
start = time.time()
# Let the movement last 30 seconds
while time.time() - start < 30:
# Let the robot arm reach this position quickly
mc.send_angles([-1.49, 115, -153.45, 30, -33.42, 137.9], 80)
# Set the color of the light to[0,0,50]
mc.set_color(0, 0, 50)
time.sleep(0.7)
# Let the robot arm reach this position quickly
mc.send_angles([-1.49, 55, -153.45, 80, 33.42, 137.9], 80)
# Set the color of the light to[0,50,0]
mc.set_color(0, 50, 0)
time.sleep(0.7)
# mc.release_all_servos()
Effects as shown
38
39
6 、Use of gripper
Introduction of API
1、 is_gripper_moving()
Return parameter:
2、 set_encoder(joint_id, encoder)
Parameter Description
encoder :Value range 0~4096,2048 means 0 when the value range is -180-
-180
3、 set_encoders(encoders, sp)
Parameter Description:
encoders :A list collection of six int elements, with the order of six encoder data
representing the position of 1 to 6 joint points.
4、 get_encoder(joint_id)
Parameter Description
joint_id :Value range 1-7, reprents 1-6 joint points and grippers respectively.
Return value:
5、 set_gripper_value(value, speed)
Function:Let the gripper turn to the specified position at the specified speed.
Parameter Description:
value :Indicates where the claws are to be reached, value range 0~4096。
6、 get_gripper_value()
40
Function:Get the encoder data information of gripper
Return value:
7、 set_gripper_state(flag, speed)
Function:Let the gripper reach the specified state at the specified speed.
Parameter Description
flag :1 means claws are closed and 0 means claws are open.
speed :Indicates how quickly the specified state has been reached,value range
0~100。
Content of code
41
from pymycobot import PI_PORT, PI_BAUD # When using the Raspberry Pi version of myCob
from pymycobot.mycobot import MyCobot
import time
def gripper_test(mc):
print("Start check IO part of api\n")
# Check if the gripper is moving
flag = mc.is_gripper_moving()
print("Is gripper moving: {}".format(flag))
time.sleep(1)
if __name__ == "__main__":
# Initialize a MyCobot object
mc = MyCobot(PI_PORT, PI_BAUD)
# Let it move to the zero position
mc.set_encoders([2048, 2048, 2048, 2048, 2048, 2048], 20)
time.sleep(3)
gripper_test(mc)
Effects as shown
42
43
7 、Calibration of the robot arm
Introduction of API
1、 set_servo_calibration(servo_no)
Parameter Description
2、 is_controller_connected()
Return value:1 means writeable, 0 means you can't write, -1 indicates an error.
3、 set_gripper_ini()
Content of code
# Detect whether the robot arm can burn into the program
if mc.is_controller_connected() != 1:
print("Connect the robot arm correctly for program writing")
exit(0)
# Fine-tune the robot arm to ensure that all snaps in the adjusted position are aligne
# Based on the alignment of the robot arm port, only a case is given here
mc.send_angles([0, 0, 18, 0, 0, 0], 20)
# Calibrate the position at this time, and the angle position after calibration is rep
# The for loop is equivalent to the set_gripper_ini() method
for i in range(1, 7):
mc.set_servo_calibration(i)
44
8 、Control suction pump
Connection
Firstly, we need to power the sunction pump as shown in the picture
Next we need to connect sunction pump and myCobot, using control cables.
Connect 2 and 5 in the pin port. As shown in the picture.
Introduction of API
set_basic_output(pin_no, pin_signal)
Parameter Description:
45
Pin_no :The int type parameter, the number of the label at the bottom of the
device takes only the numeric portion.
Content of code
pump_off()
time.sleep(3)
pump_on()
time.sleep(3)
pump_off()
time.sleep(3)
46
Use Instruction of Myblockly
Interfacce introduction
1-1interface
As 1-1 indicates,① represents the Puzzle Toolbar, which contains logical control
puzzles, variable settings puzzles, mathematical function puzzles, text type
puzzles, and control robot arm method puzzles. ② represents a jigsaw puzzle
board, pulls the method module in the puzzle toolbar into the puzzle board, and
the method module will appear in the drawing board. ③ represents the code
display area, and the method module stitched into the drawing board
automatically generates python code in the code display area.
47
1-3Save Interface
In particular, it is important to add a suffix .xml when defining a saved file
name, otherwise the saved file will not be loaded. Of course, if you
accidentally forget to add the suffix.xml. It doesn't matter, just rename the
file before loading it and add .xml suffix.
1-4Load Interface
You can click ②Load in the picture1-2 to load the file, the effect after clicking is
shown in the picture1-4. The file that is loaded at this time can only be a file with a
.xml suffix.
48
Intrduction of Myblockly
Logic
1、As shown in Figure 1-1, all methods contained in the Logic module are
included.
1-3 方法详细(二)
如图1-3所示,①表示if(条件)do(程序1)else(程序2),若满足条件则执行程序1,否则执行程序2。②
Loops
1、如图2-1所示即为Loops模块所包含的所有方法。
49
2-1 Loops模块
2、Method is explained in detail
2-2 Loops模块方法详细
如图2-2所示,①表示重复执行10次do里面的程序。②表示重复变量num次do中的程序(do被遮挡)。点击②中
注:在循环中想使用循环中的变量需要设置一致的变量。
Text
1、如图3-1所示即为Text模块所包含的所有方法。
50
3-1 Text Module
2、Method is explained in detail
3-2 Text模块方法详细(一)
如图3-2所示,①表示文本内容,可以自定义文本内容。②表示计算指定文本内容的长度。③表示输出文本内容
3-3 Text模块方法详细(二)
51
如图3-3所示,①表示指定字符串在选定字符串中第一次或最后一次出现的位置,可以点击下拉框选择是第一
Math
1、如图4-1所示即为Math模块所包含的所有方法。
4-2 Math模块方法详细
如图4-2所示,(1)表示数字常量,该数值常量是可以自定义的。(2)表示两个变量逻辑相加减等运算操作,可
List
1、如图5-1所示即为List模块所包含的所有方法。
52
5-1 List Module
2、Method is explained in detail
5-2 List模块方法详细
如图5-2所示,(1)表示创建一个空的list数组。(2)表示创建一个数组,该数组为指定一个数重复多少次后
Variables
1、如图6-1所示即为Variables模块
53
6-1 Variables模块
2、如图6-2所示,点击箭头所指处即可开始创建变量。
6-2 自定义变量名
如图6-2所示,在输入框中输入自定义的变量名,点击Look up即可创建。
6-3 变量生成效果
如图6-3所示即为创建好后的变量。
54
Functions
如图7-1所示,Functions模块包含两类函数,第一种如①所示是没有返回值的,第二种如⑥所示有返回值的。
Time
Mycobot
55
9-1 Mycobot Module
56
Release and fixation of the robot arm
Case Introduction
In this case, by using loop call, the method of focus servo to fix 6 joint points,and
use time method to fix it for 10 seconds. Finally use loop call release method to
release 6 joint points.
二、Content of demo
57
Gripper detection of the robot arm
Case Introduction
By using Set_Gripper_State function of myCobot to let the gripper open and close
10 times, and adjust angle after each group close.
demo
58
Set the movement time of the robot
arm
Case content
The main experimental content of this case is to call the jog_angle function to
keep the six joints moving continuously through a loop. Stop its motion with the
jog_stop function.Finally, the robot arm is moved to a safer position and the joint
is released and powered off.
Content of program
59
Control mechanism of the robot arm
Case content
This case mainly calls some of the commonly used control mechanism functions
of the robot arm to control the robot arm, such as power off the robot arm, power
supply, suspension of motion, restore movement and other control mechanism
functions, as well as the control of the headlights of the robot arm.
Demo
60
Adavanced operation of the robot arm
Case Introduction
Mainly realize that the robot arm intelligent judge the function that the robot arm
has already arrived the specified position, based on this function, simply let the
robot arm repeat two arrival instructions.
Firstly use if do module to judge if the robot arm is powered, if not, you
should power the robot arm. Output the current angle node information.
Transfer the robot arm to zero. Define angles variable. Use the repeated
method in the creation of list type data to assign zero node information to
angles . Define limit_time to determine if the movement times out.
Pass angles into the is in position method to determine whether the robot
arm has reached the specified position. Use the repeat module 0.5s per
movement to detect whether the robot arm has reached the specified position
and time out the timer, and if more than 7s the robot arm has not yet reached
the specified position, determine that it has arrived and execute the next
instructions.
The principle is similiar, so no more explanation
demo
61
When you use the Raspberry Pi version of mycobot, you should already have a
Raspberry Pi system equipped with ROS Kinetic.
62
1 Background
Robot
Software
Electronics
Mechanics
Motor
63
1.1 Robot
This chapter is excerpted from Introduction to robotics mechanics and control by
J.Craig. If you want to read more about it, please buy it online.
1 Background
The history of industrial automation is characterized by the rapid renewal of
technological means. The renewal of such automation technology is closely
related to the world economy, whether as an inducement or a result of the
development of the world economy. Industrial robot in the 1960s is undoubtedly a
unique equipment, it will be combined with the computer aided design (CAD)
system, computer aided manufacturing (CAM) system application, this is the
modern manufacturing automation of the latest development trend. These
technologies are leading the transition to a new field of industrial automation.
Manipulator is one of the most important types of industrial robots. Whether the
manipulator can be called an industrial robot is controversial. The equipment
shown here is generally considered to belong to the category of industrial robots,
while CNC (NC) grinders are usually outside this category.
2 Basic Concepts
Mechanical Arm
In order to describe the position and posture of a space object, we usually place
the object firmly in a space coordinate system, that is, the reference frame, and
then we study the position and posture of the space object in this reference
64
coordinate system.
Direct Kinematics
Kinematics is the study of the motion of objects without regard to the forces
causing such motion. In kinematics, we study higher-order derivatives of position,
velocity, acceleration, and position variables with respect to time or other
variables. Thus, the research object of manipulator kinematics is all the geometric
and temporal characteristics of motion. Almost all manipulators are composed of
rigid links, adjacent links connected by joints that allow for relative motion. If it's a
revolute joint, its displacement is called the joint Angle. These joints are usually
fitted with position sensors to measure the relative position of adjacent bars. If you
have a revolute joint, this displacement is called the joint Angle. Some
manipulators have sliding (or moving) joints, so the displacement of two adjacent
links is a linear motion, which sometimes called the joint offset.
End-effector
Inverse Kinematic
Given the position and posture of the end-effector of the manipulator, calculating
all joint angles that can reach the given position and attitude.
3 Space description
Position
65
Once the coordinate system is established, we can locate any point in the world
coordinate system with a position vector of 3x1. Since many coordinates areoften
defined in the world coordinate system, a piece of information must beattached to
the position vector indicating which coordinate system is defined.In this book, the
position vector has a leading superscript to indicate thecoordinate system to
which it refers.
Posture
66
We find that it is often necessary not only to represent points in space, but also to
describe the posture of objects in space. For example, if the vector "P" in Figure
2-2 directly determines a point between the fingers of the manipulator hand, the
position of the hand can only be fully determined if the posture of the hand is
known. Assuming that the manipulator has a sufficient number of joints, the
manipulator can be in any position and the position of the points between the
fingers remains constant. To describe the posture of an object, we will fix a
coordinate system on the object and give the representation of this coordinate
system with respect to the reference system. In Figure 2-2, the coordinate system
{B} is known to be fixed to the object in some way. The description in {B} relative
to {A} is sufficient to indicate the attitude of object (A).
Coordinate System
4 DH Parameters
Definition
For rotational joint n, set 0=0.0, the direction of X axis is the same as that of X,
axis, the origin position of coordinate system (n) is selected to satisfy d.=0.0.For
prismatic joint n, the direction of axis 8 is set to meet 0.=0.0. When d.=0.0, the
origin of the coordinate system {n) is selected to be located at the intersection of
axis XN-1 and joint axis n.
In the link coordinate system, if the link coordinate system is fixedly attached to
the link as described above, the link parameters can be defined as follows:
myCobot DH parameter
67
Joint alpha a d theta offset
1 0 0 131.56 theta_1 0
3 0 -110.4 0 theta_3 0
68
1.2 Software
For myCobot users, software operation requires basic C/C++ knowledge to drive
microcontrollers Basic and Atom.
Github: download the latest myCobot code and update the related users
instructions
Arduino: IDE ( integration development environment ) of the core
development of myCobot need to program in C/C++
You can also directly use Python, ROS, UIFlow, and other development tools,
which are described in the chapter of Development and Use .
If you want to learn programming languages, you can learn from books, open
classes, online videos, etc.. And your development platforms can be Windows,
MacOS, or Linux.
69
Github
GitHub is a hosting platform for open source and private software projects, it
houses our software (Python, ROS, Arduino), APP, industrial visual programming
software - RoboFlow, firmware, user manual and development guide manual.
Github:https://github.com/elephantrobotics/myCobot
70
Arduino
Arduino
Arduino is an open source electronic prototyping platform that is convenient,
flexible and easy to use, contains hardware (various types of Arduino boards) and
software (Arduino IDE). The hardware part (or development board) consists of a
microcontroller (MCU), flash memory (FLASH), and a set of general input/output
interfaces (GPIO), etc., you can think of it as a microcomputer motherboard. The
software part is mainly composed of Arduino IDE on PC, related Board Support
Package (BSP) and abundant third-party function library. Users can easily
download the BSP related to the development board you own and the library of
functions by Arduino IDE to write your programs.
71
Arduino IDE
如果需要下载Arduino IDE 可以点击Aeduino 官网下载安装与电脑系统对应的版
本。
If you need to download Arduino IDE, you can click Arduinoto download and
install the version corresponding to your computer system.
72
Select Tools -> Development Board: -> Development Board Manager
In the new dialog box that pops up, enter and search ESP32 , then click Install
After the installation, select Tools -> Development Board: to check whether it
was successful As shown in the figure below:
73
2. Add Libraries
Open Arduino IDE, select Project -> Load Library -> Management Library...
74
Project libraries that need to be downloaded from GitHub, such as
MycobotBasic
75
76
Click open and "The library has been added. Please check the 'Import Library'
menu.” is displayed in the lower right, then the environment configuration of
Arduino is complete.
77
Electronics
Introduction
Single-Chip Microcomputer (Microcontrollers/SCM) is a kind of integrated circuit
chip, which uses VLSI technology to integrate the central processing unit CPU
with data processing capabilities, random access memory RAM, read-only
memory ROM, various I/O ports, interrupt systems, and timer /Counter and other
functions (may also include display drive circuit, pulse width modulation circuit,
analog multiplexer, A/D converter and other circuits) integrated into a silicon chip
to form a small and complete microcomputer system, widely used in the field of
industrial control. From the 1980s, it developed from the 4-bit, 8-bit microcontroller
to the current 300M high-speed microcontroller.
Basic Structure
Arithmetic Unit
78
Perform various arithmetic operations.
Perform various logical operations and perform logical tests, such as a zero-
value test or a comparison of two values.
All operations performed by arithmetic unit are directed by the control signal
issued by the controller, and an arithmetic operation produces an operation,
and a logical operation produces a decision.
Controller
Extract an instruction from memory and indicates the location of the next
instruction in memory.
Decode and test the instructions,then generate the corresponding operation
control signal to facilitate the execution of the specified actions.
Direct and control the direction of data flow between CPU, memory, and
input/output devices.
79
1.4 Mechanics Background
Is being developed
80
Motor & Steering Gear
Motor
According to the type of working power supply, it can be divided into:
81
According to the application, it can be divided into:
Servo Motor
82
The steering gear is actually a servo motor, just like the rudder shaft control in the
model aircraft and other equipment, so these lightweight servo motor is called
steering gear.
Servo Motor
Servo Motor refers to the engine that controls the operation of mechanical
components in the servo system, it is a kind of indirect speed change device for
auxiliary motor.
83
Hardware of myCobot
The hardware of myCobot is composed of electronic parts and mechanical
parts. Electronic parts include PCBA, controller, steering gear, charger, etc..
Structural parts include solid plastic casing, LEGO-mounted ports, flanges,
fastener bearings, etc.
It is mainly used for the application side of myCobot, which can carry out
Arduino programming, UIFlow programming, communication or various
software defined by users themselves. Basic related programs are open
source for everyone. Check out our GitHub for more information.
Note: Basic and Atom can communicate with each other through a
communication protocol that is open to all users simultaneously.
84
myCobotElectronic Components
myCobot Electronics
Base(gray shell)
Charging Board PCBA: charging and voltage protection function
PCBA Substrate: control signal conversion function
M5 Basic: main controller
Servo Motor No. 1
Body (white shell)
Servo Motor No. 2-6
Pinboard of Servo Motor
End
M5 Atom:2nd controller
Peripheral -Charger
Structural Parts
Base
Through-Hole
Lego Interface
Body
White plastic shell
Fixed parts, fasteners, bearings, etc
End
85
Output Flange (silver)
1 0 0 131.56 theta_1 0
3 0 -110.4 0 theta_3 0
86
The coordinate system represented by DH parameter of myCobot is as follows.
Maintenance
If there are some problems with your myCobot, you can contact our customer
service for maintenance.
87
BASIC
Description
M5Stack BASIC Kit, like its namesake, is a starter kit among the M5Stack
development kit series. It’s a modular, stackable, scalable, and portable device
which is powered with an ESP-32 core, which makes it open source, low cost, full-
function, and easy for developers to handle new product development on all
stages including circuit design, PCB design, software, mold design and
production. This Basic kit provides a friendly price and full-featured resources
which makes it a good starter kit for you to explore IoT.
If you want to explore the fastest way of IoT prototyping, M5Stack development
board is the perfect solution. Not like others, M5Stack development board is
highly efficient, covered with industrial grade case and ESP32-based
development board. It integrates with Wi-Fi & Bluetooth modules and contains a
dual-core and 16MB of SPI Flash . Together with 30+ M5Stack stackable modules
, 40+ extendable units and different levels of program language, you can create
and verify your IoT product in a very short time.
88
Pin description of myCobot
Features
ESP32-based
Built-in Speaker, Buttons,Color LCD, Power/Reset button
TF card slot (16G Maximum size)
Magnetic suction at back
Extendable Pins & Holes
M-Bus Socket & Pins
Program Platform: UIFlow, MicroPython, Arduino
Applications
Internet of things terminal controller
Stem education product
DIY creation
Smart home equipment
Parameters
89
Resources Parameter
Operating
32°F to 104°F ( 0°C to 40°C )
Temperature
Button
90
IP5306 charging/discharging , Voltage parameter
charging discharging
ESP32 ADC/DAC
RelatedLink
Datasheet
ESP32
IP5306
API
Arduino API
pcba
pcba.pdf)
91
M5 Stack Atom
Description
ATOM Matrix , which has a size of only 24 * 24mm, is the most compact
development board in the M5Stack development kit series. It provides more GPIO
pins and is very suitable for handy and miniature embedded device development.
The main control adopts the ESP32-PICO-D4 chip, which comes integrated with
Wi-Fi and Bluetooth technologies and has 4MB of integrated SPI flash memory.
The Atom board provides an Infra-Red LED along with the 5 * 5 RGB LED matrix
on the panel, a built-in IMU sensor (MPU6886), and a HY2.0 interface. A general
purpose programmable button is provied below the RGB Led matrix to enable
users to add input support to their various projects. The on-board USB interface
(Type-C) enables rapid program uploading and execution. One M2 screw hole is
provided on the back for mounting the board.
Note: When using FastLED lib, the recommended brightness of RGB LED is 20.
Please do not set it to a high brightness value to avoid damage to the LED and
acrylic screen. (In ATOM lib, we have mapped its appropriate brightness range to
0~100)
Product Features
92
ESP32 PICO-based
Programmable button
5 * 5 RGB LED matrix panel(WS2812C)
Buitl-in Infra-red LED
Built-in MPU6886 Inertial Sensor
Extendable Pins & Holes
Program Platform:Arduino,UIFlow
Applications
Internet of things terminal controller
IoT node
Wearable peripherals
Specification
Resources Parameter
Flash 4MB
MEMS MPU6886
IR Infrared transmission
Operating
32°F to 104°F ( 0°C to 40°C )
Temperature
Net weight 3g
93
3.4 Motors and Servos
Specification
Specification Parameter
Dimension 45.2X24.7X35mm
Locked-rotor
19.5 kg·cm\@ 7.4V
torque
Feedback Load/position/speed/voltage/current/temperature
Electronic Overheat/overcurrent/overvoltage/overload
protection protection
Structure Feature
The shell adopts engineering plastic shell with higher strength, which
optimizes the center point alignment distance and makes the overall structure
more compact.
Steering gear adopts 1:345 copper tooth combination,which makes greater
torque.
Under the condition of the same torque, it will appear shorter (5mm) than the
size of the standard steering gear.
The body adopts double-axis structure design, round lining three-dimensional
structure characteristics with metal main and auxiliary steering wheel and
double wire wiring, suitable for quadruped robot, snake robot, desktop robot,
humanoid robot, mechanical arm applications.
High Precision
Working Modes
Mode 0: Location mode, which is the default one. 360 degree absolute Angle
control can be achieved in this mode and. Support acceleration motion.
94
Mode 2: Speed open loop, in the programming interface, the operation mode
is set to 2, you can switch to speed open loop mode, and enter the
corresponding time under the time bar to run.
A key to Calibrate
95
Learn Structure and fixation of
myCobot
Common Fix: myCobot There are two common ways to fix myCobot:
96
Make sure that there is a corresponding threaded hole on the fixed base before
installing. Before you officially install, please confirm:
After confirming the above, move the robot to the mounting surface of the base,
adjust the position of the robot, and align the fixing hole of the robot base with the
hole on the mounting surface of the base.
Note: When adjusting the position of the robot on the mounting base, please
avoid pushing the robot directly on the mounting surface of the base to avoid
scratches. When manually moving the robot, please try to avoid applying external
force to the weak part of the robot body to avoid unnecessary damage to the
robot.
97
The end of the robot arm is compatible with both lego connector holes and screw
threaded holes.
98
myStudio
The design of myStudio
Version 1.2
windows, mac, linux
99
100
Download and Loading myStudio
Download myStudio
myStudio https://github.com/elephantrobotics/myStudio
101
Note: Don’ t install in a folder with a space directory.
Burn Basic
First, connect the BASIC development board with USB, the connection
window of myStudio will display the connected development board, select
and click “Connect”
102
Then there are the basic-related firmware in the Basic and tools. Select the
firmware you want to burn, and click to burn
103
Burn Atom
Burn Atom is same as Burn Basic, withUSB connection at the end of the
Atom
ATOM can be selected in the Board, firmware of Atom will appear
There is only one firmware of Atom, click to burn
Usage of myStudio
https://www.bilibili.com/video/BV1Qr4y1N7B5/
Q&A
Q: When you click on the Tools it will be stuck in the side bar for the first time?
104
Software Platform & API
Before developing, make sure that the Basic and Atom in your myCobot are using
the latest firmware and suitable for your environment.
Arduino
Suitable for maker development, you can use all kinds of Arduino program
library.
uiFlow
python
ROS&MoveIt
roboFlow
In addition, we are also developing C# and other software interface API for
development.
105
106
Arduino
1.2 Requirement
ATOM:Burn the latest version of ATOMMain( At least 2.7 Version )
Basic:None
open Arduino IDE -> tools -> port to check if there is a device. If the device is
not detected, please replace the USB cable to test, or test whether the driver
has been installed successfully
2.Start development
2.1 Burn an offcial demo
Open Arduino IDE,select ->file-> Examples-> mycobotbasic, then you can
see all Examples about myCobot
Burn an demo->SetRGB.ino.
107
Open SetRGB from Examples
Click to download
Wait until the bottom right shows upload success, which means that the
application has been downloaded
108
Then you'll see the Atom screen loop with red, green and blue lights
109
Arduino API
1. Overall Status
powerOn();
powerOff();
isPoweredOn();
setFreeMove();
Function: Read all joint angles, when used one Angles should be defined to
receive data that was read. Angles are defined in terms of variables or
functions built into library functions. We can define a memory space that is 6
angles to store Angle variables, it is used in the same way as arrays.
Return Value: Angles type of array
Function: Synchronize joint angles, send joint angles at the same time.
Specified Angles is a container with a capacity of 6 data, can be viewed as
an array. Use a for loop to assign values, or assign values separately.
Angles[0] = Specified Angle, Angles[2] = Specify Angle,range from 0 – 90
(the value range should be the same as writeAngle) unit°
Movement Speed = speed, range from 0 – 100 unit°
Return Value: none
110
getCoords();
Function:Read x, y, z, rx, ry, rz at the end of the current robot arm to test
whether the specified point has been reached, you should define a Coords
tempcoords to receive the read angle when you use it. Coords is a variable
number or function definition of a library function that defines a storage space
of 6 memory, tempcoords, which is used in the same way as an array.
Return value:An array under the Coords type that needs to define variables
of the Coords type
Value of Moving Path Coordinate = value range from -300 – 300 ( The position
coordinates of axis=Axis::X,aixs=Axis::Y and axis=Axis::Z are respectively X,Y,Z,
the units would be mm. Position coordinate value range is not uniform,
axis=Axis::RX, aixs=Axis::RY and axis=Axis::RZ are respectively RX,RY,RZ
ranging from-180°~180°, if the value is beyond the range it will return the clue
“inverse kinematics no solution” )
Specified Speed = speed range from 0~100 unit %
checkRunning();
111
Function: Set a single joint to rotate to a specified potential vaule
Parameter Specification:
Joint Number = joint, range from 1-6
Potential Vaule of Servo Motor = encoder, range from 0-4096 ( The range
should be positively related to the range of each joint )
Return Value: none
getEncoder(int joint);
Function: Set the six joints to run synchronously to the specified position
Parameter Specification: Need to define a variableof type Angles:
angleEncoders, it is used in the same way as arrays. Assign a value to the
array angleEncoders, values range from 0 to 4096 ( The range should be
positively related to the range of each joint ) , the length range of the array is
6. Specified Speed = speed, range from 0~100unit %
Return Value: none
3. JOG Mode
jogAngle(int joint, int direction, int speed);
jogStop();
pause();
resume();
112
Function: Program continues to run
Return Value: none
stop();
setSpeed(int percentage);
getJointMin(int joint);
getJointMax(int joint);
113
Function: set the joint maximal limit Angle
Parameter Specification: joint/servo motor number = joint, range from 1-6
Return Value: none
isAllServoEnabled();
setServoCalibration(int joint);
Function: Calibrate the current position of the joint to zero Angle, the
corresponding potential value is 2048
Parameter Specification: joint number = joint, range from 1 – 6
jointBrake();
6. Atom IO Control
setLEDRGB(byte r, byte g, byte b);
114
setGripper(int data);
X,Y,Z range from -300-300.00 ( Value range is not defined. If the value is
beyond the range, the clue ”inverse kinematics no solution” will be given
) unit mm
RX,RY,RZ range from -180~180
Value: none
setWorldReference(Coords coords);
getToolReference();
getWorldReference();
setReferenceFrame(RFType rftype);
115
system/Youdao/Dict/8.9.6.0/resultui/html/index.html#/javascript:;) as the base
coordinate
Return Value: none
getReferenceFrame();
setEndType(EndType end\_type)
getEndType();
setGripperValue();
Function: Set the electric potential of the gripper. Get the current electric
potential of the gripper before use
Parameter Specification: Input Value ( 0-4095 )
Return Value: none
setGripperIni();
getGripperValue();
setGripperState;
116
setDigitalOutput;
getDitialInput;
Function:Set the PWM signal of the ATOM end IO output that specifies the
duty-through ratio
Parameter description: pin_no:IO serial number freq: clock frequency
pin_write:ratio: 0 ~ 256;128 indicates 50%.
Return value:none
117
Test Program
Functional Specification
Burn the connect-test firmware, which can detect the connection of the device,
and the Basic screen will show the linked device
Requirement
Use MyStudio Basic -> connect-test
Flow Chart
Burn AtomMain for Atom
Burn connect-test for Basic
Press Button C:
118
If there is any problem:
119
5.2 UIFlow
Press the web URL of uiFlow :https://flow.m5stack.com/https://flow.m5stack.com/
Make the Atom screen green and adjust the angles of the six joints as shown in
the figure
120
121
5.3 Python
Programming in Python Environment
Pre-preparation:
Make sure the top Atom burns into Atom and the bottom BASIC burns into
Transponder.
Pip
Notes:
Only Atom2.6 and later versions are currently supported; if you are using
previous versions, please install PyMycobot 1.0.7.
122
Download Path:https://github.com/elephantrobotics/pymycobot
Then, put the Pymycobot file in your project so that you can import it and use it.
123
pymycobot API
Class:
MyCobot
Overall status
MDI mode and operation
JOG mode and operation
Running status and Settings
Servo control
Atom IO
Angle
Coord
MyCobot
from pymycobot.mycobot import MyCobot
Overall status
power_on()
Description: Robot arm power up.
power_off()
Description: Robot arm power down.
is_power_on()
Description: Adjust robot arm whether power on.
Returns
1 : power on
0 : power off
-1 : error
set_free_mode()
Description: Set robot arm into free moving mode.
124
MDI mode and operation
get_angles()
Description: Get the degree of all joints.
send_angle()
Description: Send one degree of joint to robot arm.
Parameters
mycobot = MyCobot('/dev/ttyUSB0')
mycobot.send_angle(Angle.J2.value, 10, 50)
send_angles()
Description: Send the degrees of all joints to robot arm.
Parameters
speed :( int )
Example
mycobot = MyCobot('/dev/ttyUSB0')
mycobot.send_angles([0,0,0,0,0,0], 80)
get_radians()
Description: Get the radians of all joints.
send_radians()
Description: Send the radians of all joint to robot arm.
Parameters:
125
degrees : a list of radian value( List[float] )
speed :( int ) 0 ~ 100
Example
mycobot = MyCobot('/dev/ttyUSB0')
mycobot.send_angles_by_radian([1,1,1,1,1,1], 70)
get_coords()
Description: Get the Coords from robot arm, coordinate system based on
base.
send_coord()
Description: Send one coord to robot arm.
Parameters
mycobot = MyCobot('/dev/ttyUSB0')
mycobot.send_coord(Coord.X.value, -40, 70)
send_coords()
Description: Send all coords to robot arm.
Parameters
mycobot = MyCobot('/dev/ttyUSB0')
mycobot.send_coords([160, 160, 160, 0, 0, 0], 70, 0)
sync_send_angles()
126
Description: Send the angle in synchronous state and return when the target
point is reached
Parameters
sync_send_coords()
Description: Send the coord in synchronous state and return when the target
point is reached
Parameters
pause()
Description: Pause movement.
resume()
Description: Recovery movement.
stop()
Description: Stop moving.
is_paused()
Description: Judge whether the manipulator pauses or not.
Returns :
1 - paused
0 - not paused
-1 - error
is_in_position()
Description: Judge whether in the position.
Parameters
1 - true
0 - false
127
-1 - error
Parameters
jog_coord()
Description: Jog control coord.
Parameters
jog_stop()
Description: Stop jog moving.
set_speed()
Description: Set speed.
get_joint_min_angle()
Description: Gets the minimum movement angle of the specified joint
get_joint_max_angle()
128
Description: Gets the maximum movement angle of the specified joint
Servo control
is_servo_enable()
Description: Determine whether all steering gears are connected
Returns
0 : disable
1 : enbale
-1 : error
is_all_servo_enable()
Description: Determine whether the specified steering gear is connected
Returns
0 : disable
1 : enbale
-1 : error
release_servo()
Description: Power off designated servo
focus_servo()
Description: Power on designated servo
Atom IO
set_color()
Description: Set the color of the light on the top of the robot arm.
Parameters
r : 0 ~ 255
g : 0 ~ 255
b : 0 ~ 255
129
set_pin_mode()
Parameters
pin_no (int):
pin_mode (int): 0 - input, 1 - output, 2 - input_pullup
set_digital_output()
Parameters
pin_no (int):
pin_signal (int): 0 / 1
get_digital_input()
Parameters: pin_no (int)
get_gripper_value()
Description: Get gripper value
set_gripper_state()
Description: Set gripper switch state
Parameters
set_gripper_value()
Description: Set gripper value
Parameters
set_gripper_ini()
Description: Set the current position to zero, set current position value is
2048 .
is_gripper_moving()
Description: Judge whether the gripper is moving or not
Returns
130
0 : not moving
1 : is moving
-1 : error data
Angle
from pymycobot.genre import Angle
Description
Instance class of joint. It's recommended to use this class to select joint.
Coord
from pymycobot.genre import Coord
Description
Instance class of coord. It's recommended to use this class to select coord.
131
ROS Introduction
ROS is abbreviation of Robot Operating System. ROS is a highly flexible software
architecture for writing robotic software programs.
ROS icon :
ROS's primary design goal is to increase code reuse in the field of robotics
development. ROS is a framework of distributed processes (or "nodes") that are
encapsulated in packages and feature packs that are easy to share and publish.
ROS also supports a joint system similar to a code repository, which can also
enable engineering collaboration and release. This design enables the
development of an engineering implementation to make completely independent
decisions (without ROS restrictions) from the file system to the user interface. At
the same time, all projects can be integrated with the basic tools of ROS.
MoveIt Introduction
Moveit! is the most advanced software for robotic arm movement operations and
is used on more than 100 robots. It combines the latest results in motion planning,
control, 3D perception, operations control, control and navigation, provides an
easy-to-use platform for developing advanced robotics applications, and provides
an integrated software platform for the design and evaluation of new robot
products in industrial, commercial and research and development fields.
Moveit icon :
132
Basic development environments require the installation of robotic operating
systems ROS, MoveIt, and git version managers, which are described below.
ROS Installation
1.Choose version
ROS has a one-to-one relationship with ubuntu, and different versions of ubuntu
correspond to different versions of ROS, the website
below:http://wiki.ros.org/Distributions
If the version is different, the download will fail. The system we selected here is
Ubuntu 16.04, corresponding to ros version ROS Kinetic Kame.
2.Begin installation
2.1.Add source
There are no ROS software sources in the list of Ubuntu's own software sources,
so you need to configure the ROS software source into the software list repository
first, then you can download ROS and open a console
terminalCtrl+Alt+T),Enter the following instructions:
The results are as follows (the user password is required here, just enter the user
password set when Ubuntu is installed)
133
2.3.installation
After adding a new software source, update the list of software sources:
install ROS:
rosdep update
Bash
Zsh
134
3.Verify and installation
The start-up of the ROS system requires a ROS Master, the node manager, and
we can enter the roscore instruction at the terminal to start the ROS Maste, to
verify that the ROS was installed successfully, the following commands are
executed at the terminal:
roscore
For more detailed installation instructions, you can refer to the official installation
,
instructions website: http://wiki.ros.org/ROS/Installation
MoveIt Installation
MoveIt is the component of a series of mobile operations in ros, including motion
planning, collision detection, kinematics, 3D awareness, operation control and
other functions.
2.Install MoveIt
Enter the following command in the terminal window to perform the installation of
MoveIt:
135
sudo apt-get install ros-kinetic-moveit
git installation
3.Install git
Enter the following command in the terminal window to perform the installation of
git:
136
Read git version, Enter the following command in the terminal window:
git --version
The git version number can be displayed in the terminal, as follows, for a
successful installation.
137
Introduction
mycobot_ros is from ElephantRobotics, fits its desktop six-axis robotic arm
mycobot.
Installation
NOTE :
This package relies on ROS and MoveIT. Ensure a successful installation of
ROS and MoveIT before use.
The interaction of this package with the real robot arm is depend on
PythonApi - pymycobot
The way you install it depends on Git, make sure Git is installed on your
computer.
The following text will refer to the ROS workspace path in your computer with
<ros-workspace> , and make sure that you replace <ros-workspace> with your
native's real path when you execute the following command.
cd <ros-workspace>/src
git clone https://github.com/elephantrobotics/mycobot_ros.git
cd ..
catkin_make
source <ros-workspace>/devel/setup.bash
138
Slider control
Open a command line and run:
It will open the rviz and a slider assembly, and you'll see as the following:
You can then control the movement of the model in the rviz by dragging the slider.
If you want the real mycobot to move with you, you need to open another
command line and run:
It publishes the angle to mycobot in real time. The script supports the setting of
port numbers and baud rates, with defaults of "/dev/ttyUSB0" and 115200 。
Model follow
In addition to the controls above, we can also allow the model to follow the real
robot arm movement. Open a command line to run:
It will release the angle of the real robotic arm. The script supports setting port
numbers and baud rates, which are defaulted to "/dev/ttyUSB0" 和 115200 。
139
It opens the rviz display model to follow the effect.
140
Keyboard control
Keyboard control was added to the mycobot_ros package and synchronized in real
time in rviz. This function relies on pythonApi, so make sure to connect to the real
robot arm.
Due to the need for real robotic arm communication, the launch supports the
setting of port numbers and Baud rates, which default to "/dev/ttyUSB0" and
115200 .
141
SUMMARY
========
PARAMETERS
* /mycobot_services/baud: 115200
* /mycobot_services/port: /dev/ttyUSB0
* /robot_description: <?xml version="1....
* /rosdistro: kinetic
* /rosversion: 1.12.17
NODES
/
mycobot_services (mycobot_ros/mycobot_services.py)
real_listener (mycobot_ros/listen_real.py)
robot_state_publisher (robot_state_publisher/state_publisher)
rviz (rviz/rviz)
MyCobot Status
--------------------------------
Joint Limit:
joint 1: -170 ~ +170
joint 2: -170 ~ +170
joint 3: -170 ~ +170
joint 4: -170 ~ +170
joint 5: -170 ~ +170
joint 6: -180 ~ +180
142
Mycobot Teleop Keyboard Controller
---------------------------
Movimg options(control coordinations [x,y,z,rx,ry,rz]):
w(x+)
z(z-) x(z+)
Gripper control:
g - open
h - close
Other:
1 - Go to init pose
2 - Go to home pose
3 - Resave home pose
q - Quit
143
MoveIt
mycobot_ros is now integrated into the MoveIt section
If you need to synchronize the plan with the real robot arm, you need to open
another command line and run:
The script supports setting port numbers and baud rates, which are defaulted
to "/dev/ttyUSB0" and 115200 。
144
Please look forward to more...
145
RoboFlow
Download RoboFlow
For example, since the RoboFlow operating system runs in the teach pendant, the
user can use the carrier of the teach pendant to perform manual robotics,
programming, and other operations. The operating system OS can also be used
to communicate with other robots or devices. All in all, with the advantages of
friendly interface and rich functions, the appearance of the RoboFlow operating
system makes it easier for users to start using the elephant robot. It makes
everyone a commander of robots.
146
Figure 2- 1 login interface
There are two types of login users for the OS3 operating system, one is the
administrator and the other is the operator. The administrator has the highest
authority to perform all operations, programming and setup. The operator can only
load and run existing programs and check the statistical data information.
Administrators can add and modify multiple accounts in the settings, including
operator accounts.
By clicking on the "Shutdown" button, the OS3 operating system can be turned
off, and then the power supply can be turned off, thus the robot system can be
shut down.
2.2Main Menu
When the login is successful, it will go to the main menu page. The main menu of
the OS3 operating system is shown in Figure 2-2.
On the left side of the main menu, there are four different options available:
Run Program
Load an existing program directly and control the program to run. In this
window, the user is not allowed to edit the program, but can only control
the program running (such as control program running, pausing,
stopping). At the same time, you can view the log and other related
information during the running process of the program.
Edit Program
147
Users can choose to load an existing program in this window for
modification, or they can choose to create a new blank program for
editing.This window is the most frequently used function window for
users. Besides programming, it can also perform other operations, such
as manual manipulation of robots with "fast moving" function, forced
control of IO signals, new variables, etc.
Statistics
In this window, users can not only view the existing running data of the
system, but also view related information saved before.
Settings
In this window, the user can make basic settings for the robot. Such as
robot open, robot off, account management, default program settings,
etc.
In addition to these four main options, in the right window of the main menu, the
user can see and open the most recently run program files. It is convenient for
users to quickly find the most recently run program and control the program to
run.
Click the "Shutdown" button to close the OS3 operating system; click the "Logout"
button to log out.
148
Users can enter the program window by loading the program they need to run. In
this window, users can:
1.Get the basic information of the current (ready) running program, including
program name, running status, user type.
3.Read the relevant information of the current running program through the
display window, such as IO, variables, logs, etc.
4.The most important thing is that the running program window is the channel for
the user to load and run the program that has been debugged.
2.4Edit Program
As shown in Figure 2-4, if the user selects "Write Program" in the main menu, two
options will appear in the right window. The first is to create a program (optional
blank or template) and the second is to load the program.
149
Figure 2- 5 Program Editing Interface
When first entering the program edit page, the user sees the initial page as shown
in figure 2-5. In this page, common tools, initialization group and file management
functions are provided. The role of the initialization group is to make it easy for the
user to set the program content to run at the beginning of the program and run
only once. For example, set the initial point, Io State, and so on before the robot
starts formal work. File management provides users with a way to manage files.
Users can manage program files here, and can copy them to the U disk, or from
the U disk to the system memory. If the user wants to go back to the initial page
during the programming process, click "Back".
Function bar As shown in Figure 2-6, the function bar has seven sub-options,
which are divided into two categories, one is the program editing toolbar, and the
other is the function editing column.
150
Figure 2- 6 Function bar
Program editing toolbar: Includes file option bar, edit option bar, and toll options
bar. File:As shown in Figure 2-7, you can edit the program file. There are
several operation options: Save, Save As, New, Load, Rename, and Exit.
Edit:As shown in Figure 2-8, you can edit the specific command content in the
program file. There are cut, copy, paste, delete, disable, delete all, redo, undo
options.
A. Tool options bar:As shown in Figure 2-9, it is a shortcut toolbar. When editing
a robot program, the user often uses other tools to operate the robot. The tool
options bar provides tools commonly used in program editing. Tools provided
include: Quickmove, install, input and output, variables, logs, basic settings. For
example, when editing a motion command, the user needs to manually operate
the robot to a working position and teach the point. Then, the “Quickmove” tool in
the toolbar can be selected to manually operate the robot to move to the position.
Functional Editing Window The OS3 operating system provides a rich set of
features that allow users to perform complex functions with simple operations.
Simple, but not simple functions, thus reducing the time workers to learn
151
programming, efficient accomplish their goals. The function editing bar includes
basic functions, logic functions, advanced functions, and extended functions.
Basic functions:As shown in Figure 2-10, the basic functions include Waypoint,
Gripper, Wait, Set, and Group, which are some basic functions commonly used by
users.
Waypoint: “Create new waypoints → Manually operate the robot to move the
robot to the target point → Save current point → Running program”. With this
series of operations, the user completes the goal of controlling the movement
of the robot to the target point. If you create multiple waypoints, the motion of
the robot will form a trajectory when you run the program.
Gripper: The user can use this function to set the end effector. For example,
it holds the workpiece or releases the workpiece.
Wait: Users can use this function to delay, or wait for signals, conditions, and
so on.
Set: Users can use this function to set the input and output signals and
custom conditions.
Group: Users can use this function to edit the programs in the group.
Logic function: As shown in Figure 2-11, the logic functions include Loop,
If/Else, Subprogram, Thread, Halt, Switch, to complete the program running
process control.
Loop: The user can use this function to set a block to run cyclically multiple
times.
If/Else: The user can use this function to make conditional judgments, such
as the determination of an input signal.
Subprogram: The user can use this function to call a subroutine.
Thread: Users can use this function to achieve robot multi-thread control.
Halt: The user can use this function to control the program to pause, stop,
restart, and pop up the window to display the corresponding prompt
information.
Switch: The user can use this function to make a condition selection and
determine the content to be executed according to the value of the selected
object.
Advanced function: As shown in Figure 2-12, advanced functions include
Pallet, Assign to Var, Script, Popup, and Sender, all of which perform more
complex operations.
152
Pallet: Users can use this function to realize the robot to perform regular
point movements. For example, the handling of workpieces in pallets,
palletizing, etc. It is also possible to implement the fixed but irregular
rendezvous motion of the robot in sequence.
Assign to Var: Users can use this function to implement the assignment of a
variable.
Script: With the scripting feature, users can use the other common functions
to achieve simple tasks while using the elephant robot, and can also use
script programming to complete more complex tasks.
Popup: Users can use this function to customize the pop-up window to
display related information. This helps the operator to analyze the status of
the current robot running program.
Sender: Users can use this function to achieve TCP/IP communication
between the elephant robot and other devices.
Extended function: To adapt to different application scenarios, the OS3
operating system provides some extension functions, and even customizes
functions according to important application scenarios proposed by users.
Program Display Window On the left side of the program editing page, there is a
program display window as shown in Figure 2-14. The upper part is the name of
the currently open program file, and the lower part is the program tree, which
records the specific instructions and related information.
153
154
Figure 2- 14 Program Display Window
On the right side of the program editing page, there is a function editing window
as shown in Figure 2-15, which shows the specific contents of the function
instructions.
The user can make specific settings for the function instructions in this window.
Quick control and current command renaming, deletion, and disabling are also
provided here.
At the bottom of the program editing page, there is a program running control bar
as shown in Figure 2-16. When debugging a program, users can use it to run,
pause, stop and limit the running speed of the program.
2.5Statistics
When users use elephant robots, they can not only program and control the robot
to complete the corresponding tasks, but also get some valuable statistical data in
the statistical report window for analysis and statistics.
As shown in Figure 2-17, the general class counts the total running time, the
number of active programs, and the specific information of active programs.
155
Figure 2- 17 Conventional statistics
As shown in Figure 2-18, the program class counts the total running time and
times of different programs.
As shown in Figure 2-19, the log lists the general information, warning information
and error information recorded by the system during the user's use of OS3
operating system. This information helps users to determine what changes and
feedback the system has made during the operation of the OS3 operating system.
In particular, error information can help users quickly locate the possible causes of
errors, so as to solve problems according to error information and resume normal
use.
156
Figure 2- 19 Log statistics
As shown in Figure 2-20, security statistics can help users to count security-
related information, such as collision information, number of stops, etc.
2.6Settings
In the configuration center, users can configure the robot. For example, power the
robot, turn off the robot, set the load, time, network and so on. Initialization
When robot movement is required, the user needs to enter the configuration
center → initialize the robot, or shut down the robot. In the initialization page, you
can also set the load and installation, these two are important configuration
157
content before other operations, such as configuration errors may cause
unexpected situations.
Figure 2- 21 Initialization
Default program
This function allows the user to set a default running program. As long as the
system starts, the robot directly enters the running program window, and can start
running the program and perform corresponding actions to complete the specified
task.
158
If the user does not want the system to start and the startup program starts
running, you can choose not to run.
Version update Figure 2-23 shows the version update settings page.
This page allows users to update the OS3 operating system in two ways, one for
local file updates and one for network updates. Figure 2-24 shows the account
management page.
Users can add new users, delete expired users, or change passwords on this
page. On this page, the user can get all the account information. Language and
unit The language and unit settings page are shown in Figure 2-25. At present,
159
the OS3 operating system supports Chinese and English and metric units. Other
languages and units are increasing, so stay tuned!
Figure 2- 26 Time
The user can set the system time on the current page. If the "24-hour system" is
not checked, the time display format defaults to 12-hour system. Touch screen
calibration
Figure 2-27 shows the touch screen calibration instructions. The user clicks on
"Start to Calibrate" to enter the calibration interface. The calibration interface will
appear in sequence with four circles, as shown in the figure. The user needs to
click the center of the circle with a touch pen, and each time the button is clicked,
160
the next circle will appear until all four circles appear. A pop-up window will
appear indicating that the calibration is complete, and you can exit the calibration
screen after confirming the pop-up.
If the calibration times out or the steps are wrong, a pop-up prompts the
calibration failure. At this point, you can confirm to exit the calibration interface
and return to the page in Figure 2-27 to recalibrate.
Figure 2- 28 About us
This page shows basic information about the operating system of the OS3
operating system. For example, the model of the robot used is the Elephant
series, version information, and so on.
161
For more information, please visit the official website
https://www.elephantrobotics.cn。
As shown in Figure 3-1, Quickmove are mainly composed of 11 parts, which are
described below.
Figure 3- 1 Quickmove
162
pointed to the positive direction of the z-axis. Such three axes form a spatial
Cartesian coordinate system, and point O is called the coordinate origin. This
constitutes a Cartesian coordinate.
There are three planes in the three-dimensional Cartesian coordinate system, XY-
plane, YZ-plane, and XZ-plane. These three planes divide the three-dimensional
space into eight parts, called octant spaces. The three coordinates of each point
of the first limit are positive values.
As shown in Figure 3-3, the robot can be controlled to move in the direction of the
Cartesian coordinate system by clicking the key corresponding to the direction of
the Cartesian coordinate system.
3D View This window marks the direction of movement of the six joints of the
robot.
There are two main motion modes for manual manipulation robots.
Continuous motion mode: The user presses the motion control button and
allows the robot to move until the user releases the button and the robot
stops. For example, if you press the + X direction motion control button, you
need to hold the button all the time. The time of pressing the motion control
key determines the distance of the robot in the + X direction.
Stepping motion mode: Manual manipulation robot step motion, click "step
motion" and open the step setting window as shown in Figure 3-4. Then the
user chooses the step in this window and clicks the key of the target control
163
direction. Every time he clicks, the robot takes a step. For example, choose a
1 mm step, click the X-direction movement control button, and every time you
click the button, the robot will move 1 mm in the + X direction.
Speed As shown in Figure 3-5, the control speed of the manual manipulator can
be set here. Speed can be set from 0 to 100%.
Move to origin By selecting the icon shown in Figure 3-6, the robot can be
controlled to return to its original position and posture.
Figure 3- 6 Move to
origin
164
Freemove Select the icon shown in Figure 3-7 to switch to the drag mode.
Figure 3- 7
Freemove
Return Click on the icon shown in Figure 3-8 to return to the programming
Joint control Serial robot is an open kinematic chain of the robot. It is formed by
a series of connecting rods connected in series with a rotating joint or a moving
joint. The elephant cooperative robot belongs to a 6-axis serial robot. It drives the
relative motion of the connecting rod by using motor drivers to drive the
movement of 6 joints, allowing the end operator to reach the right posture. The
Joint control window shown in Figure 3-9 provides the keys used by the operator
to manually manipulate the robot and control the robot for joint movement using
the instructor. The control buttons for each joint are divided into 2 directions, and
165
the angle data of each axis can be seen.
Status display button: The button has two states, "OK" (displays green) and
"Reset" (displays red). When the display is normal, it indicates that the robot is
working properly, and when the reset is displayed, the robot is abnormal, the
anomaly needs to be lifted and the key is clicked for reset.
3.2Installation
166
As shown in Figure 3-11, there are three submenus inside the installation tool. It is
used to implement the loading/saving installation configuration, security
configuration, and network configuration of the elephant robot.
Security configuration: As shown in Figure 3-12, set the torque limit and brake
control of the elephant robot.
167
Network Configuration: As shown in Figure 3-13, configure the IP address and
port number of the Ethernet communication here.
168
Figure 3- 15 Input and output interface description
It should be noted that the input public terminal needs to be connected to 24V
power supply. It can be determined whether the input is active high or active low
according to the common configuration (hardware connection determines 24V or
0V). As shown in Figure 3-16, when the common terminal is connected to 24V,
once an external device inputs 0V, the input signal is in the “High” state, otherwise
it is in the “Low” state; vice versa.
As shown in Figure 3-17, the output is 24V when there is no output. Once the
output is turned on (that is, the output is High), the output is 0V.
169
Figure 3- 17 Output signal application diagram
3.4Variable
As shown in Figure 3-18, in the variable editing window, you can add, edit, and
delete variables.
As shown in Figure 3-19, there are 5 types of editable variable types. They are
string variables, pose variables, floating point variables, integer variables, and
Boolean variables. On this page, you can edit the variable name and initial value.
3.5Log
170
As shown in Figure 3-20, you can view information about the robot running status,
error information, and alarm information in the running log window. Click the
"Information", "Warning" and "Error" buttons to sort the corresponding logs.
Users can save logs to a local folder. Log files are a record of how the system is
performing, helping users to have a clearer understanding of the system and also
helping to troubleshoot errors.
3.6Basic Settings
171
As shown in Figure 3-21, the basic settings page provides a common setting
channel, allowing the user to quickly set up some functions, such as free
movement related parameter settings, even when leaving the programming
window while writing the program.
4 Function instruction
4.1 Basic function
4.1.1Waypoint
There are four types of waypoints: Absolute points, Relative points, Shared
points, and Variables. These four types are side-by-side. Under one waypoint
command, you can only choose one.
Absolute point: The absolute point is a description of the actual pose of the
robot.
That is, as long as the robot records the absolute point, the next time the
instruction is executed, regardless of the position of the robot (other settings
unchanged), will reproduce the original teaching of the absolute point of
posture.
The specific configuration page for absolute points is shown in Figure 4-1.
172
Figure 4- 1 Absolute point
As shown in Figure 4-2, there are two formats for the representation of
absolute points, namely Cartesian coordinate system coordinate values and
joint angles. Among them, the Cartesian coordinate system coordinate value
records the position and attitude of the robot TCP relative to the base
coordinate system (in mm),.The joint angle is a direct record of the actual
angle of each axis (in degree, degrees).
Waypoint control Save current point This button is used to save the current pose
data of the robot. Move to this point If you need to verify the teaching point or
move to the teaching point for some operations, press and hold the button until
the robot moves to the current teaching point. If the current teaching point is no
longer needed, this button is used to clear the current teaching point. Advanced
Features Shared configuration: This feature is being debugged, so stay tuned!
Advanced configuration As shown in Figure 4-3, in the advanced configuration
page, the user can set the movement mode, proximity mode, command speed,
and torque limit.
173
Figure 4- 3 Advanced configuration
174
Figure 4- 4 Relative point
Direct input(relative movement) As shown in Figure 4-5, you can directly enter the
coordinate value / joint angle.
Regardless of whether the coordinate value or the joint angle is input, one or
more of the six values are selected according to the offset requirement, and not
every value is required to be input.
For example, as shown in Figure 4-6, in the actual pickup and placement process,
it is necessary to set a transition point above the target placement position. At this
time, we can set a path command as absolute point, control the robot (at this time
the robot should be holding the state of the workpiece) to move to the placement
point, click to save the current point, which generates the command line 2 shown
in Figure 4-6. Then click on the basic function - waypoint: select the relative point,
set the z-direction of the icon to increase the relative point of 50mm, then the
robot will move to the position of the transition point after running the last
sentence. In the actual pickup and placement process, other instructions, such as
setting instructions, may be added between the two instructions to open the
gripper.
175
Figure 4- 6 Application examples of direct input coordinate values
In addition to offset based on the position of the last motion instruction, relative
point instruction can also be offset based on a Waypoint or Variable point.
The "Move to this" button verifies the offset motion, and "Clear Saved Points"
clears the currently entered content.
Shared point:The share point can use the location of other waypoints. Figure 4-
7 shows the specific configuration page of the share point.
176
Shared point: Select the point you want to share in the box, you can keep
pressing the "Move to this point" button to control the robot to move to that point.
If you click "Clear Saved Points" to clear the current share point.
Variable: The waypoint can be assigned by a variable. The user can use the
communication method to obtain the waypoint location from other devices.
Figure 4-8 shows the specific configuration page of the variable point.
Variable assignment: The user can select the associated pose variable, and
"Move to this point" can check whether the pose is the target pose.
Figure 4- 8 Variable
4.1.2 Gripper Figure 4-9 shows the specific configuration page of the gripper.
177
Figure 4- 9 Gripper
The user defines and controls the gripper through a simple function.
1 Select gripper
2 Set existing grippers
3 Select the gripper, you can edit or delete the existing gripper.
4 Define new grippers 如图4-10所示,可以命名夹爪,同时控制多个输入信
号:设置需要控制的输出信号的数量、在“设置”中选择设置第几个信号、设置
状态(关系到具体执行时对应“打开”或“关闭”功能)、设置对应输出信号。在
设置完成后,还可以选择等待条件。
178
Figure 4- 10 Define new grippers
Fully open: The option in the execution gripper definition is the"open" state.
Completely off: The option in the execution gripper definition is "off" status.
Debug control
Open the gripper: Manual operation performs the option of the "Open" state
in the gripper definition.
Close the gripper: Manual operation performs the option of the "Close" state
in the gripper definition.
4.1.3 Wait As shown in Figure 4-11, there are four modes for waiting for
instructions.
179
Figure 4- 11Wait
4.1.4 Set As shown in Figure 4-12, the setup command has four modes of
selection.
Set IO: Set the state of the output signal. In addition to selecting the set
output signal to determine whether it is on or off, you can also set the time
that the signal is held.
Set conditions: Customize the content of the settings.
Set TCP (i.e. tool center point).
Set the load.
Figure 4- 12 Set
180
Figure 4- 13 Group
When users use group instruction, such as grabbing and placing combinations,
they can modify parameters and teach Waypoints directly on the basis of template
programs, or they can add or delete instructions freely according to their needs.
The user can simplify the process of finding instructions by using the group
instruction. And it is more convenient and quicker to complete the programming of
the corresponding project.
181
Figure 4- 14 Loop
4.2.2 If/Else
Judging the set conditions allows the program to read the data, determine and
determine what to do next.
If/Else can be used to determine the I/O signal and can also be used to determine
other conditions.
If/Else consists of three parts: If, Else If, and Else. The relationship between these
three parts is as follows:
Except that If is an integral part, the remaining two are optional parts;
If both If, Else If, and Else exist, the program will first read If, then read Else If,
Else If ... Else. The relationship between the three is shown in Figure 4-15:
There can be more than one Else If, but there is only one If, and if you choose to
add Else, you can only have one Else.
You can delete Else If or Else, but if you delete If, delete all Else If and Else.
Figure 4-16 shows the setting page of the conditional judgment command.
182
Figure 4- 16 If/Else
As shown in the figure above, if the condition following "If" is met, the robot will
move to waypoint 1; if it meets the condition followed by "Else if", it will move to
waypoint 2; if both conditions are not met, the "Else" corresponding block will be
executed, that is, the robot will move to the waypoint 3.
Figure 4- 17 Subprogram
183
As shown in Figure 4-18, you can view and edit subroutines in the main program.
If you edit the subroutine, please note that it will not take effect until it is saved.
4.2.4Thread The thread runs along the main program. It is used to check signals
such as emergency buttons or safety light curtains. As shown in Figure 4-19, you
can set the running interval of threads.
Note that motion instructions are not allowed in threads. 4.2.5Halt The pause
command is used to control the robot to pause, stop, and resume. Figure 4-20
shows the specific configuration page for the pause command.
When setting the pause and stop status, you can also select “Show Popup”
to customize the contents displayed by the popup.
Set the restart state. When the program runs to this instruction, it will start
running again from the first instruction at the beginning.
184
Figure 4- 20 Halt
4.2.6 Switch As shown in Figure 4-21, the conditional selection instruction is used
to judge the value of a variable.
Figure 4- 21 Switch
If only a few variables are judged, and other cases are handled uniformly, we
need to select "default" and add corresponding instructions to the switch.
185
4.3 Advanced function
4.3.1Pallet The pallet instruction allows the user to teach only a few points,
through which the position of the other points can be calculated by the robotic
system. Running this instruction can control the movement of the robot to these
points. As shown in Figure 4-22, you can select a line, plane, cube, discrete point.
As shown in Figure 4-23, after you select line, select the number of points, and
the line will be split evenly based on the number of points. These points are the
split point. The user determines this line by teaching two points.
Figure 4- 23 Line
186
As shown in Figure 4-24, after selecting “Plane”, select the number of points of
the two axes, and the plane is divided equally. These points are the dividing
points. This plane is determined by teaching four points.
Figure 4- 24 Plane
As shown in Figure 4-25, after selecting "Cube", select the number of points of the
three axes, and the cube is divided equally. These points are the dividing points.
Determine this cube by teaching eight points.
Figure 4- 25 Cube
As shown in Figure 4-26, when “Discrete Point” is selected, the number of points
is selected to teach different points. That is, a discrete point is a collection of
multiple points.
187
Figure 4- 26 Discrete point
4.3.2Assign to var As shown in Figure 4-27, this command can assign values to
integer variables and string variables. You can also use the "set variables" to
directly set the value of the variable according to the instruction.
188
Figure 4- 28 Script
4.3.4Popup The pop-up command allows the user to customize the pop-up
window. In other words, when this command is executed, a pop-up window
appears, and the pop-up content is user-defined content. As shown in Figure 4-
29, there are three types of pop-up windows, information, warnings, and errors.
The user selects one and customizes the pop-up content.
There are also three kinds of pop-up window control: continue the program
(logging), that is, do not pop the window, just display the contents of the pop-up
window to the log, and the program continues to run; When the window is
popped, the program is paused, that is, the pop-up window appears, and the
program is suspended; When the window is popped, the program stops, and the
pop-up window appears, and the program stops running.
Figure 4- 29 Popup
189
4.3.5Sender
If TCP/IP communication is to be performed, the robot system must set the IP and
port number as a client or server to communicate with other devices.
The sender allows the user to set up a TCP/IP connection. Figure 4-30 shows the
specific configuration page of the Sender instruction.
If the robot system acts as a client, the IP address filled in is the IP address of the
external device of the server, and the port number corresponds to the port number
assigned to the robot system by the server. When the server is in the state of
monitoring, it can communicate with the server by clicking the "connection"
button.
If the robot system serves as a server, the IP address filled in is the local IP
address, and the port number corresponds to the port number assigned to the
client device. Click on the "monitor" button, at which point the client device can
connect to the robot system. In the client list, you can view the IP addresses and
port numbers of all clients.
Figure 4- 30 Sender
Precondition
190
Plug the power cord into the board that provides the AC 220V.
Turn on the power switch. Press the start button on the teach pendant.
5.1.2 Flow chart As shown in Figure 5-1, it is the program editing flowchart.
191
Figure 5- 2 Login interface
Select the login user name "Admin" or other administrator user name (only
administrator permissions are allowed to edit and debug the program), click on
the password box will pop up as shown in Figure 5-3.
192
Figure 5- 4 Main menu
In the main menu interface, select “Settings”, it will enter the interface as shown in
Figure 5-5 (this time has not been powered).
To ensure that the emergency stop knob is not pressed, click on the “Start Robot”
button as shown in Figure 5-5. The interface will change and the “Powering On”
icon as shown in Figure 5-6 will appear. If the power is turned on successfully, the
“I’m OK!” status shown in Figure 5-7 will appear. If it fails, check if you are missing
any steps.
After completing the previous step, return to the main menu by pressing the motor
“<Main Menu” button in the configuration center.
193
Figure 5- 6 Powering up
Figure 5- 7 Power on
5.2.3 New blank program As shown in Figure 5-8, click “Program Robot” and
then select “Empty Program”.
194
Figure 5- 8 Select "Empty program"
After performing the previous step, enter the program editing interface as shown
in Figure 5-9.
As shown in Figure 5-10, add two waypoints: absolute point, and teach two points
(that is, use the Quickmove to manually operate the robot, control the robot to
move to a certain pose, return, click "Save Current Point" The teaching steps of
the two points are the same. To verify the save point, press and hold the “Move to
this point” button to manually control the robot to move to the teaching point.).
195
After editing is complete, please note that the program file is saved. Click “Save”
in the file option bar as shown in Figure 5-10, and the window shown in Figure 5-
11 will pop up. Click on "File Name" and the input keyboard shown in Figure 5-12
will appear. After entering the file name, click "OK". Then go back to the save
interface, click "OK", the program file is saved successfully. After the save is
successful, as shown in Figure 5-13, the program name in the upper left corner of
the program editing interface will be changed.
Figure 5- 11
196
Figure 5- 12 Enter the program name
5.2.5 Program Debugging As shown in Figure 5-14, in addition to the "Next" and
"Run" functions provided in the program run control bar, click "Advanced" to enter
the more settings interface.
Among them, the "Next" function corresponds to step by step execution of the
program, click to run only one step at a time, if you need to continue to run,
continue to click "Next." The "Run" function corresponds to automatically running
the program once.
In "Advanced ", you can set the number of cycles to run, or you can run in an
infinite loop. You can also control whether the program runs in automatic or
manual mode. In the automatic mode, you can use "Next", "Run" and cycle
operation. In the interface shown in Figure 5-14, select "Manual Run Mode" and
then select "Run" or "Infinite Loop" in the loop run. You can enter the running
interface in manual operation mode as shown in Figure 5-15.
197
Figure 5- 14 Program debugging
If you use manual mode to debug the program, you need to keep pressing the
"Press Down" button to continue running. If you release the button, the program
pauses and presses again to continue.
If debugging is complete, make sure you have saved the debugged program.
After returning to the main menu, select "Run Program". The pop-up window
shown in Figure 5-16 will appear. Select the program to complete the debugging
and click “OK”.
198
Figure 5- 16 Selection procedure
After selecting the program, it will enter the running program interface as shown in
Figure 5-17. In this interface, you can run the program to view the program
running information.
If you are sure that the program will continue to run in the near future, you can
also select it in the Settings - Default Program. In this way, as long as the system
is started, it will automatically jump to the “Run Program” interface. After the
power is turned on successfully, click “Run” to run the program.
199
6 Communications and messages
Note: When use a communication protocol to communicate directly you need to
burn Transponder in Basic and the latest version of AtomMain in Atom.
6.1Communication Settings
Make sure your communication Settings are as follows
Each communication command must contain the following five parts,part 3 and 4
of which can be null.
200
Command Sequence Number: 00 ~ 8F
Various commands have been developed
Null
Command Content: some
Null
End of the Command: 0XFA
Fixed
Included
Data Data
Type Description
Description Length
Different instructions
Data length
1 correspond to different lengths
byte
of data
End the
End frame 1 Stop bit, 0xfa
byte
No Return Value
201
Data field Description Data
NO Return Value
Return Value
202
Serial port sending example: FE FE 02 20 FA
temp = angle1_low+angle1_high*256
203
Data field Description Data
Calculation: The Angle value is multiplied by 100 first to int and then to take the
high hexadecimal byte
Calculation: The Angle value is multiplied by 100 first to int and then to take the
high hexadecimal byte
No Return Value
204
Data field Description Data
Calculation: The Angle value of Joint 1 is multiplied by 100 first to int and then to
take the high hexadecimal byte
Calculation: The Angle value of Joint 1 is multiplied by 100 first to int and then to
take the high hexadecimal byte
No Return Value
205
Data field Description Data
206
(same as the ry/rz-coordinate )
Data
Description Data
field
Calculation: The x/y/z coordinate value is multiplied by 10 and then to take the
high hexadecimal byte
Calculation: The x/y/z coordinate value is multiplied by 10 and then to take the low
hexadecimal byte
Calculation: The rx/ry/rz coordinate value is multiplied by 10 and then to take the
high hexadecimal byte
Calculation: The rx/ry/rz coordinate value is multiplied by 10 and then to take the
low hexadecimal byte
No Return Value
207
Data field Description Data
Set the target point at the end of the robot (-14,-27,275,-89.5, 0.7,-90.7)
Calculation: The x coordinate value is multiplied by 10 and then to take the high
hexadecimal byte
Calculation: The x coordinate value is multiplied by 10 and then to take the low
hexadecimal byte
Calculation: The rx coordinate value is multiplied by 10 and then to take the high
hexadecimal byte
208
rx_low: Data type Byte
Calculation: The rx coordinate value is multiplied by 10 and then to take the low
hexadecimal byte
No Return Value
Calculation: The x coordinate value is multiplied by 10 and then to take the high
hexadecimal byte
Calculation: The x coordinate value is multiplied by 10 and then to take the low
hexadecimal byte
209
(same as the y/z-coordinate )
Calculation: The rx coordinate value is multiplied by 10 and then to take the high
hexadecimal byte
Calculation: The rx coordinate value is multiplied by 10 and then to take the low
hexadecimal byte
11).Motion Detection
210
Data field Description Data
12).Jog-Direction Motion
No Return Value
13).Jog-Coordinate Motion
211
Data field Description Data
Set the end to move at a speed of 50% counterclockwise toward the X-axis
No Return Value
14).jog stop
No Return Value
212
Data field Description Data
Return Value: No
213
Data domain Description Data
Potential =2249
214
Data
Description Data
domain
Calculation method :1 steering gear potential value converted to int type and then
hexadecimal high byte
Calculation method :1 steering gear potential value converted to int type and then
hexadecimal low byte
(Other Same)
Return Value: No
215
1. Read speed
1. Set speed
Return Value: No
216
Data domain Description Data
Return Value: No
Read
217
Data domain Description Data
Angle =90
temp =angle1_low+angle1_high*256
Calculation method: low angle value + high angle value multiplied by 256 to
determine whether it is greater than 33000 if it is greater than 33000 then subtract
65536 and divide 10 if directly divided by 10
218
Data domain Desccription Data
temp =angle1_low+angle1_high*256
Calculation method: low angle value + high angle value multiplied by 256 to
determine whether it is greater than 33000 if it is greater than 33000 then subtract
65536 and divide 10 if directly divided by 10
1. View connection
219
Example of serial port return: FE FE 03 50 00 FA
220
byte Data_id: data types
value value
Address Function initial value
range analysis
1\0 = open
LED
20 0-254 0 or close
alarm
LED alarm
control the
proportional
Speed
21 0-254 123joint8,456joint5 coefficient
loop P
of the
motor
control the
differential
Position
22 0-254 123joint20,456joint13 coefficient
ring I
of the
motor
control the
integral
Position
23 0-254 0 coefficient
ring D
of the
motor
set the
minimum minimum
0-
24 starting 0 output
1000
force torque1000
= 100%
221
Data domain Description Data
Return Value: No
Return Value: No
222
byte Pin_mode: data types
Return Value: No
Return Value: No
Return Value: No
Return Value: No
223
1. Set steering gear status
Return Value: No
Return Value: No
224
Data domain Description Data
Data[4] R R
Data[5] G G
Data[6] B B
Return Value: No
Return Value: No
225
Data domain Description Data
226
Data domain Description Data
Return Value: No
227
Return Value: Yes
Return Value: No
228
Set tool coordinate system (-14,-27,275,-89.5,0.7,-90.7),
Calculation: rx coordinate value multiplied by 100 and then hexadecimal high byte
Calculation: rx coordinate value multiplied by 100 and then hexadecimal low byte
Return Value: No
229
Data domain Description Data
Calculation: rx coordinate value multiplied by 100 and then hexadecimal high byte
<<<<<<<HEAD
<<<<<<< HEAD
230
Example of serial port sending: FE FE 05 30 01 01 32 FA Range of joint serial
numbers 1~6 di:data type byte value range 0 and 1 sp:data type range 0-
100%
no Return Value
f8f31e0282f58b3f27f944c8d7a6ac99
dc0185de
Calculation: rx coordinate value multiplied by 100 and then hexadecimal low byte
Return Value: No
231
Data domain Description Data
Mode of calculation: rx coordinates multiplied by 100 are converted to int type and
then hexadecimal high bytes
Mode of calculation: rx coordinates multiplied by 100 are converted to int type and
then hexadecimal low bytes
232
1. Get the world coordinate system
233
Data domain Description Data
234
Data domain Description Data
Return Value: No
235
Data domain Description Data
Return Value: No
Return Value: No
236
Example of serial port sending: FE FE 04 61 22 01 FA
Return Value: No
237
Return Value: No
Return Value: No
238
Data domain Description Data
Data[5] Speed Sp
Return Value: No
Data[6] Speed Sp
Return Value: No
239
Set the current position of the claw to zero
Data[4] 0X00/0X01
Appendix:
240
6. Change the base coordinate system
7. base coordinate system can be set by setReferenceFrame function,
RFType::BASE the robot base as the base coordinate and the world
coordinate system as the base coordinate. getReferenceFrame function is to
read the current base coordinate system type.
8. read base coordinate system information can be set by setWorldReference
and getWorldReference functions. When set, the world coordinate system is
used as the relative coordinate system, and the position information of the
base of the robot relative to the world coordinate system is input.
9. when the base coordinate system is the base, the GetCoords and
WriteCoords methods take the base as the reference coordinate system.
10. When the base coordinate system is the world coordinate system, both the
GetCoords and WriteCoords methods use the world coordinate system as
the reference coordinate system.
The setting and reading of the terminal coordinate system, the setting and reading
of the world coordinate system, the setting and reading of the current reference
coordinate system, the setting and reading of the terminal type, the setting and
reading of the moving mode, and the sending and receiving of the manipulator
information are added.
The new roboticMessages space in the ParameterList.h file is used to add the
manipulator communication information.
The Euclidean distance between the initial point and the target point is obtained,
and an interpolation point is inserted every 10 mm based on the Euclidean
distance. If the interpolation point has no inverse solution, search position
invariant three directions attitude positive and negative PI/30 adjacent space
whether there are inverse solutions, mainly to avoid singular values and some
special positions that can not be solved.
The point transmission interval between MOVEL and JOG is changed to dynamic
time. The moving time is calculated according to the maximum joint moving
distance between two points, and then the moving time minus the specific time is
taken as the time interval.
241
7 Accessories
myCobot accessories includes
End Effectors
Parallel Gripper
Adaptive Gripper
Opening Angle Gripper
Suction Pump
Bases
G Base
Flat Base
Accessories
JoyStick
Battery Box
242
7.1 End Effectors
End effectors of myCobot now in mass production
243
7.1.1 Gripper
Gripper
1.fix the gripper to the top of the robotic arm with a lego tech and connect to
the M5STACK Basic end-effector extension interface (described in the
unboxing video)
2.grippers can be used in all development environments, such as ROS、
Arduino、UIFlow and RoboFlow.
Applicable objects
small box
small ball
long strip
Tips you can paste rubber at the fingertips of the gripper for better friction.
244
API control you can control gripper by downloading the latest myCobot API.
245
1. Control the opening and closing degree of the claw:please read the claw
position first, then set the range. The value range is 0-4096; the actual range is
near 2000.
2. Set the initial point of the claw:the initial point corresponds to a closing
degree of 2048.
setGripperState // only used for adaptive gripper, 0 or 1 for open and close
246
7.1.2 Suction pump
Applicable object
Paper/plastics
Smooth planar object
Cards, etc
Product Description
Installation schematic
247
wiring diagram
Notes
Please ensure that the product is connected successfully according to the
instructions
Make sure the product is powered by attached adapter
Please ensure the access direction of positive and negative electrodes
248
set pin 2 to high level and close solenoid valve//
// please confirm solenoid valve connection G2 pin, pump link G5 pin
// connection completed, high level off, low level open
//
void setup() {
// put your setup code here, to run once:
Serial.begin(9600);//open serial port, baud rate 9600
pinMode(2,OUTPUT); //set pin G2 to output state
pinMode(5,OUTPUT); //set pin G5 to output state
delay(100);
digitalWrite(2,1);//set pin 2 to high level and close solenoid valve
digitalWrite(5,1);//set pin 5 to high level and shut down pump
}
void loop() {
// 使用时按照需求控制电磁阀与泵机
digitalWrite(5,0);//将引脚5设为低电平,打开泵机
delay(200);//延时200ms
digitalWrite(2,0);//将引脚2设为低电平,打开电磁阀
delay(2000);//延时2000ms,松开吸住的物体
digitalWrite(2,1);//将引脚2设为高电平,关闭电磁阀
delay(200);//延时200ms
digitalWrite(5,1);//将引脚5设为高电平,关闭泵机
delay(200);//延时200ms
}
249
7.1.3 Pen holder
Is being developed and written...ss
250
7.2 Base
myCobot current support base:
251
7.2.1 G Base
1.Tighten the horn screws at both ends and insert a rubber sleeve into the he
2.Fix the base on the edge of the table with a G clip
3.The base and the bottom of the arm with the attached Lego tech
4.Make sure it’s stable before use
252
7.2.2 Sucking Base
1.Install the sucker at the four corners of the base and tighten them
2.Use the attached Lego piece to connect the sucking base and the bottom of
the manipulator
3.Fix the four suckers to a smooth and flat surface before use
Tips
A small amount of non-conductive liquids can be added under the sucker to fill
the gap between the sucker and the desktop to obtain the best adsorption effect.
253
7.3 Accessories
Is being developed and written...
254
8 Machine Vision Development
255
both in the process of neural and memory exploration, and it can also
explain the recognition of images that are irregular, but in some ways
similar to the prototype. However, this model does not explain how
people can identify and process similar stimuli, and it is difficult to
implement in a computer program. Therefore, a more complex model is
proposed, that is, the "pan-demonic" recognition model.
In industrial applications, pictures are usually taken by industrial
cameras, and then processed by software according to the grayscale
difference of the picture to identify useful information. The representative
of the image recognition software is Connex.
4. development of image recognition
256
the rapid development of image segmentation based on histogram and
wavelet transform, computing technology and VLSI technology, the
research on image processing has made great progress. Image
segmentation methods combine some specific theories, methods and
tools, such as image segmentation based on mathematical morphology,
segmentation based on wavelet transform, and segmentation based on
genetic algorithm.
Development Platform
Maixpy - IDE
Development Environment
Windows
Linux
Developer Components
M5Stack - StickV
257
Description
Product Features
Dual-Core 64-bit RISC-V RV64IMAFDC (RV64GC) CPU / 400Mhz(Normal)
Dual Independent Double Precision FPU
Neural Network Processor(KPU) / 0.8Tops
258
Field-Programmable IO Array (FPIOA)
Dual hardware 512-point 16bit Complex FFT
SPI, I2C, UART, I2S, RTC, PWM, Timer Support
AES, SHA256 Accelerator
Direct Memory Access Controller (DMAC)
Micropython Support
Firmware encryption support
Case Material: PC + ABS
Applications
Face recognition/detection
Object detection/classification
Obtaining size and coordinates of the target in real-time
Obtaining the type of detected target in real-time
Shape recognition
Video/Display
Game simulator
259
Resources Parameter
SRAM 8MiB
Flash 16M
KPU
parameter
size 5.5MiB-5.9MiB
of neural
network
FOV 55deg
PMU AXP192
Battery 200mAh
External
TF-card(microSD)
storage
MEMS MPU6886
Package
144 44 43mm
Size
Case
Plastic(PC)
Material
TF-card (microSD)test
M5StickV not currently recognize all types of TF-card(microSD). We have
tested some common TF-card(microSD). The test results are as follows.
260
Brand Storage Type Class Format Test Results
FAT文
Kingston 64G XC Class10 ok
件
Functional Description
Kendryte K210
Machine Vision
Better low power vision processing speed and accuracy
KPU high performance Convolutional Neural Network (CNN) hardware
accelerator
Advanced TSMC 28nm process, temperature range -40°C to 125°C
Firmware encryption support
Unique programmable IO array maximises design flexibility
Low voltage, reduced power consumption compared to other systems with
the same processing power
3.3V/1.8V dual voltage IO support eliminates need for level shifters
CPU
261
The chip contains a high-performance, low power RISC-V ISA-based dual core
64-bit CPU with the following features:
OV7740
support for output formats: RAW RGB and YUV
support for image sizes: VGA, QVGA, CIF and any size smaller
support for black sun cancellation
support for internal and external frame synchronization
standard SCCB serial interface
digital video port (DVP) parallel output interface
embedded one-time programmable (OTP) memory
on-chip phase lock loop (PLL)
embedded 1.5 V regulator for core
Sophisticated Edge Rate Control Enables Filterless Class D Outputs
77dB PSRR at 1kHz
Low RF Susceptibility Rejects TDMA Noise from GSM Radios
Extensive Click-and-Pop Reduction Circuitry
array size: 656 x 488
power supply: – core: 1.5VDC ± 5% – analog: 3.3V ± 5% – I/O: 1.7 ~ 3.47V
temperature range: – operating: -30° C to 70°C – stable image: 0° C to 50° C
output format: – 8-/10-bit raw RGB data – 8-bit YUV
lens size: 1/5"
input clock frequency: 6 ~ 27 MHz
max image transfer rate: VGA (640x480): 60 fps – QVGA (320 x 240): 120 fp
sensitivity: 6800 mV/(Lux-sec)
maximum exposure interval: 502 x tROW
pixel size: 4.2 μm x 4.2 μm
image area: 2755.2 μm x 2049.6 μm
package/die dimensions: – CSP3: 4185 μm x 4345 μm – COB: 4200 μm x
4360 μm
MAX98357
Single-Supply Operation (2.5V to 5.5V).
262
3.2W Output Power into 4Ω at 5V
2.4mA Quiescent Current
92% Efficiency (RL = 8Ω, POUT = 1W)
22.8µVRMS Output Noise (AV = 15dB)
Low 0.013% THD+N at 1kHz
No MCLK Required
Sample Rates of 8kHz to 96kHz
Supports Left, Right, or (Left/2 + Right/2) Output
Sophisticated Edge Rate Control Enables Filterless Class D Outputs
77dB PSRR at 1kHz
Low RF Susceptibility Rejects TDMA Noise from GSM Radios
Extensive Click-and-Pop Reduction Circuitry
AXP192
Operation Voltage: 2.9V - 6.3V(AMR:-0.3V~15V)
Configurable Intelligent Power Select system
Current and voltage limit of adaptive USB or AC adapter input
The resistance of internal ideal diode lower than 100mΩ
MPU6886
Gyroscope features
Digital-output X-, Y-, and Z-axis angular rate sensors (gyroscopes) with a
user-programmable full-scale range of ±250 dps, ±500 dps, ±1000 dps, and
±2000 dps and integrated 16-bit ADCs
Digitally-programmable low-pass filter
Low-power gyroscope operation
Factory calibrated sensitivity scale factor
lens size: 1/5"
Self-test
Accelerometer features
Digital-output X-, Y-, and Z-axis accelerometer with a programmable full scale
range of ±2g, ±4g, ±8g and ±16g and integrated 16-bit ADCs
User-programmable interrupts
Wake-on-motion interrupt for low power operation of applications processor
Self-test
263
Note: There are two versions of M5StickV currently released by M5Stack.
When programming, users need to configure differently according to their
corresponding pin mapping. The specific differences are as follows.
In the M2StickV circuit design of the I2C single-mode (blue PCB) version,
MPU6886 only supports the user to configure its communication mode to
I2C, and its pin mapping is SCL-28, SDA-29.
In the SPI/I2C dual mode (black PCB) version of the M5StickV circuit design,
MPU6886 supports the user to configure its communication mode to SPI or
I2C, and its pin mapping is SCL-26, SDA-27., when using, you can switch CS
Pin level to switch modes (high level 1 is I2C mode, low level 0 is SPI mode)
The specific pin mapping is shown below:
Links
datasheet
MPU6688
SH200Q
Web page
sipeed
GITHUB
API
schematic
264
K210-CAM
Procedure
Maixpy referenceexample
265
8.1 Set up MaixPy environment
The first thing you need to know is Maixpy uses MicroPython scripting syntax, so
it doesn't require compilation like C language, it can be used without an IDE:
using the serial terminal tool which has been installed previously.
Using IDE, you can edit scripts in real time on the computer and upload them to
the development board,which is convenient to execute scripts directly on the
development board, view camera images in real time on the computer, save files
to the development board and so on.
However, using an IDE will compress and transport some resources, so the
performance will be reduced, and if MaixPy goes down, it’s hard to find problems
like serial terminal.
For more information, please click official websiteto view the official instructions.
266
Face Recognition
Object Recognition
Color Identification
Emotion Recognition
License Plate Recognition
Sorting System
For more information, please clickofficial websiteto view the official instructions.
267
Burn firmware
After downloading, use the firmware burner to burn the firmware to M5StickV:
268
2.2 Download and install software
Install software
Download the file, directly double-click the exe file to run the installation program
in Windows
Test connection
269
Open MaixPy IDE and select the model of the development board from the top
toolbar. Select M5Stickv to connect.
Below the connect button is the run button, which executes the py file in the
current edit area.
270
Click the Run button again (red) to stop running the current code.
Insert the computer, whether there is a ding-dong sound, such as the sound
of the USB drive loading when inserting the U disk. If there isn’t, it means that
there is a problem with the serial chip on the hardware.
Replace the cable and try again. Replace the USB port of the computer and
try again. It still won't load out, change the computer to confirm.
If you cannot burn firmware, check for hardware problems in the following order.
Use the serial port tool to see if the MAIXPY firmware exists in the hardware.
Set 115200 baud rate to connect the serial port, press the reset key (RST) to
receive the data of the chip. Whatever it happens means the serial port chip
is working normally, but if there is nothing, it means the hardware is
abnormal.
Based on the above, burn the firmware again. Before burning, press the
BOOT key of the hardware and press reset, and then release the BOOT key,
which means the burning is processing normally. If not, it means that Flash is
damaged, you can try to burn to SRAM. If the burning fails, it means that the
serial port chip is abnormal.
If you get to this step and still can't fix the problem, then the hardware does
have a defect.
271
We often call this a key download circuit, can easily complete the control of BOOT
and RST pin and enter the burning mode through the control of the serial port
RST and DTR. As described above, the hardware circuit is expected to replace
human to automatically perform the operation of pressing RST and BOOT, which
is strongly dependent on hardware implementation. Only based on this can data
transmission of TX and RX be carried out, so we need to use functional pins of
UART serial port.
If it is found that the download process fails, the baud rate can be reduced
appropriately, because the serial port chip is not stable. The selection of the type
in the tool will only affect the trigger of the first burning mode, and after that the
burning firmware will be burned at the configured baud rate, usually not exceed
the communication burning speed with flash, commonly seen in 50~60 KB/S.
If you find that no matter how to replace the burning mode can not enter, either
the burning version does not match, or the serial chip DTR RST pin problem
(physical).
272
8.2 Color Recognition
Example code
273
import sensor
import image
import lcd
import time
clock = time.clock()
lcd.init()
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.run(1)
lcd.rotation(2)
#blue,red,green,yellow
colour = ['blue','red','green','yellow']
def blobs_output(blobs):
for b in blobs:
tmp=img.draw_rectangle(b[0:4])
tmp=img.draw_cross(b[5], b[6])
img.draw_string(b[5], b[6], colour[i],color=(255,0,0), scale=2)
c=img.get_pixel(b[5], b[6])
_colour = {}
_colour['colour'] = colour[i]
data_ = []
data_.append(_colour)
data_.append(blobs[0])
uart_Port.write(str(blobs))
def show_fps():
fps =clock.fps()
img.draw_string(200, 1, ("%2.2ffps" %(fps)),color=(255,0,0), scale=2)
while True:
clock.tick()
img=sensor.snapshot()
show_fps()
for i in range(4):
blobs[i] = img.find_blobs([colour_threshold[i]],area_threshold=100,pixels_thre
if blobs[i]:
blobs_output(blobs[i])
lcd.display(img)
Functional Assignment
274
Identify four colors
The serial port outputs data
The display screen displays the recognized color blocks
275
Adjust the Lab tie down value mainly in the binary image field, and white pixel is
the tracked pixel
Related knowledge
MaixPy 机械视觉 API
Full Knowledge of Lab color space
Name Before we begin, let's clarify the name of the Lab Color Space: The full
name of Lab is CIELAB, sometimes also written CIE Lab*. CIE stands for the
International Commission on Illumination, an International authority on
lighting, colour, etc..
Channel Lab is composed of a brightness channel and two color channels. In
Lab color space, each color is represented by three numbers, L, a, b, and the
meaning of each component is as follows: L stands for brightness a is the
component from green to red b is the component from blue to yellow
Perceptual uniform Lab is designed based on people's perception of colors.
More specifically, it is perceptual uniform. Perceptual uniform means that if
the number (L, a, b) changes equally, then it brings about a similar degree of
visual change. Lab is more consistent with human vision than RGB and
CMYK color space, and is easier to adjust. If you want to adjust the
brightness (regardless of the Helmholtz -- Kohlrausch effect, refer to the note
below), adjust the L channel, and if you want to adjust only the color balance,
adjust a and b respectively. Note: Helmholtz–Kohlrausch effect is an illusion
in the human eyes -- that colors appear brighter when they are saturated.
Device-independent Lab has a nice feature -- Device-independent. That is,
given a white point in the color space (the figure below represents a white
spot in a color space), the color space can clearly determine how each color
is created and displayed, regardless of the display medium used.
276
For
example, when you want to convert an RGB image on the screen to a CMYK
image for printing, you can first convert it from RGB to Lab and then convert
the Lab image to CMYK mode. Because gamut of Lab is larger than RGB
and CMYK (The gamut of Lab is so large that a large part of it is beyond the
range of human vision that it cannot be called "color"). It is important to note
that Lab defines the color relative to the white point. We will not know the
other colors until we define the color of the white point (e.g. it is defined as
CIE Standard Illuminant D50).
Range of value In theory, L, a, and b are all real numbers, but in practice
they are confined to a range of integers: The larger the L is, the higher the
brightness is. When L is 0, it represents black, and when L is 100, it
represents white. a and b are both gray when they are 0. When a goes from
a negative number to a positive number, the corresponding color will change
from green to red. When b goes from a negative number to a positive
number, the corresponding color will change from blue to yellow. In practical
application, we often use the range of color channel between -100~+100 or
-128127.
Visualize Lab* has three components in total, so it can be presented in three
dimensional space. In two-dimensional space, the Chromaticity Diagram is
often used to visualize it, that is, to fix the brightness L, to see the change of
a and b. Note that these visualizations are not accurate, it’s just helpful to
understand.
CIELUV There is a color space similar to Cielab, called Cie 1976 (L, U, V),
also known as Cieluv. The L of the color space is the same as the CIELAB,
but the color component is different.
Conversion between LAB and RGB and CMYK Since RGB and CMYK are
both device related, they cannot be directly converted to Lab. So before the
277
conversion, you must define an absolute color space, such as sRGB or
Adobe RGB. Conversion from RGB to SrGB is device independent, but the
subsequent conversion are device independent.
278
8.3 Shape Recognition
Example Code
279
# identify the circle
import sensor, image, time, lcd
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.skip_frames(time = 300)
sensor.run(1)
lcd.init()
lcd.rotation(2)
clock = time.clock()
while(True):
clock.tick()
img = sensor.snapshot()
for c in img.find_circles(threshold = 1800, x_margin = 20, y_margin = 20, r_margin
r_min = 5, r_max = 45, r_step = 20):
img.draw_circle(c.x(), c.y(), c.r(), color = (255, 0, 0))#识别到的红色圆形用红色
print("r %f" % c.r())
print("FPS %f" % clock.fps())
lcd.display(img)
Functional Assignment
Identify the specified shape
Return the corresponding data
Relate knowledge
MaixPy 机械视觉 API
280
8.4 Face Recognition
281
View the machine code in the serial output
After the firmware is burned, connect the computer, open the serial assistant, then
open the serial port.
282
Download model
With the machine code we can start to download the corresponding model files.
The blue part is the model file, and the yellow part is the bin file of MaixPy.This is
the compact version, and the size is relatively small, just more than 600 KB.
Inside the json file is the configuration, which is about where these files should be
downloaded to Flash and whether they need to be checked.
283
{
"version": "0.1.0",
"files": [
{
"address": 0,
"bin": "maixpy_face_ide.bin",
"sha256Prefix": true
},
{
"address": 5242880,
"bin": "FD_a6e91e13a0de48bafec324646d070358.smodel",
"sha256Prefix": false
},
{
"address": 6291456,
"bin": "KP_chwise_a6e91e13a0de48bafec324646d070358.smodel",
"sha256Prefix": false
},
{
"address": 7340032,
"bin": "FE_mbv1_0.5_a6e91e13a0de48bafec324646d070358.smodel",
"sha256Prefix": false
}
]
}
284
Run example code
code address
285
import sensor,image,lcd # import 相关库
import KPU as kpu
import time
from Maix import FPIOA,GPIO
task_fd = kpu.load(0x200000) # 从flash 0x200000 加载人脸检测模型
task_ld = kpu.load(0x300000) # 从flash 0x300000 加载人脸五点关键点检测模型
task_fe = kpu.load(0x400000) # 从flash 0x400000 加载人脸196维特征值模型
clock = time.clock() # 初始化系统时钟,计算帧率
key_pin=16 # 设置按键引脚 FPIO16
fpioa = FPIOA()
fpioa.set_function(key_pin,FPIOA.GPIO36)
key_gpio=GPIO(GPIO.GPIO36,GPIO.IN)
last_key_state=1
key_pressed=0 # 初始化按键引脚 分配GPIO7 到 FPIO16
def check_key(): # 按键检测函数,用于在循环中检测按键是否按下,下降沿有效
global last_key_state
global key_pressed
val=key_gpio.value()
if last_key_state == 1 and val == 0:
key_pressed=1
else:
key_pressed=0
last_key_state = val
lcd.init() # 初始化lcd
sensor.reset() #初始化sensor 摄像头
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QVGA)
sensor.set_hmirror(1) #设置摄像头镜像
sensor.set_vflip(1) #设置摄像头翻转
sensor.run(1) #使能摄像头
anchor = (1.889, 2.5245, 2.9465, 3.94056, 3.99987, 5.3658, 5.155437, 6.92275, 6.718375
dst_point = [(44,59),(84,59),(64,82),(47,105),(81,105)] #standard face key point posit
a = kpu.init_yolo2(task_fd, 0.5, 0.3, 5, anchor) #初始化人脸检测模型
img_lcd=image.Image() # 设置显示buf
img_face=image.Image(size=(128,128)) #设置 128 * 128 人脸图片buf
a=img_face.pix_to_ai() # 将图片转为kpu接受的格式
record_ftr=[] #空列表 用于存储当前196维特征
record_ftrs=[] #空列表 用于存储按键记录下人脸特征, 可以将特征以txt等文件形式保存到sd卡后,读
names = ['Mr.1', 'Mr.2', 'Mr.3', 'Mr.4', 'Mr.5', 'Mr.6', 'Mr.7', 'Mr.8', 'Mr.9' , 'Mr.
while(1): # 主循环
check_key() #按键检测
img = sensor.snapshot() #从摄像头获取一张图片
clock.tick() #记录时刻,用于计算帧率
code = kpu.run_yolo2(task_fd, img) # 运行人脸检测模型,获取人脸坐标位置
if code: # 如果检测到人脸
for i in code: # 迭代坐标框
# Cut face and resize to 128x128
a = img.draw_rectangle(i.rect()) # 在屏幕显示人脸方框
face_cut=img.cut(i.x(),i.y(),i.w(),i.h()) # 裁剪人脸部分图片到 face_cut
face_cut_128=face_cut.resize(128,128) # 将裁出的人脸图片 缩放到128 * 128像素
a=face_cut_128.pix_to_ai() # 将猜出图片转换为kpu接受的格式
#a = img.draw_image(face_cut_128, (0,0))
# Landmark for face 5 points
fmap = kpu.forward(task_ld, face_cut_128) # 运行人脸5点关键点检测模型
plist=fmap[:] # 获取关键点预测结果
le=(i.x()+int(plist[0]*i.w() - 10), i.y()+int(plist[1]*i.h())) # 计算左眼位
re=(i.x()+int(plist[2]*i.w()), i.y()+int(plist[3]*i.h())) # 计算右眼位置
nose=(i.x()+int(plist[4]*i.w()), i.y()+int(plist[5]*i.h())) #计算鼻子位置
lm=(i.x()+int(plist[6]*i.w()), i.y()+int(plist[7]*i.h())) #计算左嘴角位置
rm=(i.x()+int(plist[8]*i.w()), i.y()+int(plist[9]*i.h())) #右嘴角位置
a = img.draw_circle(le[0], le[1], 4)
a = img.draw_circle(re[0], re[1], 4)
a = img.draw_circle(nose[0], nose[1], 4)
a = img.draw_circle(lm[0], lm[1], 4)
286
a = img.draw_circle(rm[0], rm[1], 4) # 在相应位置处画小圆圈
# align face to standard position
src_point = [le, re, nose, lm, rm] # 图片中 5 坐标的位置
T=image.get_affine_transform(src_point, dst_point) # 根据获得的5点坐标与标准
a=image.warp_affine_ai(img, img_face, T) #对原始图片人脸图片进行仿射变换,变换
a=img_face.ai_to_pix() # 将正脸图像转为kpu格式
#a = img.draw_image(img_face, (128,0))
del(face_cut_128) # 释放裁剪人脸部分图片
# calculate face feature vector
fmap = kpu.forward(task_fe, img_face) # 计算正脸图片的196维特征值
feature=kpu.face_encode(fmap[:]) #获取计算结果
reg_flag = False
scores = [] # 存储特征比对分数
for j in range(len(record_ftrs)): #迭代已存特征值
score = kpu.face_compare(record_ftrs[j], feature) #计算当前人脸特征值与已
scores.append(score) #添加分数总表
max_score = 0
index = 0
for k in range(len(scores)): #迭代所有比对分数,找到最大分数和索引值
if max_score < scores[k]:
max_score = scores[k]
index = k
if max_score > 85: # 如果最大分数大于85, 可以被认定为同一个人
a = img.draw_string(i.x(),i.y(), ("%s :%2.1f" % (names[index], max_sco
else:
a = img.draw_string(i.x(),i.y(), ("X :%2.1f" % (max_score)), color=(25
if key_pressed == 1: #如果检测到按键
key_pressed = 0 #重置按键状态
record_ftr = feature
record_ftrs.append(record_ftr) #将当前特征添加到已知特征列表
break
fps =clock.fps() #计算帧率
print("%2.1f fps"%fps) #打印帧率
a = lcd.display(img) #刷屏显示
#kpu.memtest()
#a = kpu.deinit(task_fe)
#a = kpu.deinit(task_ld)
#a = kpu.deinit(task_fd)
287
Press BUTTON A to record the face. After the face is recorded, the name will be
assigned in order and displayed when the face is recognized.
Related knowledges
288
8.5 QR code Identification
example code
289
# AprilTags3D定位例程
#
# 这个例子展示了OpenMV Cam的功能,可以检测OpenMV Cam M7/H7 上的April标签。OpenMV2 M4版本无
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QQVGA) # 如果分辨率太高,内存可能溢出
sensor.skip_frames(time = 2000)
#sensor.set_auto_gain(False) # 必须关闭此功能,以防止图像冲刷…
#sensor.set_auto_whitebal(False) # 必须关闭此功能,以防止图像冲刷…
clock = time.clock()
lcd.init()
lcd.rotation(2)
# 注意!与find_qrcodes不同,find_apriltags方法不需要对镜像进行镜头校正。
#标签系列有什么区别? 那么,例如,TAG16H5家族实际上是一个4x4的方形标签。
#所以,这意味着可以看到比6x6的TAG36H11标签更长的距离。 然而,较低的H值(H5对H11)
#意味着4x4标签的假阳性率远高于6x6标签。 所以,除非你有理由使用其他标签系列,
#否则使用默认族TAG36H11。
# f_x是相机的x焦距。它应该等于以mm为单位的镜头焦距除以x传感器尺寸(以mm为单位)乘以图像中的像
# 以下数值适用于配备2.8毫米镜头的OV7725相机。
# f_y是相机的y焦距。它应该等于以mm为单位的镜头焦距除以y传感器尺寸(以mm为单位)乘以图像中的像
# 以下数值适用于配备2.8毫米镜头的OV7725相机。
# c_x是以像素为单位的图像x中心位置
# c_x是以像素为单位的图像x中心位置
def degrees(radians):
return (180 * radians) / math.pi
while(True):
clock.tick()
img = sensor.snapshot()
for tag in img.find_apriltags(fx=f_x, fy=f_y, cx=c_x, cy=c_y): # 默认为 TAG36H11
tmp = img.draw_rectangle(tag.rect(), color = (255, 0, 0))
tmp = img.draw_cross(tag.cx(), tag.cy(), color = (0, 255, 0))
print_args = (tag.x_translation(), tag.y_translation(), tag.z_translation(), \
degrees(tag.x_rotation()), degrees(tag.y_rotation()), degrees(tag.z_rotati
# 变换单位不详。旋转单位是度数。
print("Tx: %f, Ty %f, Tz %f, Rx %f, Ry %f, Rz %f" % print_args)
print(clock.fps())
lcd.display(img)
功能解析
290
识别指定的QR码
串口输出数据
显示屏显示识别到的QR码
扩展知识
MaixPy 机械视觉 API
april官网
291
1 Maintenance
After-sales Service
Return service is limited to goods not opened within 7 days after the receipt date
of logistics of the products. The freight or other risks incurred in return shall be
borne by the customer.
Customers should provide the purchasing invoice and warranty card as the
warranty certification when a warranty is being asked.
Elephant Robotics will be responsible for the hardware faults of products
caused by the normal using during the warranty period.
The warranty period starts from the date of purchase or the receipt date of the
logistics.
The faulty parts from the products will be owned by Elephant Robotics, and the
appropriate cost will be charged if necessary.
If you need to apply for warranty service, please contact our customer service first
to confirm the detailed information. The following is warranty terms of detailed
components:
Note: If there is a conflict with the Product Brochure, the User Manual shall
prevail.
Servos
Warranty
Warranty Services
Period
292
Electrical Parts ( M5 Hardware)
Warranty
Warranty Services
Period
≥6
Customers need to buy it themselves.
months
保修 保修服务
期限
≤1 Elephant Robotics offers free new components once, customs
years shall bear the freight.
≥1
Customers need to buy it themselves.
years
During the warranty period of the delivered product, the company only repairs the
malfunctions that occur during normal use of the robot for free. However, in the
following cases, the customer will be charged for repairs (even during the
warranty period):
please strictly follow the instructions in this manual and related manual to operate
the robot.
Limit
293
After we receive myCobot, we should pay attention to the limit of each joint of
myCobot, and the rotation angle of each axis cannot exceed its own
maximum physical limit.
It should be turned at a small angle and gently. After reaching the limit, you
should not continue turning.
Be careful not to touch the robot arm itself when the joint is running so as not
to knock it out.
Shell Disassembly
Youtube-[Video Link: https://youtu.be/wHzFsExkYrE] Steps
Prepare the items as shown, loosen the screws of the corresponding shell,
force properly, and gently remove the shell from the left and right.
After removing the shell, check that there is no damage inside, you can install
the new shell.
l Fasten the new shell, the shell and the manipulator have a clear touch, after
fastening, the screws are placed and tightened to complete the shell removal.
calibration
Youtube-video Link:https://youtu.be/vGznxW4OF10
294
steps
295
2 常见问题FAQ
faq
Hardware
Q : why it appear small jitter when I use the vertical state, but not when
running?
A:ROS is open source and will update on our github. There is no charge for
firmware upgrades.
Software
Q : why can't my compiler find the corresponding device?
A:You need to build the development environment and install the corresponding
project library to develop the equipment.
Q : why did I burn the firmware to the ATOM terminal and the device didn't
work properly?
296
A:ATOM terminal firmware needs to use our factory firmware, you can not
change other unofficial firmware in use, If the device burn other firmware
accidentally. You can use "myCobot firmware burner" to select ATOM terminal-
select serial port-select ATOMMAIN firmware to burn the ATOM terminal.
l UIFlow
Q:UI Flow don't support with the latest firmware? A: Need M5STACK update,
we have asked M5 to help us update, the time cycle is unknown.
l RoboFlow
l ROS
Q:Does the ROS system operate with a computer attached to the robot arm?
A: Yes, connect with the computer is ok.
Q:Ros folder downloaded from github runs only display controls but not display
myCobot 3D models. A: You need to open rviz,rosrun rviz rviz manually
Q:Range out or error[101] occurs when in runtime A: Check that the serial port
is correct, basic and atom firmware are correct.
l myStudio
Q:I have a few questions to ask about APP: 1) Bluetooth has two controls on the
right, one is back to zero, the other is? 2) The sixth axis has no mechanical zero,
how to get the robot back to zero? Rz the Euler angle is 0. 3) Press the control
button of the joint and end shaft, then loose, will it keep moving? A:1. Another
free model.
2.When the robot arm is in calibration correction, the current position is set to zero
by default.
3.After release, you should stop. APP use the jog control. However, due to the
instability of Bluetooth connection, there may be a stop instruction but arm do not
received.
l Firmware Burner
Q: Motor don't work A: Use the latest version 1.3 firmware burner, burn
atommain to the atom. Only if computer link to the end of the robotic arm typeC
interface can you update the firmware to atom.
297
Q: After burn, how to return to the factory settings? A: Download firmware
burner or arduino program, burn main firmware.
l Others
A:MainControl is the factory with its own, transponder is the transfer program,
after burning can be directly sent packet protocol control.
Q: default software - Drag to teach, action can only be saved to ram can not be
saved to flash A: old version will be released with the Chinese version
mainControl after the bug adjustment
Others
Q : Can I get real-time data while executing instructions
A:It is not allowed for the time being, the bus can not be disturbed when working
Q : The end interface rudder model has a limited end of the control card,
what is the relationship with the M5stack? Is it independent?
A: End of the device needs to be able to adapt to our controller, it can support
PWM mode control of the steering gear.
298
A:myCobot is open ROS, now supports most platforms on the market including
UIFLOW,Arduino,microPython,FreeROTS
Q : shell material
A: plastic; photosensitive resin; SLA
Q : Brand of motor
A: self-made motor, external processing
Q : not harmonics?
A: notss
A:it is a steering gear like a model, but we customize and modify a lot, is more
suitable for the manipulator. Six-axis linkage, three-power interpolation, not as
uniform as industrial robots, the final price here.
Q : motor parameters
A:large steering gear: rated load 10 kgcm, maximum speed 60 rpm, voltage 7.4
V, encoder 12 bit magnetic knitting.
small steering gear: rated load 2 kgcm, maximum speed 80 rpm, voltage 7.4 V,
encoder 12 bit magnetic knitting.
Q : ROS version
l rosdistro:kinetic //
l rosversion:1.12.17//
Q : Angle Accuracy
l 360/4096=0.0879°
299
l 6~9V
3~5A
Q : asic software brush uiflow, later with your burning software brush back
maincontrol, why the robot arm track recording software can not be used
A:atom firmware version is not compatible, can only use the same version of
firmware; update the atom version can
300
Q : Do you need to start again after drag the instruction
A: because when re-recording joint motor lock position can not be cut off, click
on the release can make the arm joint loose, can be the next drag teaching;
power failure can also start again
Q : What information does the send to the joint motor through the ROS?
Can feedback encoder information
301
3 Resources
Website
Official Website
https://www.elephantrobotics.com/en/
myCobot github-software
https://github.com/elephantrobotics/myCobot
M5 UI Flow
https://docs.m5stack.com/#/zh_CN/quick_start/m5core/m5stack_core_get_started
_MicroPython
Videos
youtube
maintenance
connecting line https://youtu.be/1wq0kTJVqw4
disassembly https://youtu.be/wHzFsExkYrE
limit https://youtu.be/PUeU-mynljw
calibration https://youtu.be/vGznxW4OF10
Tutorials unboxing https://youtu.be/Lwi8UoihzNc
free move https://youtu.be/WzrbOrdQop0
Maincontrol https://youtu.be/VKd8b989M8g
ROS https://youtu.be/-Jo_IJ8RaXc
Arduino https://youtu.be/pkQIApDRJpo
myStudio https://youtu.be/Kr9i62ZPf4w
others
2 display screens-traffic signals https://youtu.be/9ej0tEwhXuE
promotional video https://youtu.be/uSw5rsymjVY
User cases(1) https://youtu.be/0Al1MN50RS0
User cases(2) https://youtu.be/eoR2-MId_-I
302