MSC CS Sem 4 Part2 Journalrobotics
MSC CS Sem 4 Part2 Journalrobotics
MSC CS Sem 4 Part2 Journalrobotics
Aim: Making Raspberry pi headless, and reaching it from the network using wifi
and SSH.
Components Required:-
SD Card -16 Gb ,Desktop
Procedure :-
STEP I: Go to www.raspberry.com and click on software tab.
Download raspberry pi imager for windows. And connect SD card to
laptop.
STEP IV : Click on Setting Icon . and set hostname. Click to enable SSH.
Code with sam -youtube Developed by :- Sangeeta Menon ,VIMEET M Sc. SEM IV 2022-
23
Step 4: Connect Raspberry Pi WIFI and Laptop WIFI to Mobile
Step 5: Open CMD and tyepe following command
I ) ping raspberrypi or ping 162.168.207.244
II) ssh admin@ raspberrypi or ssh
admin@ 162.168.207.244
And type password of Admin
Source Code:-
import RPi.GPIO as GPIO
import time
GPIO.setmode(GPIO.BOARD)
GPIO.setwarnings(False)
button=12
DC_motor_a=7
DC_motor_b=11
GPIO.setup(DC_motor_a,GPIO.OUT)
GPIO.setup(DC_motor_b,GPIO.OUT)
GPIO.setup(button,GPIO.IN,pull_up_down=GPIO.PUD_UP)
while(1):
if GPIO.input(button)==GPIO.LOW:
GPIO.output(DC_motor_a,GPIO.HIGH)
GPIO.output(DC_motor_b,GPIO.LOW)
time.sleep(0.1)
else:
GPIO.output(DC_motor_a,GPIO.LOW)
GPIO.output(DC_motor_b,GPIO.HIGH)
time.sleep(0.1)
Practical NO -4
Aim :-Write a script to follow a predetermined path.
Components:-
Raspberry pi Board3
L293D
Simple DC Motor
Button
Process of Creation of Project:-
Step I:-Goto new project and select new project
-
Source Code:-
import RPi.GPIO as GPIO
import time
GPIO.setmode(GPIO.BOARD)
GPIO.setwarnings(False)
button=12
DC_motor_a=7
DC_motor_b=11
GPIO.setup(DC_motor_a,GPIO.OUT)
GPIO.setup(DC_motor_b,GPIO.OUT)
GPIO.setup(button,GPIO.IN,pull_up_down=GPIO.PUD_UP)
while(1):
if GPIO.input(button)==GPIO.LOW:
GPIO.output(DC_motor_a,GPIO.HIGH)
GPIO.output(DC_motor_b,GPIO.LOW)
time.sleep(0.1)
else:
GPIO.output(DC_motor_a,GPIO.LOW)
GPIO.output(DC_motor_b,GPIO.HIGH)
time.sleep(0.1)
Output:-
Motors moving in Clockwise direction
Compontes:
a. PIR Sensor
b. Resistor
c. Piezo
d. Arduino Uno R3
e. LED RGB
Step 2:Type the following code
int pirsensor = 0;
void setup()
pinMode(2, INPUT);
pinMode(12, OUTPUT);
pinMode(13, OUTPUT);
void loop()
pirsensor = digitalRead(2);
-
if (pirsensor == HIGH)
digitalWrite(13,HIGH);
tone(12,500,500);
digitalWrite(13,LOW)
Output:
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
-
Practctical No -6
Aim:-Add the sensors to the robot object and develop the line
follower behaviour code
Components required:-
Ardinuo
Button
Zumo robot
Proteus 8.13 simulator
Process of Creation of Project:-
Step I:-Goto new project and select new project
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
// Peripheral Constructors
CPU &cpu = Cpu;
TimerOne &timer1 = Timer1;
DRIVE Z1_DRIVE = DRIVE (8, 10, 7, 9);
LINESENSOR Z1_LINESENSOR = LINESENSOR (4, A3, 11, A0, A2, 5, 2);
COMPASS Z1_COMPASS = COMPASS ();
GYRO Z1_GYRO = GYRO ();
void peripheral_setup () {
Z1_DRIVE.begin ();
Z1_LINESENSOR.begin ();
Z1_COMPASS.begin ();
Z1_GYRO.begin ();
}
void peripheral_loop() {
}
//---CONFIG_END---
// Flowchart Variables
long var_linePosition;
long var_error;
long var_lastError;
long var_speedDifference;
long var_leftSpeed;
long var_rightSpeed;
long var_maxSpeed;
float var_magX;
float var_magY;
float var_magZ;
// Flowchart Routines
void chart_SETUP() {
var_lastError=0,var_maxSpeed=255;
}
void chart_LOOP() {
var_linePosition=Z1_LINESENSOR.readLinePos();
var_error=var_linePosition-2500;
var_speedDifference=var_error/4+6*(var_error-var_lastError);
var_lastError=var_error;
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
void chart_boundSpeeds() {
if(var_leftSpeed<0) {
var_leftSpeed=0;
} else {
if(var_rightSpeed<0) {
var_rightSpeed=0;
} else {
if(var_leftSpeed>var_maxSpeed) {
var_leftSpeed=var_maxSpeed;
} else {
if(var_rightSpeed>var_maxSpeed) {
var_rightSpeed=var_maxSpeed;
}
}
}
}
}
OUTPUT:
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
Components required:
Raspberry pr ,Strip rgb led
Circuit connection:-
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
# Main function
def main () :
# Setup
variables_setup ()
peripheral_setup ()
chart_SETUP ()
# Infinite loop
while True :
peripheral_loop ()
chart_LOOP ()
# Command line execution
if name == ' main ' :
main()
Flowchart of project:
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
University Department of Computer Science
Subject:-Robotics And Process Automation
Subject Code:-
Code with sam - Developed by :- Sangeeta Menon ,VIMEET MSc. SEM IV 2022-
youtube 23
Figure of the led showing red color
Conclusion:-
Hence we have programmed the rbg strip led for the observation of various colors used to identify the paths.
Practical No :-9
Aim - Create an obstacle avoidance behaviour for robot and test it.
Components
Aurdino uno
Zumo robot
3) Save the graphic as a PNG file into the same directory as your Proteus project.
3) Edit the turtle component on your schematic and specify the graphic you have just saved as the obstacle
map
Source Code:
//---CONFIG_BEGIN---
#pragma GCC push_options
#pragma GCC optimize ("Os")
// Peripheral Constructors
CPU &cpu = Cpu;
TimerOne &timer1 = Timer1;
TurtleDrive T1_DRIVE = TurtleDrive (2, 4, 3, 6, 7, 5);
TurtleSonarHead T1_SH = TurtleSonarHead (8, 9, 10);
TurtleLineHunter T1_LH = TurtleLineHunter (11, 12,
A0);
void peripheral_setup () {
T1_DRIVE.begin ();
T1_SH.begin ();
T1_LH.begin ();
}
void peripheral_loop() {
}
//---CONFIG_END---
// Flowchart Variables
long var_speed;
long var_dir;
long var_count;
long var_range;
long var_fast;
long var_slow;
long var_tstart;
long var_tstop;
// Flowchart Routines
void chart_SETUP() {
var_speed=180,var_range=10,var_dir=0;
T1_SH.setAngle(0);
T1_SH.setRange(25);
T1_DRIVE.forwards(var_speed);
}
void chart_LOOP() {
if(!(T1_LH(0,0,0))) {
if(T1_LH(1,1,1)) {
chart_Correct();
} else {
if(T1_LH(0,1,1)) {
T1_DRIVE.drive(1,1,5*var_speed/4);
T1_DRIVE.drive(2,1,var_speed/2);
var_dir=10;
chart_Avoid();
} else {
if(T1_LH(0,0,1)) {
T1_DRIVE.drive(1,1,5*var_speed/4);
T1_DRIVE.drive(2,0,var_speed/5);
var_dir=30;
} else {
if(T1_LH(1,1,0)) {
T1_DRIVE.drive(2,1,5*var_speed/4);
T1_DRIVE.drive(1,1,var_speed/2);
var_dir=-10;
chart_Avoid();
} else {
if(T1_LH(1,0,0)) {
T1_DRIVE.drive(2,1,5*var_speed/4);
T1_DRIVE.drive(1,0,var_speed/5);
var_dir=-30;
} else {
if(T1_LH(-1,1,-1)) {
T1_DRIVE.forwards(var_speed);
var_dir=0;
chart_Avoid();
}
}
}
}
}
}
}
}
void chart_Correct() {
var_count=0;
l3:;
if(var_dir>0) {
T1_DRIVE.drive(2,1,var_speed);
T1_DRIVE.drive(1,0,var_speed/3);
} else {
if(var_dir<0) {
T1_DRIVE.drive(1,1,var_speed);
T1_DRIVE.drive(2,0,var_speed/3);
}
}
delay(1);
var_count=var_count+1;
if(var_count<1000) {
if (T1_LH(1,1,1)) goto l3;
} else {
T1_DRIVE.stop();
var_dir=0;
}
}
void chart_Avoid() {
if(T1_SH(var_range,0)) {
T1_DRIVE.backwards(2*var_speed/3);
delay(250);
T1_DRIVE.turn(80);
do {
delay(5);
} while ((!(T1_SH(1.5*var_range,0))) ==
false); T1_DRIVE.stop();
delay(500);
T1_DRIVE.turn(80);
var_tstart=cpu.millis();
while (!(T1_SH(1.5*var_range,0))) {
}
var_tstop=cpu.millis();
var_count=(var_tstop-var_tstart)/10;
T1_DRIVE.stop();
delay(500);
T1_DRIVE.turn(-80);
while (var_count>0) {
var_count=var_count-1;
delay(5);
}
T1_DRIVE.backwards(2*var_speed/3);
delay(300);
T1_DRIVE.forwards(var_speed/2);
}
}
Coding:
import cv2
#reading the image
img = cv2.imread("face.jpg")
eye_cascade = cv2.CascadeClassifier(cv2.data.haarcascades +
"haarcascade_eye.xml")
eyes = eye_cascade.detectMultiScale(gray_img)
Aim :Using the robot to display its camera as a web app on a phone or desktop
and then use camera to drive smart Color and face tracking behaviours
Components needed:-
Proteus Simulator 8.9 and above
Lcd TFT
Button
Raspberry pi camera
Procedure:-
First the components from the library and place it in the schematic chart
Steo 1:-Select new project from the tab , select flow chart project
Step 2:- Create new folder and save the project by remaining as camera.psdrpj
in proteus.
Step 3:- Select a picture and place it in the resource file by right clicking on
raspberry pi in the flowchart view
Step 4:-once you click on add resource it will take you to project folder place a bit map image
in it
Example shown hereis the butterfiles.png
Ssve it and then run the simulation once you run the simulation you will see the output as
shown below
Step7:Once you click the button in the schematic view we can see our image on the TFT LCD seleted.For
displaying your image please keep the camera of the laptop on incase desktop use the web camera
The camera wil click the image and display the real time image on the TFT display as shown above
Conclusion :- Hence we have studied the camera function and displayed it using the webapp .