EIS Lect 20210322 Notes V2
EIS Lect 20210322 Notes V2
Consist of
• Embedded Systems and
• Intelligent Systems.
• Program Execution
• Timer/Counter control
RS 232, USB
• PWM control PWM
I²C, CAN
• ext./int. Interrupts
• ADC/ DAC control HDMI
Digital I/O
• Interface controls
µC: Atmega MCUs, 8051 MCUs, Arduino, myRIO, ……..
FDS
Integration Test
FDS: Functional Design Specifications -> Develop a functional model (e.g. block diagram),
group functions into modules (subroutines), define interfaces between modules
such as return parameters or shared variables. Functions which are used by
several modules should be put in a library for common use.
MDS: Modul Design Specifications -> Develop a detailed model of each modul,
i.e. a detailed block diagram or flow chart
Modul Test: Separate (!) test of modules with parameters which represent test scenarios
LCD
I2C SRF08
[12]
25.03.2021 Prof. Dr. Peter Nauth 19
ATmega 2560 Specifications
int main(void) { ……
}
ISR(xxx_vect( { …..
}
14
25.03.2021 27
myRIO (National Instruments)
with RTOS
Programming Language:
LabVIEW
[13]
[13]
[13]
[13]
[13]
[13]
[13]
[12]
[12]
[12]
v = s / t t: time of
flight
s=vt
d = s/2 = v t / 2
d = (340 m/s * t) / 2
[15]
[15]
[15]
int error()
{
// Print Error Message
lcd_clrscr();
lcd_puts(“Transmission failed“);
}
V0 ~ 1 / d
ADC
VIN
action
Period
PC-Program can
be placed here
RT-Program
FPGA-Program
Set Period
Set Period
Resolution 8 Megapixels
[1] [1]
[1]
z is distance,
i.e. z = d
(1)
Image Formation
• xc = f x/z yc = f y/z x,y: Object
xc,yc: Imaged Object
tan Θ = x / z Camera
z
Θ
= u z / (z s) x
=u/s Object
Θ = arctan (u/ s)
Distance by View Angle
• The camera is mounted on the hight h.
• The object is recognized at pixel index v
related to the main camera axis
(parallel to the floor)
Camera
• tan Θ = hcamera / z and Θ
z
tan Θ = v / s hcamara
• z=hs/v Floor
Bottom of Object
Distance by Stereo Vision
Stereo Vision uses images taken by two
cameras mounted in a distance a next to
each other. The same point P(x,y,z) is imaged
to different pixels in camera 1 (u1, v) and
camera 2 (u2, v).
u1 = s x/z and u2 = s (x+a)/z
->u2 – u1 = s a/z
z = s a/(u2 - u1)
Distance by Multiple Mesurements
• Measurement of the same object in
different distances z and z + Δz (Δz is
known, e.g. step length of a robot)
• z = Δz /(u2/u1 – 1)
Distance by Structured Illumination:
Kinect Camera
• Methods:
- Feature based Image Processing
- Pattern Matching
- Deep Learning
25.03.2021 Prof. Dr. Peter Nauth 115
Computer Vision (CV)
camera.start_preview()
sleep(5)
camera.capture(
‘/home/pi/Desktop/image.jpg’)
camera.stop_preview()
Img =
cv2.imread(‘/home/pi/Desktop/image.jpg’,0)
Grey =
cv2.cvtColor(Img, cv2.CV_BGR2GRAY)
[1]
[1]
f(u,v) f(u,v)
25.03.2021 Prof. Dr. Peter Nauth 128
Featurebased CV/ Preprocessing
Rectangular Binominal Filter
f(m,n) f(m,n)
[1]
f(u,v)
Img = cv2.imread(‘/home/pi/Desktop/image.jpg’,0)
kernel = np.ones((5,5),np.float32)/25
Img_filter = cv2.filter2D(Img,-1,kernel)
Thresh1 = cv2.threshold(Img_filter,90,225,
cv2.THRESH_BINARY_INV)
sx = m10 / m00
sy = m01 / m00
contours,hierarchy =
cv2.findContours(thres,1,2)
cnt = contours[0]
M = cv2.moments(cnt)
cx = int(M[‘m10’]/M[‘m00’])
cy = int(M[‘m01’]/M[‘m00’])
Discrimination function
of class i
P ( x | i ) P ( i ) Bayes Theorem
Di (x) =
P( x)
Causal direction
P(x|i)
P ( x | i ) P ( i ) Bayes Theorem
Di (x) =
P( x)
Multivariate
Gaussian
Distribution
[1]
Bayes Classifier =
Negative Squared
Mahalanobis Distance
Naive Bayes Classifier:
Feature are statistically independent, i.e. K is doagonal matrix
Likelihood of false
classifications for
a given wi
Task 1
Period Task 1
Period Task 0
Request of Task 0
t/ms
4 8 12 16 20
Task 1 Task 1
Period Task 1
Task 0
Period Task 0
Request of Task 0
t/ms
4 8 12 16 20
[15]
Δs z
Area of P(z)
s modelled as
surface of
s = z – Δs sphere with
s≈z if Δs << z center z = 0)
09.01.2019 Intelligent Sensors / Prof. Dr. Peter Nauth 193
Sonic Power / Intensity Losses
Definitions
Pt: Power transmitted to object, Pt = P(s)
It: Intensity transmitted to object (Power/Area)
(Wave front propagates as spheric surface)
Pr: Power received at receiver plane
Ir: Intensity received at receiver plane
Pd: Power measured by receiver transducer
R0: Reflectivity of object
A0: Area of object
Ad: Area of transducer surface
09.01.2019 Intelligent Sensors / Prof. Dr. Peter Nauth 194
Sonic Power / Intensity Losses
Power detected (received) by transducer
It = Pt / 4 π s² (Sphere Wave Front assumed)
Pr = It R A0 (Power in receiver plane = Power reflected)
Ir = Pr / 4 π s² (Sphere839 3057 4506
Wave Front assumed)
Pd = Ir Ad (Power detected by transducer)
Pd = Pt R A0 Ad / 16 π² s4
TWCR = (1<<TWINT)|(1<<TWSTO)|(1<<TWEN);