0% found this document useful (0 votes)
9 views11 pages

Extension

Uploaded by

Sohaib Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views11 pages

Extension

Uploaded by

Sohaib Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Modelling and Simulating Complex Projectile

Motion
A. Joshi and S. Alam

1 Intercontinental Projec- 1.1.3 GLM


tile Modelling The only new addition to our toolset was
GLM. We decided to use GLM for its vec-
1.1 Tools to begin with tor and matrix operations. If we had writ-
ten our own matrix and vector classes we
Before starting off with a project as such, it would not have been able to optimise our
is imperative to take inventory of the tools code as well as GLM, which would have
we have at our disposal, and begin building lead to inefficiencies, especially when we
a solution to the problem statement. aren’t using the GPU for these calculations.

1.1.1 SFML
1.2 Algorithm
1.2.1 Motivation
As with our previous applications, we de-
cided to use SFML as our primary graphics Our intention is to create a projectile
library. We decided to keep using SFML, launcher, that works on a model of the
as to finish this project faster, and prevent Earth. We can decompose this problem
ourselves from getting bogged down in the into the following steps:
details of learning a new library.
Since we were not using OpenGL di- 1. Be able to render a sphere
rectly, we will be rendering purely on the
2. Be able to map a texture on to the
CPU side, and will not be lending the
sphere
help of the GPU for rasterization. This
is a trade-off we were willing to make, as 3. Be able to draw a point on the sphere
we were not looking to make a game, but (as the launch point)
rather a simulation.
4. Be able to compute and draw the tra-
jectory of the projectile
1.1.2 ImGUI
5. Be able to account for the rotation of
Like with SFML, we already had the code the Earth
infastructure to use ImGUI, and we decided
to use it for the same reasons as SFML. 6. Be able to animate the projectile

1
1.2.2 Rendering a Sphere B, but isn’t nessecarily on the line segment
AB.
To render a sphere, we must note that we
Staying with the two dimension story, if
have no way to render a sphere directly
we were to draw a set point of points P that
(since we do not have access to OpenGL,
were all on a straight line not going through
and could use gluSphere()). Note that
the origin, and we were to normalise them
SFML does provide a method for us to ren-
in reference to the origin, with a certain
der polygons, so we thought of rendering a
distance d, we would contruct an arc of a
sphere in terms of polygons.
circle with radius d, since all this exercise
To do so, we must first introduce the
is, is drawing a set of points on a circle with
idea triangle subdivision. The idea is to
radius d. It is then trivial to prove that the
take a triangle and divide it into smaller tri-
same would hold in three dimensions2 .
angles (as the name would suggest). There
The reason to even go through such an
are multiple ways of doing this (see [3] for
exercise, it to realise that we can start of
more), and the general reasoning behind
a octahedron, and then subdivide it, yield-
this that we can have a better more re-
ing us the points on a straight line. Then
fined representation of any polygon, with-
we can normalise these points to get the
out having to store extra information1 . See
points on a sphere, and obviously we can
in figure 1 how by subdividing a tetrahe-
also control the radius of such a sphere. To
dron, we can approximiate a smooth sur-
keep things simple, we use an octahedron,
face. Usually a few iterations of this pro-
because it is a comprised of 8 equilateral
cess is suffice to give a good approximation
triangles, which are trivial to subdivide.
of the limit surface.
The second idea we must introduce is Now that we have our points that we
of normalisation, with respect to a set dis- can render, we need to somehow convert
tance. Normally, normalisation preserves these from 3D to 2D, so that we can render
the direction of a vector, but scales it such them on to the screen. This is where GLM
that its magnitude is 1. Our normalisa- does most of the heavy lifting, in that, we
tion is a bit different, however, because we don’t have to manually construct the equa-
don’t end up with magnitude 1, but rather tions to this, but can leave it up to GLM
a magnitude of a set distance. to do this for us.
Here is a two-dimensional example of
normalisation with respect to a distance: The Rendering Pipeline
3 shows two points, A and B, and the line
drawn between them. Currently, the dis- Throughout this procedure, we will be us-
tance between A and B is 6 units, however ing 4D vectors (x, y, z, w), and 4×4 matri-
if one were tasked to find a point on the ces. The reason for this is that we can use
line AB that is 12 units away from A (see the fourth dimension to store information
figure 4 as point C). about the vector. I would recommend using
More generally, we can say that this [2] as a guide to further understanding the
point C will always be colinear with A and intricate process defined here. This being
1
Though, obviously we take a memory and computation penalty for this, we can achieve a smooth
limit surface
2
This is a exercise left to the reader

2
Figure 1: Image shows the tetrahedron being subdivided 0, 1, 2, 6 times

Figure 2: shows the subdivison process on a equilateral triangle

Figure 3: Image shows the tetrahedron being subdivided 0, 1, 2, 6 times

Figure 4: shows the point C, which is 12 units away from A

3
ˆ If w = 0, then the vector is a direc- be equivalent of moving the entire world 3
tion vector units to the left instead. Mathematically,
this is equivalent of describing everything
ˆ If w = 1, then the vector is a point in terms of the basis vectors defined relative
to the camera, rather than in world space.
To begin, all of the points that describe a
This is the idea behind the view matrix.
sphere are relative to the origin (obviously),
however this origin is not nessecarily the Now that we are in Camera Space, we
origin of the world (but rather relative to can start to project our points onto the
the origin of the model). To make it rel- screen. This is done by the projection ma-
ative to the world, we can apply a model trix. We obviously have to use the x and
matrix transformation. The model matrix y coordinates of the points to determine
is consistent of: where to place our points on the screen,
however we must also use the z coordinate
ˆ A translation matrix – which de- to determine which point should be more
scribes the position of the object in on the screen than the other. The pro-
the world relative to the origin of the jection matrix converts the frustum of the
world. camera to a cube, and then scales the cube
ˆ A rotation matrix – which describes to the screen. 5 shows the steps taken de-
the orientation of the object in the scribed here. Once our coordinates have
world relative to the basis vectors of been projected onto the screen, we can then
the world. render them using SFML, this is done by
creating a vertex array, and then filling it
ˆ A scaling matrix – which describes with the points we have projected onto the
the size of the object in the world rel- screen.
ative to the basis vectors of the world.
<<Get UV coordinate for a point
After applying the model matrix, our coor- xyz>>
<<Rendering a Sphere>>=
dinates are now in world space (points are
<<Get subdivided octahedron>>
defined relative to the origin of the world). <<Map the octahedron onto a sphere>>
Quote from Futurama:
sf::Texture texture = sf::Texture();
‘The engines don’t move the for (int i = 0; i <
ship at all. The ship stays triangles.size(); i++) {
where it is and the engines glm::vec3 v1 = triangles[i].v1;
move the universe around it’ glm::vec3 v2 = triangles[i].v2;
glm::vec3 v3 = triangles[i].v3;
For example, if you want to view a moun-
tain from a different angle, you can either glm::vec4 p1 = MVP * glm::vec4(v1,
move the camera or move the mountain. 1.0f);
glm::vec4 p2 = MVP * glm::vec4(v2,
Whilst not practial in real life, the latter is
1.0f);
easier and simpler in CG than the former glm::vec4 p3 = MVP * glm::vec4(v3,
Intially, your camera is at the origin 1.0f);
of the world space and you want to move
your camera 3 units to the right, this would

4
Figure 5: shows the steps taken to get screen coordinates

5
sf::VertexArray the y axis3 . To begin with then4 :
triangle(sf::Triangles, 3);
triangle[0].position = y = − cos(θ)
sf::Vector2f(p1.x, p1.y); x = − cos(ϕ) sin(ϕ)
triangle[1].position =
sf::Vector2f(p2.x, p2.y);
z = sin(ϕ) sin(θ)
triangle[2].position = From this we can infer that:
sf::Vector2f(p3.x, p3.y);
θ = arccos(−y)
<<Set UV coordinates>>
window.draw(triangle, ϕ = atan2(z, −x)
&texture); Where atan2 is the four-quadrant inverse
}
tangent function. This returns values in
the range [−π, π], however these values go
from 0 to π, then flip to −π, proceeding
back to 0. While mathematically correct,
1.2.3 Mapping a texture onto the this cannot be used to map uv coordinate,
sphere since we want a smooth transition from 0
to 1.
Fortunately,
After the arduous task of getting the trian-
gles we want on to the screen, we can now atan2(a, b) = atan2(−a, −b) + π
move on to the task of mapping a texture
onto the sphere. To do so, we must in- This formulation gives values in the de-
troduce the idea of uv coordinates. These sired smooth range of [0, 2π], therefore
coordinates specify the location of a 2D ϕ = atan2(z, −x) + π
source image (or in some 2D parameter-
ized space). We need to find a mapping Since we have our θ and ϕ values, we
of a point from a 3D surface (in this case can now convert them to uv coordinates.
of a sphere) onto uv coordinates. This is done by:
uv coordinates are defined in the range ϕ
u=
[0, 1], where u is the horizontal coordinate 2π
and v is the vertical coordinate. Their θ
v=
range allows them to be used in any tex- π
ture, regardless of size, since they are rela- Now that we have our uv coordinates,
tive to the size of the texture. SFML provides a method of interpolation
For spheres, surface coordinates are de- between these coordinates defined by the
fined in terms of two angles θ and ϕ, where vertices of the triangles, so we need not
θ measures the angle made between the y worry about the interpolation of the uv co-
axis and the point and ϕ is the angle about ordinates:
3
Annoyingly, many textbook definitions of ϕ and θ are not only swapped, but also the axes of measure-
ment are also changed, we consider the ”poles” of our sphere to be the y axis, however many textbooks
consider the ”poles” to be the z axis, which ends up changing the equations in a subtle, yet frustrating
to debug manner.
4
Assuming unit sphere

6
conclude that our y coordinate will be the
<<Get UV coordinate for a point
sine of the latitude.
xyz>>=
glm::vec2 getUV(glm::vec3 xyz) {
By the same logic, we can infer that the
float theta = acos(-xyz.y); x and z coordinates will be the cosine of the
float phi = atan2(xyz.z, latitude since the x and z coordinates are
-xyz.x) + M_PI; the projection of the latitude onto the xz
return glm::vec2(phi / (2 * plane.
M_PI), theta / M_PI); The longitude will affect the x and z co-
} ordinates, since the longitude is the angle
<<Set UV coordinates>>=
about the y axis. The x coordinate will be
glm::vec2 uv1 = getUV(v1);
glm::vec2 uv2 = getUV(v2); the cosine of the longitude, and the z coor-
glm::vec2 uv3 = getUV(v3); dinate will be the sine of the longitude.
Therefore the equations are:
triangle[0].texCoords =
sf::Vector2f(uv1.x, 1 - uv1.y); y = sin(latitude)
triangle[1].texCoords = x = cos(latitude) cos(longitude)
sf::Vector2f(uv2.x, 1 - uv2.y);
triangle[2].texCoords = z = cos(latitude) sin(longitude)
sf::Vector2f(uv3.x, 1 - uv3.y);
1.2.5 Computing and drawing the
Interestingly, we have had to reverse our v trajectory of the projectile
coordinate, this is because SFML’s reading
We gave the user the option to select these
of texture coordinates is from the top left
configuration items for the projectile:
corner, rather than the bottom left corner.
This is a common convention in computer ˆ Latitude
graphics, and is something to be aware of.
ˆ Longitude

1.2.4 Drawing a point on the sphere ˆ Launch velocity

We decided that the user would be allowed ˆ Launch angle (cardinal)


to select a launch point (as this point would
ˆ Elevation angle
act as the starting point for our projectile).
And the easiest way for the user was to se- The latitude and longitude are easy to
lect latitude and longitude points, as these understand, and the launch velocity is the
are the most intuitive to the user. speed at which the projectile is launched.
The process from here is simply the in- The launch angle is the angle at which
verse of the process described above. the projectile is launched, with reference to
Also note that in our model, latitude/- the Westerly direction. The elevation an-
longtitude (0, 0) is the point (0, 0, −1) gle is the angle at which the projectile is
We can derive these equations, by real- launched, with reference to the horizon.
ising that since our sphere revolves around To visualise these the last 3 parame-
the y axis, only the latitude component ters properly, suppose a local coordinate
will affect our final y coordinate. Since our system, where the normal to the sphere is
sphere is also centered at the origin, we can (by definition) orthogonal to the to the two

7
other basis vectors of the coordinate sys- tion to find the position of the projectile
tem. We can define the z ′ axis to be the at any given time. We can infer the direc-
normal to the sphere, and the x′ axis to be tion in which gravity will act in, since it
the basis vector ‘facing’ the Westerly direc- will into the center of the sphere. Since our
tion. The y ′ axis is then the basis vector sphere is centered at (0, 0, 0), we know that
facing the Northerly direction5 . the direction is simply the negative of the
We can create a local coordinate system position vector of the projectile.
by applying the cross product two times to Next we can calculate the acceleration
the normal of the sphere. The implemen- due to gravity, by using the formula:
tation we used is a derivation of the one
GM
defined in [4]. a=
r2
<<Get local coordinate system>>= One inaccuracy of our model, is that
void CoordinateSystem(const our mass of Earth (or Mars or Moon) M ,
glm::vec3 &v1, glm::vec3 *v2,
is scaled and not accurate to the real mass
glm::vec3* v3)
{ of the planet. Our scaled mass was calcu-
*v2 = glm::vec3(-v1.z, 0, lated as:
v1.x) / std::sqrt(v1.x *
v1.x + v1.z * v1.z); M = g ∗ r2 /G

where g is the acceleration due to gravity


*v3 = glm::cross(v1, *v2);
} on the surface of the planet, r is the radius
of the planet (in our scaled version), and G
From this assumption, we can define all is the gravitational constant.
possible directions where the projectile can We can then calculate the acceleration
be thrown as a hemisphere, with radius of due to gravity as:
the launch velocity. Further, we can define float distanceFromCenter =
the launch angle to be the ‘longtitude’ and glm::distance(glm::vec3(0,
the elevation angle to be the ‘latitude’ of 0, 0), xyzPosition);
this hemisphere. g = launchControlSettings.bigG *
From this we can use the formulation planetMass /
given in [5] to find the components of ve- (distanceFromCenter *
locity vector with reference to the local co- distanceFromCenter);
ordinate system 6 :
acceleration = -g * difference;
vx = v cos(elevation) cos(launch)
This would integrate to the rest of the
vy = v cos(elevation) sin(launch)
code as follows:
vz = v sin(elevation)
float g =
Now that we know the velocity vector launchControlSettings.bigG *
of the projectile, we can use verlet integra- planetMass /
5
This is to say that x′ and y ′ is a propotional representation of the x and y axis of the world space
(since our sphere’s poles are through the y axis)
6
Again, note that vz is the cosine not the sine. This is because we want zero elevation to be the
horizon, and not the zenith.

8
(launchControlSettings.radius planetMass /
* (distanceFromCenter *
launchControlSettings.radius); distanceFromCenter);

glm::vec3 difference = acceleration = -g * difference;


glm::normalize(xyzPosition);
glm::vec3 acceleration = -g * points.push_back(xyzPosition);
difference; numPoints++;
}
int numPoints = 0;
int maxPoints = 1000; Note how we keep a track of the num-
ber of points, as we don’t want to calcu-
float dt = 0.001f;
late the trajectory of the projectile indef-
while (glm::distance(glm::vec3(0, 0,
0), xyzPosition) >= initely, causing memory and computation
launchControlSettings.radius issues later on (plus SFML’s draw calls for
&& points more than 1000 points is not the
numPoints < maxPoints) { most efficient.).
// update our position
xyzPosition += xyzVelocity * dt +
0.5f * acceleration * dt * dt; 1.2.6 Accounting for the rotation of
the Earth
// adjust our position based on
rotation We account for the rotation of the Earth,
xyzPosition = adjustforRotation( by shifting each point on the projectile by
xyzPosition, the same amount that the Earth has ro-
launchControlSettings.angularVelocity,
tated, in the time taken for the projectile
dt, numPoints);
to be at that point. This is done by:
// update our velocity // This function calculates the
xyzVelocity += acceleration * dt; current spherical coordinates
of the projectile,
// update our acceleration // And takes away some component with
difference = respect to the angular velocity
glm::normalize(xyzPosition); glm::vec3 adjustforRotation(glm::vec3
currentPos, float angularVel,
// Acceleration is calculated by float dt, int pointIndex) {
working out which component of float length =
the velocity glm::length(currentPos);
// will be affected the most of currentPos =
immediate effect of gravity glm::normalize(currentPos);
// by calculating the normaised float theta =
difference between the std::acos(-currentPos.y) -
position and the glm::pi<float>() / 2.f;
// center of the earth float phi =
float distanceFromCenter = std::atan2(-currentPos.z,
glm::distance(glm::vec3(0, 0, currentPos.x);
0), xyzPosition); phi -= angularVel * dt * pointIndex;
g = launchControlSettings.bigG * return

9
getCartesian(glm::degrees(theta), transformedPoint.y =
glm::degrees(phi), 1) * length; (transformedPoint.y + 1.0f)
} * 0.5f * RENDER_HEIGHT;

It is good to note that we could have sf::CircleShape circle(2);


also simply moved the landing position if (transformedPoint.z > 4.8) {
circle.setFillColor(sf::Color::Red);
of the projectile by the same amount the
} else {
Earth had rotatated (as described in [1]), circle.setFillColor(sf::Color::Magenta);
as an alternative method to account for the }
rotation of the Earth. circle.setPosition(transformedPoint.x,
transformedPoint.y);
window.draw(circle);
1.2.7 Animating the projectile }

The most trivial process of the entire al-


gorithm is the animation of the projec-
1.3 Results
tile. This is done by calculating how many
points in the projectile there are, with re- Our model is a good approximation of the
spect to a fixed time limit (.e.g. 5 seconds) real world, and can be used to simulate the
and then waiting for that many frames to trajectory of a projectile on Earth, Mars,
pass before moving on to the next point. and the Moon. The model is not perfect,
and there are some inaccuracies, such as
float timePerPoint = 5.f /
the mass of the planets, and the fact our
projectilePath.size();
if (animatationClock. model does not account for the Earth’s true
getElapsedTime().asSeconds() shape, nor its atmosphere. One big prob-
>= timePerPoint) lem with our model, is that there seems to
{ be artefacting of the texture on the sphere.
currentAnimatedPoint++; We believe this to be an issue with SFML’s
if (currentAnimatedPoint texture interpolation algorithm, and we are
>= projectilePath.size()) { not sure how to fix this. We have tried to
launchControlSettings.
increase the resolution of the sphere, but
isAnimated = false;
currentAnimatedPoint = 0; this has not fixed the issue.
}
animatationClock.restart();
}
for (int i = 0; i <
currentAnimatedPoint; i++)
{
glm::vec3 transformedPoint = mvp
*
glm::vec4(projectilePath[i],
1.0f);

transformedPoint.x =
(transformedPoint.x + 1.0f)
* 0.5f * RENDER_WIDTH;

10
References [3] Jos Stam. “Evaluation of Loop Sub-
division Surfaces”. In: 2010. url:
[1] Department of Applied Mathematics https : / / api . semanticscholar .
and Theoretical Physics. “Deflection org/CorpusID:8420692.
of a Projectile due to the Earth’s Ro-
tation”. In: (). url: https : / / www . [4] Vectors. https://pbr-book.org/3ed-
damtp . cam . ac . uk / user / reh10 / 2018/Geometry and Transformations/Vectors#Coordina
lectures/ia-dyn-handout14.pdf. (Visited on 08/12/2024).
[2] Essence of Linear Algebra. [5] Eric W. Weisstein. Spherical Coordi-
http://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE
nates. https://mathworld.wolfram.com/. ab.
(Visited on 08/09/2024). Text. (Visited on 08/12/2024).

11

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy