Vector Representation an Essential in Machine Learning

Category Artificial intelligence, Data Engineering, Data Science

Why are matrices with dimensions Nx1 called vectors

If you are a machine learning engineer, one thing that you will work with day in and day out is a vector. Vectors are ubiquitous in machine learning.

If you have taken any college level physics or engineering, you probably think of vectors as something that has both magnitude and direction where the length of the vector is the magnitude and the orientation of the vector is the direction.

 

For example speed is a scalar quantity, but velocity is vector because it has both magnitude and direction. You would define something to be a vector if both magnitude and direction is important to get complete idea about the impact of the variable. Giving another example time is not a vector quantity, but a time interval is.

Another example of a vector is an area. How? Now, if you just consider it area is just the measure of a two dimensional surface. In many cases we do not consider the direction of the surface. But sometimes we do. For example, you want to buy a plot of land to build a house. In most cases, the plot of land would be a flat land and the direction is upwards (direction of an area is perpendicular to the surface), but what if it’s on a mountain and the direction is not upwards but slanting. You will then probably make the part of the land flat before moving on to building a house.

Vector Components and Basis Vectors

Consider a Cartesian coordinate system in a 3D space. The thing to remember about coordinates is they come with coordinate basis vectors.

The coordinate basis vectors are of unit length in the units that the coordinate system is measured. And the direction is in the direction of the coordinate axes. Generally, in case of a three dimensional system, the î, ĵ and k̂ are considered to be the basis vectors of the x-axis, y-axis and z-axis respectively.

Once you have the coordinates and the unit vectors in place, you are in a position to find the components of your vector. A way to visualize the vector component, you can ask how far do I have to go in the x direction and how far do I have to go in the y direction. In other words, how many î unit vectors, ĵ unit vectors and k̂ unit vectors are required.

In other words, taking the above example shown in the diagram, 2î unit vectors, 3ĵ unit vectors and 5k̂ unit vectors is a valid representation of the vector shown. And if you know the basis vectors then of course you don’t even need to write them up. You can simply take the values 2, 3 and 5 and write them in an array, or stack them up (sets are not a good data structure as we want to preserve the sequence) and put a nice set of parenthesis around them.

So, we can see the left hand side of the above equation, which we know to be a vector in the traditional sense, has resolved to be a 3 x 1 dimensional matrix in case the space we consider is three dimensional. Similarly, a vector in n dimensions can be written as a n x 1 dimensional matrix. That is why if the second number of the dimension of a matrix is 1 we call it a vector. If you understand this you will go a long way in understanding the matrix and high dimensional space manipulations commonly done in machine learning and AI.

Ready to embark on a transformative journey? Connect with our experts and fuel your growth today!