Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • Our Favorite Apple Watch Has Never Been Less Expensive
    • Vercel says it detected unauthorized access to its internal systems after a hacker using the ShinyHunters handle claimed a breach on BreachForums (Lawrence Abrams/BleepingComputer)
    • Today’s NYT Strands Hints, Answer and Help for April 20 #778
    • KV Cache Is Eating Your VRAM. Here’s How Google Fixed It With TurboQuant.
    • OneOdio Focus A1 Pro review
    • The 11 Best Fans to Buy Before It Gets Hot Again (2026)
    • A look at Dylan Patel’s SemiAnalysis, an AI newsletter and research firm that expects $100M+ in 2026 revenue from subscriptions and AI supply chain research (Abram Brown/The Information)
    • ‘Euphoria’ Season 3 Release Schedule: When Does Episode 2 Come Out?
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Sunday, April 19
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Artificial Intelligence»The Geometry Behind the Dot Product: Unit Vectors, Projections, and Intuition
    Artificial Intelligence

    The Geometry Behind the Dot Product: Unit Vectors, Projections, and Intuition

    Editor Times FeaturedBy Editor Times FeaturedApril 6, 2026No Comments11 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link



    This text is the primary of three components. Every half stands by itself, so that you don’t have to learn the others to know it.

    The dot product is among the most necessary operations in machine studying – nevertheless it’s onerous to know with out the correct geometric foundations. On this first half, we construct these foundations:

    · Unit vectors

    · Scalar projection

    · Vector projection

    Whether or not you’re a scholar studying Linear Algebra for the primary time, or wish to refresh these ideas, I like to recommend you learn this text.

    Actually, we are going to introduce and clarify the dot product on this article, and within the subsequent article, we are going to discover it in larger depth.

    The vector projection part is included as an non-obligatory bonus: useful, however not needed for understanding the dot product.

    The following half explores the dot product in larger depth: its geometric that means, its relationship to cosine similarity, and why the distinction issues.

    The ultimate half connects these concepts to 2 main functions: advice techniques and NLP.


    A vector 𝐯→massive mathbf{vec{v}} known as a unit vector if its magnitude is 1:

    |𝐯→|=1LARGE mathbf{|vec{v}|} = 1

    To take away the magnitude of a non-zero vector whereas maintaining its course, we will normalize it. Normalization scales the vector by the issue:

    1|𝐯→|LARGE frac{1}{|mathbf{vec{v}}|}

    The normalized vector 𝐯^massive mathbf{hat{v}}  is the unit vector within the course of 𝐯→massive mathbf{vec{v}}: 

    𝐯^=𝐯→|𝐯→|LARGE start{array} hline mathbf{hat{v}} = frac{mathbf{vec{v}}}{|mathbf{vec{v}}|} hline finish{array}

    Notation 1. Any longer, at any time when we normalize a vector 𝐯→massive mathbf{vec{v}},  or write 𝐯^massive mathbf{hat{v}}, we assume that 𝐯→≠0massive mathbf{vec{v}} neq 0. This notation, together with those that comply with, can also be related to the next articles.

    This operation naturally separates a vector into its magnitude and its course:

    𝐯→=|𝐯→|⏟magnitude⋅𝐯^⏟courseLARGE start{array} hline rule{0pt}{2.5em} mathbf{vec{v}} = underbrace{|mathbf{vec{v}}|}_{textual content{magnitude}} cdot underbrace{mathbf{hat{v}}}_{textual content{course}} [4.5em] hline finish{array}

    Determine 1 illustrates this concept: 𝐯{mathbf{v}} and 𝐯^massive mathbf{hat{v}} level in the identical course, however have totally different magnitudes.

    Determine 1-Separating “How A lot” from “Which Means”. Any vector will be written because the product of its magnitude and its unit vector, which preserves course however has size 1. Picture by Writer (created utilizing Claude).

    Similarity of unit vectors

    In two dimensions, all unit vectors lie on the unit circle (radius 1, centered on the origin). A unit vector that types an angle θ with the x-axis has coordinates (cos θ, sin θ).

    This implies the angle between two unit vectors encodes a pure similarity rating - as we are going to present shortly, this rating is precisely cos θ: equal to 1 once they level the identical method, 0 when perpendicular, and −1 when reverse.

    Notation 2. All through this text, θ denotes the smallest angle between the 2 vectors, so 0°≤θ≤180°0° leq theta leq 180° .

    In observe, we don’t know θ instantly – we all know the vectors’ coordinates.

    We will present why the dot product of two unit vectors: a^largehat{a} and b^largehat{b} equals cos θ utilizing a geometrical argument in three steps:

    1. Rotate the coordinate system till b^largehat{b} lies alongside the x-axis. Rotation doesn’t change angles or magnitudes.

    2. Learn off the brand new coordinates. After rotation, b^largehat{b} has coordinates (1 , 0). Since a^largehat{a} is a unit vector at angle θ from the x-axis, the unit circle definition provides its coordinates as (cos θ, sin θ).

    3. Multiply corresponding parts and sum:

    a^⋅b^=ax⋅bx+ay⋅by=cos⁡θ⋅1+sin⁡θ⋅0=cos⁡θGiant start{aligned} hat{a} cdot hat{b} = a_x cdot b_x + a_y cdot b_y = costheta cdot 1 + sintheta cdot 0 = costheta finish{aligned}

    This sum of component-wise merchandise known as the dot product:

    a→⋅b→=a1⋅b1+a2⋅b2+⋯+an⋅bnGiant boxed{ start{aligned} vec{a} cdot vec{b} = a_1 cdot b_1 + a_2 cdot b_2 + cdots + a_n cdot b_n finish{aligned} }

    See the illustration of those three steps in Determine 2 beneath:

    Determine 2- By rotating our perspective to align with the x-axis, the coordinate math simplifies superbly to disclose why the 2 unit vectors’ dot product is the same as cos(θ). Picture by Writer (created utilizing Claude).

    Every part above was proven in 2D, however the identical end result holds in any variety of dimensions. Any two vectors, irrespective of what number of dimensions they stay in, all the time lie in a single flat airplane. We will rotate that airplane to align with the xy-plane — and from there, the 2D proof applies precisely.

    Notation 3. Within the diagrams that comply with, we frequently draw one of many vectors (usually b→largevec{b}) alongside the horizontal axis. When b→largevec{b} shouldn’t be already aligned with the x-axis, we will all the time rotate our coordinate system as we did above (the “rotation trick”). Since rotation preserves all lengths, angles, and dot merchandise, each components derived on this orientation holds for any course of b→largevec{b}.


    A vector can contribute in lots of instructions without delay, however usually we care about just one course.

    Scalar projection solutions the query: How a lot 𝒂→massive boldsymbol{vec{a}} of lies alongside the course of 𝒃→massive boldsymbol{vec{b}}?

    This worth is unfavorable if the projection factors in the wrong way of b→largevec{b}.

    The Shadow Analogy

    Essentially the most intuitive method to consider scalar projection is because the size of a shadow. Think about you maintain a stick (vector a→massive vec{a}) at an angle above the bottom (the course of b→largevec{b}), and a light-weight supply shines straight down from above.

    The shadow that the stick casts on the bottom is the scalar projection.

    The animated determine beneath illustrates this concept:

    Determine 3- Scalar projection as a shadow.
     The scalar projection measures how a lot of vector a lies within the course of b.
     It equals the size of the shadow that a casts onto b (Woo, 2023). The GIF was created by Claude

    Calculation

    Think about a light-weight supply shining straight down onto the road PS (the course of b→largevec{b}). The “shadow” that a→largevec{a} (the arrow from P to Q ) casts onto that line is precisely the phase PR. You possibly can see this in Determine 4.

    Determine 4: Measuring Directional Alignment. The scalar projection (phase PR) visually solutions the core query: “How a lot of vector a lies within the actual course of vector b.” Picture by Writer (created utilizing Claude).

    Deriving the components

    Now take a look at the triangle  PQRmassive PQR: the perpendicular drop from Qmassive Q creates a proper triangle, and its sides are:

    •  PQ=|a→|massive PQ = |vec{a}| (the hypotenuse).
    •  PRmassive PR (the adjoining aspect – the shadow).
    •  QRmassive QR (the other aspect – the perpendicular part).

    From this triangle:

    1. The angle between a→largevec{a} and b→largevec{b} is θ.
    2. cos⁡(θ)=PR|a→|massive cos(theta) = frac{PR}{|vec{a}|} (essentially the most primary definition of cosine).
    3. Multiply each side by |a→|massive|vec{a}| :

    PR=|a→|cos⁡(θ)LARGE start{array} hline PR = |vec{a}| cos(theta) hline finish{array}

    The Phase 𝑷𝑹boldsymbol{PR} is the shadow size – the scalar projection of 𝒂→massive boldsymbol{vec{a}} on 𝒃→massive boldsymbol{vec{b}}.

    When θ > 90°, the scalar projection turns into unfavorable too. Consider the shadow as flipping to the other aspect.

    How is the unit vector associated?

    The shadow’s size (PR) doesn’t depend upon how lengthy b→largevec{b} is. It will depend on |a→|massive|vec{a}| and on θ.

    While you compute a→⋅b^largevec{a} cdot hat{b}, you’re asking: how a lot of a→largevec{a} lies alongside b→largevec{b} course?  That is the shadow size.

    The unit vector acts like a course filter: multiplying a→largevec{a} by it extracts the part of a→largevec{a} alongside that course.

    Let’s see it utilizing the rotation trick. We place b̂ alongside the x-axis:

    a→=(|a→|cos⁡θ, |a→|sin⁡(θ))Giant vec{a} = (|vec{a}|costheta, |vec{a}|sin(theta))

    and:

    b^=(1,0)Giant hat{b} = (1, 0)

    Then:

    a→⋅b^=|a→|cos⁡θ⋅1+|a→|sin⁡(θ)⋅0=|a→|cos⁡θGiant start{aligned} vec{a} cdot hat{b} = |vec{a}|costheta cdot 1 + |vec{a}|sin(theta) cdot 0 = |vec{a}|costheta finish{aligned}

    The scalar projection of 𝒂→massive boldsymbol{vec{a}} within the course of 𝒃→massive boldsymbol{vec{b}} is:

    |a→|cos⁡θ=a→⋅b^=a→⋅b→|b→|LARGE renewcommand{arraystretch}{2} start{array} hline start{aligned} |vec{a}|costheta &= vec{a} cdot hat{b} &= frac{vec{a} cdot vec{b}}{|vec{b}|} finish{aligned} hline finish{array}


    We apply the identical rotation trick yet one more time, now with two basic vectors: a→largevec{a} and b→largevec{b}.

    After rotation:

    a→=(|a→|cos⁡θ, |a→|sin⁡θ)Giant vec{a} = (|vec{a}|costheta, |vec{a}|sintheta) ,

    b→=(|b→|, 0)Giant vec{b} = (|vec{b}|, 0)

    so:

    a→⋅b→=|a→|cos⁡θ⋅|b→|+|a→|sin⁡θ⋅0=|a→||b→|cos⁡θGiant start{aligned} vec{a} cdot vec{b} = |vec{a}|costheta cdot |vec{b}| + |vec{a}|sintheta cdot 0 = |vec{a}||vec{b}|costheta finish{aligned}

    The dot product of 𝒂→massive boldsymbol{vec{a}} and 𝒃→massive boldsymbol{vec{b}} is:

    a→⋅b→=a1b1+⋯+anbn=∑i=1naibi=|a→||b→|cos⁡θGiant renewcommand{arraystretch}{2} start{array} hline vec{a} cdot vec{b} = a_1 b_1+ dots + a_n b_n = sum_{i=1}^{n} a_i b_i = |vec{a}||vec{b}|costheta hline finish{array}


    Vector projection extracts the portion of vector 𝒂→massive boldsymbol{vec{a}} that factors alongside the course of vector 𝒃→massive boldsymbol{vec{b}}.

    The Path Analogy

    Think about two trails ranging from the identical level (the origin):

    • Path A results in a whale-watching spot.
    • Path B leads alongside the coast in a special course.

    Right here’s the query projection solutions:

    You’re solely allowed to stroll alongside Path B. How far do you have to stroll in order that you find yourself as shut as attainable to the endpoint of Path A?

    You stroll alongside B, and sooner or later, you cease. From the place you stopped, you look towards the top of Path A, and the road connecting you to it types an ideal 90° angle with Path B. That’s the important thing geometric truth – the closest level is all the time the place you’d make a right-angle flip.

    The spot the place you cease on Path B is the projection of A onto B. It represents “the a part of A that goes in B’s course.

    The remaining hole -  out of your stopping level to the precise finish of Path A  –  is all the things about A that has nothing to do with B’s course. This instance is illustrated in Determine 5 beneath: The vector that begins on the origin, factors alongside Path B, and ends on the closest level –is the vector projection of a→largevec{a} onto b→largevec{b} .

    Determine 5 — Vector projection because the closest level to a course.
     Strolling alongside path B, the closest level to the endpoint of A happens the place the connecting phase types a proper angle with B. This level is the projection of A onto B. Picture by Writer (created utilizing Claude)..

    Scalar projection solutions: “How far did you stroll?”

    That’s only a distance, a single quantity.

    Vector projection solutions: “The place precisely are you?”

    Extra exactly: “What’s the precise motion alongside Path B that will get you to that closest level?”

    Now “1.5 kilometers” isn’t sufficient, you have to say “1.5 kilometers east alongside the coast.” That’s a distance plus a course: an arrow, not only a quantity. The arrow begins on the origin, factors alongside Path B, and ends on the closest level.

    The gap you walked is the scalar projection worth. The magnitude of the vector projection equals absolutely the worth of the scalar projection.

    Unit vector  solutions : “Which course does Path B go?”

    It’s precisely what b^largehat{b} represents. It’s Path B stripped of any size info  - simply the pure course of the coast.

    vector projection=(how far you stroll)⏟scalar projection×(B course)⏟b^start{aligned} &textual content{vector projection} = &underbrace{(textual content{how far you stroll})}_{textual content{scalar projection}} occasions underbrace{(textual content{B course})}_{hat{b}} finish{aligned}

    I do know the whale analog may be very particular; it was impressed by this good explanation (Michael.P, 2014)

    Determine 6 beneath reveals the identical shadow diagram as in Determine 4, with PR drawn as an arrow, as a result of the vector projection is a vector (with each size and course), not only a quantity.

    Determine 6 — Vector projection as a directional shadow.
     Not like scalar projection (a size), the vector projection is an arrow alongside vector b. Picture by Writer (created utilizing Claude).

    For the reason that projection should lie alongside b→largevec{b} , we’d like two issues for PR→largevec{PR} :

    1. Its magnitude is the scalar projection: |a→|cos⁡θmassive|vec{a}|costheta
    2. Its course is: b^largehat{b} (the course of b→largevec{b})

    Any vector equals its magnitude occasions its course (as we noticed within the Unit Vector part), so:

    PR→=|a→|cos⁡θ⏟scalar projection⋅b^⏟course of b→massive start{array} hline hspace{10pt} vec{PR} = underbrace{|vec{a}| cos theta}_{textual content{scalar projection}} cdot underbrace{hat{b}}_{textual content{course of } vec{b}} hspace{20pt} hline finish{array}

    That is already the vector projection components. We will rewrite it by substituting b^=b→|b→|largehat{b} = frac{vec{b}}{|vec{b}|} , and recognizing that |a→||b→|cos⁡θ=a→⋅b→massive|vec{a}||vec{b}|costheta = vec{a} cdot vec{b}

    The vector projection of 𝒂→massive boldsymbol{vec{a}} within the course of 𝒃→massive boldsymbol{vec{b}} is:

    projb→(a→)=(|a→|cos⁡θ)b^=(a→⋅b→|b→|2)b→=(a→⋅b^)b^Giant renewcommand{arraystretch}{1.5} start{array} hline start{aligned} textual content{proj}_{vec{b}}(vec{a}) &= (|vec{a}|costheta)hat{b} &= left(frac{vec{a} cdot vec{b}}{|vec{b}|^2}proper)vec{b} &= (vec{a} cdot hat{b})hat{b} finish{aligned} hline finish{array}


    • A unit vector isolates a vector’s course by stripping away its magnitude.

    𝐯^=𝐯→|𝐯→|LARGE start{array} hline mathbf{hat{v}} = frac{mathbf{vec{v}}}{|mathbf{vec{v}}|} hline finish{array}

    • The dot product multiplies corresponding parts and sums them. It’s also equal to the product of the magnitudes of the 2 vectors multiplied by the cosine of the angle between them.

     a→⋅b→=a1b1+⋯+anbn=∑i=1naibi=|a→||b→|cos⁡θ renewcommand{arraystretch}{2} start{array} hline vec{a} cdot vec{b} = a_1 b_1+ dots + a_n b_n = sum_{i=1}^{n} a_i b_i = |vec{a}||vec{b}|costheta hline finish{array}

    • Scalar projection makes use of the dot product to measure how far one vector reaches alongside one other’s course - a single quantity, just like the size of a shadow

    |a→|cos⁡θ=a→⋅b^=a→⋅b→|b→|Giant start{array} hline |vec{a}|costheta = vec{a} cdot hat{b} = frac{vec{a} cdot vec{b}}{|vec{b}|} hline finish{array}

    • Vector projection goes one step additional, returning an precise arrow alongside that course: the scalar projection occasions the unit vector.

    (|a→|cos⁡θ)b^=(a→⋅b^)b^Giant renewcommand{arraystretch}{2} start{array} hline (|vec{a}|costheta)hat{b} = (vec{a} cdot hat{b})hat{b} hline finish{array}

    Within the subsequent half, we are going to use the instruments we discovered on this article to actually perceive the dot product.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    KV Cache Is Eating Your VRAM. Here’s How Google Fixed It With TurboQuant.

    April 19, 2026

    Proxy-Pointer RAG: Structure Meets Scale at 100% Accuracy with Smarter Retrieval

    April 19, 2026

    Dreaming in Cubes | Towards Data Science

    April 19, 2026

    AI Agents Need Their Own Desk, and Git Worktrees Give Them One

    April 18, 2026

    Your RAG System Retrieves the Right Data — But Still Produces Wrong Answers. Here’s Why (and How to Fix It).

    April 18, 2026

    Europe Warns of a Next-Gen Cyber Threat

    April 18, 2026
    Leave A Reply Cancel Reply

    Editors Picks

    Our Favorite Apple Watch Has Never Been Less Expensive

    April 19, 2026

    Vercel says it detected unauthorized access to its internal systems after a hacker using the ShinyHunters handle claimed a breach on BreachForums (Lawrence Abrams/BleepingComputer)

    April 19, 2026

    Today’s NYT Strands Hints, Answer and Help for April 20 #778

    April 19, 2026

    KV Cache Is Eating Your VRAM. Here’s How Google Fixed It With TurboQuant.

    April 19, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Skyscraper-style tiny house redefines compact living

    July 2, 2025

    Silicon ‘postage stamp’ implant instantly emails your thoughts to AI

    January 2, 2026

    7 Best Outdoor Lights (2025), Including Solar Lights

    June 27, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.