{"id":1578,"date":"2025-06-12T21:00:56","date_gmt":"2025-06-12T21:00:56","guid":{"rendered":"https:\/\/learnbydoing.dev\/?p=1578"},"modified":"2026-01-10T22:50:42","modified_gmt":"2026-01-10T22:50:42","slug":"pure-pursuit-controller","status":"publish","type":"post","link":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/","title":{"rendered":"Controlador Pure Pursuit"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"1578\" class=\"elementor elementor-1578\" data-elementor-post-type=\"post\">\n\t\t\t\t<div class=\"elementor-element elementor-element-e28ab35 e-flex e-con-boxed e-con e-parent\" data-id=\"e28ab35\" data-element_type=\"container\" data-e-type=\"container\" id=\"content\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-62ab6e3 e-con-full e-flex e-con e-child\" data-id=\"62ab6e3\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-62ab203 elementor-align-center elementor-widget elementor-widget-post-info\" data-id=\"62ab203\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"post-info.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<ul class=\"elementor-inline-items elementor-icon-list-items elementor-post-info\">\n\t\t\t\t\t\t\t\t<li class=\"elementor-icon-list-item elementor-repeater-item-2c98363 elementor-inline-item\" itemprop=\"about\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-icon-list-text elementor-post-info__item elementor-post-info__item--type-terms\">\n\t\t\t\t\t\t\t\t\t\t<span class=\"elementor-post-info__terms-list\">\n\t\t\t\t<span class=\"elementor-post-info__terms-list-item\">Nav2<\/span>, <span class=\"elementor-post-info__terms-list-item\">Tutorials<\/span>\t\t\t\t<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t\t\t\t<\/li>\n\t\t\t\t<\/ul>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-0650e10 e-con-full e-flex e-con e-child\" data-id=\"0650e10\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-ac19582 elementor-view-default elementor-widget elementor-widget-icon\" data-id=\"ac19582\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"icon.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-icon-wrapper\">\n\t\t\t<div class=\"elementor-icon\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"75\" height=\"75\" viewBox=\"0 0 75 75\" fill=\"none\"><path d=\"M74.9999 75H13.1889V73.0002H71.5859L0.460938 1.87521L1.87515 0.460999L73.0001 71.586V13.1889H74.9999V75Z\" fill=\"white\"><\/path><\/svg>\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-47aa245d e-flex e-con-boxed e-con e-parent\" data-id=\"47aa245d\" data-element_type=\"container\" data-e-type=\"container\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-a3b40ae elementor-widget elementor-widget-image\" data-id=\"a3b40ae\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\" class=\"attachment-full size-full wp-image-1364\" alt=\"\" srcset=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp 1920w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller-300x169.webp 300w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller-1024x576.webp 1024w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller-768x432.webp 768w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller-1536x864.webp 1536w\" sizes=\"(max-width: 1920px) 100vw, 1920px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4bc31ca3 elementor-widget elementor-widget-text-editor\" data-id=\"4bc31ca3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3 data-start=\"27\" data-end=\"61\">Introduction to Motion Planning<\/h3><p data-start=\"63\" data-end=\"961\">Motion planning is a core component of autonomous robots that deals with <strong data-start=\"136\" data-end=\"168\">finding and following a path<\/strong> for the robot to reach a goal safely. High-level (global) motion planners typically compute a static path through the environment, but real robots operate in <strong data-start=\"327\" data-end=\"351\">dynamic environments<\/strong> where obstacles or conditions can change rapidly. This is why robots also need <strong data-start=\"431\" data-end=\"463\">real-time, reactive planning<\/strong> at the local level \u2013 a way to adjust their motion on the fly as the situation evolves. Local motion planners (or controllers) are responsible for <strong data-start=\"610\" data-end=\"639\">tracking the planned path<\/strong> and making minor adjustments to keep the robot on course (or avoid sudden obstacles) in between global re-planning events. The key is responsiveness: an autonomous robot must continually sense its position and surroundings and update its movements in <strong data-start=\"891\" data-end=\"904\">real time<\/strong> to stay on a safe, efficient trajectory toward its goal.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c5c30b4 elementor-widget elementor-widget-image\" data-id=\"c5c30b4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning.webp\" class=\"attachment-full size-full wp-image-1652\" alt=\"\" srcset=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning.webp 1920w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-300x169.webp 300w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-1024x576.webp 1024w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-768x432.webp 768w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-1536x864.webp 1536w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-18x10.webp 18w\" sizes=\"(max-width: 1920px) 100vw, 1920px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1b7c9c1 elementor-widget elementor-widget-text-editor\" data-id=\"1b7c9c1\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<ol data-start=\"764\" data-end=\"1106\"><li data-start=\"764\" data-end=\"919\"><p data-start=\"767\" data-end=\"919\"><strong data-start=\"767\" data-end=\"797\">Outer loop (Path Planning)<\/strong>: Computes a global path from start to goal using a (usually static) map. Executed infrequently due to computational cost.<\/p><\/li><li data-start=\"920\" data-end=\"1106\"><p data-start=\"923\" data-end=\"1106\"><strong data-start=\"923\" data-end=\"968\">Inner loop (Motion Planning \/ Controller)<\/strong>: Reactively tracks the path, using fresh sensor data to avoid dynamic obstacles and adapt to environment changes. Runs at high frequency.<\/p><\/li><\/ol><p data-start=\"1108\" data-end=\"1374\">A controller must convert the planned path into velocity commands that drive the robot to follow the path while reacting to unforeseen obstacles and execution errors (e.g., wheel slip). One widely used reactive path-tracking method is the <strong data-start=\"1347\" data-end=\"1363\">Pure Pursuit<\/strong> algorithm.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-63715e8 elementor-widget elementor-widget-text-editor\" data-id=\"63715e8\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3 data-start=\"963\" data-end=\"1015\">Pure Pursuit: A Reactive Path-Following Algorithm<\/h3><p data-start=\"1017\" data-end=\"1759\">One common reactive motion planning (or control) technique is the <strong data-start=\"1083\" data-end=\"1109\">Pure Pursuit algorithm<\/strong>. Pure Pursuit is a <strong data-start=\"1129\" data-end=\"1156\">geometric path tracking<\/strong> method originally developed for autonomous vehicles in the early 1990s. At its heart is a simple but effective strategy often illustrated with the metaphor of a <strong data-start=\"1318\" data-end=\"1342\">\u201ccarrot on a stick.\u201d<\/strong> Imagine the robot is chasing a carrot dangling in front of it \u2013 the carrot represents a point on the path a short distance ahead of the robot\u2019s current position. Instead of explicitly planning a complex turn, Pure Pursuit always aims the robot toward this <strong data-start=\"1599\" data-end=\"1618\">lookahead point<\/strong> (the \u201ccarrot\u201d), continuously updating the target as the robot moves. This allows the robot to smoothly follow the path in a reactive manner.<\/p><p data-start=\"1017\" data-end=\"1759\">\u00a0<\/p><p data-start=\"1761\" data-end=\"1853\"><strong data-start=\"1761\" data-end=\"1788\">How Pure Pursuit Works:<\/strong> At a high level, the algorithm can be summarized in a few steps:<\/p><ol data-start=\"1855\" data-end=\"2912\"><li data-start=\"1855\" data-end=\"1980\"><p data-start=\"1858\" data-end=\"1980\"><strong data-start=\"1858\" data-end=\"1883\">Get the current pose:<\/strong> Determine the robot\u2019s current position and orientation (e.g. from odometry or state estimation).<\/p><\/li><li data-start=\"1981\" data-end=\"2164\"><p data-start=\"1984\" data-end=\"2164\"><strong data-start=\"1984\" data-end=\"2013\">Select a lookahead point:<\/strong> On the planned path ahead of the robot, pick a target point at some fixed distance forward along the route. This is the \u201ccarrot\u201d the robot will chase.<\/p><\/li><li data-start=\"2165\" data-end=\"2468\"><p data-start=\"2168\" data-end=\"2468\"><strong data-start=\"2168\" data-end=\"2217\">Compute the curvature\/steering to that point:<\/strong> Calculate the curvature of the circular arc that would connect the robot\u2019s current pose to the lookahead point. This essentially gives the turning radius or steering angle needed for the robot to reach that point.<\/p><\/li><li data-start=\"2469\" data-end=\"2699\"><p data-start=\"2472\" data-end=\"2699\"><strong data-start=\"2472\" data-end=\"2499\">Command the velocities:<\/strong> Convert that curvature into actual velocity commands \u2013 typically a forward linear velocity and an angular velocity (or steering angle) \u2013 that will drive the robot along the arc toward the lookahead.<\/p><\/li><li data-start=\"2700\" data-end=\"2912\"><p data-start=\"2703\" data-end=\"2912\"><strong data-start=\"2703\" data-end=\"2727\">Repeat continuously:<\/strong> As the robot moves and the \u201ccarrot\u201d point is reached or updated, select a new lookahead point further along the path and repeat, so the robot incrementally moves along the entire path.<\/p><\/li><\/ol>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5c5a612 elementor-widget elementor-widget-image\" data-id=\"5c5a612\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-1.gif\" class=\"attachment-full size-full wp-image-1653\" alt=\"\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-027720d elementor-widget elementor-widget-text-editor\" data-id=\"027720d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3>\u2797 The Geometry and Math Behind Pure Pursuit<\/h3><p>Under the hood, Pure Pursuit uses simple geometry to compute the curvature of the path toward the lookahead (\u201ccarrot\u201d) point. If we express the lookahead point\u2019s coordinates relative to the robot\u2019s current pose (i.e., in the robot\u2019s reference frame with the robot at the origin facing the x-axis), let those coordinates be <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mo stretchy=\"false\">(<\/mo><mi>x<\/mi><mo separator=\"true\">,<\/mo><mi>y<\/mi><mo stretchy=\"false\">)<\/mo><\/mrow><annotation encoding=\"application\/x-tex\">(x, y)<\/annotation><\/semantics><\/math> \u2013 where <strong>x<\/strong> is the forward distance to the point and <strong>y<\/strong> is the lateral offset (positive <strong>y<\/strong> means the point is off to the left of the robot\u2019s heading, for example). We can define <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>L<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">L<\/annotation><\/semantics><\/math> as the straight-line distance to the lookahead point:<\/p><p><math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"> <mi>L<\/mi> <mo>=<\/mo> <msqrt> <mrow> <msup><mi>x<\/mi><mn>2<\/mn><\/msup> <mo>+<\/mo> <msup><mi>y<\/mi><mn>2<\/mn><\/msup> <\/mrow> <\/msqrt> <\/math><\/p><p>Using the geometry of a circle that connects the robot to that target point, the Pure Pursuit algorithm computes the required curvature <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>\u03ba<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">\\kappa<\/annotation><\/semantics><\/math> (reciprocal of the turning radius) of the arc. The formula for curvature is:<\/p><p><math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><msup><mi>L<\/mi><mn>2<\/mn><\/msup><mo>=<\/mo><msup><mi>x<\/mi><mn>2<\/mn><\/msup><mo>+<\/mo><msup><mi>y<\/mi><mn>2<\/mn><\/msup><\/mrow><annotation encoding=\"application\/x-tex\">L^2 = x^2 + y^2<\/annotation><\/semantics><\/math><\/p><p>Intuitively, this says that if the target point is directly ahead of the robot (<math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>y<\/mi><mo>=<\/mo><mn>0<\/mn><\/mrow><annotation encoding=\"application\/x-tex\">y = 0<\/annotation><\/semantics><\/math>), the curvature is zero (no turn needed).<\/p><p>If the point has an offset <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>y<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">y<\/annotation><\/semantics><\/math>, the robot must turn with curvature proportional to that offset and inversely proportional to the square of the distance. A larger lateral offset or a closer point yields a tighter turn (higher curvature).<\/p><p>Once the curvature <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>\u03ba<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">\\kappa<\/annotation><\/semantics><\/math> is known, the controller needs to convert it into actual motion commands. For a car-like robot (with a steering wheel and wheelbase <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>W<\/mi><mi>B<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">WB<\/annotation><\/semantics><\/math>), the required steering angle <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>\u03b4<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">\\delta<\/annotation><\/semantics><\/math> is related to curvature by:<\/p><p><math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"> <mi>\u03b4<\/mi> <mo>=<\/mo> <mrow> <mi>arctan<\/mi> <mo>\u2061<\/mo> <mfenced> <mrow><mi>WB<\/mi><mo>\u22c5<\/mo><mi>\u03ba<\/mi><\/mrow> <\/mfenced> <\/mrow> <\/math><\/p><p>which comes from simple geometry (for small angles, \u03b4<math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mo>\u2248<\/mo><mi>WB<\/mi><mo>\u22c5<\/mo><mi>\u03ba <\/mi><\/math>\u00a0since curvature is roughly the steering angle divided by wheelbase).<\/p><p>In fact, the common Pure Pursuit formula for steering is given by:<\/p><p><math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"> <mi>\u03b4<\/mi> <mo>=<\/mo> <mrow> <mi>arctan<\/mi> <mo>\u2061<\/mo> <mfenced> <mfrac> <mrow><mn>2<\/mn><mi>WB<\/mi><mo>\u22c5<\/mo><mi>sin<\/mi><mo>\u2061<\/mo><mi>\u03b1<\/mi><\/mrow> <mi>L<\/mi> <\/mfrac> <\/mfenced> <\/mrow> <\/math><\/p><p>where <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>\u03b1<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">\\alpha<\/annotation><\/semantics><\/math> is the angle from the robot\u2019s heading to the lookahead point. This is equivalent to the above curvature relationship.<\/p><p>For a differential-drive robot (which accepts linear and angular velocity commands), you can multiply the curvature by the forward velocity to get the needed angular velocity (since angular velocity = linear velocity \u00d7 curvature for a given turning radius). For example, if the robot drives forward at velocity <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>v<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">v<\/annotation><\/semantics><\/math>, one can set the angular velocity <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>\u03c9<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">\\omega<\/annotation><\/semantics><\/math>:<\/p><p><math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"> <mi>\u03c9<\/mi> <mo>=<\/mo> <mi>\u03ba<\/mi> <mo>\u22c5<\/mo> <mi>v<\/mi> <\/math><\/p><p>(assuming <math xmlns=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><semantics><mrow><mi>v<\/mi><\/mrow><annotation encoding=\"application\/x-tex\">v<\/annotation><\/semantics><\/math> is not zero). The Pure Pursuit controller often keeps a roughly constant forward speed and adjusts the turn rate (or steering angle) based on curvature to smoothly reach the lookahead point.<\/p><hr \/>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c8775b8 elementor-widget elementor-widget-image\" data-id=\"c8775b8\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1.webp\" class=\"attachment-full size-full wp-image-1654\" alt=\"\" srcset=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1.webp 1920w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1-300x169.webp 300w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1-1024x576.webp 1024w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1-768x432.webp 768w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1-1536x864.webp 1536w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-3-1-18x10.webp 18w\" sizes=\"(max-width: 1920px) 100vw, 1920px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-87c55c3 elementor-widget elementor-widget-text-editor\" data-id=\"87c55c3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"2689\" data-end=\"2734\">A critical parameter here is the <strong data-start=\"2947\" data-end=\"2969\">lookahead distance<\/strong> \u2013 how far ahead of the robot to choose the target point. A larger lookahead distance tends to make the path following <strong data-start=\"3088\" data-end=\"3100\">smoother<\/strong> and less sensitive to small path deviations, but if it\u2019s too large the robot may cut corners or react slowly to sharp turns. A smaller lookahead distance makes the robot track the path <strong data-start=\"3286\" data-end=\"3317\">more tightly and accurately<\/strong>, but can cause jerky or sharp movements if the path curvature is high. In practice, this distance is tuned to balance smoothness and responsiveness for the specific robot and environment.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5f0b1e5 elementor-widget elementor-widget-image\" data-id=\"5f0b1e5\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"1080\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5.webp\" class=\"attachment-full size-full wp-image-1656\" alt=\"\" srcset=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5.webp 1920w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5-300x169.webp 300w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5-1024x576.webp 1024w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5-768x432.webp 768w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5-1536x864.webp 1536w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/5-Motion-Planning-5-18x10.webp 18w\" sizes=\"(max-width: 1920px) 100vw, 1920px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-637e795 elementor-widget elementor-widget-text-editor\" data-id=\"637e795\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3><b>\ud83d\udcd6 Pure Pursuit in ROS 2: A Python Implementation<\/b><\/h3><p>Below is a full Python implementation of a Pure Pursuit controller as a ROS\u00a02 node using rclpy. This node subscribes to a planned path and the robot\u2019s odometry, then computes and publishes velocity commands to follow the path using the Pure Pursuit logic described above. The code is written to be clear and straightforward for educational purposes:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f5a6432 elementor-widget elementor-widget-code-highlight\" data-id=\"f5a6432\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>#!\/usr\/bin\/env python3\r\n\r\nimport rclpy\r\nfrom rclpy.node import Node\r\nfrom geometry_msgs.msg import Twist, PoseStamped, Pose\r\nfrom nav_msgs.msg import Path\r\nfrom tf2_ros import Buffer, TransformListener\r\nimport math\r\nfrom tf_transformations import quaternion_matrix, quaternion_from_matrix, translation_from_matrix, inverse_matrix, concatenate_matrices\r\n\r\n\r\nclass PurePursuit(Node):\r\n    def __init__(self):\r\n        super().__init__(\"pure_pursuit_motion_planner_node\")\r\n\r\n        # TF buffer and listener\r\n        self.tf_buffer = Buffer()\r\n        self.tf_listener = TransformListener(self.tf_buffer, self)\r\n\r\n        # Parameters\r\n        self.declare_parameter(\"look_ahead_distance\", 0.5)\r\n        self.declare_parameter(\"max_linear_velocity\", 0.3)\r\n        self.declare_parameter(\"max_angular_velocity\", 1.0)\r\n\r\n        self.look_ahead_distance = self.get_parameter(\"look_ahead_distance\").value\r\n        self.max_linear_velocity = self.get_parameter(\"max_linear_velocity\").value\r\n        self.max_angular_velocity = self.get_parameter(\"max_angular_velocity\").value\r\n\r\n        # Subscribers and publishers\r\n        self.path_sub = self.create_subscription(Path, \"\/a_star\/path\", self.path_callback, 10)\r\n        self.cmd_pub = self.create_publisher(Twist, \"\/cmd_vel\", 10)\r\n        self.carrot_pose_pub = self.create_publisher(PoseStamped, \"\/pure_pursuit\/carrot\", 10)\r\n\r\n        # Control loop\r\n        self.timer = self.create_timer(0.1, self.control_loop)\r\n        self.global_plan = None\r\n\r\n    def path_callback(self, path: Path):\r\n        self.global_plan = path\r\n\r\n    def control_loop(self):\r\n        if not self.global_plan or not self.global_plan.poses:\r\n            return\r\n\r\n        # Get the robot's current pose in the odom frame\r\n        try:\r\n            robot_pose_transform = self.tf_buffer.lookup_transform(\r\n                \"odom\", \"base_footprint\", rclpy.time.Time())\r\n        except Exception as ex:\r\n            self.get_logger().warn(f\"Could not transform: {ex}\")\r\n            return\r\n\r\n        # Transform plan to robot's frame\r\n        if not self.transform_plan(robot_pose_transform.header.frame_id):\r\n            self.get_logger().error(\"Unable to transform Plan in robot's frame\")\r\n            return\r\n\r\n        robot_pose = PoseStamped()\r\n        robot_pose.header.frame_id = robot_pose_transform.header.frame_id\r\n        robot_pose.pose.position.x = robot_pose_transform.transform.translation.x\r\n        robot_pose.pose.position.y = robot_pose_transform.transform.translation.y\r\n        robot_pose.pose.orientation = robot_pose_transform.transform.rotation\r\n\r\n        carrot_pose: PoseStamped = self.get_carrot_pose(robot_pose)\r\n        dx = carrot_pose.pose.position.x - robot_pose.pose.position.x\r\n        dy = carrot_pose.pose.position.y - robot_pose.pose.position.y\r\n        distance = math.sqrt(dx ** 2 + dy ** 2)\r\n\r\n        if distance <= 0.1:\r\n            self.get_logger().info(\"Goal Reached!\")\r\n            self.global_plan.poses.clear()\r\n            return\r\n\r\n        self.carrot_pose_pub.publish(carrot_pose)\r\n\r\n        # Calculate the curvature to the look-ahead point\r\n        robot_tf = quaternion_matrix([\r\n            robot_pose.pose.orientation.x,\r\n            robot_pose.pose.orientation.y,\r\n            robot_pose.pose.orientation.z,\r\n            robot_pose.pose.orientation.w,\r\n        ])\r\n        robot_tf[0][3] = robot_pose.pose.position.x\r\n        robot_tf[1][3] = robot_pose.pose.position.y\r\n\r\n        carrot_pose_tf = quaternion_matrix([\r\n            carrot_pose.pose.orientation.x,\r\n            carrot_pose.pose.orientation.y,\r\n            carrot_pose.pose.orientation.z,\r\n            carrot_pose.pose.orientation.w,\r\n        ])\r\n        carrot_pose_tf[0][3] = carrot_pose.pose.position.x\r\n        carrot_pose_tf[1][3] = carrot_pose.pose.position.y\r\n\r\n        carrot_pose_robot_tf = concatenate_matrices(inverse_matrix(robot_tf), carrot_pose_tf)\r\n        carrot_pose_robot = PoseStamped()\r\n        carrot_pose_robot.pose.position.x = carrot_pose_robot_tf[0][3]\r\n        carrot_pose_robot.pose.position.y = carrot_pose_robot_tf[1][3]\r\n        carrot_pose_robot.pose.position.z = carrot_pose_robot_tf[2][3]\r\n        quaternion = quaternion_from_matrix(carrot_pose_robot_tf)\r\n        carrot_pose_robot.pose.orientation.x = quaternion[0]\r\n        carrot_pose_robot.pose.orientation.y = quaternion[1]\r\n        carrot_pose_robot.pose.orientation.z = quaternion[2]\r\n        carrot_pose_robot.pose.orientation.w = quaternion[3]\r\n\r\n        curvature = self.get_curvature(carrot_pose_robot.pose)\r\n\r\n        #  Create and publish the velocity command\r\n        cmd_vel = Twist()\r\n        cmd_vel.linear.x = self.max_linear_velocity\r\n        cmd_vel.angular.z = curvature * self.max_angular_velocity\r\n        self.cmd_pub.publish(cmd_vel)\r\n\r\n    def get_carrot_pose(self, robot_pose: PoseStamped) -> PoseStamped:\r\n        carrot_pose = self.global_plan.poses[-1]\r\n        for pose in reversed(self.global_plan.poses):\r\n            dx = pose.pose.position.x - robot_pose.pose.position.x\r\n            dy = pose.pose.position.y - robot_pose.pose.position.y\r\n            distance = math.sqrt(dx * dx + dy * dy)\r\n            if distance > self.look_ahead_distance:\r\n                carrot_pose = pose\r\n            else:\r\n                break\r\n        return carrot_pose\r\n    \r\n    def get_curvature(self, carrot_pose: Pose):\r\n        carrot_dist = carrot_pose.position.x ** 2 + carrot_pose.position.y ** 2\r\n        if carrot_dist > 0.001:\r\n            return 2.0 * carrot_pose.position.y \/ carrot_dist\r\n        else:\r\n            return 0.0\r\n\r\n    def transform_plan(self, frame):\r\n        if self.global_plan.header.frame_id == frame:\r\n            return True\r\n\r\n        try:\r\n            transform = self.tf_buffer.lookup_transform(\r\n                frame, self.global_plan.header.frame_id, rclpy.time.Time())\r\n        except Exception as ex:\r\n            self.get_logger().error(\r\n                f\"Couldn't transform plan from frame {self.global_plan.header.frame_id} to {frame}: {ex}\")\r\n            return False\r\n        \r\n        transform_matrix = quaternion_matrix([\r\n                transform.transform.rotation.x,\r\n                transform.transform.rotation.y,\r\n                transform.transform.rotation.z,\r\n                transform.transform.rotation.w,\r\n            ])\r\n        transform_matrix[0][3] = transform.transform.translation.x\r\n        transform_matrix[1][3] = transform.transform.translation.y\r\n\r\n        for pose in self.global_plan.poses:\r\n            pose_matrix = quaternion_matrix([\r\n                pose.pose.orientation.x,\r\n                pose.pose.orientation.y,\r\n                pose.pose.orientation.z,\r\n                pose.pose.orientation.w,\r\n            ])\r\n            pose_matrix[0][3] = pose.pose.position.x\r\n            pose_matrix[1][3] = pose.pose.position.y\r\n\r\n            transformed_pose = concatenate_matrices(pose_matrix, transform_matrix)\r\n            [pose.pose.orientation.x,pose.pose.orientation.y, pose.pose.orientation.z, pose.pose.orientation.w] = quaternion_from_matrix(transformed_pose)\r\n            [pose.pose.position.x, pose.pose.position.y, pose.pose.position.z] = translation_from_matrix(transformed_pose)\r\n\r\n        self.global_plan.header.frame_id = frame\r\n        return True\r\n\r\n\r\ndef main(args=None):\r\n    rclpy.init(args=args)\r\n    pd_motion_planner = PurePursuit()\r\n    rclpy.spin(pd_motion_planner)\r\n    pd_motion_planner.destroy_node()\r\n    rclpy.shutdown()\r\n\r\nif __name__ == \"__main__\":\r\n    main()<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a9168f4 elementor-widget elementor-widget-text-editor\" data-id=\"a9168f4\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3><b>\ud83d\udd0e <\/b><b>Breaking down the Code<\/b><\/h3><p>At its heart, the algorithm translates the robot&#8217;s map into a grid and then plays a game of finding the shortest path from a start point to a goal.<\/p><p>\u00a0<\/p><h4 data-start=\"12551\" data-end=\"12602\">1. Subscribing to the Path and Transforming Frames<\/h4><p data-start=\"12604\" data-end=\"12765\">In the constructor of the node (<code data-start=\"12636\" data-end=\"12668\">PurePursuitController.__init__<\/code>), we set up subscribers for the path and odometry topics, and a publisher for velocity commands:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-75a2bc3 elementor-widget elementor-widget-code-highlight\" data-id=\"75a2bc3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp># In the node's constructor:\r\nself.path_sub = self.create_subscription(Path, '\/path', self.path_callback, 10)\r\nself.odom_sub = self.create_subscription(Odometry, '\/odom', self.odom_callback, 10)\r\nself.cmd_pub = self.create_publisher(Twist, '\/cmd_vel', 10)<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-7246002 elementor-widget elementor-widget-text-editor\" data-id=\"7246002\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>When a new path is received, the <code data-start=\"13068\" data-end=\"13083\">path_callback<\/code> simply stores it in <code data-start=\"13104\" data-end=\"13123\">self.current_path<\/code>, and the <code data-start=\"13133\" data-end=\"13148\">odom_callback<\/code> stores the latest robot pose in <code data-start=\"13181\" data-end=\"13200\">self.current_pose<\/code>. The core logic runs in a timed loop (<code data-start=\"13239\" data-end=\"13253\">control_loop<\/code>) at 10 Hz. In that loop, the first step is to get the robot\u2019s current position and orientation (yaw) and then <strong data-start=\"13364\" data-end=\"13424\">transform the path points to the robot\u2019s reference frame<\/strong>. We compute the transform by subtracting the robot\u2019s position and rotating by the negative of the robot\u2019s yaw angle:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5c9f1eb elementor-widget elementor-widget-code-highlight\" data-id=\"5c9f1eb\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>px = self.current_pose.position.x\r\npy = self.current_pose.position.y\r\n# Convert robot orientation (quaternion) to yaw angle\r\nq = self.current_pose.orientation\r\n_, _, yaw = euler_from_quaternion([q.x, q.y, q.z, q.w])\r\n\r\nfor pose_stamped in self.current_path.poses:\r\n    gx = pose_stamped.pose.position.x\r\n    gy = pose_stamped.pose.position.y\r\n    # Translate to robot-centric coordinates\r\n    dx = gx - px\r\n    dy = gy - py\r\n    # Rotate into robot frame (apply -yaw rotation)\r\n    x_r = math.cos(yaw) * dx + math.sin(yaw) * dy\r\n    y_r = -math.sin(yaw) * dx + math.cos(yaw) * dy\r\n    ...\r\n<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c63aaef elementor-widget elementor-widget-text-editor\" data-id=\"c63aaef\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Every time the algorithm discovers a valid new cell to explore, it&#8217;s added to this queue. The <em><strong>PriorityQueue<\/strong> <\/em>automatically handles the sorting, ensuring the next cell to be processed is always the most promising one (i.e., the cheapest to get to).<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a802803 elementor-widget elementor-widget-code-highlight\" data-id=\"a802803\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp># From the plan() function\nnew_node = GraphNode(new_node_coords.x, new_node_coord.y, new_node_cost, active_node)\npending_nodes.put(new_node)<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-af2faf0 elementor-widget elementor-widget-text-editor\" data-id=\"af2faf0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"14132\" data-end=\"14450\">Here, $(gx, gy)$ are the global coordinates of a path point, and $(x_r, y_r)$ are the coordinates of that point in the robot\u2019s local frame. After this transformation, the robot is effectively at the origin facing the positive x-axis, so it becomes easy to evaluate which path point is \u201cahead\u201d of the robot and how far.<\/p><p data-start=\"14452\" data-end=\"14622\"><strong data-start=\"14452\" data-end=\"14461\">Note:<\/strong> We skip any path points that end up <strong data-start=\"14498\" data-end=\"14518\">behind the robot<\/strong> in its local frame (i.e., with $x_r &lt; 0$), since the robot should only consider targets in front of it.<\/p><p data-start=\"14452\" data-end=\"14622\">\u00a0<\/p><h4 data-start=\"14624\" data-end=\"14666\">2. Selecting the Lookahead \u201cCarrot\u201d Point<\/h4><p data-start=\"14668\" data-end=\"14875\">Within the same loop, the code scans through the transformed path points to find the first one that is at least the lookahead distance away from the robot. This point will serve as the \u201ccarrot\u201d for guidance:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-121c766 elementor-widget elementor-widget-code-highlight\" data-id=\"121c766\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>    ...\r\n    distance = math.sqrt(x_r*x_r + y_r*y_r)\r\n    if distance >= self.lookahead_distance:\r\n        # Found a suitable lookahead point\r\n        lookahead_point = (x_r, y_r, distance)\r\n        break<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-764ffcf elementor-widget elementor-widget-text-editor\" data-id=\"764ffcf\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"15092\" data-end=\"15542\">We compute the Euclidean distance of each point from the robot and compare it to the predefined <code data-start=\"15233\" data-end=\"15253\">lookahead_distance<\/code>. The loop picks the <strong data-start=\"15274\" data-end=\"15326\">first point that lies at or beyond this distance<\/strong> in front of the robot. By choosing the first such point, we essentially pick the closest point ahead of the robot that is at least the lookahead distance away \u2013 this tends to be a good target to aim for on the path.<\/p><p data-start=\"15544\" data-end=\"15830\">If the loop finishes without finding any point beyond the lookahead distance (which can happen if the robot is near the end of the path), the code falls back to using the <strong data-start=\"15715\" data-end=\"15741\">last point of the path<\/strong> as the target. This ensures the robot will drive to the end of the path (and then stop).<\/p><p data-start=\"15544\" data-end=\"15830\">\u00a0<\/p><h4 data-start=\"15832\" data-end=\"15879\">3. Calculating Curvature and Velocity Commands<\/h4><p data-start=\"15881\" data-end=\"16205\">Once the lookahead point is determined (in robot-frame coordinates), the controller calculates the curvature needed to reach that point and then generates the velocity command accordingly. In code:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-7d4e6c9 elementor-widget elementor-widget-code-highlight\" data-id=\"7d4e6c9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>x_r, y_r, L = lookahead_point\r\ncurvature = 2 * y_r \/ (L * L)<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-17f2b9e elementor-widget elementor-widget-text-editor\" data-id=\"17f2b9e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>This curvature is then used to compute the actual velocities. We keep the linear velocity <code data-start=\"16372\" data-end=\"16384\">linear_vel<\/code> constant (as set by a parameter, e.g. 0.5\u00a0m\/s), and compute the angular velocity <code data-start=\"16466\" data-end=\"16479\">angular_vel<\/code> by multiplying curvature with linear velocity:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8dc4ee5 elementor-widget elementor-widget-code-highlight\" data-id=\"8dc4ee5\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>linear_vel = self.linear_speed\r\nangular_vel = curvature * linear_vel<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-60b3ac5 elementor-widget elementor-widget-text-editor\" data-id=\"60b3ac5\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Finally, we create a <code data-start=\"16692\" data-end=\"16699\">Twist<\/code> message with this linear and angular velocity and publish it:<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1772d2f elementor-widget elementor-widget-code-highlight\" data-id=\"1772d2f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-python line-numbers\">\n\t\t\t\t<code readonly=\"true\" class=\"language-python\">\n\t\t\t\t\t<xmp>cmd_msg = Twist()\r\ncmd_msg.linear.x = linear_vel\r\ncmd_msg.angular.z = angular_vel\r\nself.cmd_pub.publish(cmd_msg)<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f38044b elementor-widget elementor-widget-text-editor\" data-id=\"f38044b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p data-start=\"16888\" data-end=\"17258\">This Twist message will cause the robot to drive forward and turn with the calculated angular velocity, steering it along the circular arc toward the lookahead point. As the robot moves, the whole process will repeat: new odometry comes in, the lookahead point shifts forward along the path, and new velocity commands are computed, creating a smooth pursuit of the path.<\/p><p data-start=\"16888\" data-end=\"17258\">\u00a0<\/p><p data-start=\"17260\" data-end=\"17695\"><strong data-start=\"17260\" data-end=\"17285\">Stopping at the Goal:<\/strong> In the code, after publishing the command, we also included an optional check for goal arrival \u2013 if the robot\u2019s distance to the final goal is below a small threshold (e.g., 0.1\u00a0m), the controller publishes a zero velocity to stop the robot. This prevents continuous small movements once the goal is reached. In a full implementation, you might also signal completion or shut down the controller at this point.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f596ece elementor-widget elementor-widget-text-editor\" data-id=\"f596ece\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<h3>\ud83e\uddf0 Hands-on: The fun part<\/h3><p>Time to put all the boring theory into practice and create some cool maps!<\/p><p>Clone the <a class=\"d-block overflow-x-hidden color-fg-default\" href=\"https:\/\/github.com\/AntoBrandi\/Self-Driving-and-ROS-2-Learn-by-Doing-Plan-Navigation\" data-pjax=\"#repo-content-pjax-container\" data-turbo-frame=\"repo-content-turbo-frame\">Self-Driving-and-ROS-2-Learn-by-Doing-Plan-Navigation<\/a>\u00a0repository on your PC<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-507331c elementor-widget elementor-widget-code-highlight\" data-id=\"507331c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>git clone https:\/\/github.com\/AntoBrandi\/Self-Driving-and-ROS-2-Learn-by-Doing-Plan-Navigation.git<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4287516 elementor-widget elementor-widget-text-editor\" data-id=\"4287516\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Go to the <em>bumperbot workspace<\/em> folder in the <em>Section5_Motion_Planning<\/em>\u00a0folder<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-2bcc1cb elementor-widget elementor-widget-code-highlight\" data-id=\"2bcc1cb\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>cd Self-Driving-and-ROS-2-Learn-by-Doing-Plan-Navigation\/Section5_Motion_Planning\/bumperbot_ws<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-270e161 elementor-widget elementor-widget-text-editor\" data-id=\"270e161\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Install all the required dependencies using <a href=\"https:\/\/docs.ros.org\/en\/humble\/Tutorials\/Intermediate\/Rosdep.html\">rosdep<\/a><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e2b1c63 elementor-widget elementor-widget-code-highlight\" data-id=\"e2b1c63\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>rosdep install --from-paths src -y --ignore-src<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a3f1a2b elementor-widget elementor-widget-text-editor\" data-id=\"a3f1a2b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Build the workspace to compile it<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e41791a elementor-widget elementor-widget-code-highlight\" data-id=\"e41791a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>colcon build<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b764423 elementor-widget elementor-widget-text-editor\" data-id=\"b764423\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>In a new window of the terminal, source the <strong>bumperbot_ws<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3c53cbb elementor-widget elementor-widget-code-highlight\" data-id=\"3c53cbb\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>. install\/setup.bash<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-93c6185 elementor-widget elementor-widget-text-editor\" data-id=\"93c6185\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Launch the simulation and start navigating!<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a76962b elementor-widget elementor-widget-code-highlight\" data-id=\"a76962b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>ros2 launch bumperbot_bringup simulated_robot.launch.py use_slam:=True world_name:=small_house<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-251098a elementor-widget elementor-widget-text-editor\" data-id=\"251098a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Once the simulation is up and running, you can open a new window of the terminal, source the <strong>bumperbot_ws<\/strong> also in the new window, and run the <strong>a_star_planner<\/strong> node.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5a2dfe9 elementor-widget elementor-widget-code-highlight\" data-id=\"5a2dfe9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>. install\/setup.bash\nros2 run bumperbot_planning a_star_planner<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-473817c elementor-widget elementor-widget-text-editor\" data-id=\"473817c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Finally, to make the robot follow the path calculated by the <strong>a_star_planner<\/strong> node, we can start our implementation of the Pure Pursuit motion planner.<\/p><p>So in another window of the terminal, let&#8217;s source the <strong>bumperbot_ws<\/strong> and run the <strong>pure_pursuit<\/strong> node.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-0915f4f elementor-widget elementor-widget-code-highlight\" data-id=\"0915f4f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"code-highlight.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"prismjs-tomorrow copy-to-clipboard \">\n\t\t\t<pre data-line=\"\" class=\"highlight-height language-bash \">\n\t\t\t\t<code readonly=\"true\" class=\"language-bash\">\n\t\t\t\t\t<xmp>. install\/setup.bash\nros2 run bumperbot_motion pure_pursuit.py<\/xmp>\n\t\t\t\t<\/code>\n\t\t\t<\/pre>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f1c947b elementor-widget elementor-widget-text-editor\" data-id=\"f1c947b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>In RViz2, add the visualization of the <strong>\/a_star\/path<\/strong> and <strong>\/a_star\/visited_map<\/strong> topics, along with the <strong>\/pure_pursuit\/carrot<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d6e6e8c elementor-widget elementor-widget-image\" data-id=\"d6e6e8c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"800\" height=\"402\" src=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-1024x515.png\" class=\"attachment-large size-large wp-image-1680\" alt=\"\" srcset=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-1024x515.png 1024w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-300x151.png 300w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-768x386.png 768w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-1536x773.png 1536w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656-18x9.png 18w, https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/Screenshot-2025-06-15-154656.png 1902w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-2a88474 elementor-widget elementor-widget-text-editor\" data-id=\"2a88474\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Then, you can finally use the &#8220;2D Goal Pose&#8221; RViz tool to select a pose in the map and trigger the planner.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d1c59bc elementor-widget elementor-widget-html\" data-id=\"d1c59bc\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"html.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<div class=\"yt-wrapper\">\r\n\t<div class=\"yt-frame-container\">\r\n\t\t<iframe src=\"https:\/\/www.youtube.com\/embed\/Ha1rBIolqlg?playlist=Ha1rBIolqlg&autoplay=1&mute=1&loop=1&color=white&controls=0\" title=\"YouTube video player\" frameborder=\"0\" allow=\"accelerometer; autoplay; loop; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\r\n\t<\/div>\r\n<\/div>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-63a9e19 e-con-full e-flex e-con e-parent\" data-id=\"63a9e19\" data-element_type=\"container\" data-e-type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-f4cc46c elementor-bg-transform elementor-bg-transform-move-left elementor-cta--layout-image-left elementor-cta--mobile-layout-image-above elementor-cta--skin-classic elementor-animated-content elementor-widget elementor-widget-call-to-action\" data-id=\"f4cc46c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"call-to-action.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-cta\">\n\t\t\t\t\t<div class=\"elementor-cta__bg-wrapper\">\n\t\t\t\t<div class=\"elementor-cta__bg elementor-bg\" style=\"background-image: url(https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/plan_navigation.webp);\" role=\"img\" aria-label=\"plan_navigation\"><\/div>\n\t\t\t\t<div class=\"elementor-cta__bg-overlay\"><\/div>\n\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-cta__content\">\n\t\t\t\t\n\t\t\t\t\t\t\t\t\t<h2 class=\"elementor-cta__title elementor-cta__content-item elementor-content-item\">\n\t\t\t\t\t\tWant to learn more?\t\t\t\t\t<\/h2>\n\t\t\t\t\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-cta__description elementor-cta__content-item elementor-content-item\">\n\t\t\t\t\t\tYou can find a detailed explaination of the Pure Pursuit algorithm, along with several other Motion Planning algorithms in  the \"Self Driving and ROS 2 - Learn by Doing! Plan &amp; Navigation\" course\t\t\t\t\t<\/div>\n\t\t\t\t\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-cta__button-wrapper elementor-cta__content-item elementor-content-item \">\n\t\t\t\t\t<a class=\"elementor-cta__button elementor-button elementor-size-\" href=\"\" target=\"_blank\">\n\t\t\t\t\t\tEnroll Now\t\t\t\t\t<\/a>\n\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<\/div>\n\t\t\t\t\t\t\t<div class=\"elementor-ribbon elementor-ribbon-right\">\n\t\t\t\t<div class=\"elementor-ribbon-inner\">\n\t\t\t\t\tDISCOUNT\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b26ac12 elementor-widget elementor-widget-spacer\" data-id=\"b26ac12\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Introduction to Motion Planning Motion planning is a core component of autonomous robots that deals with finding and following a path for the robot to reach a goal safely. High-level (global) motion planners typically compute a static path through the environment, but real robots operate in dynamic environments where obstacles or conditions can change rapidly. [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":1364,"comment_status":"closed","ping_status":"open","sticky":false,"template":"elementor_header_footer","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[49,43],"tags":[101,96,110,122,98,100,75,71,107,72,102],"class_list":["post-1578","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-nav2","category-tutorials","tag-autonomous-navigation","tag-nav2","tag-navigation","tag-pure-pursuit","tag-python","tag-robot","tag-robotics","tag-ros","tag-ros-2","tag-ros2","tag-self-driving"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Pure Pursuit Controller - Learn by Doing!<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/learnbydoing.dev\/es\/pure-pursuit-controller\/\" \/>\n<meta property=\"og:locale\" content=\"es_ES\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Pure Pursuit Controller - Learn by Doing!\" \/>\n<meta property=\"og:description\" content=\"Introduction to Motion Planning Motion planning is a core component of autonomous robots that deals with finding and following a path for the robot to reach a goal safely. High-level (global) motion planners typically compute a static path through the environment, but real robots operate in dynamic environments where obstacles or conditions can change rapidly. [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/learnbydoing.dev\/es\/pure-pursuit-controller\/\" \/>\n<meta property=\"og:site_name\" content=\"Learn by Doing!\" \/>\n<meta property=\"article:published_time\" content=\"2025-06-12T21:00:56+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-01-10T22:50:42+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Escrito por\" \/>\n\t<meta name=\"twitter:data1\" content=\"\" \/>\n\t<meta name=\"twitter:label2\" content=\"Tiempo de lectura\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutos\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/\"},\"author\":{\"name\":\"\",\"@id\":\"\"},\"headline\":\"Pure Pursuit Controller\",\"datePublished\":\"2025-06-12T21:00:56+00:00\",\"dateModified\":\"2026-01-10T22:50:42+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/\"},\"wordCount\":2004,\"publisher\":{\"@id\":\"https:\/\/learnbydoing.dev\/es\/#organization\"},\"image\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\",\"keywords\":[\"autonomous navigation\",\"nav2\",\"navigation\",\"pure pursuit\",\"python\",\"robot\",\"Robotics\",\"ROS\",\"ROS 2\",\"ROS2\",\"self-driving\"],\"articleSection\":[\"Nav2\",\"Tutorials\"],\"inLanguage\":\"es\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/\",\"url\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/\",\"name\":\"Pure Pursuit Controller - Learn by Doing!\",\"isPartOf\":{\"@id\":\"https:\/\/learnbydoing.dev\/es\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\",\"datePublished\":\"2025-06-12T21:00:56+00:00\",\"dateModified\":\"2026-01-10T22:50:42+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#breadcrumb\"},\"inLanguage\":\"es\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage\",\"url\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\",\"contentUrl\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp\",\"width\":1920,\"height\":1080},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/learnbydoing.dev\/es\/learn-by-doing-es\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Pure Pursuit Controller\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/learnbydoing.dev\/es\/#website\",\"url\":\"https:\/\/learnbydoing.dev\/es\/\",\"name\":\"Learn by Doing!\",\"description\":\"Learn Robotics the fun way\",\"publisher\":{\"@id\":\"https:\/\/learnbydoing.dev\/es\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/learnbydoing.dev\/es\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"es\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/learnbydoing.dev\/es\/#organization\",\"name\":\"Learn by Doing!\",\"url\":\"https:\/\/learnbydoing.dev\/es\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"es\",\"@id\":\"https:\/\/learnbydoing.dev\/es\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/cropped-cropped-cropped-Progetto-senza-titolo-6-1.png\",\"contentUrl\":\"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/cropped-cropped-cropped-Progetto-senza-titolo-6-1.png\",\"width\":512,\"height\":512,\"caption\":\"Learn by Doing!\"},\"image\":{\"@id\":\"https:\/\/learnbydoing.dev\/es\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.linkedin.com\/in\/antonio-brandi-512166bb\/\"]},{\"@type\":\"Person\",\"@id\":\"\",\"url\":\"https:\/\/learnbydoing.dev\/es\/author\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Pure Pursuit Controller - Learn by Doing!","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/learnbydoing.dev\/es\/pure-pursuit-controller\/","og_locale":"es_ES","og_type":"article","og_title":"Pure Pursuit Controller - Learn by Doing!","og_description":"Introduction to Motion Planning Motion planning is a core component of autonomous robots that deals with finding and following a path for the robot to reach a goal safely. High-level (global) motion planners typically compute a static path through the environment, but real robots operate in dynamic environments where obstacles or conditions can change rapidly. [&hellip;]","og_url":"https:\/\/learnbydoing.dev\/es\/pure-pursuit-controller\/","og_site_name":"Learn by Doing!","article_published_time":"2025-06-12T21:00:56+00:00","article_modified_time":"2026-01-10T22:50:42+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp","type":"image\/webp"}],"twitter_card":"summary_large_image","twitter_misc":{"Escrito por":"","Tiempo de lectura":"11 minutos"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#article","isPartOf":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/"},"author":{"name":"","@id":""},"headline":"Pure Pursuit Controller","datePublished":"2025-06-12T21:00:56+00:00","dateModified":"2026-01-10T22:50:42+00:00","mainEntityOfPage":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/"},"wordCount":2004,"publisher":{"@id":"https:\/\/learnbydoing.dev\/es\/#organization"},"image":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage"},"thumbnailUrl":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp","keywords":["autonomous navigation","nav2","navigation","pure pursuit","python","robot","Robotics","ROS","ROS 2","ROS2","self-driving"],"articleSection":["Nav2","Tutorials"],"inLanguage":"es"},{"@type":"WebPage","@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/","url":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/","name":"Pure Pursuit Controller - Learn by Doing!","isPartOf":{"@id":"https:\/\/learnbydoing.dev\/es\/#website"},"primaryImageOfPage":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage"},"image":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage"},"thumbnailUrl":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp","datePublished":"2025-06-12T21:00:56+00:00","dateModified":"2026-01-10T22:50:42+00:00","breadcrumb":{"@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#breadcrumb"},"inLanguage":"es","potentialAction":[{"@type":"ReadAction","target":["https:\/\/learnbydoing.dev\/pure-pursuit-controller\/"]}]},{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#primaryimage","url":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp","contentUrl":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/pure_pursuit_controller.webp","width":1920,"height":1080},{"@type":"BreadcrumbList","@id":"https:\/\/learnbydoing.dev\/pure-pursuit-controller\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/learnbydoing.dev\/es\/learn-by-doing-es\/"},{"@type":"ListItem","position":2,"name":"Pure Pursuit Controller"}]},{"@type":"WebSite","@id":"https:\/\/learnbydoing.dev\/es\/#website","url":"https:\/\/learnbydoing.dev\/es\/","name":"Learn by Doing!","description":"Learn Robotics the fun way","publisher":{"@id":"https:\/\/learnbydoing.dev\/es\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/learnbydoing.dev\/es\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"es"},{"@type":"Organization","@id":"https:\/\/learnbydoing.dev\/es\/#organization","name":"Learn by Doing!","url":"https:\/\/learnbydoing.dev\/es\/","logo":{"@type":"ImageObject","inLanguage":"es","@id":"https:\/\/learnbydoing.dev\/es\/#\/schema\/logo\/image\/","url":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/cropped-cropped-cropped-Progetto-senza-titolo-6-1.png","contentUrl":"https:\/\/learnbydoing.dev\/wp-content\/uploads\/2025\/06\/cropped-cropped-cropped-Progetto-senza-titolo-6-1.png","width":512,"height":512,"caption":"Learn by Doing!"},"image":{"@id":"https:\/\/learnbydoing.dev\/es\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.linkedin.com\/in\/antonio-brandi-512166bb\/"]},{"@type":"Person","@id":"","url":"https:\/\/learnbydoing.dev\/es\/author\/"}]}},"_links":{"self":[{"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/posts\/1578","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/comments?post=1578"}],"version-history":[{"count":103,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/posts\/1578\/revisions"}],"predecessor-version":[{"id":5640,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/posts\/1578\/revisions\/5640"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/media\/1364"}],"wp:attachment":[{"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/media?parent=1578"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/categories?post=1578"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/learnbydoing.dev\/es\/wp-json\/wp\/v2\/tags?post=1578"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}