User contributions for Mr. MacKenty
30 January 2024
- 12:5412:54, 30 January 2024 diff hist +364 Archived classes No edit summary
23 January 2024
- 00:0300:03, 23 January 2024 diff hist +30 Socket No edit summary current
- 00:0100:01, 23 January 2024 diff hist +2,612 Socket No edit summary
22 January 2024
- 23:5923:59, 22 January 2024 diff hist −13 Socket No edit summary
- 23:5823:58, 22 January 2024 diff hist −1,291 Socket No edit summary
15 November 2023
- 17:3417:34, 15 November 2023 diff hist +2,140 Advantages and disadvantages of simulation No edit summary current
- 16:5416:54, 15 November 2023 diff hist −32 Changes in data collection that could improve the model or simulation No edit summary current
- 16:5416:54, 15 November 2023 diff hist −65 Changes in data collection that could improve the model or simulation No edit summary
- 16:5116:51, 15 November 2023 diff hist −250 Examples of simulations that involve changes in rules, formulae and algorithms No edit summary current
- 16:4016:40, 15 November 2023 diff hist −205 Software and hardware required for a simulation No edit summary current
- 16:2916:29, 15 November 2023 diff hist −311 Reliability of a simulation No edit summary current
- 16:2816:28, 15 November 2023 diff hist +2,625 Reliability of a simulation No edit summary
- 16:2216:22, 15 November 2023 diff hist +2,399 Examples of simulations that involve changes in rules, formulae and algorithms No edit summary
- 16:1816:18, 15 November 2023 diff hist −301 Software and hardware required for a simulation →Additional Considerations
- 16:1816:18, 15 November 2023 diff hist +2,273 Software and hardware required for a simulation No edit summary
- 16:1616:16, 15 November 2023 diff hist −371 Changes in rules, formulae and algorithms No edit summary current
- 16:1416:14, 15 November 2023 diff hist +2,949 Changes in rules, formulae and algorithms No edit summary
17 October 2023
- 15:4215:42, 17 October 2023 diff hist +15 Systems that can be modelled No edit summary current
10 October 2023
- 09:1109:11, 10 October 2023 diff hist +29 Programming →Programming Paradigms
- 09:1009:10, 10 October 2023 diff hist +111 Programming →Object Oriented Programming
2 September 2023
- 17:0517:05, 2 September 2023 diff hist −3 Operators No edit summary current
31 May 2023
- 15:4615:46, 31 May 2023 diff hist −36 Artificial Intelligence No edit summary current
- 15:4615:46, 31 May 2023 diff hist +12 Artificial Intelligence No edit summary
- 15:3115:31, 31 May 2023 diff hist −12 Artificial Intelligence No edit summary
- 15:3115:31, 31 May 2023 diff hist +6 Artificial Intelligence No edit summary
- 15:2815:28, 31 May 2023 diff hist +33 Artificial Intelligence No edit summary
- 15:2515:25, 31 May 2023 diff hist 0 N File:Parts of AI.png No edit summary current
- 14:5014:50, 31 May 2023 diff hist −5 2024 case study →Introduction current
30 May 2023
- 13:3213:32, 30 May 2023 diff hist +1,728 N Tracking Created page with "In the context of robotics and particularly in systems like Simultaneous Localization and Mapping (SLAM) or Visual SLAM (vSLAM), "tracking" typically refers to the process of continuously estimating the robot's motion and position over time based on its sensor data. Here's how tracking might work in a vSLAM system: # The robot captures a sequence of images with its camera as it moves through the environment. # For each new image, the robot identifies features (distinct..." current
- 13:3113:31, 30 May 2023 diff hist +16 Robotics No edit summary current
- 13:1213:12, 30 May 2023 diff hist +1,894 N Loop closure Created page with "Loop closure is an important concept in the field of robotics, particularly in relation to the Simultaneous Localization and Mapping (SLAM) problem. As a robot moves through an environment, it builds a map of the environment and uses that map to estimate its location within it. However, as the robot moves, small errors in its motion estimates can accumulate over time, leading to drift in the estimated trajectory and the map. The idea of loop closure is to correct thi..." current
- 13:1113:11, 30 May 2023 diff hist +1,693 N Local mapping Created page with "Local mapping is a concept in robotics, particularly in relation to Simultaneous Localization and Mapping (SLAM) and Visual SLAM (vSLAM), where the robot builds a smaller, more immediate map of its surroundings, often referred to as a local map. The idea is to focus computational resources on understanding the robot's immediate surroundings in detail, rather than attempting to map the entire environment at once. This local map is continuously updated as the robot moves..." current
- 13:0913:09, 30 May 2023 diff hist +1,883 N Initialization Created page with "In the context of robotics and especially in algorithms like Simultaneous Localization and Mapping (SLAM) or Visual SLAM (vSLAM), "Initialization" refers to the process of setting up the initial conditions or starting point for the algorithm. At the start of SLAM or vSLAM, the robot doesn't know anything about its environment or its position within that environment. However, to begin the process of mapping and localization, it needs some kind of initial guess or estimat..." current
- 13:0613:06, 30 May 2023 diff hist +482 N Visual simultaneous localization and mapping (vSLAM) modules Created page with "Visual Simultaneous Localization and Mapping, or vSLAM, is a variant of the general SLAM problem where the primary sensor data comes from a camera or multiple cameras. This technique uses visual information to create a map of the environment while also keeping track of the robot's location within the map. The "modules" in a vSLAM system might refer to the individual components or stages of the vSLAM process. The exact modules can vary depending on the specific vSLAM alg..." current
- 13:0513:05, 30 May 2023 diff hist +2,157 N Sensor fusion model Created page with "Sensor fusion is a method used in robotics and automation that involves merging data from different sensors to improve the understanding of the environment. This process can reduce uncertainty, improve accuracy, and make the system more robust to failures of individual sensors. A sensor fusion model, then, is a mathematical and computational model that describes how to combine the data from different sensors. Here's an example to illustrate the concept: Imagine you ha..." current
- 13:0313:03, 30 May 2023 diff hist +1,723 N Simultaneous localization and mapping (SLAM) Created page with "Simultaneous Localization and Mapping, or SLAM, is a computational problem in the field of robotics. As the name implies, it's about doing two things at the same time: # '''Localization''': Determining where a robot is located in an environment. # '''Mapping''': Building a map of that environment. What makes SLAM challenging is that it's a chicken-and-egg problem: to know where you are (localization), you need a map, but to create a map, you need to know where you are...." current
- 12:5912:59, 30 May 2023 diff hist +1,972 N Robot drift Created page with ""Robot drift" is a term often used in the context of robotics and refers to the accumulated error in a robot's estimated position and orientation over time. This error, or "drift", can occur when a robot is using sensors like wheel encoders or Inertial Measurement Units (IMUs) to estimate its motion. Both these methods involve integrating sensor measurements over time to calculate position, but small errors in these measurements can accumulate, leading to larger and lar..." current
- 12:5812:58, 30 May 2023 diff hist +1,900 N Rigid pose estimation (RPE) Created page with "Rigid Pose Estimation (RPE) is a concept in computer vision and robotics that involves determining the position and orientation (the "pose") of an object that does not deform or change shape — in other words, a "rigid" object. The term 'rigid' indicates that the distance between any two points on the object remains constant over time, regardless of the object's movement or orientation. In the context of robotics, pose estimation often refers to estimating the pose of..." current
- 12:5612:56, 30 May 2023 diff hist +1,386 N Relocalization Created page with "Relocalization is a critical concept in robotics, specifically in the context of autonomous navigation and Simultaneous Localization and Mapping (SLAM). It refers to the ability of a robot to determine its current location in a map that it previously built or in a known environment, particularly after it has lost track of its position due to an error, disturbance, or after it has been manually moved (also known as the "kidnapped robot" problem). There are many reasons w..." current
- 12:5412:54, 30 May 2023 diff hist +1,923 N Odometry sensor Created page with "An odometry sensor is a device used to estimate the change in position over time of a vehicle, like a car or a robot, based on data from its own sensors. The term "odometry" comes from the combination of "hodos", meaning path or way in Greek, and "metron", meaning measure. Therefore, it's all about measuring the path a vehicle takes. There are several types of odometry sensors, and they work in different ways: # '''Wheel Encoders''': In many robots, especially wheeled..." current
- 12:5312:53, 30 May 2023 diff hist +1,331 N Object occlusion Created page with "Object occlusion in the context of computer vision refers to the event where a part or all of an object in the scene is hidden from view by some other object in the scene. In simple words, when an object is in front of another object, blocking it from view, we say that the second object is occluded. For example, imagine you are looking at a photograph of a crowd of people. Some of those people may be standing in front of others, preventing you from seeing the people beh..." current
- 12:5112:51, 30 May 2023 diff hist +1,606 N Light detection and ranging (LIDAR) Created page with "Light Detection and Ranging, more commonly known as LiDAR, is a method of remote sensing that uses light in the form of a pulsed laser to measure distances to an object. These light pulses, combined with other data recorded by the airborne system, generate precise, three-dimensional information about the shape of the Earth and its surface characteristics. Here's a simplified explanation of how it works: # A LiDAR system sends out a pulse of light, usually in the form o..." current
- 12:5012:50, 30 May 2023 diff hist +1,581 N Key points/pairs Created page with "Key points, also known as feature points or interest points, are distinct and unique points in an image that are easy to find and accurately describe. These points are usually selected because they represent corners, edges, or other interesting aspects of the image, and they are used in many computer vision tasks for things like object recognition, image alignment, and 3D reconstruction. When multiple images are used (for example, in a video or a sequence of frames take..." current
- 12:4812:48, 30 May 2023 diff hist +1,803 N Keyframe selection Created page with "Keyframe selection is a term commonly used in the field of computer vision, especially in video processing and robotics. A keyframe is a frame in a sequence of frames (like a video or a series of images) that contains important or critical data. In video compression, for example, keyframes (also known as I-frames) are the frames from which subsequent frames are based. These keyframes serve as reference points and the frames in between are often compressed by storing onl..." current
- 12:4612:46, 30 May 2023 diff hist +1,710 N Inertial measurement unit (IMU) Created page with "An Inertial Measurement Unit, or IMU, is a device that measures and reports on a vehicle's velocity, orientation, and gravitational forces, using a combination of accelerometers, gyroscopes, and sometimes magnetometers. IMUs are typically used to aid in navigation and tracking systems, particularly when GPS data is unavailable or unreliable. Let's break down the components of an IMU: # '''Accelerometers:''' These measure linear acceleration. However, they can't disting..." current
- 12:4512:45, 30 May 2023 diff hist +1,832 N Human pose estimation (HPE) Created page with "Human pose estimation (HPE) is a computer vision task that involves determining the position and orientation of the human body, along with the positions of various body parts such as the head, arms, legs, and so on, usually in real-time. Here's a simplified way to think about it: Imagine you're looking at a photo of a person. You can probably tell what position they're in — maybe they're standing up straight, sitting down, or running. Now imagine trying to teach a com..." current
- 12:4412:44, 30 May 2023 diff hist +1,579 N GPS-denied environment Created page with "A GPS-denied environment is a location or situation where the Global Positioning System (GPS) signals are not available at all. This can occur for a number of reasons: # Indoor Locations: Buildings often block GPS signals, making them unavailable inside. # Underground or Underwater: Similarly, GPS signals can't penetrate underground or underwater. # Jamming or Spoofing: GPS signals can be intentionally disrupted or blocked using devices known as GPS jammers. Additionall..." current
- 12:4212:42, 30 May 2023 diff hist +12 GPS-degraded environment No edit summary current
- 12:4112:41, 30 May 2023 diff hist −5 GPS-degraded environment No edit summary
- 12:4112:41, 30 May 2023 diff hist −6 GPS-degraded environment No edit summary