All public logs

Combined display of all available logs of Computer Science Wiki. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).

Logs
(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)
  • 07:17, 25 July 2024 Mr. MacKenty talk contribs deleted page Multi-layer perceptron (MLP) (content was: "<center> <blockquote style="padding: 5px; background-color: #FFF8DC; border: solid thin gray;"> File:Exclamation.png This is student work which has not yet been approved as correct by the instructor </blockquote> </center> right|frame|Case study notes<ref>http://www.flaticon.com/</ref> == Introduction == Multi-layer perceptrons are simply a type of neural network consisting of at least 3 nodes. The input nodes, the hidden nodes, and the output no...")
  • 07:16, 25 July 2024 Mr. MacKenty talk contribs created page Weights (Created page with "''This article was created with the support of an LLM'' Weights are a fundamental component of neural networks, representing the parameters that the model learns during training. In the context of chatbots, weights determine how the input data is transformed as it passes through the network layers, ultimately influencing the chatbot's ability to understand and generate human language. === Importance of Weights === Weights are crucial for: * Learning from data during t...")
  • 07:14, 25 July 2024 Mr. MacKenty talk contribs created page Vanishing gradient (Created page with "''This article was created with the support of an LLM'' The vanishing gradient problem is a significant issue in training deep neural networks, particularly recurrent neural networks (RNNs). It occurs when gradients used to update the weights during backpropagation become exceedingly small, effectively preventing the network from learning. This problem can severely impact the performance of chatbots by hindering the training of deep models. === Importance of Addressing...")
  • 07:13, 25 July 2024 Mr. MacKenty talk contribs created page Transformer neural network (transformer NN) (Created page with "''This article was created with the support of an LLM'' Transformer Neural Networks (Transformer NNs) are a type of neural network architecture designed for handling sequential data. They are particularly effective for natural language processing (NLP) tasks and have revolutionized the development of chatbots by providing a powerful mechanism for understanding and generating human language. === Importance of Transformer NNs === Transformer NNs are crucial for: * Captu...")
  • 07:11, 25 July 2024 Mr. MacKenty talk contribs created page Tensor processing unit (TPU) (Created page with "''This article was created with the support of an LLM'' A Tensor Processing Unit (TPU) is a specialized hardware accelerator designed by Google specifically for machine learning tasks, particularly neural network computations. TPUs are optimized for large-scale training and inference of deep learning models, making them highly effective for powering advanced chatbots. === Importance of TPUs === TPUs are crucial for: * Accelerating the training and inference of large n...")
  • 07:04, 25 July 2024 Mr. MacKenty talk contribs created page Synthetic data (Created page with "''This article was created with support from an LLM'' Synthetic data refers to artificially generated data that mimics real-world data. In the context of chatbots, synthetic data is used to train and test models, particularly when real data is scarce, sensitive, or expensive to obtain. Synthetic data can enhance the chatbot's performance by providing diverse and extensive training examples. === Importance of Synthetic Data === Synthetic data is crucial for: * Expandin...")
  • 07:03, 25 July 2024 Mr. MacKenty talk contribs created page Self-attention mechanism (Created page with "''This article was created with support from an LLM'' The self-attention mechanism is a component in neural network architectures that allows the model to weigh the importance of different words in a sentence relative to each other. This mechanism is pivotal in enhancing the model's ability to capture long-range dependencies and contextual relationships within the text. In chatbots, the self-attention mechanism is often employed within transformer models to improve natu...")
  • 07:02, 25 July 2024 Mr. MacKenty talk contribs created page Recurrent neural network (RNN) (Created page with "''This article has been created with support from an LLM'' Recurrent Neural Networks (RNNs) are a class of artificial neural networks designed to recognize patterns in sequences of data, such as text, genomes, handwriting, or time series data. In chatbots, RNNs are employed to process and generate sequences of words, allowing the system to maintain context across multiple turns of conversation. === Importance of RNNs === RNNs are crucial for: * Handling sequential dat...")
  • 07:00, 25 July 2024 Mr. MacKenty talk contribs created page Pre-processing (Created page with "''This article was written with support from an LLM'' Pre-processing is the initial phase in natural language processing (NLP) that involves preparing and cleaning the input text to make it suitable for analysis and interpretation. In chatbots, pre-processing is essential for ensuring accurate understanding and efficient processing of user inputs. === Importance of Pre-processing === Pre-processing is crucial for: * Cleaning and normalizing user inputs. * Reducing noi...")
  • 06:59, 25 July 2024 Mr. MacKenty talk contribs created page Natural language understanding (NLU) (Created page with "''This articel was written with the support of an LLM'' Natural Language Understanding (NLU) is a subfield of natural language processing (NLP) that focuses on the ability of machines to understand and interpret human language. In the context of chatbots, NLU enables the system to comprehend user inputs, recognize intents, and extract relevant entities. === Importance of NLU === NLU is crucial for: * Accurately interpreting user queries and commands. * Understanding t...")
  • 06:58, 25 July 2024 Mr. MacKenty talk contribs created page Syntactical analysis (parsing) (Created page with "''This article was written with the support of an LLM'' Syntactical analysis, also known as parsing, is the process of analyzing the grammatical structure of a sentence to understand the relationships between words. In chatbots, syntactical analysis helps in interpreting user inputs by breaking down sentences into their constituent parts. === Importance of Syntactical Analysis === Syntactical analysis is crucial for: * Understanding the grammatical structure of senten...")
  • 06:57, 25 July 2024 Mr. MacKenty talk contribs created page Semantic analysis (Created page with "''This article was written with the support of an LLM'' Semantic analysis is the process of understanding the meaning and interpretation of words, phrases, and sentences in context. In chatbots, semantic analysis helps in grasping the intended meaning behind user inputs, enabling more accurate and relevant responses. === Importance of Semantic Analysis === Semantic analysis is vital for: * Interpreting the meanings of words and sentences in context. * Identifying rela...")
  • 06:54, 25 July 2024 Mr. MacKenty talk contribs created page Pragmatic analysis (Created page with "''This article has been written by an LLM'' Pragmatic analysis involves understanding the intended meaning behind a user's input by considering context, user intent, and other external factors. It goes beyond the literal meaning of words to interpret the intended message, which is crucial for creating effective and human-like chatbots. === Importance of Pragmatic Analysis === Pragmatic analysis is essential for ensuring that chatbots: * Comprehend the user's actual in...")
  • 06:51, 25 July 2024 Mr. MacKenty talk contribs created page Lexical analysis (Created page with "''This article was written with the support of an LLM'' Lexical analysis, also known as tokenization, is the process of converting a sequence of characters into a sequence of tokens. In the context of chatbots, lexical analysis is a fundamental step in natural language processing (NLP) that helps in understanding and interpreting user inputs. === Importance of Lexical Analysis === Lexical analysis is crucial for breaking down user inputs into manageable and meaningful...")
  • 06:50, 25 July 2024 Mr. MacKenty talk contribs created page Discourse integration (Created page with "Discourse integration refers to the capability of chatbots to maintain coherence and context across multiple interactions within a conversation. It ensures that the chatbot can understand, interpret, and generate responses that are contextually relevant, enhancing the overall conversational experience. === Importance of Discourse Integration === Effective discourse integration is crucial for creating natural and engaging interactions. It enables chatbots to: * Maintain...")
  • 06:48, 25 July 2024 Mr. MacKenty talk contribs created page Memory cell state (Created page with "A memory cell state is a critical concept in the context of chatbots, especially those utilizing neural networks such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) networks. These networks are commonly employed to manage and improve the understanding of context and sequential information in chatbot conversations. === Understanding Memory Cell State === In neural network architectures, the memory cell state represents the internal state of the cell at...")
  • 19:18, 9 July 2024 Mr. MacKenty talk contribs created page Loss function (Created page with "''This answer was supported by a LLM'' '''Loss Function''' A loss function, also known as a cost function or objective function, is a mathematical function used in machine learning to measure the difference between the predicted output of a model and the actual target values. The primary goal of training a model is to minimize the loss function, thereby improving the model’s accuracy. Here’s a detailed explanation of loss functions within the context of a chatbot s...")
  • 19:15, 9 July 2024 Mr. MacKenty talk contribs created page Long short-term memory (LSTM) (Created page with "''This answer was supported by a LLM'' '''Long Short-Term Memory (LSTM)''' Long Short-Term Memory (LSTM) is a type of recurrent neural network (RNN) architecture designed to handle long-term dependencies and mitigate issues such as vanishing and exploding gradients, which are common in traditional RNNs. LSTMs are particularly effective in tasks that involve sequential data, such as language modeling and time series prediction. Here’s a detailed explanation of LSTMs w...")
  • 19:13, 9 July 2024 Mr. MacKenty talk contribs created page Latency (Created page with "''This answer was supported by a LLM'' '''Latency''' Latency refers to the time delay between a user's action and the system's response. In the context of a chatbot system, latency is a critical factor that affects user experience and the perceived responsiveness of the chatbot. Here’s a detailed explanation of latency within the context of a chatbot system: == Definition == * '''Latency''': * The time delay between the input (user query) and the output (chatbot r...")
  • 18:02, 9 July 2024 Mr. MacKenty talk contribs created page Large language model (LLM) (Created page with "''This answer was supported by a LLM'' '''Large Language Model (LLM)''' A Large Language Model (LLM) is a type of artificial intelligence model that has been trained on vast amounts of text data to understand, generate, and manipulate human language. LLMs leverage advanced machine learning techniques, particularly deep learning, to perform a wide range of natural language processing (NLP) tasks. Here’s a detailed explanation of LLMs within the context of a chatbot sy...")
  • 18:00, 9 July 2024 Mr. MacKenty talk contribs created page Hyperparameter tuning (Created page with "```mediawiki ''This answer was supported by a LLM'' '''Hyperparameter Tuning''' Hyperparameter tuning is the process of optimizing the hyperparameters of a machine learning model to improve its performance. Unlike model parameters, which are learned during training, hyperparameters are set before the training process begins and directly influence the behavior of the learning algorithm. Here’s a detailed explanation of hyperparameter tuning within the context of a cha...")
  • 08:59, 3 July 2024 Mr. MacKenty talk contribs created page Graphical processing unit (GPU) (Created page with "''This answer was supported by a LLM'' '''Graphical Processing Unit (GPU)''' A Graphical Processing Unit (GPU) is a specialized electronic circuit designed to accelerate the processing of images and videos. It has become essential for deep learning and other computationally intensive tasks due to its ability to perform many calculations simultaneously. Here’s a detailed explanation of GPUs within the context of a chatbot system: == Definition == * '''Graphical Proce...")
  • 08:55, 3 July 2024 Mr. MacKenty talk contribs created page Dataset (Created page with "''This answer was supported by a LLM'' '''Dataset''' A dataset is a structured collection of data used for various purposes such as training, validating, and testing machine learning models, including chatbots. Datasets can consist of various types of data, such as text, images, audio, or numerical values, depending on the application. Here’s a detailed explanation of a dataset within the context of a chatbot system: == Definition == * '''Dataset''': * A collectio...")
  • 08:54, 3 July 2024 Mr. MacKenty talk contribs created page Selection (Created page with " ''This answer was supported by a LLM'' '''Sampling Bias''' Sampling bias occurs when the sample selected for analysis is not representative of the entire population, leading to skewed or inaccurate results. This type of bias can significantly impact the performance and fairness of AI systems, including chatbots. Here’s a detailed explanation of sampling bias within the context of a chatbot system: == Definition == * '''Sampling Bias''': * A type of bias that aris...")
  • 08:53, 3 July 2024 Mr. MacKenty talk contribs created page Sampling (Created page with " ''This answer was supported by a LLM'' '''Sampling''' Sampling is the process of selecting a subset of data from a larger dataset to train machine learning models, including chatbots. The goal of sampling is to create a representative subset that accurately reflects the characteristics of the entire dataset. Here’s a detailed explanation of sampling within the context of a chatbot system: == Definition == * '''Sampling''': * The technique of choosing a smaller, m...")
  • 08:52, 3 July 2024 Mr. MacKenty talk contribs created page Linguistic (Created page with " ''This answer was supported by a LLM'' '''Linguistic Bias''' Linguistic bias refers to prejudice or favoritism toward certain languages, dialects, or ways of speaking within AI systems. This type of bias can impact the performance and fairness of chatbots and other language-based technologies. Here’s a detailed explanation of linguistic bias within the context of a chatbot system: == Definition == * '''Linguistic Bias''': * Bias that occurs when AI systems favor...")
  • 08:51, 3 July 2024 Mr. MacKenty talk contribs created page Labelling (Created page with " ''This answer was supported by a LLM'' '''Labelling Bias''' Labelling bias occurs when the labels assigned to training data reflect subjective judgments, stereotypes, or prejudices, rather than objective truth. This type of bias can significantly impact the performance and fairness of AI systems, including chatbots. Here’s a detailed explanation of labelling bias within the context of a chatbot system: == Definition == * '''Labelling Bias''': * Bias introduced du...")
  • 08:49, 3 July 2024 Mr. MacKenty talk contribs created page Historical (Created page with " ''This answer was supported by a LLM'' Historical bias refers to distortions or inaccuracies in data that arise from historical inequalities, prejudices, or systemic discrimination embedded within the data. This type of bias can significantly affect the training and performance of AI systems, including chatbots. Here’s a detailed explanation of historical bias within the context of a chatbot system: == Definition == * '''Historical Bias''': * Bias in data that re...")
  • 08:48, 3 July 2024 Mr. MacKenty talk contribs created page Confirmation (Created page with " ''This answer was supported by a LLM'' Confirmation, often referred to as confirmation bias, is the tendency to search for, interpret, and remember information in a way that confirms one’s preexisting beliefs or hypotheses. This cognitive bias can significantly affect decision-making processes and the development of AI systems. Here’s a detailed explanation of confirmation within the context of a chatbot system: == Definition == * '''Confirmation Bias''': * Th...")
  • 08:44, 3 July 2024 Mr. MacKenty talk contribs created page Biases (Created page with "This article was created with the help and support of an LLM Bias refers to a systematic error or deviation from true values or fairness that affects data, models, and decision-making processes. In the context of machine learning and artificial intelligence, bias can lead to skewed results and unfair treatment of certain groups. Here’s a detailed explanation of bias within the context of a chatbot system: == Types of Bias == * '''Data Bias''': * Occurs when the tr...")
  • 08:40, 3 July 2024 Mr. MacKenty talk contribs created page Bag-of-words (Created page with "This wiki article was made with the help and support of an LLM ```mediawiki '''Bag of Words (BoW)''' Bag of Words (BoW) is a fundamental technique used in natural language processing (NLP) and information retrieval to represent text data in a structured format. It simplifies the text data by treating it as a collection of individual words, disregarding grammar and word order, but keeping track of word frequency. Here’s a detailed explanation of BoW within the context...")
  • 08:38, 3 July 2024 Mr. MacKenty talk contribs created page Backpropagation through time (BPTT) (Created page with "Backpropagation Through Time (BPTT) is an extension of the backpropagation algorithm for training recurrent neural networks (RNNs). RNNs are designed to handle sequential data by maintaining a state that can capture information from previous inputs. However, training RNNs is challenging due to their complex structure and the need to account for dependencies over time. Here’s a detailed explanation of BPTT within the context of a chatbot system: 1. Sequential Nature o...")
  • 08:33, 3 July 2024 Mr. MacKenty talk contribs created page Chat bots (Created page with "right|frame|Case study notes<ref>http://www.flaticon.com/</ref> * Backpropagation through time (BPTT) * Bag-of-words * Biases * Confirmation * Historical * Labelling * Linguistic * Sampling * Selection * Dataset * Deep learning * Graphical processing unit (GPU) * Hyperparameter tuning * Large language model (LLM) * Latency * Long short-term memory (LSTM) * Loss function * Memory...")
  • 08:28, 3 July 2024 Mr. MacKenty talk contribs created page File:D 4 comsc css 2405 1a e.pdf
  • 08:28, 3 July 2024 Mr. MacKenty talk contribs uploaded File:D 4 comsc css 2405 1a e.pdf
  • 08:28, 3 July 2024 Mr. MacKenty talk contribs created page 2025 case study (Created page with "right|frame|Case study<ref>http://www.flaticon.com/</ref> == Introduction == Higher-level students must write 3 papers. The case study is the third paper. Every year the case study discusses a different topic. Students must become '''very familiar''' with the case study. The IB recommends spending about a year studying this case study. This page will help you organize and understand the 2024 case study. == The case study == Media:D 4 c...")
  • 15:07, 29 January 2024 Mr. MacKenty talk contribs deleted page User:Smurpani (content was: "Can't seem to login with the account I made previously... Thanks, Sid", and the only contributor was "Mr. MacKenty" (talk))
  • 15:07, 29 January 2024 Mr. MacKenty talk contribs deleted page User:18barlow d (content was: "I'm pretty cool. I enjoy reading and eating lots of food. Mexican food is some of my favorite. I also love cookies.", and the only contributor was "Mr. MacKenty" (talk))
  • 15:25, 31 May 2023 Mr. MacKenty talk contribs created page File:Parts of AI.png
  • 15:25, 31 May 2023 Mr. MacKenty talk contribs uploaded File:Parts of AI.png
  • 13:32, 30 May 2023 Mr. MacKenty talk contribs created page Tracking (Created page with "In the context of robotics and particularly in systems like Simultaneous Localization and Mapping (SLAM) or Visual SLAM (vSLAM), "tracking" typically refers to the process of continuously estimating the robot's motion and position over time based on its sensor data. Here's how tracking might work in a vSLAM system: # The robot captures a sequence of images with its camera as it moves through the environment. # For each new image, the robot identifies features (distinct...")
  • 13:12, 30 May 2023 Mr. MacKenty talk contribs created page Loop closure (Created page with "Loop closure is an important concept in the field of robotics, particularly in relation to the Simultaneous Localization and Mapping (SLAM) problem. As a robot moves through an environment, it builds a map of the environment and uses that map to estimate its location within it. However, as the robot moves, small errors in its motion estimates can accumulate over time, leading to drift in the estimated trajectory and the map. The idea of loop closure is to correct thi...")
  • 13:11, 30 May 2023 Mr. MacKenty talk contribs created page Local mapping (Created page with "Local mapping is a concept in robotics, particularly in relation to Simultaneous Localization and Mapping (SLAM) and Visual SLAM (vSLAM), where the robot builds a smaller, more immediate map of its surroundings, often referred to as a local map. The idea is to focus computational resources on understanding the robot's immediate surroundings in detail, rather than attempting to map the entire environment at once. This local map is continuously updated as the robot moves...")
  • 13:09, 30 May 2023 Mr. MacKenty talk contribs created page Initialization (Created page with "In the context of robotics and especially in algorithms like Simultaneous Localization and Mapping (SLAM) or Visual SLAM (vSLAM), "Initialization" refers to the process of setting up the initial conditions or starting point for the algorithm. At the start of SLAM or vSLAM, the robot doesn't know anything about its environment or its position within that environment. However, to begin the process of mapping and localization, it needs some kind of initial guess or estimat...")
  • 13:06, 30 May 2023 Mr. MacKenty talk contribs created page Visual simultaneous localization and mapping (vSLAM) modules (Created page with "Visual Simultaneous Localization and Mapping, or vSLAM, is a variant of the general SLAM problem where the primary sensor data comes from a camera or multiple cameras. This technique uses visual information to create a map of the environment while also keeping track of the robot's location within the map. The "modules" in a vSLAM system might refer to the individual components or stages of the vSLAM process. The exact modules can vary depending on the specific vSLAM alg...")
  • 13:05, 30 May 2023 Mr. MacKenty talk contribs created page Sensor fusion model (Created page with "Sensor fusion is a method used in robotics and automation that involves merging data from different sensors to improve the understanding of the environment. This process can reduce uncertainty, improve accuracy, and make the system more robust to failures of individual sensors. A sensor fusion model, then, is a mathematical and computational model that describes how to combine the data from different sensors. Here's an example to illustrate the concept: Imagine you ha...")
  • 13:03, 30 May 2023 Mr. MacKenty talk contribs created page Simultaneous localization and mapping (SLAM) (Created page with "Simultaneous Localization and Mapping, or SLAM, is a computational problem in the field of robotics. As the name implies, it's about doing two things at the same time: # '''Localization''': Determining where a robot is located in an environment. # '''Mapping''': Building a map of that environment. What makes SLAM challenging is that it's a chicken-and-egg problem: to know where you are (localization), you need a map, but to create a map, you need to know where you are....")
  • 12:59, 30 May 2023 Mr. MacKenty talk contribs created page Robot drift (Created page with ""Robot drift" is a term often used in the context of robotics and refers to the accumulated error in a robot's estimated position and orientation over time. This error, or "drift", can occur when a robot is using sensors like wheel encoders or Inertial Measurement Units (IMUs) to estimate its motion. Both these methods involve integrating sensor measurements over time to calculate position, but small errors in these measurements can accumulate, leading to larger and lar...")
  • 12:58, 30 May 2023 Mr. MacKenty talk contribs created page Rigid pose estimation (RPE) (Created page with "Rigid Pose Estimation (RPE) is a concept in computer vision and robotics that involves determining the position and orientation (the "pose") of an object that does not deform or change shape — in other words, a "rigid" object. The term 'rigid' indicates that the distance between any two points on the object remains constant over time, regardless of the object's movement or orientation. In the context of robotics, pose estimation often refers to estimating the pose of...")
  • 12:56, 30 May 2023 Mr. MacKenty talk contribs created page Relocalization (Created page with "Relocalization is a critical concept in robotics, specifically in the context of autonomous navigation and Simultaneous Localization and Mapping (SLAM). It refers to the ability of a robot to determine its current location in a map that it previously built or in a known environment, particularly after it has lost track of its position due to an error, disturbance, or after it has been manually moved (also known as the "kidnapped robot" problem). There are many reasons w...")
(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)