All public logs

Combined display of all available logs of Computer Science Wiki. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).

Logs
(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)
  • 07:17, 25 July 2024 Mr. MacKenty talk contribs deleted page Multi-layer perceptron (MLP) (content was: "<center> <blockquote style="padding: 5px; background-color: #FFF8DC; border: solid thin gray;"> File:Exclamation.png This is student work which has not yet been approved as correct by the instructor </blockquote> </center> right|frame|Case study notes<ref>http://www.flaticon.com/</ref> == Introduction == Multi-layer perceptrons are simply a type of neural network consisting of at least 3 nodes. The input nodes, the hidden nodes, and the output no...")
  • 07:16, 25 July 2024 Mr. MacKenty talk contribs created page Weights (Created page with "''This article was created with the support of an LLM'' Weights are a fundamental component of neural networks, representing the parameters that the model learns during training. In the context of chatbots, weights determine how the input data is transformed as it passes through the network layers, ultimately influencing the chatbot's ability to understand and generate human language. === Importance of Weights === Weights are crucial for: * Learning from data during t...")
  • 07:14, 25 July 2024 Mr. MacKenty talk contribs created page Vanishing gradient (Created page with "''This article was created with the support of an LLM'' The vanishing gradient problem is a significant issue in training deep neural networks, particularly recurrent neural networks (RNNs). It occurs when gradients used to update the weights during backpropagation become exceedingly small, effectively preventing the network from learning. This problem can severely impact the performance of chatbots by hindering the training of deep models. === Importance of Addressing...")
  • 07:13, 25 July 2024 Mr. MacKenty talk contribs created page Transformer neural network (transformer NN) (Created page with "''This article was created with the support of an LLM'' Transformer Neural Networks (Transformer NNs) are a type of neural network architecture designed for handling sequential data. They are particularly effective for natural language processing (NLP) tasks and have revolutionized the development of chatbots by providing a powerful mechanism for understanding and generating human language. === Importance of Transformer NNs === Transformer NNs are crucial for: * Captu...")
  • 07:11, 25 July 2024 Mr. MacKenty talk contribs created page Tensor processing unit (TPU) (Created page with "''This article was created with the support of an LLM'' A Tensor Processing Unit (TPU) is a specialized hardware accelerator designed by Google specifically for machine learning tasks, particularly neural network computations. TPUs are optimized for large-scale training and inference of deep learning models, making them highly effective for powering advanced chatbots. === Importance of TPUs === TPUs are crucial for: * Accelerating the training and inference of large n...")
  • 07:04, 25 July 2024 Mr. MacKenty talk contribs created page Synthetic data (Created page with "''This article was created with support from an LLM'' Synthetic data refers to artificially generated data that mimics real-world data. In the context of chatbots, synthetic data is used to train and test models, particularly when real data is scarce, sensitive, or expensive to obtain. Synthetic data can enhance the chatbot's performance by providing diverse and extensive training examples. === Importance of Synthetic Data === Synthetic data is crucial for: * Expandin...")
(newest | oldest) View (newer 50 | ) (20 | 50 | 100 | 250 | 500)