Chat bots: Difference between revisions
Mr. MacKenty (talk | contribs) No edit summary |
Mr. MacKenty (talk | contribs) No edit summary |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
[[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | [[file:Studying.png|right|frame|Case study notes<ref>http://www.flaticon.com/</ref>]] | ||
'''Learn this first, in this order:''' | |||
* [[Neural networks]] | |||
* [[Recurrent neural network (RNN)]] | |||
* [[Large language model (LLM)]] | |||
* [[Hyperparameter tuning]] | |||
* [[Backpropagation through time (BPTT)]] | * [[Backpropagation through time (BPTT)]] | ||
* [[ | * [[Deep learning]] | ||
* [[Transformer neural network (transformer NN)]] | |||
'''Then this:''' | |||
* [[Biases]] | * [[Biases]] | ||
** [[Confirmation]] | ** [[Confirmation]] | ||
Line 10: | Line 21: | ||
** [[Sampling]] | ** [[Sampling]] | ||
** [[Selection]] | ** [[Selection]] | ||
* [[Bag-of-words]] | |||
* [[Dataset]] | * [[Dataset]] | ||
* [[Graphical processing unit (GPU)]] | * [[Graphical processing unit (GPU)]] | ||
* [[Latency]] | * [[Latency]] | ||
* [[Long short-term memory (LSTM)]] | * [[Long short-term memory (LSTM)]] | ||
Line 27: | Line 37: | ||
* [[Natural language understanding (NLU)]] | * [[Natural language understanding (NLU)]] | ||
* [[Pre-processing]] | * [[Pre-processing]] | ||
* [[Self-attention mechanism]] | * [[Self-attention mechanism]] | ||
* [[Synthetic data]] | * [[Synthetic data]] | ||
* [[Tensor processing unit (TPU)]] | * [[Tensor processing unit (TPU)]] | ||
* [[Vanishing gradient]] | * [[Vanishing gradient]] | ||
* [[Weights]] | * [[Weights]] |
Latest revision as of 08:12, 4 October 2024
Learn this first, in this order:
- Neural networks
- Recurrent neural network (RNN)
- Large language model (LLM)
- Hyperparameter tuning
- Backpropagation through time (BPTT)
- Deep learning
- Transformer neural network (transformer NN)
Then this:
- Bag-of-words
- Dataset
- Graphical processing unit (GPU)
- Latency
- Long short-term memory (LSTM)
- Loss function
- Memory cell state
- Natural language processing
- Discourse integration
- Lexical analysis
- Pragmatic analysis
- Semantic analysis
- Syntactical analysis (parsing)
- Natural language understanding (NLU)
- Pre-processing
- Self-attention mechanism
- Synthetic data
- Tensor processing unit (TPU)
- Vanishing gradient
- Weights