Chat bots: Difference between revisions
Mr. MacKenty (talk | contribs) (Created page with "right|frame|Case study notes<ref>http://www.flaticon.com/</ref> * Backpropagation through time (BPTT) * Bag-of-words * Biases * Confirmation * Historical * Labelling * Linguistic * Sampling * Selection * Dataset * Deep learning * Graphical processing unit (GPU) * Hyperparameter tuning * Large language model (LLM) * Latency * Long short-term memory (LSTM) * Loss function * Memory...") |
Mr. MacKenty (talk | contribs) No edit summary |
||
(One intermediate revision by the same user not shown) | |||
Line 4: | Line 4: | ||
* [[Bag-of-words]] | * [[Bag-of-words]] | ||
* [[Biases]] | * [[Biases]] | ||
* [[Confirmation]] | ** [[Confirmation]] | ||
* [[Historical]] | ** [[Historical]] | ||
* [[Labelling]] | ** [[Labelling]] | ||
* [[Linguistic]] | ** [[Linguistic]] | ||
* [[Sampling]] | ** [[Sampling]] | ||
* [[Selection]] | ** [[Selection]] | ||
* [[Dataset]] | * [[Dataset]] | ||
* [[Deep learning]] | * [[Deep learning]] | ||
Line 24: | Line 24: | ||
* [[Pragmatic analysis]] | * [[Pragmatic analysis]] | ||
* [[Semantic analysis]] | * [[Semantic analysis]] | ||
* Syntactical analysis (parsing) | * [[Syntactical analysis (parsing)]] | ||
* Natural language understanding (NLU) | * [[Natural language understanding (NLU)]] | ||
* Pre-processing | * [[Pre-processing]] | ||
* Recurrent neural network (RNN) | * [[Recurrent neural network (RNN)]] | ||
* Self-attention mechanism | * [[Self-attention mechanism]] | ||
* Synthetic data | * [[Synthetic data]] | ||
* Tensor processing unit (TPU) | * [[Tensor processing unit (TPU)]] | ||
* Transformer neural network (transformer NN) | * [[Transformer neural network (transformer NN)]] | ||
* Vanishing gradient | * [[Vanishing gradient]] | ||
* Weights | * [[Weights]] | ||
Latest revision as of 09:04, 3 July 2024
- Backpropagation through time (BPTT)
- Bag-of-words
- Biases
- Dataset
- Deep learning
- Graphical processing unit (GPU)
- Hyperparameter tuning
- Large language model (LLM)
- Latency
- Long short-term memory (LSTM)
- Loss function
- Memory cell state
- Natural language processing
- Discourse integration
- Lexical analysis
- Pragmatic analysis
- Semantic analysis
- Syntactical analysis (parsing)
- Natural language understanding (NLU)
- Pre-processing
- Recurrent neural network (RNN)
- Self-attention mechanism
- Synthetic data
- Tensor processing unit (TPU)
- Transformer neural network (transformer NN)
- Vanishing gradient
- Weights