Natural Language Processing
Natural language processing (NLP) is a field of computer science, artificial intelligence and computational linguistics concerned with the interactions between computers and human (natural) languages, and, in particular, concerned with programming computers to fruitfully process large natural language corpora. [2]
The big ideas in AI[edit]
Terms[edit]
Term | Definition |
---|---|
Text Normalization | Normalizing text means converting it to a more convenient, standard form [3] |
Tokenization | A part of text normalization. Given a character sequence and a defined document unit, tokenization is the task of chopping it up into pieces, called tokens , perhaps at the same time throwing away certain characters, such as punctuation[4] |
Standards[edit]
References[edit]
[[Category:Artificial Intelligence]