Accueil > Term: tokenization
tokenization
In text mining or Full-Text Search, the process of identifying meaningful units within strings, either at word boundaries, morphemes, or stems, so that related tokens can be grouped. For example, although "San Francisco" is two words, it could be treated as a single token.
- Partie du discours : noun
- Secteur d’activité/Domaine : Logiciels
- Catégorie : Systèmes d'exploitation
- Company: Microsoft
0
Créateur
- Maxiao
- 100% positive feedback