일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- ML
- variance bias
- sweet spot in fitting
- level of negative
- NO TYPO
- online machine learning
- Naive Bayes example
- color and data
- color theory
- Apache Beam
- productivity
- bias vs variance
- negotiate
- API
- NLP
- Data Distribution
- level of emotion
- machine learning
- plot color
- Insight
- Deep Learning
- Pararel Computing
- pandas
- utilize
- tensorflow
- root of the word
- plotting
- distributed training
- level of positive
- Learn from mistake
- Today
- Total
목록NLP (2)
How much am I expressed through the data

' However, in some cases, the stemming process produces words that are not correct spellings of the root word.' ' ex) happi and sunni. That's because it chooses the most common stem for related words. -- ' 1. In terms of probablities ? (I meant to say observed cases.) #Copyrights:This screenshot is part of the Coursera's Natural Language Processing Specialization by DeepLearning.ai

Express the level of emotions in either positive or negative through the level of some color feature as following. (i.e) positive : light green -> green -> dark green negative : light red -> red -> dark red #Copyrights:This screenshot is part of the Coursera's Natural Language Processing Specialization by DeepLearning.ai