Notice
Recent Posts
Recent Comments
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
Tags
- Absolute
- AGI
- ai
- AI agents
- AI engineer
- AI researcher
- ajax
- algorithm
- Algorithms
- aliases
- Array 객체
- ASI
- bayes' theorem
- Bit
- Blur
- BOM
- bootstrap
- canva
- challenges
- ChatGPT
Archives
- Today
- In Total
A Joyful AI Research Journey🌳😊
Computing the Posterior Probability Using Bayes' Theorem 본문
🌳AI Projects: NLP🍀✨/NLP Deep Dive
Computing the Posterior Probability Using Bayes' Theorem
yjyuwisely 2023. 9. 11. 22:24To determine
P(J∣F,I)
the probability Jill Stein spoke the words 'freedom' and 'immigration', we'll apply Bayes' Theorem:
P(J∣F,I)
=P(J)×P(F∣J)×P(I∣J) / P(F,I)
Where:
- P(J) is the prior probability (the overall likelihood of Jill Stein giving a speech). In our case, P(J)=0.5.
- P(F∣J) and P(I∣J) are the likelihoods. These represent the probabilities of Jill Stein saying the words 'freedom' and 'immigration' respectively. Both are given as 0.1.
- P(F,I) is the evidence or marginal likelihood. This is the overall probability of the words 'freedom' and 'immigration' being said, regardless of the speaker.
It can be computed as:
P(F,I)
=P(J)×P(F∣J)×P(I∣J) + P(G)×P(F∣G)×P(I∣G)
By plugging in the numbers, we can compute P(J∣F,I):
P(J∣F,I)
=0.5×0.1×0.1 / P(F,I)
After calculating P(F,I) from the formula above, insert its value into the equation to get P(J∣F,I).
This approach, rooted in Bayes' Theorem, provides the foundation for the Naive Bayes classifier in the realm of Natural Language Processing.
728x90
반응형
'🌳AI Projects: NLP🍀✨ > NLP Deep Dive' 카테고리의 다른 글
Naive Bayes versus BERT in Sentiment Analysis (0) | 2024.08.24 |
---|---|
Links to Text Summarization with BART Model (0) | 2024.08.24 |
Processing Text Data for Bayesian Inference with Python (0) | 2023.09.11 |
Resolving the "NameError: name 'pd' is not defined" in Python (0) | 2023.09.11 |
Comments