Skip to content

Analyzing the sentiments of the users or customers based on their feedback or reviews using BERT neural network model .

License

Notifications You must be signed in to change notification settings

Gladrin22/Sentiment_Analysis_BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Sentiment_Analysis_using_BERT

Analyzing and understanding the users or customers emotions using pre-trained model [BERT] Bidirectional Encoder Representations from Transformers. Rating the dataset(Reviews , feedack etc ...) between the scale of (1-5) . In this (1) stands for the worst and (5) stands for the best . The more we have insights about our users/customers , the more we can personalize and customize our products based upon the end users !!! .

What is BERT ?

Bidirectional Encoder Representations from Transformers.BERT is the state of the art language model for NLP . This is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. Yes , its a pre-trained ! model devleoped in 2018 by Jacob Devlin and his colleagues from Google. For , more details check out the reference below . Ref. BERT Model

What is a transformer ?

The Transformer is the first transduction model relying entirely on self-attention to compute representations of its input and output without using sequence-aligned RNNs or convolution.

Packages :

pip install transformers 
pip install torch
pip install requests 
pip install numpy
pip install pandas 
pip install re   
pip install BeautifulSoup  #for web scraping

Datasets:

"we can collect the data either by web scraping or we can also use freely available datasets online."

License

MIT

forthebadge made-with-python & Made withJupyter

About

Analyzing the sentiments of the users or customers based on their feedback or reviews using BERT neural network model .

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published