Analyzing and understanding the users or customers emotions using pre-trained model [BERT] Bidirectional Encoder Representations from Transformers. Rating the dataset(Reviews , feedack etc ...) between the scale of (1-5) . In this (1) stands for the worst and (5) stands for the best . The more we have insights about our users/customers , the more we can personalize and customize our products based upon the end users !!! .
Bidirectional Encoder Representations from Transformers.BERT is the state of the art language model for NLP . This is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. Yes , its a pre-trained ! model devleoped in 2018 by Jacob Devlin and his colleagues from Google. For , more details check out the reference below . Ref. BERT Model
The Transformer is the first transduction model relying entirely on self-attention to compute representations of its input and output without using sequence-aligned RNNs or convolution.
pip install transformers
pip install torch
pip install requests
pip install numpy
pip install pandas
pip install re
pip install BeautifulSoup #for web scraping
"we can collect the data either by web scraping or we can also use freely available datasets online."