A tool for condensing extensive amounts of text.
BART, a transformer model, combines a bidirectional encoder (similar to BERT) and an autoregressive decoder (similar to GPT). It's pre-trained by corrupting text using a noise function and learning to reconstruct it. BART excels in tasks like text generation (e.g., summarization, translation) and performs effectively in comprehension tasks (e.g., text classification, question answering). This specific version has been fine-tuned on the CNN Daily Mail dataset, optimizing it for generating summaries from text.
To run this project, you will need to add the following environment variables to your .env file
- Hugging Face
API_KEY
Clone the project
git clone https://github.com/lohithgsk/automated-text-summarizer.git
Go to the project directory
cd my-project
Install dependencies
npm install
Start the server
node index.js
If you have any feedback, please reach out to us.