π Want to create a fully automated news website? This tutorial will teach you how to:
β
Scrape & rank trending news using AI
β
Fetch full articles (headings, images, links)
β
Host your API for free on Vercel
β
Load API keys securely from .env
Letβs build your AI-powered news API from scratch! π₯
First, install the required Python libraries:
pip install flask requests newspaper3k beautifulsoup4 python-dotenv nltk textblob
Weβll use:
Flask β Create an API
Requests β Fetch news from APIs
Newspaper3k & BeautifulSoup β Scrape full articles
TextBlob & NLTK β Perform sentiment analysis
python-dotenv β Load API keys securely
1οΈβ£ Go to https://newsapi.org/
2οΈβ£ Sign up & get your API key
Now, letβs store it securely in a .env
file.
.env
Create a .env
file in your project folder:
NEWS_API_KEY=your_news_api_key
Now, modify your Flask app to load this key securely:
import os
from dotenv import load_dotenv
# Load API Key from .env
load_dotenv()
NEWS_API_KEY = os.getenv("NEWS_API_KEY")
β Now, your API key isnβt exposed in the code!
from flask import Flask, jsonify, request
import requests
import json
from bs4 import BeautifulSoup
from datetime import datetime
import nltk
from textblob import TextBlob
from dotenv import load_dotenv
import os
nltk.download('punkt')
app = Flask(__name__)
# Load API Key
load_dotenv()
NEWS_API_KEY = os.getenv("NEWS_API_KEY")
# Function to fetch news
def fetch_news():
url = f"https://newsapi.org/v2/top-headlines?country=us&category=general&apiKey={NEWS_API_KEY}"
response = requests.get(url)
data = response.json()
return data.get("articles", [])
# Function to analyze sentiment
def analyze_sentiment(text):
analysis = TextBlob(text)
return analysis.sentiment.polarity # Returns score from -1 to 1
# Function to scrape full article content
def get_full_article(url):
try:
response = requests.get(url, headers={"User-Agent": "Mozilla/5.0"})
soup = BeautifulSoup(response.content, "html.parser")
# Extract Title
title = soup.find("h1").get_text() if soup.find("h1") else ""
# Extract Headings
headings = [h.get_text() for h in soup.find_all(["h2", "h3"])]
# Extract Paragraphs
paragraphs = [p.get_text() for p in soup.find_all("p")]
# Extract Images
images = [img["src"] for img in soup.find_all("img") if "src" in img.attrs]
# Extract Links
links = [a["href"] for a in soup.find_all("a", href=True)]
# Construct structured article
full_article = {
"title": title,
"headings": headings,
"content": "\n".join(paragraphs),
"images": images,
"links": links
}
return full_article
except Exception as e:
return {"error": f"Failed to scrape article: {str(e)}"}
# API to Get Top Ranked News
@app.route('/api/news', methods=['GET'])
def get_news():
articles = fetch_news()
ranked_news = sorted(articles, key=lambda x: analyze_sentiment(x["title"]), reverse=True)
return jsonify(ranked_news[:5]) # Return top 5 news articles
# API to Fetch Full Article
@app.route('/api/full-article', methods=['GET'])
def get_full_news():
url = request.args.get("url")
if not url:
return jsonify({"error": "URL parameter is required"}), 400
full_article = get_full_article(url)
return jsonify(full_article)
# Run Flask App
if __name__ == '__main__':
app.run(debug=True)
Run the Flask app:
python app.py
Now, visit:
Top news: http://127.0.0.1:5000/api/news
Full article: http://127.0.0.1:5000/api/full-article?url=ARTICLE_URL
Now, letβs deploy our API on Vercel so anyone can access it online!
npm install -g vercel
vercel.json
Create a vercel.json
file in your project:
{
"version": 2,
"builds": [{ "src": "app.py", "use": "@vercel/python" }],
"routes": [{ "src": "/(.*)", "dest": "app.py" }]
}
1οΈβ£ Login to Vercel
vercel login
2οΈβ£ Deploy your Flask API
vercel
β
Your API is now live on a URL like:
π https://your-news-api.vercel.app/api/news
π― Now you have a fully automated AI-powered news API!
β
Fetches trending news
β
Extracts full articles (headings, images, links)
β
Loads API key securely from .env
β
Deployed for free on Vercel!
π© Add email newsletters β Checkout this
π Integrate with Next.js β Build a full news website
π§ Use AI summarization β Auto-generate article highlights
π¬ Want help integrating this into a Next.js frontend? Let me know! π
π¨βπ» Programmer | βοΈ Love Traveling | π³ Enjoy Cooking | Building cool tech and exploring the world!
View more blogs by me CLICK HERE
Loading related blogs...
In this newsletter we provide latest news about technology, business and startup ideas. Hope you like it.