AI-Powered Threat Intelligence Platform for Ransomware Monitoring
Powered by Machine Learning & Real-time Dark Web Analysis
Features β’ Installation β’ Quick Start β’ Monitoring β’ API β’ Frontend
Dragons Eye is the umbrella name for threat intelligence tools developed by Dragons Community. This Ransomware Tracker is a powerful, open-source platform designed to monitor, track, and analyze ransomware leak sites across the dark web.
Built for security researchers, threat analysts, and SOC teams, Dragons Eye provides:
- π Automated scraping every 30 minutes
- π‘οΈ Protection page bypass with retry logic
- π Modern web dashboard for visualization
- π€ AI-powered enrichment for victim data
- π Real-time statistics and analytics
β οΈ Disclaimer: This tool is for research and educational purposes only. Developed and maintained by Dragons Community members.
You can use Dragons Eye as a standalone CLI tool without the frontend. Perfect for:
- π¬ Security researchers
- π€ Automated threat intel pipelines
- π Data collection scripts
- π Integration with other tools
# 1. Clone and setup
git clone https://github.com/dragons-community/DragonsEye-RansomwareTracker.git
cd DragonsEye-RansomwareTracker
# 2. Setup Python environment
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
# 3. Install Playwright browsers
playwright install firefox
# 4. Configure environment
cp env.example .env
nano .env # Add your settings
# 5. Start Tor (required for .onion sites)
# macOS: brew services start tor
# Linux: sudo systemctl start tor# π Scrape all groups
python3 bin/scrape.py --all
# π Scrape specific group
python3 bin/scrape.py --group lockbit3
# π Parse scraped data
python3 bin/parse.py --all
# π Check system status
python3 bin/status.py
# π Full update (scrape + parse)
python3 bin/scrape.py --all && python3 bin/parse.py --all
# π Export data
cat db/victims.json | jq '.[] | select(.group_name=="lockbit3")'# Start API server for REST access
python3 api/main.py
# API endpoints:
# GET http://localhost:8000/api/v1/victims
# GET http://localhost:8000/api/v1/groups
# GET http://localhost:8000/api/v1/statistics| File | Description |
|---|---|
db/victims.json |
All victim records |
db/groups.json |
Group configurations |
db/decryptors.json |
Available decryptors |
db/ransom_notes.json |
Ransom notes collection |
| Feature | Description |
|---|---|
| π Automated Scraping | Scrape 300+ ransomware leak sites including .onion domains via Tor |
| π Auto-Update Scheduler | Automatic scrape + parse every 30 minutes |
| π‘οΈ Protection Bypass | Intelligent retry with DDoS/Captcha detection |
| π Modern Dashboard | Next.js frontend with interactive world map |
| π REST API | FastAPI backend with comprehensive endpoints |
| πΈ Screenshot Capture | Automatic screenshots with watermarking |
| π€ AI Enrichment | OpenAI/LM Studio integration for victim profiling |
| π HTTP Fingerprinting | Server identification and security header analysis |
| π Emoji Logging | Clear, visual log output for easy monitoring |
DragonsEye-RansomwareTracker/
β
βββ api/ # FastAPI Backend
β βββ main.py # API server with auto-scheduler
β
βββ bin/ # Core Python Scripts
β βββ _parsers/ # Individual group parsers (109)
β βββ scrape.py # Main scraping engine
β βββ parse.py # Data parsing orchestrator
β βββ status.py # π System status monitor
β βββ manage.py # CLI management tool
β βββ shared_utils.py # Shared utilities
β βββ libcapture.py # Screenshot utilities
β βββ enrich_existing.py # AI enrichment script
β βββ fetch_ransom_notes.py # Ransom notes fetcher
β
βββ db/ # JSON Databases
β βββ victims.json # Victim records (24,000+)
β βββ groups.json # Group configurations (306)
β βββ decryptors.json # Decryptor tools
β βββ ransom_notes.json # Ransom notes collection
β βββ negotiations_data.json # Negotiation chats
β
βββ images/ # Static Assets
β βββ groups/ # Group screenshots & logos
β βββ victims/ # Victim page screenshots
β
βββ logs/ # Log Files
β βββ update_latest.log # Latest update log
β
βββ tmp/ # Temporary/Cache Files
β βββ *.html # Scraped HTML files
β βββ scheduler_status.json # Scheduler status
β
βββ env.example # Environment template (copy to .env)
βββ requirements.txt # Python dependencies
- Python 3.9+
- Node.js 18+ (for frontend)
- Tor service running locally
- Playwright browsers installed
# 1. Clone the repository
git clone https://github.com/dragons-community/DragonsEye-RansomwareTracker.git
cd DragonsEye-RansomwareTracker
# 2. Create Python virtual environment
python3 -m venv venv
source venv/bin/activate
# 3. Install Python dependencies
pip install -r requirements.txt
# 4. Install Playwright browsers
playwright install firefox chromium
# 5. Install frontend dependencies
cd frontend
npm install
cd ..
# 6. Configure environment
cp env.example .env
nano .env # Edit with your settings# Dragons Core Configuration
DRAGONS_HOME=/path/to/DragonsEye-RansomwareTracker
DB_DIR=/db
IMAGES_DIR=/images
TMP_DIR=/tmp
# Tor Configuration
TOR_PROXY_SERVER=socks5://127.0.0.1:9050
# AI Enrichment (Optional)
OPENAI_API_KEY=sk-your-openai-key
# Or for local LM Studio:
OPENAI_BASE_URL=http://localhost:1234/v1# Terminal 1: Start API (includes auto-scheduler)
python3 api/main.py
# Terminal 2: Start Frontend
cd frontend && npm run devThat's it!
- API runs on: http://localhost:8000
- Frontend runs on: http://localhost:3000
- Auto-update runs every 30 minutes
# Quick status
python3 bin/status.py
# Verbose status (with recent victims)
python3 bin/status.py -v
# Live monitoring (refreshes every 10s)
python3 bin/status.py --watchExample Output:
======================================================================
π DRAGONS EYE - SYSTEM STATUS
======================================================================
β° Check time: 2026-01-09 02:56:44
π‘ API STATUS:
β
API Running
π Data Freshness: fresh
π Victims Age: 1h 1m
π Scheduler: idle
π HTML FILES:
π Total: 176 files
β
Real Data: 152 files
π‘οΈ Protection Page: 24 files
π Success Rate: 86.4%
π₯ DATABASE:
π Total Victims: 24,765
π Added Today: 10
π΄ Total Groups: 306 (71 active)
======================================================================
# Follow update logs
tail -f logs/update_latest.log
# Follow API logs
tail -f logs/api.logcd bin
# Scrape all groups (verbose)
python3 scrape.py -V
# Scrape specific group
python3 scrape.py -G qilin -V
# Force scrape (bypass enabled flag)
python3 scrape.py -B -VScrape Output with Emojis:
[02:56:44] π [qilin] Scraping http://ijzn3si...
[02:57:30] β
[qilin] OK (78KB) - Qilin blog
[02:57:35] π‘οΈ [clop] DDoS Protection - bypass failed
[02:57:40] β° [anubis] Timeout - http://om6q4a...
[02:57:45] π [lockbit5] Attempt 2 failed, retrying...
π SCRAPE RESULT SUMMARY
============================================================
β
Success: 152
π‘οΈ Protected: 24
β° Timeout: 8
β Error: 3
βοΈ Skipped: 45
============================================================
cd bin
# Parse all groups
python3 parse.py
# Parse specific group
python3 parse.py -G lockbit3
# Force parse (remove lock)
python3 parse.py -Fcd bin
# Enrich existing victims (activity/sector only)
python3 enrich_activity_only.py --limit 100
# Full enrichment
python3 enrich_existing.py --limit 50Base URL: http://localhost:8000/api/v1
| Endpoint | Description |
|---|---|
GET /victims |
List victims (paginated) |
GET /victims/{id} |
Get victim by ID |
GET /groups |
List all groups |
GET /groups/{name} |
Get group details |
GET /stats/summary |
Overall statistics |
GET /stats/countries |
Country breakdown |
GET /stats/sectors |
Sector breakdown |
GET /stats/trend |
Attack trend (30 days) |
GET /status |
System status |
POST /update/trigger |
Trigger manual update |
GET /decryptors |
List decryptors |
GET /ransom-notes |
List ransom notes |
GET /negotiations |
List negotiation chats |
# Get latest 10 victims
curl "http://localhost:8000/api/v1/victims?limit=10&sort=desc"
# Get statistics
curl "http://localhost:8000/api/v1/stats/summary"
# Trigger manual update
curl -X POST "http://localhost:8000/api/v1/update/trigger"
# Check status
curl "http://localhost:8000/api/v1/status"| Page | Route | Description |
|---|---|---|
| Dashboard | / |
Overview with stats, map, latest victims |
| Victims | /victims |
Searchable victim list |
| Victim Detail | /victims/[id] |
Individual victim info |
| Groups | /groups |
Ransomware group list |
| Group Detail | /groups/[id] |
Group profile & victims |
| Countries | /country |
Country analysis |
| Industries | /industry |
Sector analysis |
| Statistics | /statistics |
Charts & trends |
| Negotiations | /negotiation |
Chat logs |
| Decryptors | /decryptors |
Available tools |
| Ransom Notes | /ransom-notes |
Note collection |
| About | /about |
About Dragons Community |
The system automatically updates every 30 minutes:
- Scrape Phase: Fetch HTML from all enabled group sites
- Parse Phase: Extract victim data from HTML
- Cache Clear: Refresh API cache for new data
- Status Update: Update scheduler status file
In api/main.py:
UPDATE_INTERVAL_MINUTES = 30 # Update every 30 minutes
RUN_ON_STARTUP = True # Run update immediately on startupDragons Eye includes intelligent protection page detection:
| Protection Type | Detection | Handling |
|---|---|---|
| DDoS Protection | β | Retry with longer wait |
| Captcha | β | Retry 3x, then skip |
| Cloudflare | β | JS render + wait |
| JS Challenge | β | Extended wait time |
Retry Logic:
- Attempt 1: 60s wait
- Attempt 2: 90s wait
- Attempt 3: 120s wait
- Then mark as blocked
| Command | Description |
|---|---|
python3 api/main.py |
Start API + scheduler |
npm run dev (frontend/) |
Start frontend |
python3 bin/status.py |
Check status |
python3 bin/status.py -v |
Verbose status |
python3 bin/status.py --watch |
Live monitoring |
python3 bin/scrape.py -V |
Manual scrape |
python3 bin/scrape.py -G <name> |
Scrape single group |
python3 bin/parse.py |
Manual parse |
python3 bin/parse.py -F |
Force parse |
tail -f logs/update_latest.log |
Watch update log |
Contributions are welcome! Areas of interest:
- New Parsers: Add support for new ransomware groups
- Protection Bypass: Improve captcha/DDoS handling
- Frontend: UI/UX improvements
- Documentation: Help improve docs
Create bin/_parsers/newgroup.py:
from shared_utils import stdlog, errlog, appender
from bs4 import BeautifulSoup
def parse(html_content, group_name, location):
soup = BeautifulSoup(html_content, 'html.parser')
for victim in soup.find_all('div', class_='victim'):
name = victim.find('h2').text.strip()
appender(
victim=name,
group_name=group_name,
description='',
website='',
post_url=location['slug']
)This project is released under the Unlicense - see the LICENSE file.
Dragons Eye is provided for research and educational purposes only.
- Do NOT use for unauthorized access
- Do NOT engage with ransomware operators
- Do NOT pay ransoms
- DO report findings to appropriate authorities
Developed by Dragons Community. The maintainers assume no liability for misuse.
- GitHub: Dragons-Community
- X (Twitter): @DragonsCyberHQ
- Support: support@dragons.community
π Dragons Eye - Ransomware Tracker
Made with π₯ by Dragons Community for the cybersecurity community