Initial commit

This commit is contained in:
Autumn.home 2024-06-05 21:39:34 +08:00
commit a589d475b4
1722 changed files with 216206 additions and 0 deletions

4
.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
.idea
__pycache__/
__pycache__
config.yaml

113
README.md Normal file
View File

@ -0,0 +1,113 @@
<div align=center>
<img src="docs/images/favicon.ico"/>
</div>
中文 | [English](./README_EN.md)
## 网址
- 官网:[https://www.scope-sentry.top](https://www.scope-sentry.top)
- Github: [https://github.com/Autumn-27/ScopeSentry](https://github.com/Autumn-27/ScopeSentry)
- 扫描端源码:[https://github.com/Autumn-27/ScopeSentry-Scan](https://github.com/Autumn-27/ScopeSentry-Scan)
## 介绍
Scope Sentry是一款具有资产测绘、子域名枚举、信息泄露检测、漏洞扫描、目录扫描、子域名接管、爬虫、页面监控功能的工具通过构建多个节点自由选择节点运行扫描任务。当出现新漏洞时可以快速排查关注资产是否存在相关组件。
## 目前功能
- 子域名枚举
- 子域名接管检测
- 端口扫描
- 资产识别
- 目录扫描
- 漏洞扫描
- 敏感信息泄露检测
- URL提取
- 爬虫
- 页面监控
- 自定义WEB指纹
- POC导入
- 资产分组
- 多节点扫描
- webhook
## To DO
- 插件系统
- 弱口令爆破
- 数据清洗
- 数据共享?
- ~
## 安装
安装教程见[官网](https://www.scope-sentry.top)
## 交流
Discord:
[https://discord.gg/agsYdAyN](https://discord.gg/agsYdAyN)
QQ:
<img src="docs/images/qq.png" alt="QQ" width="200"/>
WX:
<img src="docs/images/wx.png" alt="WX" width="200"/>
## 截图
### 登录
![alt text](docs/images/login.png)
### 首页面板
![alt text](docs/images/index-cn.png)
## 资产数据
### 资产
![alt text](docs/images/asset-cn.png)
### 子域名
![alt text](docs/images/subdomain-cn.png)
### 子域名接管
![alt text](docs/images/subt-cn.png)
### URL
![alt text](docs/images/url-cn.png)
### 爬虫
![alt text](docs/images/craw-cn.png)
### 敏感信息
![alt text](docs/images/sns-cn.png)
### 目录扫描
![alt text](docs/images/dir-cn.png)
### 漏洞
![alt text](docs/images/vul-cn.png)
### 页面监控
![alt text](docs/images/page-cn.png)
## 项目
![](docs/images/project-cn.png)
## 任务
![](docs/images/task-cn.png)
## 任务进度
![](docs/images/task-pg-cn.png)
## 节点
![](docs/images/node-cn.png)

101
README_EN.md Normal file
View File

@ -0,0 +1,101 @@
<div align=center>
<img src="docs/images/favicon.ico"/>
</div>
## Website
- Official Website: [https://www.scope-sentry.top](https://www.scope-sentry.top)
- Github: [https://github.com/Autumn-27/ScopeSentry](https://github.com/Autumn-27/ScopeSentry)
- Scanner source code: [https://github.com/Autumn-27/ScopeSentry-Scan](https://github.com/Autumn-27/ScopeSentry-Scan)
## Introduction
Scope Sentry is a tool with functions such as asset mapping, subdomain enumeration, information leakage detection, vulnerability scanning, directory scanning, subdomain takeover, crawler, and page monitoring. By building multiple nodes, users can freely choose nodes to run scanning tasks. When new vulnerabilities emerge, it can quickly check whether the concerned assets have related components.
## Current Features
- Subdomain Enumeration
- Subdomain Takeover Detection
- Port Scanning
- Asset Identification
- Directory Scanning
- Vulnerability Scanning
- Sensitive Information Leakage Detection
- URL Extraction
- Crawler
- Page Monitoring
- Custom WEB Fingerprint
- POC Import
- Asset Grouping
- Multi-Node Scanning
- Webhook
## To Do
- Plugin System
- Weak Password Cracking
- Data Cleaning
- Data Sharing?
- ~
## Installation
For installation instructions, see the [official website](https://www.scope-sentry.top)
## Communication
Discord:
[https://discord.gg/agsYdAyN](https://discord.gg/agsYdAyN)
## Screenshots
### Login
![alt text](docs/images/login.png)
### Homepage Dashboard
![alt text](docs/images/index-en.png)
## Asset Data
### Assets
![alt text](docs/images/asset-en.png)
### Subdomains
![alt text](docs/images/subdomain-en.png)
### Subdomain Takeover
![alt text](docs/images/subt-en.png)
### URL
![alt text](docs/images/url-en.png)
### Crawler
![alt text](docs/images/craw-en.png)
### Sensitive Information
![alt text](docs/images/sns-en.png)
### Directory Scanning
![alt text](docs/images/dir-en.png)
### Vulnerabilities
![alt text](docs/images/vul-en.png)
### Page Monitoring
![alt text](docs/images/page-en.png)
## Projects
![](docs/images/project-cn.png)
## Tasks
![](docs/images/task-en.png)
## Task Progress
![](docs/images/task-pg-en.png)
## Nodes
![](docs/images/node-cn.png)

70
api/SubdoaminTaker.py Normal file
View File

@ -0,0 +1,70 @@
# -------------------------------------
# @file : SubdoaminTaker.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/27 15:41
# -------------------------------------------
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from pymongo import DESCENDING
from api.users import verify_token
from core.config import POC_LIST
from core.db import get_mongo_db
from core.util import search_to_mongodb
from loguru import logger
router = APIRouter()
@router.post("/subdomaintaker/data")
async def get_subdomaintaker_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# MongoDB collection for SensitiveRule
# Fuzzy search based on the name field
keyword = {
'domain': 'input',
'value': 'value',
'type': 'cname',
'response': 'response',
'project': 'project',
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
# Get the total count of documents matching the search criteria
total_count = await db.SubdoaminTakerResult.count_documents(query)
if total_count == 0:
return {
"code": 200,
"data": {
'list': [],
'total': 0
}
}
# Perform pagination query
cursor: AsyncIOMotorCursor = db.SubdoaminTakerResult.find(query).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
# Process the result as needed
response_data = []
for doc in result:
data = {
"host": doc["input"],
"value": doc["value"],
"type": doc["cname"],
"response": doc["response"],
}
response_data.append(data)
return {
"code": 200,
"data": {
'list': response_data,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}

3
api/__init__.py Normal file
View File

@ -0,0 +1,3 @@
# -*- coding:utf-8 -*-  
# @name: __init__.py
# @version:

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

560
api/asset_info.py Normal file
View File

@ -0,0 +1,560 @@
# -------------------------------------
# @file : assetinfo.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/14 17:14
# -------------------------------------------
import json
from bson import ObjectId
from fastapi import APIRouter, Depends
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.db import get_mongo_db
from core.redis_handler import get_redis_pool
from core.util import *
from pymongo import ASCENDING, DESCENDING
from loguru import logger
router = APIRouter()
@router.get("/asset/statistics/data")
async def asset_statistics_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
aset_count = await db['asset'].count_documents({})
subdomain_count = await db['subdomain'].count_documents({})
sensitive_count = await db['SensitiveResult'].count_documents({})
url_count = await db['UrlScan'].count_documents({})
vulnerability_count = await db['vulnerability'].count_documents({})
return {
"code": 200,
"data": {
"assetCount": aset_count,
"subdomainCount": subdomain_count,
"sensitiveCount": sensitive_count,
"urlCount": url_count,
"vulnerabilityCount": vulnerability_count
}
}
@router.post("/asset/data")
async def asset_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
if len(APP) == 0:
collection = db["FingerprintRules"]
cursor = collection.find({}, {"_id": 1, "name": 1})
async for document in cursor:
document['id'] = str(document['_id'])
del document['_id']
APP[document['id']] = document['name']
if len(SensitiveRuleList) == 0:
collection = db["SensitiveRule"]
cursor = collection.find({}, {"_id": 1, "name": 1})
async for document in cursor:
document['id'] = str(document['_id'])
del document['_id']
SensitiveRuleList[document['id']] = {
"name": document['name'],
"color": document['color']
}
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'app': '',
'body': 'responsebody',
'header': 'rawheaders',
'project': 'project',
'title': 'title',
'statuscode': 'statuscode',
'icon': 'faviconmmh3',
'ip': ['host', 'ip'],
'domain': ['host', 'url', 'domain'],
'port': 'port',
'protocol': ['protocol', 'type'],
'banner': 'raw',
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['asset'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['asset'].find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"host": 1,
"url": 1,
"ip": 1,
"port": 1,
"protocol": 1,
"type": 1,
"title": 1,
"statuscode": 1,
"rawheaders": 1,
"webfinger": 1,
"technologies": 1,
"raw": 1,
"timestamp": 1,
"iconcontent": 1
})
.skip((page_index - 1) * page_size)
.limit(page_size))
.sort([("timestamp", DESCENDING)]))
result = await cursor.to_list(length=None)
result_list = []
for r in result:
tmp = {}
tmp['port'] = r['port']
tmp['time'] = r['timestamp']
tmp['id'] = r['id']
tmp['type'] = r['type']
if r['type'] == 'other':
tmp['domain'] = r['host']
tmp['ip'] = r['ip']
tmp['service'] = r['protocol']
tmp['title'] = ""
tmp['status'] = None
tmp['banner'] = ""
try:
if r['raw'] is not None:
raw_data = json.loads(r['raw'].decode('utf-8'))
for k in raw_data:
tmp['banner'] += k + ":" + str(raw_data[k]).strip("\n") + "\n"
except:
tmp['banner'] = ""
tmp['products'] = []
else:
tmp['domain'] = r['url'].replace(f'{r["type"]}://', '')
tmp['ip'] = r['host']
tmp['service'] = r['type']
tmp['title'] = r['title']
tmp['status'] = r['statuscode']
tmp['url'] = r['url']
tmp['banner'] = r['rawheaders']
tmp['products'] = []
tmp['icon'] = r['iconcontent']
technologies = r['technologies']
if technologies is not None:
tmp['products'] = tmp['products'] + technologies
if r['webfinger'] is not None:
for w in r['webfinger']:
tmp['products'].append(APP[w])
result_list.append(tmp)
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/asset/detail")
async def asset_detail(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
asset_id = request_data.get("id")
# Check if ID is provided
if not asset_id:
return {"message": "ID is missing in the request data", "code": 400}
# Query the database for content based on ID
query = {"_id": ObjectId(asset_id)}
doc = await db.asset.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
products = []
tlsdata = ""
hashes = ""
banner = ""
if doc['type'] == 'other':
domain = doc.get('host', "")
IP = doc.get("ip", "")
URL = ""
service = doc.get("protocol", "")
try:
if doc['raw'] is not None:
raw_data = json.loads(doc['raw'].decode('utf-8'))
for k in raw_data:
banner += k + ":" + str(raw_data[k]).strip("\n") + "\n"
except:
banner = ""
else:
domain = doc.get('url', "").replace("http://", "").replace("https://", "").split(":")[0]
IP = doc.get("host", "")
URL = doc.get("url", "")
service = doc.get("type", "")
products = doc.get('technologies')
if products == None:
products = []
if doc['webfinger'] is not None:
for w in doc['webfinger']:
products.append(APP[w])
if doc['tlsdata'] is not None:
for h in doc['tlsdata']:
tlsdata += h + ": " + str(doc['tlsdata'][h]) + '\n'
if doc['hashes'] is not None:
for h in doc['hashes']:
hashes += h + ": " + str(doc['hashes'][h]) + '\n'
banner = doc.get('rawheaders', "")
project_name = ""
if doc.get("project", "") != "":
query = {"_id": ObjectId(doc.get("project", ""))}
project_data = await db.project.find_one(query)
project_name = project_data.get("name", "")
data = {
"host": domain,
"IP": IP,
"URL": URL,
"port": doc.get("port", ""),
"service": service,
"title": doc.get("title", ""),
"status": doc.get("statuscode", ""),
"FaviconHash": doc.get("faviconmmh3", ""),
"jarm": doc.get("jarm", ""),
"time": doc.get("timestamp", ""),
"products": products,
"TLSData": tlsdata,
"hash": hashes,
"banner": banner,
"ResponseBody": doc.get("responsebody", ""),
"project": project_name
}
return {"code": 200, "data": data}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/asset/statistics")
async def asset_data_statistics(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
search_query = request_data.get("search", "")
keyword = {
'app': '',
'body': 'responsebody',
'header': 'rawheaders',
'project': 'project',
'title': 'title',
'statuscode': 'statuscode',
'icon': 'faviconmmh3',
'ip': ['host', 'ip'],
'domain': ['host', 'url', 'domain'],
'port': 'port',
'protocol': ['protocol', 'type'],
'banner': 'raw',
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
cursor: AsyncIOMotorCursor = ((db['asset'].find(query, {
"port": 1,
"protocol": 1,
"type": 1,
"webfinger": 1,
"technologies": 1,
"faviconmmh3": 1,
"iconcontent": 1
})))
result = await cursor.to_list(length=None)
result_list = {"Port": [], "Service": [], "Product": [], "Icon": []}
port_list = {}
service_list = {}
icon_list = {}
icon_tmp = {}
tec_list = {}
for r in result:
if r['port'] not in port_list:
port_list[r['port']] = 1
else:
port_list[r['port']] += 1
if r['type'] == "http" or r['type'] == "https":
service = r['type']
icon = r['iconcontent']
icon_hash = r['faviconmmh3']
if icon_hash != "":
icon_tmp[icon_hash] = icon
if icon_hash not in icon_list:
icon_list[icon_hash] = 1
else:
icon_list[icon_hash] += 1
if r['technologies'] != None:
for t in r['technologies']:
if t != "":
if t not in tec_list:
tec_list[t] = 1
else:
tec_list[t] += 1
if r['webfinger'] != None:
for wf in r['webfinger']:
if wf != None:
if APP[wf] not in tec_list:
tec_list[APP[wf]] = 1
else:
tec_list[APP[wf]] += 1
else:
service = r['protocol']
if service != "":
if service not in service_list:
service_list[service] = 1
else:
service_list[service] += 1
service_list = dict(sorted(service_list.items(), key=lambda item: -item[1]))
for service in service_list:
result_list['Service'].append({"value": service, "number": service_list[service]})
port_list = dict(sorted(port_list.items(), key=lambda item: -item[1]))
for port in port_list:
result_list['Port'].append({"value": port, "number": port_list[port]})
tec_list = dict(sorted(tec_list.items(), key=lambda item: -item[1]))
for tec in tec_list:
result_list['Product'].append({"value": tec, "number": tec_list[tec]})
icon_list = dict(sorted(icon_list.items(), key=lambda item: -item[1]))
for ic in icon_list:
result_list['Icon'].append({"value": icon_tmp[ic], "number": icon_list[ic], "icon_hash": ic})
return {
"code": 200,
"data": result_list
}
@router.post("/subdomain/data")
async def asset_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'domain': 'host',
'ip': 'ip',
'type': 'type',
'project': 'project',
'value': 'value'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['subdomain'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['subdomain'].find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"host": 1,
"type": 1,
"value": 1,
"ip": 1,
"time": 1,
})
.skip((page_index - 1) * page_size)
.limit(page_size))
.sort([("time", DESCENDING)]))
result = await cursor.to_list(length=None)
result_list = []
for r in result:
if r['value'] is None:
r['value'] = []
if r['ip'] is None:
r['ip'] = []
result_list.append(r)
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/url/data")
async def url_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'url': 'output',
'project': 'project',
'input': 'input',
'source': 'source',
"type": "outputtype"
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['UrlScan'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['UrlScan'].find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"input": 1,
"source": 1,
"type": "$outputtype",
"url": "$output",
"time": 1,
})
.skip((page_index - 1) * page_size)
.limit(page_size))
.sort([("time", DESCENDING)]))
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/crawler/data")
async def crawler_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'url': 'url',
'method': 'method',
'body': 'body',
'project': 'project'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['crawler'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['crawler'].find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"method": 1,
"body": 1,
"url": 1
})
.sort([('_id', -1)])
.skip((page_index - 1) * page_size)
.limit(page_size))
)
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/asset/statistics2")
async def asset_data_statistics2(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
search_query = request_data.get("search", "")
keyword = {
'app': '',
'body': 'responsebody',
'header': 'rawheaders',
'project': 'project',
'title': 'title',
'statuscode': 'statuscode',
'icon': 'faviconmmh3',
'ip': ['host', 'ip'],
'domain': ['host', 'url', 'domain'],
'port': 'port',
'protocol': ['protocol', 'type'],
'banner': 'raw',
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
pipeline = [
{
"$match": query # 添加搜索条件
},
{
"$facet": {
"by_type": [
{"$group": {"_id": "$type", "num_tutorial": {"$sum": 1}}},
{"$match": {"_id": {"$ne": None}}}
],
"by_port": [
{"$group": {"_id": "$port", "num_tutorial": {"$sum": 1}}},
{"$match": {"_id": {"$ne": None}}}
],
"by_protocol": [
{"$group": {"_id": "$protocol", "num_tutorial": {"$sum": 1}}},
{"$match": {"_id": {"$ne": None}}}
],
"by_icon": [
{"$group": {"_id": "$faviconmmh3", "num_tutorial": {"$sum": 1},
"iconcontent": {"$first": "$iconcontent"}}},
{"$match": {"_id": {"$ne": ""}}}
],
"by_webfinger": [
{"$unwind": "$webfinger"},
{"$group": {"_id": "$webfinger", "num_tutorial": {"$sum": 1}}},
{"$match": {"_id": {"$ne": None}}}
],
"by_technologies": [
{"$unwind": "$technologies"},
{"$group": {"_id": "$technologies", "num_tutorial": {"$sum": 1}}},
{"$match": {"_id": {"$ne": None}}}
]
}
}
]
result = await db['asset'].aggregate(pipeline).to_list(None)
result_list = {"Port": [], "Service": [], "Product": [], "Icon": []}
port_list = {}
service_list = {}
icon_list = {}
icon_tmp = {}
tec_list = {}
for r in result:
for port in r['by_port']:
port_list[port["_id"]] = port["num_tutorial"]
for icon in r['by_icon']:
icon_tmp[icon['_id']] = icon['iconcontent']
icon_list[icon['_id']] = icon['num_tutorial']
for type in r['by_type']:
service_list[type['_id']] = type['num_tutorial']
for technologie in r['by_technologies']:
tec_list[technologie['_id']] = technologie['num_tutorial']
for webfinger in r['by_webfinger']:
try:
if APP[webfinger['_id']] not in tec_list:
tec_list[APP[webfinger['_id']]] = webfinger['num_tutorial']
else:
tec_list[APP[webfinger['_id']]] += webfinger['num_tutorial']
except:
pass
service_list = dict(sorted(service_list.items(), key=lambda item: -item[1]))
for service in service_list:
result_list['Service'].append({"value": service, "number": service_list[service]})
port_list = dict(sorted(port_list.items(), key=lambda item: -item[1]))
for port in port_list:
result_list['Port'].append({"value": port, "number": port_list[port]})
tec_list = dict(sorted(tec_list.items(), key=lambda item: -item[1]))
for tec in tec_list:
result_list['Product'].append({"value": tec, "number": tec_list[tec]})
icon_list = dict(sorted(icon_list.items(), key=lambda item: -item[1]))
for ic in icon_list:
result_list['Icon'].append({"value": icon_tmp[ic], "number": icon_list[ic], "icon_hash": ic})
return {
"code": 200,
"data": result_list
}

120
api/configuration.py Normal file
View File

@ -0,0 +1,120 @@
# -*- coding:utf-8 -*-  
# @name: configuration
# @auth: rainy-autumn@outlook.com
# @version:
from bson import ObjectId
from fastapi import APIRouter, Depends
from api.users import verify_token
from core.db import get_mongo_db
from core.redis_handler import refresh_config
from core.config import set_timezone
from loguru import logger
router = APIRouter()
@router.get("/subfinder/data")
async def get_subfinder_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Find document with name equal to "DomainDic"
result = await db.config.find_one({"name": "SubfinderApiConfig"})
return {
"code": 200,
"data": {
"content": result.get("value", '')
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/subfinder/save")
async def save_subfinder_data(data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Update the document with name equal to "DomainDic"
result = await db.config.update_one({"name": "SubfinderApiConfig"}, {"$set": {"value": data.get('content','')}}, upsert=True)
if result:
await refresh_config('all', 'subfinder')
return {"code": 200, "message": "Successfully updated SubfinderApiConfig value"}
else:
return {"code": 404, "message": "SubfinderApiConfig not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.get("/rad/data")
async def get_rad_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Find document with name equal to "DomainDic"
result = await db.config.find_one({"name": "RadConfig"})
return {
"code": 200,
"data": {
"content": result.get("value", '')
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/rad/save")
async def save_rad_data(data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Update the document with name equal to "DomainDic"
result = await db.config.update_one({"name": "RadConfig"}, {"$set": {"value": data.get('content','')}}, upsert=True)
if result:
await refresh_config('all', 'rad')
return {"code": 200, "message": "Successfully updated RadConfig value"}
else:
return {"code": 404, "message": "SubfinderApiConfig not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.get("/system/data")
async def get_system_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# 查询所有 type 为 "system" 的文档
cursor = db.config.find({"type": "system"})
system_data = {}
async for document in cursor:
# 提取 name 和 value 字段,并添加到 system_data 中
system_data[document["name"]] = document["value"]
return {
"code": 200,
"data": system_data
}
except Exception as e:
logger.error(str(e))
# 根据需要处理异常
return {"message": "error", "code": 500}
@router.post("/system/save")
async def save_system_data(data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
for key, value in data.items():
if key == 'timezone':
set_timezone(value)
# 使用键来查找并更新相应的文档
await db.config.update_one(
{"type": "system", "name": key},
{"$set": {"value": value}},
upsert=True
)
await refresh_config('all', 'system')
return {"message": "Data saved successfully", "code": 200}
except Exception as e:
return {"message": "error", "code": 500}

188
api/dictionary.py Normal file
View File

@ -0,0 +1,188 @@
# -*- coding:utf-8 -*-  
# @name: dictionary
# @auth: rainy-autumn@outlook.com
# @version:
from bson import ObjectId
from fastapi import APIRouter, Depends
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.db import get_mongo_db
from core.redis_handler import refresh_config
from loguru import logger
router = APIRouter()
@router.get("/subdomain/data")
async def get_subdomain_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Find document with name equal to "DomainDic"
result = await db.config.find_one({"name": "DomainDic"})
return {
"code": 200,
"data": {
"dict": result.get("value", '')
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/subdomain/save")
async def save_subdomain_data(data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Update the document with name equal to "DomainDic"
result = await db.config.update_one({"name": "DomainDic"}, {"$set": {"value": data.get('dict','')}}, upsert=True)
if result.modified_count > 0:
await refresh_config('all', 'subdomain')
return {"code": 200, "message": "Successfully updated DomainDic value"}
else:
return {"code": 404, "message": "DomainDic not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.get("/dir/data")
async def get_dir_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Find document with name equal to "DomainDic"
result = await db.config.find_one({"name": "DirDic"})
return {
"code": 200,
"data": {
"dict": result.get("value", '')
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/dir/save")
async def save_subdomain_data(data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Update the document with name equal to "DomainDic"
result = await db.config.update_one({"name": "DirDic"}, {"$set": {"value": data.get('dict','')}}, upsert=True)
if result.modified_count > 0:
await refresh_config('all', 'dir')
return {"code": 200, "message": "Successfully updated DirDic value"}
else:
return {"code": 404, "message": "DirDic not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/port/data")
async def get_port_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
search = request_data.get("search", None) # 获取search参数
# Construct the search query
search_query = {}
if search:
search_regex = {"$regex": search, "$options": "i"} # Case-insensitive regex
search_query = {"$or": [{"name": search_regex}, {"value": search_regex}]}
total_count = await db.PortDict.count_documents(search_query)
# Perform pagination query
cursor: AsyncIOMotorCursor = db.PortDict.find(search_query).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
# Process the result as needed
response_data = [{"id": str(doc["_id"]),"name": doc["name"], "value": doc["value"]} for doc in result]
return {
"code": 200,
"data": {
'list': response_data,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/port/upgrade")
async def upgrade_port_dict(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract values from request data
port_id = request_data.get("id")
name = request_data.get("name")
value = request_data.get("value")
# Update query based on rule_id
update_query = {"_id": ObjectId(port_id)}
# Values to be updated
update_values = {"$set": {"name": name, "value": value}}
# Perform the update
result = await db.PortDict.update_one(update_query, update_values)
await refresh_config('all', 'port')
if result:
return {"code": 200, "message": "SensitiveRule updated successfully"}
else:
return {"code": 404, "message": "SensitiveRule not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/port/add")
async def add_port_dict(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract values from request data
name = request_data.get("name")
value = request_data.get("value",'')
if value == '':
return {"code": 400, "message": "value is null"}
# Create a new SensitiveRule document
new_port_dict = {
"name": name,
"value": value
}
# Insert the new document into the SensitiveRule collection
result = await db.PortDict.insert_one(new_port_dict)
await refresh_config('all', 'port')
# Check if the insertion was successful
if result.inserted_id:
return {"code": 200, "message": "Port dict added successfully"}
else:
return {"code": 400, "message": "Failed to add port dict"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/port/delete")
async def delete_port_dict(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract the list of IDs from the request_data dictionary
port_dict_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(port_dict_id) for port_dict_id in port_dict_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.PortDict.delete_many({"_id": {"$in": obj_ids}})
await refresh_config('all', 'port')
# Check if the deletion was successful
if result.deleted_count > 0:
return {"code": 200, "message": "Port dict deleted successfully"}
else:
return {"code": 404, "message": "Port dict not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

50
api/dirscan.py Normal file
View File

@ -0,0 +1,50 @@
# -------------------------------------
# @file : dirscan.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/5/9 20:22
# -------------------------------------------
from bson import ObjectId
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from api.users import verify_token
from core.db import get_mongo_db
from core.util import search_to_mongodb
from loguru import logger
router = APIRouter()
@router.post("/dirscan/result/data")
async def dirscan_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'project': 'project',
'statuscode': 'status',
'url': 'url',
'redirect': 'msg'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['DirScanResult'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['DirScanResult'].find(query, {"_id": 0})
.sort([('_id', -1)])
.skip((page_index - 1) * page_size)
.limit(page_size)))
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

168
api/fingerprint.py Normal file
View File

@ -0,0 +1,168 @@
# -*- coding:utf-8 -*-  
# @name: fingerprint
# @auth: rainy-autumn@outlook.com
# @version:
from bson import ObjectId
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from api.users import verify_token
from core.db import get_mongo_db
from core.redis_handler import refresh_config
from core.util import string_to_postfix
from core.config import APP
from loguru import logger
router = APIRouter()
@router.post("/fingerprint/data")
async def fingerprint_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
query = {"name": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.FingerprintRules.count_documents(query)
# Perform pagination query and sort by time
cursor: AsyncIOMotorCursor = db.FingerprintRules.find(query, {"_id": 0, "id": {"$toString": "$_id"}, "name": 1, "rule": 1, "category": 1, "parent_category": 1, "amount": 1, "state": 1}).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/fingerprint/update")
async def update_fingerprint_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
fingerprint_id = request_data.get("id")
# Check if ID is provided
if not fingerprint_id:
return {"message": "ID is missing in the request data", "code": 400}
# Check if data to update is provided
if not request_data:
return {"message": "Data to update is missing in the request", "code": 400}
# Extract individual fields from the request data
name = request_data.get("name")
rule = request_data.get("rule")
category = request_data.get("category")
parent_category = request_data.get("parent_category")
state = request_data.get("state")
if rule == '':
return {"code": 500, "message": "rule is null"}
exp = string_to_postfix(rule)
if exp == "":
return {"code": 500, "message": "rule to express error"}
# Prepare the update document
update_document = {
"$set": {
"name": name,
"rule": rule,
"express": exp,
"category": category,
"parent_category": parent_category,
"state": state
}
}
# Remove the ID from the request data to prevent it from being updated
del request_data["id"]
# Update data in the database
result = await db.FingerprintRules.update_one({"_id": ObjectId(fingerprint_id)}, update_document)
# Check if the update was successful
if result:
if fingerprint_id in APP:
APP[fingerprint_id] = name
await refresh_config('all', 'finger')
return {"message": "Data updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
print(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/fingerprint/add")
async def add_fingerprint_rule(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract values from request data
name = request_data.get("name")
rule = request_data.get("rule")
category = request_data.get("category")
parent_category = request_data.get("parent_category")
state = request_data.get("state")
if rule == '':
return {"code": 500, "message": "rule is null"}
exp = string_to_postfix(rule)
if exp == "":
return {"code": 500, "message": "rule to express error"}
# Create a new SensitiveRule document
new_rule = {
"name": name,
"rule": rule,
"category": category,
"express": exp,
"parent_category": parent_category,
'amount': 0,
"state": state
}
# Insert the new document into the SensitiveRule collection
result = await db.FingerprintRules.insert_one(new_rule)
# Check if the insertion was successful
if result.inserted_id:
if str(result.inserted_id) not in APP:
APP[str(result.inserted_id)] = name
await refresh_config('all', 'finger')
return {"code": 200, "message": "SensitiveRule added successfully"}
else:
return {"code": 400, "message": "Failed to add SensitiveRule"}
except Exception as e:
print(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/fingerprint/delete")
async def delete_fingerprint_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract the list of IDs from the request_data dictionary
fingerprint_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(fingerprint_id) for fingerprint_id in fingerprint_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.FingerprintRules.delete_many({"_id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
for fid in fingerprint_ids:
if fid in APP:
del APP[fid]
await refresh_config('all', 'finger')
return {"code": 200, "message": "FingerprintRules deleted successfully"}
else:
return {"code": 404, "message": "FingerprintRules not found"}
except Exception as e:
print(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

149
api/node.py Normal file
View File

@ -0,0 +1,149 @@
# -*- coding:utf-8 -*-  
# @name: node
# @auth: rainy-autumn@outlook.com
# @version:
from datetime import datetime
from fastapi import WebSocket
from fastapi import APIRouter, Depends
from starlette.websockets import WebSocketDisconnect
from core.config import *
from api.users import verify_token
from core.redis_handler import get_redis_pool
from core.util import get_now_time
from core.redis_handler import refresh_config
import asyncio, json
from loguru import logger
router = APIRouter()
async def update_redis_data(redis, key):
await redis.hmset(key, {'state': '3'})
@router.get("/node/data")
async def node_data(_: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
async with redis_con as redis:
# 获取所有以 node: 开头的键
keys = await redis.keys("node:*")
# 构建结果字典
result = []
for key in keys:
name = key.split(":")[1]
# 获取哈希中的所有字段和值
hash_data = await redis.hgetall(key)
# 在哈希数据中增加键为 name值为键的名称
hash_data['name'] = name
if hash_data.get('state') == '1':
update_time_str = hash_data.get('updateTime')
if update_time_str:
update_time = datetime.strptime(update_time_str, '%Y-%m-%d %H:%M:%S')
time_difference = (datetime.strptime(get_now_time(), "%Y-%m-%d %H:%M:%S") - update_time).seconds
if time_difference > NODE_TIMEOUT:
await asyncio.create_task(update_redis_data(redis, key))
hash_data['state'] = '3'
# 将哈希数据添加到结果数组中
result.append(hash_data)
return {
"code": 200,
"data": {
'list': result
}
}
@router.get("/node/data/online")
async def node_data_online(_: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
result = await get_redis_online_data(redis_con)
return {
"code": 200,
"data": {
'list': result
}
}
@router.post("/node/config/update")
async def node_config_update(config_data: dict, _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
name = config_data.get("name")
max_task_num = config_data.get("maxTaskNum")
state = config_data.get("state")
if name is None or max_task_num is None or state is None:
return {"code": 400, "message": "Invalid request, missing required parameters"}
async with redis_con as redis:
key = f"node:{name}"
redis_state = await redis.hget(key, "state")
if state:
if redis_state == "2":
await redis.hset(key, "state", "1")
else:
if redis_state == "1":
await redis.hset(key, "state", "2")
del config_data["name"]
del config_data["state"]
for c in config_data:
await redis.hset(key, c, config_data[c])
await refresh_config(name, 'nodeConfig')
return {"code": 200, "message": "Node configuration updated successfully"}
except Exception as e:
return {"code": 500, "message": f"Internal server error: {str(e)}"}
@router.post("/node/delete")
async def delete_node_rules(request_data: dict, _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
node_names = request_data.get("names", [])
for name in node_names:
logger.info("delete node:" + name)
await redis_con.delete("node:" + name)
return {"message": "Node deleted successfully", "code": 200}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/node/log/data")
async def get_node_logs(request_data: dict, _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
node_name = request_data.get("name")
if not node_name:
return {"message": "Node name is required", "code": 400}
# 构建日志键
log_key = f"log:{node_name}"
# 从 Redis 中获取日志列表
logs = await redis_con.lrange(log_key, 0, -1)
log_data = ""
for log in logs:
log_data += log
return {"code": 200, "logs": log_data}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "Error retrieving logs", "code": 500}
async def get_redis_online_data(redis_con):
async with redis_con as redis:
# 获取所有以 node: 开头的键
keys = await redis.keys("node:*")
# 构建结果字典
result = []
for key in keys:
name = key.split(":")[1]
hash_data = await redis.hgetall(key)
if hash_data.get('state') == '1':
update_time_str = hash_data.get('updateTime')
if update_time_str:
update_time = datetime.strptime(update_time_str, '%Y-%m-%d %H:%M:%S')
time_difference = (datetime.strptime(get_now_time(), "%Y-%m-%d %H:%M:%S") - update_time).seconds
if time_difference > NODE_TIMEOUT:
await asyncio.create_task(update_redis_data(redis, key))
hash_data['state'] = '3'
else:
result.append(name)
return result

147
api/notification.py Normal file
View File

@ -0,0 +1,147 @@
# -------------------------------------
# @file : notification.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/5/12 11:37
# -------------------------------------------
from bson import ObjectId
from fastapi import APIRouter, Depends
from pymongo import DESCENDING
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.db import get_mongo_db
from core.redis_handler import refresh_config
from loguru import logger
router = APIRouter()
@router.get("/notification/data")
async def get_notification_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
cursor: AsyncIOMotorCursor = db.notification.find({}, {"id": {"$toString": "$_id"}, "_id": 0, "name": 1, "method": 1, "url": 1, "contentType": 1,"data": 1, "state": 1})
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/notification/add")
async def add_notification_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
if not request_data:
return {"message": "Data to add is missing in the request", "code": 400}
result = await db.notification.insert_one(request_data)
if result.inserted_id:
await refresh_config('all', 'notification')
return {"message": "Data added successfully", "code": 200}
else:
return {"message": "Failed to add data", "code": 400}
except Exception as e:
logger.error(str(e))
return {"message": "error", "code": 500}
@router.post("/notification/update")
async def update_notification_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
not_id = request_data.get("id")
# Check if ID is provided
if not not_id:
return {"message": "ID is missing in the request data", "code": 400}
# Check if data to update is provided
if not request_data:
await refresh_config('all', 'notification')
return {"message": "Data to update is missing in the request", "code": 400}
del request_data["id"]
update_document = {
"$set": request_data
}
result = await db.notification.update_one({"_id": ObjectId(not_id)}, update_document)
# Check if the update was successful
if result:
await refresh_config('all', 'notification')
return {"message": "Data updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/notification/delete")
async def delete_notification(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
not_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(not_id) for not_id in not_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.notification.delete_many({"_id": {"$in": obj_ids}})
if result.deleted_count > 0:
await refresh_config('all', 'notification')
return {"code": 200, "message": "Notification deleted successfully"}
else:
return {"code": 404, "message": "Notification not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.get("/notification/config/data")
async def get_notification_config_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
result = await db.config.find_one({"name": "notification"})
del result['_id']
del result['type']
del result['name']
return {
"code": 200,
"data": result
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/notification/config/update")
async def update_notification_config_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
update_document = {
"$set": request_data
}
result = await db.config.update_one({"name": "notification"}, update_document)
# Check if the update was successful
if result:
await refresh_config('all', 'notification')
return {"message": "Data updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

79
api/page_monitoring.py Normal file
View File

@ -0,0 +1,79 @@
# -------------------------------------
# @file : page_monitoring.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/22 19:46
# -------------------------------------------
from bson import ObjectId
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from api.users import verify_token
from core.db import get_mongo_db
from pymongo import ASCENDING, DESCENDING
from loguru import logger
from core.redis_handler import refresh_config
from core.util import *
router = APIRouter()
async def get_page_monitoring_data(db, all):
if all:
query = {}
else:
query = {"state": 1}
cursor: AsyncIOMotorCursor = db.PageMonitoring.find(query, {"url": 1, "_id": 0})
result = await cursor.to_list(length=None)
urls = [item['url'] for item in result]
return urls
@router.post("/page/monitoring/result")
async def page_monitoring_result(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'url': 'url',
'project': 'project',
'hash': 'hash',
'diff': 'diff',
'response': 'response'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
# Get the total count of documents matching the search criteria
query["diff"] = {"$ne": []}
total_count = await db.PageMonitoring.count_documents(query)
# Perform pagination query and sort by time
cursor: AsyncIOMotorCursor = db.PageMonitoring.find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"url": 1,
"content": 1,
"hash": 1,
"diff": 1}).sort(
[("time", DESCENDING)]).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
result_list = []
for r in result:
if len(r['content']) < 2:
continue
result_list.append({
"url": r['url'],
"response1": r['content'][-2],
"response2": r['content'][-1],
"hash1": r['hash'][-2],
"hash2": r['hash'][-1],
"diff": r['diff'][-1],
"history_diff": r['diff'][::1]
})
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}

199
api/poc.py Normal file
View File

@ -0,0 +1,199 @@
# -*- coding:utf-8 -*-  
# @name: poc_manage
# @auth: rainy-autumn@outlook.com
# @version:
from bson import ObjectId
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from api.users import verify_token
from core.db import get_mongo_db
from pymongo import ASCENDING, DESCENDING
from loguru import logger
from core.redis_handler import refresh_config
from core.util import *
from core.config import POC_LIST
router = APIRouter()
@router.post("/poc/data")
async def poc_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
query = {"name": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.PocList.count_documents(query)
# Perform pagination query and sort by time
cursor: AsyncIOMotorCursor = db.PocList.find(query, {"_id": 0, "id": {"$toString": "$_id"}, "name": 1, "level": 1, "time": 1}).sort([("level", DESCENDING), ("time", DESCENDING)]).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.get("/poc/data/all")
async def poc_data(db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
cursor: AsyncIOMotorCursor = db.PocList.find({}, {"id": {"$toString": "$_id"}, "name": 1, "time": -1, "_id": 0}).sort([("time", DESCENDING)])
result = await cursor.to_list(length=None)
return {
"code": 200,
"data": {
'list': result
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/poc/content")
async def poc_content(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
poc_id = request_data.get("id")
# Check if ID is provided
if not poc_id:
return {"message": "ID is missing in the request data", "code": 400}
# Query the database for content based on ID
query = {"_id": ObjectId(poc_id)}
doc = await db.PocList.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
# Extract the content
content = doc.get("content", "")
return {"code": 200, "data": {"content": content}}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/poc/update")
async def update_poc_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
poc_id = request_data.get("id")
# Check if ID is provided
if not poc_id:
return {"message": "ID is missing in the request data", "code": 400}
# Check if data to update is provided
if not request_data:
return {"message": "Data to update is missing in the request", "code": 400}
# Extract individual fields from the request data
name = request_data.get("name")
content = request_data.get("content")
hash_value = calculate_md5_from_content(content)
level = request_data.get("level")
# Prepare the update document
update_document = {
"$set": {
"name": name,
"content": content,
"hash": hash_value,
"level": level
}
}
# Remove the ID from the request data to prevent it from being updated
del request_data["id"]
# Update data in the database
result = await db.PocList.update_one({"_id": ObjectId(poc_id)}, update_document)
# Check if the update was successful
if result:
POC_LIST[poc_id] = level
await refresh_config('all', 'poc')
return {"message": "Data updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/poc/add")
async def add_poc_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Check if data to add is provided
if not request_data:
return {"message": "Data to add is missing in the request", "code": 400}
# Extract individual fields from the request data
name = request_data.get("name")
content = request_data.get("content")
hash_value = calculate_md5_from_content(content)
level = request_data.get("level")
formatted_time = get_now_time()
# Insert data into the database
result = await db.PocList.insert_one({
"name": name,
"content": content,
"hash": hash_value,
"level": level,
"time": formatted_time
})
# Check if the insertion was successful
if result.inserted_id:
POC_LIST[str(result.inserted_id)] = level
await refresh_config('all', 'poc')
return {"message": "Data added successfully", "code": 200}
else:
return {"message": "Failed to add data", "code": 400}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/poc/delete")
async def delete_poc_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract the list of IDs from the request_data dictionary
poc_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(poc_id) for poc_id in poc_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.PocList.delete_many({"_id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
for pid in poc_ids:
if pid in POC_LIST:
del POC_LIST[pid]
return {"code": 200, "message": "Poc deleted successfully"}
else:
return {"code": 404, "message": "Poc not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

747
api/project.py Normal file
View File

@ -0,0 +1,747 @@
import time
from bson import ObjectId
from fastapi import APIRouter, Depends, BackgroundTasks
from api.task import create_scan_task
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.config import Project_List
from core.db import get_mongo_db
from core.redis_handler import refresh_config, get_redis_pool
from loguru import logger
from core.util import *
from core.apscheduler_handler import scheduler
router = APIRouter()
@router.post("/project/data")
async def get_projects_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
background_tasks: BackgroundTasks = BackgroundTasks()):
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
if search_query == "":
query = {}
else:
query = {
"$or": [
{"name": {"$regex": search_query, "$options": "i"}},
{"target": {"$regex": search_query, "$options": "i"}}
]
}
tag_num = {}
tag_result = await db.project.aggregate([{
"$group": {
"_id": "$tag",
"count": {"$sum": 1}
}
}]).to_list(None)
all_num = 0
for tag in tag_result:
tag_num[tag["_id"]] = tag["count"]
all_num += tag["count"]
result_list = {}
tag_num["All"] = all_num
for tag in tag_num:
if tag != "All":
tag_query = {
"$and": [
query,
{"tag": tag}
]
}
else:
tag_query = query
cursor = db.project.find(tag_query, {"_id": 0,
"id": {"$toString": "$_id"},
"name": 1,
"logo": 1,
"AssetCount": 1,
"tag": 1
}).sort("AssetCount", -1).skip((page_index - 1) * page_size).limit(
page_size)
results = await cursor.to_list(length=None)
result_list[tag] = []
for result in results:
result["AssetCount"] = result.get("AssetCount", 0)
result_list[tag].append(result)
background_tasks.add_task(update_project_count, db=db, id=result["id"])
return {
"code": 200,
'data': {
"result": result_list,
"tag": tag_num
}
}
async def update_project_count(db, id):
query = {"project": {"$eq": id}}
total_count = await db['asset'].count_documents(query)
update_document = {
"$set": {
"AssetCount": total_count
}
}
await db.project.update_one({"_id": ObjectId(id)}, update_document)
@router.post("/project/content")
async def get_project_content(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
project_id = request_data.get("id")
if not project_id:
return {"message": "ID is missing in the request data", "code": 400}
query = {"_id": ObjectId(project_id)}
doc = await db.project.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
project_target_data = await db.ProjectTargetData.find_one({"id": project_id})
result = {
"name": doc.get("name", ""),
"tag": doc.get("tag", ""),
"target": project_target_data.get("target", ""),
"node": doc.get("node", []),
"logo": doc.get("logo", ""),
"subdomainScan": doc.get("subdomainScan", False),
"subdomainConfig": doc.get("subdomainConfig", []),
"urlScan": doc.get("urlScan", False),
"sensitiveInfoScan": doc.get("sensitiveInfoScan", False),
"pageMonitoring": doc.get("pageMonitoring", ""),
"crawlerScan": doc.get("crawlerScan", False),
"vulScan": doc.get("vulScan", False),
"vulList": doc.get("vulList", []),
"portScan": doc.get("portScan"),
"ports": doc.get("ports"),
"waybackurl": doc.get("waybackurl"),
"dirScan": doc.get("dirScan"),
"scheduledTasks": doc.get("scheduledTasks"),
"hour": doc.get("hour"),
"allNode": doc.get("allNode", False),
"duplicates": doc.get("duplicates")
}
return {"code": 200, "data": result}
@router.post("/project/add")
async def add_project_rule(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
background_tasks: BackgroundTasks = BackgroundTasks()):
try:
# Extract values from request data
name = request_data.get("name")
target = request_data.get("target")
scheduledTasks = request_data.get("scheduledTasks", False)
hour = request_data.get("hour", 1)
t_list = []
tmsg = ''
root_domains = []
for t in target.split('\n'):
if t not in t_list:
targetTmp = t.replace('http://', "").replace('https://', "") + '\n'
if targetTmp != "":
root_domain = get_root_domain(targetTmp)
if root_domain not in root_domains:
root_domains.append(root_domain)
tmsg += targetTmp
# Create a new SensitiveRule document
tmsg = tmsg.strip().strip("\n")
request_data["root_domains"] = root_domains
del request_data['target']
if "All Poc" in request_data['vulList']:
request_data['vulList'] = ["All Poc"]
cursor = db.project.find({"name": {"$eq": name}}, {"_id": 1})
results = await cursor.to_list(length=None)
if len(results) != 0:
return {"code": 400, "message": "name already exists"}
# Insert the new document into the SensitiveRule collection
result = await db.project.insert_one(request_data)
# Check if the insertion was successful
if result.inserted_id:
await db.ProjectTargetData.insert_one({"id": str(result.inserted_id), "target": tmsg})
if scheduledTasks:
scheduler.add_job(scheduler_project, 'interval', hours=hour, args=[str(result.inserted_id)],
id=str(result.inserted_id), jobstore='mongo')
next_time = scheduler.get_job(str(result.inserted_id)).next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
db.ScheduledTasks.insert_one(
{"id": str(result.inserted_id), "name": name, 'hour': hour, 'type': 'Project', 'state': True,
'lastTime': get_now_time(), 'nextTime': formatted_time, 'runner_id': str(result.inserted_id)})
await scheduler_project(str(result.inserted_id))
background_tasks.add_task(update_project, tmsg, str(result.inserted_id))
await refresh_config('all', 'project')
Project_List[name] = str(result.inserted_id)
return {"code": 200, "message": "Project added successfully"}
else:
return {"code": 400, "message": "Failed to add Project"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/project/delete")
async def delete_project_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
background_tasks: BackgroundTasks = BackgroundTasks()):
try:
# Extract the list of IDs from the request_data dictionary
pro_id = request_data.get("id", '')
# Convert the provided rule_ids to ObjectId
obj_id = ObjectId(pro_id)
# Delete the SensitiveRule documents based on the provided IDs
result = await db.project.delete_many({"_id": {"$eq": obj_id}})
await db.ProjectTargetData.delete_many({"id": {"$eq": pro_id}})
# Check if the deletion was successful
if result.deleted_count > 0:
job = scheduler.get_job(pro_id)
if job:
scheduler.remove_job(pro_id)
background_tasks.add_task(delete_asset_project_handler, pro_id)
for project_id in Project_List:
if pro_id == Project_List[project_id]:
del Project_List[project_id]
break
return {"code": 200, "message": "Project deleted successfully"}
else:
return {"code": 404, "message": "Project not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/project/update")
async def update_project_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
background_tasks: BackgroundTasks = BackgroundTasks()):
try:
# Get the ID from the request data
pro_id = request_data.get("id")
hour = request_data.get("hour")
# Check if ID is provided
if not pro_id:
return {"message": "ID is missing in the request data", "code": 400}
query = {"id": pro_id}
doc = await db.ScheduledTasks.find_one(query)
newScheduledTasks = request_data.get("scheduledTasks")
if doc:
oldScheduledTasks = doc["state"]
old_hour = doc["hour"]
if oldScheduledTasks != newScheduledTasks:
if newScheduledTasks:
scheduler.add_job(scheduler_project, 'interval', hours=hour, args=[pro_id],
id=str(pro_id), jobstore='mongo')
await db.ScheduledTasks.update_one({"id": pro_id}, {"$set": {'state': True}})
else:
scheduler.remove_job(pro_id)
await db.ScheduledTasks.update_one({"id": pro_id}, {"$set": {'state': False}})
else:
if newScheduledTasks:
if hour != old_hour:
job = scheduler.get_job(pro_id)
if job is not None:
scheduler.remove_job(job)
scheduler.add_job(scheduler_project, 'interval', hours=hour, args=[pro_id],
id=str(pro_id), jobstore='mongo')
else:
if newScheduledTasks:
scheduler.add_job(scheduler_project, 'interval', hours=hour, args=[str(pro_id)],
id=str(pro_id), jobstore='mongo')
next_time = scheduler.get_job(str(pro_id)).next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
db.ScheduledTasks.insert_one(
{"id": str(pro_id), "name": request_data['name'], 'hour': hour, 'type': 'Project', 'state': True,
'lastTime': get_now_time(), 'nextTime': formatted_time})
result = await db.project.find_one({"_id": ObjectId(pro_id)})
project_target_data = await db.ProjectTargetData.find_one({"id": pro_id})
old_targets = project_target_data['target']
old_name = result['name']
new_name = request_data['name']
new_targets = request_data['target']
if old_targets != new_targets.strip().strip('\n'):
new_root_domain = []
update_document = {
"$set": {
"target": new_targets.strip().strip('\n')
}
}
for n_t in new_targets.strip().strip('\n').split('\n'):
t_root_domain = get_root_domain(n_t)
if t_root_domain not in new_root_domain:
new_root_domain.append(t_root_domain)
request_data["root_domains"] = new_root_domain
await db.ProjectTargetData.update_one({"id": pro_id}, update_document)
background_tasks.add_task(change_update_project, new_targets.strip().strip('\n'), pro_id)
if old_name != new_name:
del Project_List[old_name]
Project_List[new_name] = pro_id
await db.ScheduledTasks.update_one({"id": pro_id}, {"$set": {'name': new_name}})
request_data.pop("id")
del request_data['target']
update_document = {
"$set": request_data
}
result = await db.project.update_one({"_id": ObjectId(pro_id)}, update_document)
# Check if the update was successful
if result:
return {"message": "Task updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
async def change_update_project(domain, project_id):
async for db in get_mongo_db():
await add_asset_project(db, domain, project_id, True)
await add_subdomain_project(db, domain, project_id, True)
await add_dir_project(db, domain, project_id, True)
await add_vul_project(db, domain, project_id, True)
await add_SubTaker_project(db, domain, project_id, True)
await add_PageMonitoring_project(db, domain, project_id, True)
await add_sensitive_project(db, domain, project_id, True)
await add_url_project(db, domain, project_id, True)
await add_crawler_project(db, domain, project_id, True)
async def add_asset_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['asset'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1,
"host": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"asset project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = ""
if "url" in r:
url = r['url']
else:
url = r['host']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['asset'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['asset'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_asset_project error:{e}")
async def add_subdomain_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['subdomain'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"host": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"subdomain project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['host']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['subdomain'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['subdomain'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_subdomain_project error:{e}")
async def add_url_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['UrlScan'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"input": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"url project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['input']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['UrlScan'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['UrlScan'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_url_project error:{e}")
async def add_crawler_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['crawler'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"crawler project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['url']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['crawler'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['crawler'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_crawler_project error:{e}")
async def add_sensitive_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['SensitiveResult'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"sensitive project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['url']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['SensitiveResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['SensitiveResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_sensitive_project error:{e}")
async def add_dir_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['DirScanResult'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"dir project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['url']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['DirScanResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['DirScanResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_dir_project error:{e}")
async def add_vul_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['vulnerability'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"vul project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['url']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['vulnerability'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['vulnerability'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_vul_project error:{e}")
async def add_PageMonitoring_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['PageMonitoring'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"url": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"PageMonitoring project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['url']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['PageMonitoring'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['PageMonitoring'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_PageMonitoring_project error:{e}")
async def add_SubTaker_project(db, domain, project_id, updata=False):
try:
if updata:
query = {"$or": [{"project": ""}, {"project": project_id}]}
else:
query = {"project": {"$eq": ""}}
cursor: AsyncIOMotorCursor = ((db['SubdoaminTakerResult'].find(query, {
"_id": 0, "id": {"$toString": "$_id"},
"Input": 1
})))
result = await cursor.to_list(length=None)
logger.debug(f"SubTaker project null number is {len(result)}")
if len(result) != 0:
domain_root_list = []
for d in domain.split("\n"):
u = get_root_domain(d)
if u not in domain_root_list:
domain_root_list.append(u)
for r in result:
url = r['input']
if url != "":
targer_url = get_root_domain(url)
if targer_url in domain_root_list:
update_document = {
"$set": {
"project": project_id,
}
}
await db['SubdoaminTakerResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
else:
update_document = {
"$set": {
"project": "",
}
}
await db['SubdoaminTakerResult'].update_one({"_id": ObjectId(r['id'])}, update_document)
except Exception as e:
logger.error(f"add_SubTaker_project error:{e}")
async def update_project(domain, project_id):
async for db in get_mongo_db():
await add_asset_project(db, domain, project_id)
await add_subdomain_project(db, domain, project_id)
await add_dir_project(db, domain, project_id)
await add_vul_project(db, domain, project_id)
await add_SubTaker_project(db, domain, project_id)
await add_PageMonitoring_project(db, domain, project_id)
await add_sensitive_project(db, domain, project_id)
await add_url_project(db, domain, project_id)
await add_crawler_project(db, domain, project_id)
async def delete_asset_project(db, collection, project_id):
try:
query = {"project": project_id}
cursor = db[collection].find(query)
async for document in cursor:
await db[collection].update_one({"_id": document["_id"]}, {"$set": {"project": ""}})
except Exception as e:
logger.error(f"delete_asset_project error:{e}")
async def delete_asset_project_handler(project_id):
async for db in get_mongo_db():
asset_collection_list = ['asset', 'subdomain', 'DirScanResult', 'vulnerability', 'SubdoaminTakerResult',
'PageMonitoring', 'SensitiveResult', 'UrlScan', 'crawler']
for c in asset_collection_list:
await delete_asset_project(db, c, project_id)
async def scheduler_project(id):
logger.info(f"Scheduler project {id}")
async for db in get_mongo_db():
async for redis in get_redis_pool():
next_time = scheduler.get_job(id).next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
doc = await db.ScheduledTasks.find_one({"id": id})
run_id_last = doc.get("runner_id", "")
if run_id_last != "":
progresskeys = await redis.keys(f"TaskInfo:progress:{run_id_last}:*")
for pgk in progresskeys:
await redis.delete(pgk)
task_id = generate_random_string(15)
update_document = {
"$set": {
"lastTime": get_now_time(),
"nextTime": formatted_time,
"runner_id": task_id
}
}
await db.ScheduledTasks.update_one({"id": id}, update_document)
query = {"_id": ObjectId(id)}
doc = await db.project.find_one(query)
targetList = []
target_data = await db.ProjectTargetData.find_one({"id": id})
for t in target_data.get('target', '').split("\n"):
t.replace("http://", "").replace("https://", "")
t = t.strip("\n").strip("\r").strip()
if t != "" and t not in targetList:
targetList.append(t)
await create_scan_task(doc, task_id, targetList, redis)

306
api/scheduled_tasks.py Normal file
View File

@ -0,0 +1,306 @@
# -------------------------------------
# @file : scheduled_tasks.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/28 20:58
# -------------------------------------------
from apscheduler.events import JobSubmissionEvent, EVENT_JOB_MAX_INSTANCES, EVENT_JOB_SUBMITTED
from apscheduler.executors.base import MaxInstancesReachedError
from bson import ObjectId
from fastapi import APIRouter, Depends
from pytz import utc
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.apscheduler_handler import scheduler
from core.db import get_mongo_db
from core.redis_handler import get_redis_pool
from core.util import *
from api.node import get_redis_online_data
from api.page_monitoring import get_page_monitoring_data
router = APIRouter()
@router.post("/scheduled/task/data")
async def get_scheduled_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# Fuzzy search based on the name field
query = {"name": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.ScheduledTasks.count_documents(query)
# Perform pagination query
cursor: AsyncIOMotorCursor = db.ScheduledTasks.find(query).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
if len(result) == 0:
return {
"code": 200,
"data": {
'list': [],
'total': 0
}
}
result_list = []
for doc in result:
tmp = {
"id": doc["id"],
"name": doc["name"],
"type": doc["type"],
"lastTime": doc.get("lastTime", ""),
"nextTime": doc.get("nextTime", ""),
"state": doc.get("state"),
"cycle": doc.get("hour"),
"node": doc.get("node", []),
"allNode": doc.get("allNode", True),
"runner_id": doc.get("runner_id", "")
}
result_list.append(tmp)
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
# @router.post("/scheduled/task/run")
# async def scheduled_run(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
# jobstore_alias=None):
# try:
# id = request_data.get("id", "")
# job = scheduler.get_job(id)
# if job:
# executor = scheduler._lookup_executor(job.executor)
# run_times = [datetime.now(utc)]
# try:
# executor.submit_job(job, run_times)
# except MaxInstancesReachedError:
# scheduler._logger.warning(
# 'Execution of job "%s" skipped: maximum number of running '
# 'instances reached (%d)', job, job.max_instances)
# event = JobSubmissionEvent(EVENT_JOB_MAX_INSTANCES, job.id,
# jobstore_alias, run_times)
# scheduler._dispatch_event(event)
# except BaseException:
# scheduler._logger.exception('Error submitting job "%s" to executor "%s"',
# job, job.executor)
# else:
# event = JobSubmissionEvent(EVENT_JOB_SUBMITTED, job.id, jobstore_alias,
# run_times)
# scheduler._dispatch_event(event)
# return {"message": "task run success", "code": 200}
# else:
# return {"message": "Not Found Task", "code": 500}
# except:
# return {"message": "error", "code": 500}
@router.post("/scheduled/task/delete")
async def delete_task(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token),
redis_con=Depends(get_redis_pool)):
try:
# Extract the list of IDs from the request_data dictionary
task_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = []
for task_id in task_ids:
if task_id != "page_monitoring":
obj_ids.append(task_id)
job = scheduler.get_job(task_id)
if job:
function_name = job.func.__name__ if hasattr(job.func, '__name__') else job.func
update_document = {
"$set": {
"scheduledTasks": False
}
}
if function_name == "scheduler_scan_task":
await db.task.update_one({"_id": ObjectId(task_id)}, update_document)
else:
await db.project.update_one({"_id": ObjectId(task_id)}, update_document)
scheduler.remove_job(task_id)
result = await db.ScheduledTasks.delete_many({"id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
return {"code": 200, "message": "Scheduled Task deleted successfully"}
else:
return {"code": 404, "message": "Scheduled Task not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
async def get_page_monitoring_time():
async for db in get_mongo_db():
result = await db.ScheduledTasks.find_one({"id": "page_monitoring"})
time = result['hour']
return time
async def create_page_monitoring_task():
logger.info("create_page_monitoring_task")
async for db in get_mongo_db():
async for redis in get_redis_pool():
name_list = []
result = await db.ScheduledTasks.find_one({"id": "page_monitoring"})
next_time = scheduler.get_job("page_monitoring").next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
update_document = {
"$set": {
"lastTime": get_now_time(),
"nextTime": formatted_time
}
}
await db.ScheduledTasks.update_one({"_id": result['_id']}, update_document)
if result['allNode']:
tmp = await get_redis_online_data(redis)
name_list += tmp
else:
name_list += result['node']
targetList = await get_page_monitoring_data(db, False)
if len(targetList) == 0:
return
await redis.lpush(f"TaskInfo:page_monitoring", *targetList)
add_redis_task_data = {
"type": 'page_monitoring',
"TaskId": "page_monitoring"
}
for name in name_list:
await redis.rpush(f"NodeTask:{name}", json.dumps(add_redis_task_data))
@router.post("/scheduled/task/pagemonit/data")
async def get_scheduled_task_pagemonit_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# Fuzzy search based on the name field
query = {"url": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.PageMonitoring.count_documents(query)
# Perform pagination query
cursor: AsyncIOMotorCursor = db.PageMonitoring.find(query).skip((page_index - 1) * page_size).limit(page_size)
result = await cursor.to_list(length=None)
result_list = []
for doc in result:
tmp = {
"id": str(doc["_id"]),
"url": doc["url"]
}
result_list.append(tmp)
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/scheduled/task/pagemonit/update")
async def update_scheduled_task_pagemonit_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
if not request_data:
return {"message": "Data to update is missing in the request", "code": 400}
state = request_data.get('state')
formatted_time = ""
job = scheduler.get_job('page_monitoring')
if state:
if job is None:
scheduler.add_job(create_page_monitoring_task, 'interval', hours=request_data.get('hour', 24), id='page_monitoring', jobstore='mongo')
next_time = scheduler.get_job('page_monitoring').next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
else:
if job:
scheduler.remove_job('page_monitoring')
update_document = {
"$set": {
"hour": request_data.get('hour', 24),
"node": request_data.get('node', []),
"allNode": request_data.get('allNode', True),
"nextTime": formatted_time,
"state": request_data.get('state'),
}
}
result = await db.ScheduledTasks.update_one({"id": 'page_monitoring'}, update_document)
if result:
return {"message": "Data updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/scheduled/task/pagemonit/delete")
async def delete_scheduled_task_pagemonit_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract the list of IDs from the request_data dictionary
url_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(url_id) for url_id in url_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.PageMonitoring.delete_many({"_id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
return {"code": 200, "message": "URL deleted successfully"}
else:
return {"code": 404, "message": "URL not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/scheduled/task/pagemonit/add")
async def add_scheduled_task_pagemonit_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
if not request_data:
return {"message": "Data to add is missing in the request", "code": 400}
url = request_data.get("url")
result = await db.PageMonitoring.insert_one({
"url": url,
"content": [],
"hash": [],
"diff": [],
"state": 1,
"project": '',
"time": ''
})
if result.inserted_id:
return {"message": "Data added successfully", "code": 200}
else:
return {"message": "Failed to add data", "code": 400}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

240
api/sensitive.py Normal file
View File

@ -0,0 +1,240 @@
# -*- coding:utf-8 -*-  
# @name: sensitive
# @auth: rainy-autumn@outlook.com
# @version:
from datetime import datetime
from bson import ObjectId
from fastapi import APIRouter, Depends
from pymongo import DESCENDING
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.config import SensitiveRuleList
from core.db import get_mongo_db
from core.redis_handler import refresh_config
from loguru import logger
from core.util import search_to_mongodb
router = APIRouter()
@router.post("/sensitive/data")
async def get_sensitive_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# MongoDB collection for SensitiveRule
# Fuzzy search based on the name field
query = {"name": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.SensitiveRule.count_documents(query)
# Perform pagination query
cursor: AsyncIOMotorCursor = db.SensitiveRule.find(query).skip((page_index - 1) * page_size).limit(page_size).sort([("timestamp", DESCENDING)])
result = await cursor.to_list(length=None)
if len(result) == 0:
return {
"code": 200,
"data": {
'list': [],
'total': 0
}
}
# Process the result as needed
response_data = [{"id": str(doc["_id"]),"name": doc["name"], "regular": doc["regular"], "state": doc["state"], "color": doc["color"]} for doc in result]
return {
"code": 200,
"data": {
'list': response_data,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/sensitive/update")
async def upgrade_sensitive_rule(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract values from request data
rule_id = request_data.get("id")
name = request_data.get("name")
regular = request_data.get("regular")
color = request_data.get("color")
state = request_data.get("state")
# Update query based on rule_id
update_query = {"_id": ObjectId(rule_id)}
# Values to be updated
update_values = {"$set": {"name": name, "regular": regular, "color": color, "state": state}}
# Perform the update
result = await db.SensitiveRule.update_one(update_query, update_values)
if result:
SensitiveRuleList[str(rule_id)] = {
"name": name,
"color": color
}
await refresh_config('all', 'sensitive')
return {"code": 200, "message": "SensitiveRule updated successfully"}
else:
return {"code": 404, "message": "SensitiveRule not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/sensitive/add")
async def add_sensitive_rule(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract values from request data
name = request_data.get("name")
regular = request_data.get("regular",'')
color = request_data.get("color")
state = request_data.get("state")
if regular == '':
return {"code": 500, "message": "regular is null"}
# Create a new SensitiveRule document
new_rule = {
"name": name,
"regular": regular,
"color": color,
"state": state
}
# Insert the new document into the SensitiveRule collection
result = await db.SensitiveRule.insert_one(new_rule)
# Check if the insertion was successful
if result.inserted_id:
SensitiveRuleList[str(result.inserted_id)] = {
"name": name,
"color": color
}
await refresh_config('all', 'sensitive')
return {"code": 200, "message": "SensitiveRule added successfully"}
else:
return {"code": 400, "message": "Failed to add SensitiveRule"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/sensitive/delete")
async def delete_sensitive_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Extract the list of IDs from the request_data dictionary
rule_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = [ObjectId(rule_id) for rule_id in rule_ids]
# Delete the SensitiveRule documents based on the provided IDs
result = await db.SensitiveRule.delete_many({"_id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
for rule_id in rule_ids:
del SensitiveRuleList[rule_id]
await refresh_config('all', 'sensitive')
return {"code": 200, "message": "SensitiveRules deleted successfully"}
else:
return {"code": 404, "message": "SensitiveRules not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/sensitive/result/data")
async def get_sensitive_result_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
keyword = {
'url': 'url',
'sname': 'sid',
"body": "body",
"info": "match",
'project': 'project',
'md5': 'md5'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
total_count = await db['SensitiveResult'].count_documents(query)
cursor: AsyncIOMotorCursor = ((db['SensitiveResult'].find(query, {"_id": 0,
"id": {"$toString": "$_id"},
"url": 1,
"sid": 1,
"match": 1,
"time": 1,
"color": 1
})
.skip((page_index - 1) * page_size)
.limit(page_size))
.sort([("time", DESCENDING)]))
result = await cursor.to_list(length=None)
result_list = []
for r in result:
tmp = {
'id': r['id'],
'url': r['url'],
'name': r['sid'],
'color': r['color'],
'match': r['match'],
'time': r['time']
}
result_list.append(tmp)
return {
"code": 200,
"data": {
'list': result_list,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/sensitive/result/body")
async def get_sensitive_result_body_rules(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
sensitive_result_id = request_data.get("id")
# Check if ID is provided
if not sensitive_result_id:
return {"message": "ID is missing in the request data", "code": 400}
# Query the database for content based on ID
query = {"_id": ObjectId(sensitive_result_id)}
doc = await db.SensitiveResult.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
# Extract the content
content = doc.get("body", "")
return {"code": 200, "data": {"body": content}}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}

76
api/system.py Normal file
View File

@ -0,0 +1,76 @@
# -------------------------------------
# @file : system.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/5/14 21:59
# -------------------------------------------
from fastapi import APIRouter, Depends
import git
from api.users import verify_token
from core.db import get_mongo_db
from core.config import *
import requests
from core.redis_handler import get_redis_pool, refresh_config
router = APIRouter()
@router.get("/system/version")
async def get_system_version(redis_con=Depends(get_redis_pool), _: dict = Depends(verify_token)):
try:
r = requests.get(f"{UPDATEURL}/get/version?name=server").json()
server_lversion = r["value"]
server_msg = r['msg']
r = requests.get(f"{UPDATEURL}/get/version?name=scan").json()
scan_lversion = r["value"]
scan_msg = r['msg']
except:
server_lversion = ""
server_msg = ""
scan_lversion = ""
scan_msg = ""
result_list = [{"name": "ScopeSentry-Server", "cversion": VERSION, "lversion": server_lversion, "msg": server_msg}]
async with redis_con as redis:
keys = await redis.keys("node:*")
for key in keys:
name = key.split(":")[1]
hash_data = await redis.hgetall(key)
result_list.append({"name": name, "cversion": hash_data["version"], "lversion": scan_lversion, "msg": scan_msg})
return {
"code": 200,
"data": {
'list': result_list
}
}
@router.get("/system/update")
async def system_update():
update_server()
await refresh_config("all", 'UpdateSystem')
def update_server():
repo_path = os.getcwd()
if not os.path.isdir('.git'):
repo = git.Repo.init(repo_path)
else:
repo = git.Repo.init(repo_path, bare=False)
if not repo.remotes:
# 添加远程仓库地址
repo.create_remote('origin', REMOTE_REPO_URL)
else:
# 获取远程地址
remote_url = repo.remotes.origin.url
# 检查远程地址是否为预期地址
if remote_url != REMOTE_REPO_URL:
repo.remotes.origin.set_url(REMOTE_REPO_URL)
result = repo.remotes.origin.pull()
for info in result:
print(info)

542
api/task.py Normal file
View File

@ -0,0 +1,542 @@
# -*- coding:utf-8 -*-  
# @name: sensitive
# @auth: rainy-autumn@outlook.com
# @version:
import asyncio
import json
from loguru import logger
from bson import ObjectId
from fastapi import APIRouter, Depends, BackgroundTasks
from pymongo import DESCENDING
from api.users import verify_token
from motor.motor_asyncio import AsyncIOMotorCursor
from core.apscheduler_handler import scheduler
from core.db import get_mongo_db
from core.redis_handler import get_redis_pool
from core.util import *
from api.node import get_redis_online_data
from api.page_monitoring import get_page_monitoring_data
router = APIRouter()
@router.post("/task/data")
async def get_task_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token), background_tasks: BackgroundTasks = BackgroundTasks()):
try:
background_tasks.add_task(task_progress)
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# Fuzzy search based on the name field
query = {"name": {"$regex": search_query, "$options": "i"}}
# Get the total count of documents matching the search criteria
total_count = await db.task.count_documents(query)
# Perform pagination query
cursor: AsyncIOMotorCursor = db.task.find(query).skip((page_index - 1) * page_size).limit(page_size).sort([("creatTime", DESCENDING)])
result = await cursor.to_list(length=None)
if len(result) == 0:
return {
"code": 200,
"data": {
'list': [],
'total': 0
}
}
# Process the result as needed
response_data = [{"id": str(doc["_id"]), "name": doc["name"], "taskNum": doc["taskNum"], "progress": doc["progress"], "creatTime": doc["creatTime"], "endTime": doc["endTime"]} for doc in result]
return {
"code": 200,
"data": {
'list': response_data,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}
@router.post("/task/add")
async def add_task(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
name = request_data.get("name")
target = request_data.get("target", "")
node = request_data.get("node")
if name == "" or target == "" or node == []:
return {"message": "Null", "code": 500}
scheduledTasks = request_data.get("scheduledTasks", False)
hour = request_data.get("hour", 1)
targetList = []
targetTmp = ""
for t in target.split("\n"):
t.replace("http://","").replace("https://","")
t = t.strip("\n").strip("\r")
if t != "" and t not in targetList:
targetList.append(t)
targetTmp += t + "\n"
taskNum = len(targetList)
request_data['taskNum'] = taskNum
request_data['target'] = targetTmp.strip("\n")
request_data['progress'] = 0
request_data["creatTime"] = get_now_time()
request_data["endTime"] = ""
if "All Poc" in request_data['vulList']:
request_data['vulList'] = ["All Poc"]
result = await db.task.insert_one(request_data)
# Check if the insertion was successful
if result.inserted_id:
if scheduledTasks:
scheduler.add_job(scheduler_scan_task, 'interval', hours=hour, args=[str(result.inserted_id)],
id=str(result.inserted_id), jobstore='mongo')
next_time = scheduler.get_job(str(result.inserted_id)).next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
db.ScheduledTasks.insert_one(
{"id": str(result.inserted_id), "name": name, 'hour': hour, 'type': 'Scan', 'state': True, 'lastTime': get_now_time(), 'nextTime': formatted_time, 'runner_id': str(result.inserted_id)})
f = await create_scan_task(request_data, result.inserted_id, targetList, redis_con)
if f:
return {"code": 200, "message": "Task added successfully"}
else:
return {"code": 400, "message": "Failed to add Task"}
else:
return {"code": 400, "message": "Failed to add Task"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/task/content")
async def task_content(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
task_id = request_data.get("id")
# Check if ID is provided
if not task_id:
return {"message": "ID is missing in the request data", "code": 400}
# Query the database for content based on ID
query = {"_id": ObjectId(task_id)}
doc = await db.task.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
result = {
"name": doc.get("name", ""),
"target": doc.get("target", ""),
"node": doc.get("node", []),
"subdomainScan": doc.get("subdomainScan", False),
"subdomainConfig": doc.get("subdomainConfig", []),
"urlScan": doc.get("urlScan", False),
"sensitiveInfoScan": doc.get("sensitiveInfoScan", False),
"pageMonitoring": doc.get("pageMonitoring", ""),
"crawlerScan": doc.get("crawlerScan", False),
"vulScan": doc.get("vulScan", False),
"vulList": doc.get("vulList", []),
"portScan": doc.get("portScan"),
"ports": doc.get("ports"),
"waybackurl": doc.get("waybackurl"),
"dirScan": doc.get("dirScan"),
"scheduledTasks": doc.get("scheduledTasks"),
"hour": doc.get("hour"),
"duplicates": doc.get("duplicates")
}
return {"code": 200, "data": result}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/task/delete")
async def delete_task(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
# Extract the list of IDs from the request_data dictionary
task_ids = request_data.get("ids", [])
# Convert the provided rule_ids to ObjectId
obj_ids = []
redis_key = []
for task_id in task_ids:
obj_ids.append(ObjectId(task_id))
redis_key.append("TaskInfo:" + task_id)
redis_con.delete(*redis_key)
# Delete the SensitiveRule documents based on the provided IDs
result = await db.task.delete_many({"_id": {"$in": obj_ids}})
# Check if the deletion was successful
if result.deleted_count > 0:
return {"code": 200, "message": "Task deleted successfully"}
else:
return {"code": 404, "message": "Task not found"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
@router.post("/task/retest")
async def retest_task(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool)):
try:
# Get the ID from the request data
task_id = request_data.get("id")
# Check if ID is provided
if not task_id:
return {"message": "ID is missing in the request data", "code": 400}
# Query the database for content based on ID
query = {"_id": ObjectId(task_id)}
doc = await db.task.find_one(query)
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
target = doc['target']
targetList = target.split("\n")
keys_to_delete = [
f"TaskInfo:tmp:{task_id}",
f"TaskInfo:{task_id}",
f"TaskInfo:time:{task_id}",
f"duplicates:url:{task_id}",
f"duplicates:domain:{task_id}",
f"duplicates:sensresp:{task_id}",
f"duplicates:craw:{task_id}"
]
progresskeys = await redis_con.keys(f"TaskInfo:progress:{task_id}:*")
keys_to_delete.extend(progresskeys)
if keys_to_delete:
await redis_con.delete(*keys_to_delete)
f = await create_scan_task(doc, task_id, targetList, redis_con)
if f:
update_document = {
"$set": {
"progress": 0,
"creatTime": get_now_time(),
"endTime": ""
}
}
await db.task.update_one({"_id": ObjectId(task_id)}, update_document)
return {"code": 200, "message": "Task added successfully"}
else:
return {"code": 400, "message": "Failed to add Task"}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
async def create_scan_task(request_data, id, targetList, redis_con):
try:
request_data["id"] = str(id)
if request_data['allNode']:
request_data["node"] = get_redis_online_data(redis_con)
keys_to_delete = [
f"TaskInfo:tmp:{id}",
f"TaskInfo:{id}",
f"TaskInfo:time:{id}",
f"duplicates:url:{id}",
f"duplicates:domain:{id}",
f"duplicates:sensresp:{id}",
f"duplicates:craw:{id}"
]
progresskeys = await redis_con.keys(f"TaskInfo:progress:{id}:*")
keys_to_delete.extend(progresskeys)
if keys_to_delete:
await redis_con.delete(*keys_to_delete)
add_redis_task_data = transform_db_redis(request_data)
async with redis_con as redis:
await redis.lpush(f"TaskInfo:{id}", *targetList)
for name in request_data["node"]:
await redis.rpush(f"NodeTask:{name}", json.dumps(add_redis_task_data))
return True
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return False
@router.post("/task/update")
async def update_task_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
# Get the ID from the request data
task_id = request_data.get("id")
hour = request_data.get("hour")
# Check if ID is provided
if not task_id:
return {"message": "ID is missing in the request data", "code": 400}
query = {"id": task_id}
doc = await db.ScheduledTasks.find_one(query)
oldScheduledTasks = doc["state"]
old_hour = doc["hour"]
newScheduledTasks = request_data.get("scheduledTasks")
if oldScheduledTasks != newScheduledTasks:
if newScheduledTasks:
scheduler.add_job(scheduler_scan_task, 'interval', hours=hour, args=[task_id],
id=str(task_id), jobstore='mongo')
await db.ScheduledTasks.update_one({"id": task_id}, {"$set": {'state': True}})
else:
scheduler.remove_job(task_id)
await db.ScheduledTasks.update_one({"id": task_id}, {"$set": {'state': False}})
if newScheduledTasks:
if hour != old_hour:
job = scheduler.get_job(task_id)
if job is not None:
scheduler.remove_job(job)
scheduler.add_job(scheduler_scan_task, 'interval', hours=hour, args=[task_id],
id=str(task_id), jobstore='mongo')
request_data.pop("id")
update_document = {
"$set": request_data
}
result = await db.task.update_one({"_id": ObjectId(task_id)}, update_document)
# Check if the update was successful
if result:
return {"message": "Task updated successfully", "code": 200}
else:
return {"message": "Failed to update data", "code": 404}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error", "code": 500}
async def task_progress():
async for db in get_mongo_db():
async for redis in get_redis_pool():
query = {"progress": {"$ne": 100}}
cursor: AsyncIOMotorCursor = db.task.find(query)
result = await cursor.to_list(length=None)
if len(result) == 0:
return True
for r in result:
id = str(r["_id"])
key = f"TaskInfo:tmp:{id}"
exists = await redis.exists(key)
if exists:
count = await redis.llen(key)
progress_tmp = round(count / r['taskNum'], 2)
progress_tmp = round(progress_tmp * 100, 1)
if progress_tmp > 100:
progress_tmp = 100
if progress_tmp == 100:
time_key = f"TaskInfo:time:{id}"
time_value = await redis.get(time_key)
await db.task.update_one({"_id": r["_id"]}, {"$set": {"endTime": time_value}})
await db.task.update_one({"_id": r["_id"]}, {"$set": {"progress": progress_tmp}})
else:
await db.task.update_one({"_id": r["_id"]}, {"$set": {"progress": 0}})
return
# @router.post("/task/progress/info")
# async def progress_info(request_data: dict, _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool), db=Depends(get_mongo_db)):
# task_id = request_data.get("id")
# type = request_data.get("type")
# runner = request_data.get("runner")
# # Check if ID is provided
# if not task_id:
# return {"message": "ID is missing in the request data", "code": 400}
# query = {"_id": ObjectId(task_id)}
# if type == "scan":
# doc = await db.task.find_one(query)
# else:
# doc = await db.project.find_one(query)
# target_data = await db.ProjectTargetData.find_one({"id": task_id})
# doc["target"] = target_data["target"]
# if not doc:
# return {"message": "Content not found for the provided ID", "code": 404}
# target = doc['target']
# result_list = []
# for t in target.split("\n"):
# progress_result = {
# "subdomain": ["", ""],
# "subdomainTakeover": ["", ""],
# "portScan": ["", ""],
# "assetMapping": ["", ""],
# "urlScan": ["", ""],
# "sensitive": ["", ""],
# "crawler": ["", ""],
# "dirScan": ["", ""],
# "vulnerability": ["", ""],
# "all": ["", ""],
# "target": ""
# }
# if not doc['subdomainScan']:
# progress_result['subdomain'] = ['', '', '']
# if not doc['portScan']:
# progress_result['portScan'] = ['', '', '']
# if not doc['urlScan']:
# progress_result['urlScan'] = ['', '', '']
# if not doc['sensitiveInfoScan']:
# progress_result['sensitive'] = ['', '', '']
# if not doc['crawlerScan']:
# progress_result['crawler'] = ['', '', '']
# if not doc['dirScan']:
# progress_result['dirScan'] = ['', '', '']
# if not doc['vulScan']:
# progress_result['vulnerability'] = ['', '', '']
# if runner != "":
# key = "TaskInfo:progress:" + runner + ":" + t
# else:
# key = "TaskInfo:progress:" + task_id + ":" + t
# data = await redis_con.hgetall(key)
# progress_result["target"] = t
# if not data:
# result_list.append(progress_result)
# else:
# progress_result['subdomain'][0] = data.get("subdomain_start","")
# progress_result['subdomain'][1] = data.get("subdomain_end", "")
# progress_result['subdomainTakeover'][0] = data.get("subdomainTakeover_start", "")
# progress_result['subdomainTakeover'][1] = data.get("subdomainTakeover_end", "")
# progress_result['portScan'][0] = data.get("portScan_start", "")
# progress_result['portScan'][1] = data.get("portScan_end", "")
# progress_result['assetMapping'][0] = data.get("assetMapping_start", "")
# progress_result['assetMapping'][1] = data.get("assetMapping_end", "")
# progress_result['urlScan'][0] = data.get("urlScan_start", "")
# progress_result['urlScan'][1] = data.get("urlScan_end", "")
# progress_result['sensitive'][0] = data.get("sensitive_start", "")
# progress_result['sensitive'][1] = data.get("sensitive_end", "")
# progress_result['crawler'][0] = data.get("crawler_start", "")
# progress_result['crawler'][1] = data.get("crawler_end", "")
# progress_result['dirScan'][0] = data.get("dirScan_start", "")
# progress_result['dirScan'][1] = data.get("dirScan_end", "")
# progress_result['vulnerability'][0] = data.get("vulnerability_start", "")
# progress_result['vulnerability'][1] = data.get("vulnerability_end", "")
# progress_result['all'][0] = data.get("scan_start", "")
# progress_result['all'][1] = data.get("scan_end", "")
# result_list.append(progress_result)
# return {
# "code": 200,
# "data": {
# 'list': result_list,
# "total": len(result_list)
# }
# }
@router.post("/task/progress/info")
async def progress_info(request_data: dict, _: dict = Depends(verify_token), redis_con=Depends(get_redis_pool),
db=Depends(get_mongo_db)):
task_id = request_data.get("id")
type = request_data.get("type")
runner = request_data.get("runner")
if not task_id:
return {"message": "ID is missing in the request data", "code": 400}
query = {"_id": ObjectId(task_id)}
if type == "scan":
doc = await db.task.find_one(query)
else:
doc, target_data = await asyncio.gather(
db.project.find_one(query),
db.ProjectTargetData.find_one({"id": task_id})
)
if target_data:
doc["target"] = target_data["target"]
if not doc:
return {"message": "Content not found for the provided ID", "code": 404}
target = doc['target']
result_list = []
tasks = []
for t in target.split("\n"):
key = f"TaskInfo:progress:{runner or task_id}:{t}"
tasks.append(redis_con.hgetall(key))
redis_results = await asyncio.gather(*tasks)
for t, data in zip(target.split("\n"), redis_results):
progress_result = {
"subdomain": ["", ""],
"subdomainTakeover": ["", ""],
"portScan": ["", ""],
"assetMapping": ["", ""],
"urlScan": ["", ""],
"sensitive": ["", ""],
"crawler": ["", ""],
"dirScan": ["", ""],
"vulnerability": ["", ""],
"all": ["", ""],
"target": t
}
if not data:
result_list.append(progress_result)
continue
progress_result['subdomain'][0] = data.get("subdomain_start", "")
progress_result['subdomain'][1] = data.get("subdomain_end", "")
progress_result['subdomainTakeover'][0] = data.get("subdomainTakeover_start", "")
progress_result['subdomainTakeover'][1] = data.get("subdomainTakeover_end", "")
progress_result['portScan'][0] = data.get("portScan_start", "")
progress_result['portScan'][1] = data.get("portScan_end", "")
progress_result['assetMapping'][0] = data.get("assetMapping_start", "")
progress_result['assetMapping'][1] = data.get("assetMapping_end", "")
progress_result['urlScan'][0] = data.get("urlScan_start", "")
progress_result['urlScan'][1] = data.get("urlScan_end", "")
progress_result['sensitive'][0] = data.get("sensitive_start", "")
progress_result['sensitive'][1] = data.get("sensitive_end", "")
progress_result['crawler'][0] = data.get("crawler_start", "")
progress_result['crawler'][1] = data.get("crawler_end", "")
progress_result['dirScan'][0] = data.get("dirScan_start", "")
progress_result['dirScan'][1] = data.get("dirScan_end", "")
progress_result['vulnerability'][0] = data.get("vulnerability_start", "")
progress_result['vulnerability'][1] = data.get("vulnerability_end", "")
progress_result['all'][0] = data.get("scan_start", "")
progress_result['all'][1] = data.get("scan_end", "")
result_list.append(progress_result)
return {
"code": 200,
"data": {
'list': result_list,
"total": len(result_list)
}
}
async def scheduler_scan_task(id):
logger.info(f"Scheduler scan {id}")
async for db in get_mongo_db():
async for redis in get_redis_pool():
next_time = scheduler.get_job(id).next_run_time
formatted_time = next_time.strftime("%Y-%m-%d %H:%M:%S")
doc = await db.ScheduledTasks.find_one({"id": id})
run_id_last = doc.get("runner_id", "")
if run_id_last != "" and id != run_id_last:
progresskeys = await redis.keys(f"TaskInfo:progress:{run_id_last}:*")
for pgk in progresskeys:
await redis.delete(pgk)
task_id = generate_random_string(15)
update_document = {
"$set": {
"lastTime": get_now_time(),
"nextTime": formatted_time,
"runner_id": task_id
}
}
await db.ScheduledTasks.update_one({"id": id}, update_document)
query = {"_id": ObjectId(id)}
doc = await db.task.find_one(query)
targetList = []
for t in doc['target'].split("\n"):
t.replace("http://", "").replace("https://", "")
t = t.strip("\n").strip("\r").strip()
if t != "" and t not in targetList:
targetList.append(t)
await create_scan_task(doc, task_id, targetList, redis)

97
api/users.py Normal file
View File

@ -0,0 +1,97 @@
# -*- coding:utf-8 -*-  
# @name: users
# @version:
from fastapi import APIRouter, Depends, HTTPException, Body
from pydantic import BaseModel
import jwt
from fastapi.security import OAuth2PasswordBearer
from datetime import datetime, timedelta
import hashlib
from core.config import SECRET_KEY
from core.db import get_mongo_db
router = APIRouter()
ALGORITHM = "HS256"
class LoginRequest(BaseModel):
username: str
password: str
class ChangePassword(BaseModel):
newPassword: str
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
def create_access_token(data: dict, expires_delta: timedelta):
to_encode = data.copy()
expire = datetime.utcnow() + expires_delta
to_encode.update({"exp": expire})
encoded_jwt = jwt.encode(to_encode, SECRET_KEY, algorithm=ALGORITHM)
return encoded_jwt
async def verify_token(token: str = Depends(oauth2_scheme)):
credentials_exception = HTTPException(
status_code=200,
detail={"code": 401, "message": "Could not validate credentials"},
headers={"WWW-Authenticate": "Bearer"},
)
try:
payload = jwt.decode(token, SECRET_KEY, algorithms=[ALGORITHM])
username: str = payload.get("sub")
if username is None:
raise credentials_exception
return {"sub": username}
except :
raise credentials_exception
def hash_password(password: str) -> str:
hashed_password = hashlib.sha256(password.encode()).hexdigest()
return hashed_password
async def verify_user(username: str, password: str, db):
user = await db.user.find_one({"username": username})
if user and user["password"] == hash_password(password):
return user
return None
@router.post("/user/login")
async def login(login_request: LoginRequest = Body(...), db=Depends(get_mongo_db)):
username = login_request.username
password = login_request.password
user = await verify_user(username, password, db)
if user is None:
return {
'code': 401,
'message' : 'Incorrect username or password'
}
token_data = {"sub": username}
expires_delta = timedelta(days=30) # Set the expiration time as needed
token = create_access_token(token_data, expires_delta)
return {
'code': 200,
'data': {
'access_token': token
}
}
@router.post("/user/changePassword")
async def change_password(change_password: ChangePassword = Body(...), _: dict = Depends(verify_token), db=Depends(get_mongo_db)):
try:
newPassword = hash_password(change_password.newPassword)
await db.users.update_one({"username": 'admin'},
{"$set": {"password": newPassword}})
return {
'code': 200,
'message': ''
}
except:
return {
'code': 500,
'message': 'Password change failed'
}

78
api/vulnerability.py Normal file
View File

@ -0,0 +1,78 @@
# -------------------------------------
# @file : vulnerability.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/27 13:25
# -------------------------------------------
from fastapi import APIRouter, Depends
from motor.motor_asyncio import AsyncIOMotorCursor
from pymongo import DESCENDING
from api.users import verify_token
from core.config import POC_LIST
from core.db import get_mongo_db
from core.util import search_to_mongodb
from loguru import logger
router = APIRouter()
@router.post("/vul/data")
async def get_vul_data(request_data: dict, db=Depends(get_mongo_db), _: dict = Depends(verify_token)):
try:
search_query = request_data.get("search", "")
page_index = request_data.get("pageIndex", 1)
page_size = request_data.get("pageSize", 10)
# MongoDB collection for SensitiveRule
# Fuzzy search based on the name field
keyword = {
'url': 'url',
'vulname': 'vulname',
'project': 'project',
'matched': 'matched',
'request': 'request',
'response': 'response',
'level': 'level'
}
query = await search_to_mongodb(search_query, keyword)
if query == "" or query is None:
return {"message": "Search condition parsing error", "code": 500}
query = query[0]
# Get the total count of documents matching the search criteria
total_count = await db.vulnerability.count_documents(query)
if total_count == 0:
return {
"code": 200,
"data": {
'list': [],
'total': 0
}
}
# Perform pagination query
cursor: AsyncIOMotorCursor = db.vulnerability.find(query).skip((page_index - 1) * page_size).limit(page_size).sort([("timestamp", DESCENDING)])
result = await cursor.to_list(length=None)
# Process the result as needed
response_data = []
for doc in result:
data = {
"id": str(doc["_id"]),
"url": doc["url"],
"vulnerability": doc["vulname"],
"matched": doc["matched"],
"time": doc["time"],
"request": doc["request"],
"response": doc["response"],
}
if doc["vulnid"] in POC_LIST:
data["level"] = POC_LIST[doc["vulnid"]]
response_data.append(data)
return {
"code": 200,
"data": {
'list': response_data,
'total': total_count
}
}
except Exception as e:
logger.error(str(e))
# Handle exceptions as needed
return {"message": "error","code":500}

3
core/__init__.py Normal file
View File

@ -0,0 +1,3 @@
# -*- coding:utf-8 -*-  
# @name: __init__.py
# @version:

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,22 @@
# -------------------------------------
# @file : apscheduler_handler.py
# @author : Autumn
# @contact : rainy-autumn@outlook.com
# @time : 2024/4/21 19:36
# -------------------------------------------
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from apscheduler.jobstores.mongodb import MongoDBJobStore
from core.config import *
mongo_config = {
'host': MONGODB_IP,
'port': int(MONGODB_PORT),
'username': DATABASE_USER,
'password': DATABASE_PASSWORD,
'database': DATABASE_NAME,
'collection': 'apscheduler'
}
jobstores = {
'mongo': MongoDBJobStore(**mongo_config)
}
scheduler = AsyncIOScheduler(jobstores=jobstores)

98
core/config.py Normal file
View File

@ -0,0 +1,98 @@
# -*- coding:utf-8 -*-  
# @name: config
# @auth: rainy-autumn@outlook.com
# @version:
import os
import random
import string
import yaml
VERSION = "1.0"
UPDATEURL = "http://update.scope-sentry.top"
REMOTE_REPO_URL = "https://github.com/Autumn-27/ScopeSentry.git"
SECRET_KEY = "ScopeSentry-15847412364125411"
MONGODB_IP = ""
MONGODB_PORT = 0
DATABASE_NAME = ""
DATABASE_USER = ''
DATABASE_PASSWORD = ''
REDIS_IP = ""
REDIS_PORT = ""
REDIS_PASSWORD = ""
TIMEZONE = 'Asia/Shanghai'
LOG_INFO = {}
GET_LOG_NAME = []
NODE_TIMEOUT = 50
TOTAL_LOGS = 1000
APP = {}
SensitiveRuleList = {}
POC_LIST = {}
Project_List = {}
def set_timezone(t):
global TIMEZONE
TIMEZONE = t
def get_timezone():
global TIMEZONE
return TIMEZONE
def generate_random_string(length):
# 生成随机字符串,包括大小写字母和数字
characters = string.ascii_letters + string.digits
random_string = ''.join(random.choice(characters) for _ in range(length))
return random_string
def set_config():
global MONGODB_IP, MONGODB_PORT, DATABASE_NAME, DATABASE_USER, DATABASE_PASSWORD, REDIS_IP, REDIS_PORT, REDIS_PASSWORD, SECRET_KEY, TOTAL_LOGS, TIMEZONE
SECRET_KEY = generate_random_string(16)
config_file_path = "config.yaml"
if os.path.exists(config_file_path):
with open(config_file_path, 'r') as file:
data = yaml.safe_load(file)
MONGODB_IP = data['mongodb']['ip']
MONGODB_PORT = data['mongodb']['port']
DATABASE_NAME = data['mongodb']['database_name']
DATABASE_USER = data['mongodb']['username']
DATABASE_PASSWORD = data['mongodb']['password']
REDIS_IP = data['redis']['ip']
REDIS_PORT = data['redis']['port']
REDIS_PASSWORD = data['redis']['password']
TOTAL_LOGS = data['logs']['total_logs']
TIMEZONE = data['system']['timezone']
else:
TIMEZONE = os.environ.get("TIMEZONE", default='Asia/Shanghai')
MONGODB_IP = os.environ.get("MONGODB_IP", default='127.0.0.1')
MONGODB_PORT = int(os.environ.get("MONGODB_PORT", default=27017))
DATABASE_NAME = os.environ.get("DATABASE_NAME", default='ScopeSentry')
DATABASE_USER = os.environ.get("DATABASE_USER", default='root')
DATABASE_PASSWORD = os.environ.get("DATABASE_PASSWORD", default='QckSdkg5CKvtxfec')
REDIS_IP = os.environ.get("REDIS_IP", default='127.0.0.1')
REDIS_PORT = os.environ.get("REDIS_PORT", default="6379")
REDIS_PASSWORD = os.environ.get("REDIS_PASSWORD", default='ScopeSentry')
TOTAL_LOGS = 1000
config_data = {
'system': {
'timezone': TIMEZONE
},
'mongodb': {
'ip': MONGODB_IP,
'port': int(MONGODB_PORT),
'database_name': DATABASE_NAME,
'username': DATABASE_USER,
'password': DATABASE_PASSWORD,
},
'redis': {
'ip': REDIS_IP,
'port': REDIS_PORT,
'password': REDIS_PASSWORD,
},
'logs': {
'total_logs': TOTAL_LOGS
}
}
with open(config_file_path, 'w') as file:
yaml.dump(config_data, file)

185
core/db.py Normal file
View File

@ -0,0 +1,185 @@
# -*- coding:utf-8 -*-  
# @name: db
# @auth: rainy-autumn@outlook.com
# @version:
from motor.motor_asyncio import AsyncIOMotorClient, AsyncIOMotorCursor
from core.default import *
from core.config import *
from core.util import string_to_postfix
from loguru import logger
async def get_mongo_db():
client = AsyncIOMotorClient(f"mongodb://{DATABASE_USER}:{DATABASE_PASSWORD}@{MONGODB_IP}:{str(MONGODB_PORT)}",
serverSelectionTimeoutMS=10000, unicode_decode_error_handler='ignore')
db = client[DATABASE_NAME]
try:
yield db
finally:
client.close()
async def create_database():
client = None
try:
# 创建新的 MongoDB 客户端
client = AsyncIOMotorClient(f"mongodb://{DATABASE_USER}:{DATABASE_PASSWORD}@{MONGODB_IP}:{str(MONGODB_PORT)}",
serverSelectionTimeoutMS=2000)
# 获取数据库列表
database_names = await client.list_database_names()
# 如果数据库不存在,创建数据库
if DATABASE_NAME not in database_names:
# 在数据库中创建一个集合,比如名为 "user"
collection = client[DATABASE_NAME]["user"]
# 用户数据
await collection.insert_one({"username": "ScopeSentry",
'password': 'b0ce71fcbed8a6ca579d52800145119cc7d999dc8651b62dfc1ced9a984e6e64'})
collection = client[DATABASE_NAME]["config"]
# 系统配置
await collection.insert_one(
{"name": "timezone", 'value': 'Asia/Shanghai', 'type': 'system'})
await collection.insert_one(
{"name": "MaxTaskNum", 'value': '7', 'type': 'system'})
await collection.insert_one(
{"name": "DirscanThread", 'value': '15', 'type': 'system'})
await collection.insert_one(
{"name": "PortscanThread", 'value': '15', 'type': 'system'})
await collection.insert_one(
{"name": "CrawlerThread", 'value': '2', 'type': 'system'})
await collection.insert_one(
{"name": "UrlMaxNum", 'value': '500', 'type': 'system'})
await collection.insert_one(
{"name": "UrlThread", 'value': '5', 'type': 'system'})
# 设置时区为Asia/Shanghai
# SHA_TZ = timezone(TIMEZONE)
# timezone('Asia/Shanghai')
# utc_now = datetime.utcnow().replace(tzinfo=timezone.utc)
# time_now = utc_now.astimezone(SHA_TZ)
# formatted_time = time_now.strftime("%Y-%m-%d %H:%M:%S")
# subfinder配置
collection = client[DATABASE_NAME]["config"]
# 插入一条数据
await collection.insert_one(
{"name": "SubfinderApiConfig", 'value': subfinderApiConfig, 'type': 'subfinder'})
await collection.insert_one(
{"name": "RadConfig", 'value': radConfig, 'type': 'rad'})
dirDict = get_dirDict()
await collection.insert_one(
{"name": "DirDic", 'value': dirDict, 'type': 'dirDict'})
await collection.insert_one(
{"name": "notification", 'dirScanNotification': True,
'portScanNotification': True, 'sensitiveNotification': True,
'subdomainTakeoverNotification': True,
'pageMonNotification': True,
'subdomainNotification': True,
'vulNotification': True,
'type': 'notification'})
domainDict = get_domainDict()
await collection.insert_one(
{"name": "DomainDic", 'value': domainDict, 'type': 'domainDict'})
sensitive_data = get_sensitive()
collection = client[DATABASE_NAME]["SensitiveRule"]
if sensitiveList:
await collection.insert_many(sensitive_data)
collection = client[DATABASE_NAME]["ScheduledTasks"]
await collection.insert_one(
{"id": "page_monitoring", "name": "Page Monitoring", 'hour': 24, 'node': [], 'allNode': True, 'type': 'Page Monitoring', 'state': True})
collection = client[DATABASE_NAME]
await collection.create_collection("notification")
collection = client[DATABASE_NAME]["PortDict"]
await collection.insert_many(portDic)
collection = client[DATABASE_NAME]["PocList"]
pocData = get_poc()
await collection.insert_many(pocData)
collection = client[DATABASE_NAME]["project"]
project_data, target_data = get_project_data()
await collection.insert_many(project_data)
collection = client[DATABASE_NAME]["ProjectTargetData"]
await collection.insert_many(target_data)
collection = client[DATABASE_NAME]["FingerprintRules"]
fingerprint_rules = get_fingerprint_data()
for rule in fingerprint_rules:
express = string_to_postfix(rule['rule'])
if express == "":
continue
default_rule = {
'name': rule['product'],
'rule': rule['rule'],
'express': express,
'category': rule['category'],
'parent_category': rule['parent_category'],
'company': rule['company'],
'amount': 0,
'state': True
}
await collection.insert_one(default_rule)
else:
collection = client[DATABASE_NAME]["config"]
result = await collection.find_one({"name": "timezone"})
set_timezone(result.get('value', 'Asia/Shanghai'))
collection = client[DATABASE_NAME]["ScheduledTasks"]
result = await collection.find_one({"id": "page_monitoring"})
if not result:
await collection.insert_one(
{"id": "page_monitoring", "name": "Page Monitoring", 'hour': 24, 'type': 'Page Monitoring', 'state': True})
await get_fingerprint(client[DATABASE_NAME])
await get_sens_rule(client[DATABASE_NAME])
await get_pocList(client[DATABASE_NAME])
await get_project(client[DATABASE_NAME])
except Exception as e:
# 处理异常
logger.error(f"Error creating database: {e}")
exit(0)
finally:
# 在适当的地方关闭 MongoDB 客户端
if client:
client.close()
async def get_fingerprint(client):
collection = client["FingerprintRules"]
cursor = collection.find({}, {"_id": 1, "name": 1})
async for document in cursor:
document['id'] = str(document['_id'])
del document['_id']
APP[document['id']] = document['name']
async def get_sens_rule(client):
collection = client["SensitiveRule"]
cursor = collection.find({}, {"_id": 1, "name": 1, "color": 1})
async for document in cursor:
document['id'] = str(document['_id'])
del document['_id']
SensitiveRuleList[document['id']] = {
"name": document['name'],
"color": document['color']
}
async def get_pocList(client):
collection = client["PocList"]
cursor = collection.find({}, {"_id": 1, "level": 1})
async for document in cursor:
document['id'] = str(document['_id'])
POC_LIST[document['id']] = document['level']
async def get_project(client):
collection = client["project"]
cursor = collection.find({}, {"_id": 1, "name": 1})
async for document in cursor:
document['id'] = str(document['_id'])
Project_List[document['name'].lower()] = document['id']

493
core/default.py Normal file
View File

@ -0,0 +1,493 @@
# -*- coding:utf-8 -*-  
# @name: default
# @auth: rainy-autumn@outlook.com
# @version:
import json
import os
from bson import ObjectId
from core.util import *
from loguru import logger
current_directory = os.getcwd()
dict_directory = "dicts"
combined_directory = os.path.join(current_directory, dict_directory)
def get_domainDict():
domainDict = ""
try:
# 尝试打开文件并读取内容
with open(os.path.join(combined_directory, "domainDict"), "r") as file:
domainDict = file.read()
except FileNotFoundError:
logger.error("文件不存在")
return domainDict
def get_dirDict():
domainDict = ""
try:
# 尝试打开文件并读取内容
with open(os.path.join(combined_directory, "dirDict"), "r") as file:
domainDict = file.read()
except FileNotFoundError:
logger.error("文件不存在")
return domainDict
def get_poc():
pocPath = os.path.join(combined_directory, "ScopeSentry.PocList.json")
data = read_json_file(pocPath)
for d in data:
d.pop('_id', None)
return data
def get_project_data():
project_path = os.path.join(combined_directory, "ScopeSentry.project.json")
data = read_json_file(project_path)
target_data = []
project_data = []
for d in data:
project_id = d['_id']['$oid']
tmp = []
for t in d['target'].split('\n'):
root_domain = get_root_domain(t)
if root_domain not in tmp:
tmp.append(root_domain)
d["root_domains"] = tmp
d['_id'] = ObjectId(project_id)
target_data.append({"id": project_id, "target": d['target']})
del d["target"]
project_data.append(d)
return project_data, target_data
def get_sensitive():
sensitive_path = os.path.join(combined_directory, "ScopeSentry.SensitiveRule.json")
data = read_json_file(sensitive_path)
for d in data:
d.pop('_id', None)
return data
subfinderApiConfig = '''# subfinder can be used right after the installation, however many sources required API keys to work. Learn more here: https://docs.projectdiscovery.io/tools/subfinder/install#post-install-configuration.
bevigil: []
binaryedge: []
bufferover: []
builtwith: []
c99: []
censys: []
certspotter: []
chaos: []
chinaz: []
dnsdb: []
dnsrepo: []
facebook: []
fofa: []
fullhunt: []
github: []
hunter: []
intelx: []
leakix: []
netlas: []
passivetotal: []
quake: []
redhuntlabs: []
robtex: []
securitytrails: []
shodan: []
threatbook: []
virustotal: []
whoisxmlapi: []
zoomeyeapi: []
'''
sensitiveList = [{'name': 'JSON Web Token',
'regular': '(eyJ[A-Za-z0-9_-]{10,}\\.[A-Za-z0-9._-]{10,}|eyJ[A-Za-z0-9_\\/+-]{10,}\\.[A-Za-z0-9._\\/+-]{10,})',
'color': 'green', 'state': True}, {'name': 'Swagger UI',
'regular': '((swagger-ui.html)|(\\"swagger\\":)|(Swagger UI)|(swaggerUi)|(swaggerVersion))',
'color': 'red', 'state': True},
{'name': 'Ueditor', 'regular': '(ueditor\\.(config|all)\\.js)', 'color': 'green', 'state': True},
{'name': 'Java Deserialization', 'regular': '(javax\\.faces\\.ViewState)', 'color': 'yellow',
'state': True},
{'name': 'URL As A Value', 'regular': '(=(https?)(://|%3a%2f%2f))', 'color': 'cyan', 'state': True},
{'name': 'Upload Form', 'regular': '(type=\\"file\\")', 'color': 'yellow', 'state': True},
{'name': 'Email',
'regular': '(([a-z0-9][_|\\.])*[a-z0-9]+@([a-z0-9][-|_|\\.])*[a-z0-9]+\\.((?!js|css|jpg|jpeg|png|ico)[a-z]{2,}))',
'color': 'yellow', 'state': True}, {'name': 'Chinese IDCard',
'regular': "'[^0-9]((\\d{8}(0\\d|10|11|12)([0-2]\\d|30|31)\\d{3}$)|(\\d{6}(18|19|20)\\d{2}(0[1-9]|10|11|12)([0-2]\\d|30|31)\\d{3}(\\d|X|x)))[^0-9]'",
'color': 'orange', 'state': True},
{'name': 'Chinese Mobile Number',
'regular': "'[^\\w]((?:(?:\\+|00)86)?1(?:(?:3[\\d])|(?:4[5-79])|(?:5[0-35-9])|(?:6[5-7])|(?:7[0-8])|(?:8[\\d])|(?:9[189]))\\d{8})[^\\w]'",
'color': 'orange', 'state': True}, {'name': 'Internal IP Address',
'regular': "'[^0-9]((127\\.0\\.0\\.1)|(10\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3})|(172\\.((1[6-9])|(2\\d)|(3[01]))\\.\\d{1,3}\\.\\d{1,3})|(192\\.168\\.\\d{1,3}\\.\\d{1,3}))'",
'color': 'cyan', 'state': True}, {'name': 'MAC Address',
'regular': '(^([a-fA-F0-9]{2}(:[a-fA-F0-9]{2}){5})|[^a-zA-Z0-9]([a-fA-F0-9]{2}(:[a-fA-F0-9]{2}){5}))',
'color': 'green',
'state': True},
{'name': 'Chinese Bank Card ID', 'regular': "'[^0-9]([1-9]\\d{12,18})[^0-9]'", 'color': 'orange',
'state': True},
{'name': 'Cloud Key', 'regular': '((accesskeyid)|(accesskeysecret)|(LTAI[a-z0-9]{12,20}))',
'color': 'yellow', 'state': True}, {'name': 'Windows File/Dir Path',
'regular': "'[^\\w](([a-zA-Z]:\\\\(?:\\w+\\\\?)*)|([a-zA-Z]:\\\\(?:\\w+\\\\)*\\w+\\.\\w+))'",
'color': 'green', 'state': True}, {'name': 'Password Field',
'regular': '((|\'|")([p](ass|wd|asswd|assword))(|\'|")(:|=)( |)(\'|")(.*?)(\'|")(|,))',
'color': 'yellow',
'state': True},
{'name': 'Username Field',
'regular': '((|\'|")(([u](ser|name|ame|sername))|(account))(|\'|")(:|=)( |)(\'|")(.*?)(\'|")(|,))',
'color': 'green', 'state': True},
{'name': 'WeCom Key', 'regular': '([c|C]or[p|P]id|[c|C]orp[s|S]ecret)', 'color': 'green',
'state': True},
{'name': 'JDBC Connection', 'regular': '(jdbc:[a-z:]+://[a-z0-9\\.\\-_:;=/@?,&]+)', 'color': 'yellow',
'state': True}, {'name': 'Authorization Header',
'regular': '((basic [a-z0-9=:_\\+\\/-]{5,100})|(bearer [a-z0-9_.=:_\\+\\/-]{5,100}))',
'color': 'yellow', 'state': True},
{'name': 'Github Access Token', 'regular': '([a-z0-9_-]*:[a-z0-9_\\-]+@github\\.com*)',
'color': 'green', 'state': True}, {'name': 'Sensitive Field',
'regular': '((|\'|")([\\w]{0,10})((key)|(secret)|(token)|(config)|(auth)|(access)|(admin))(|\'|")(:|=)( |)(\'|")(.*?)(\'|")(|,))',
'color': 'yellow', 'state': True}, {'name': 'Linkfinder',
'regular': '(?:"|\')(((?:[a-zA-Z]{1,10}://|//)[^"\'/]{1,}\\.[a-zA-Z]{2,}[^"\']{0,})|((?:/|\\.\\./|\\./)[^"\'><,;|*()(%%$^/\\\\\\[\\]][^"\'><,;|()]{1,})|([a-zA-Z0-9_\\-/]{1,}/[a-zA-Z0-9_\\-/]{1,}\\.(?:[a-zA-Z]{1,4}|action)(?:[\\?|#][^"|\']{0,}|))|([a-zA-Z0-9_\\-/]{1,}/[a-zA-Z0-9_\\-/]{3,}(?:[\\?|#][^"|\']{0,}|))|([a-zA-Z0-9_\\-]{1,}\\.(?:\\w)(?:[\\?|#][^"|\']{0,}|)))(?:"|\')',
'color': 'gray',
'state': True},
{'name': 'Source Map', 'regular': '(\\.js\\.map)', 'color': 'null', 'state': True},
{'name': 'HTML Notes', 'regular': '(<!--[\\s\\S]*?-->)', 'color': 'green', 'state': True},
{'name': 'Create Script', 'regular': '(createElement\\(\\"script\\"\\))', 'color': 'green',
'state': True}, {'name': 'URL Schemes',
'regular': '(?![http]|[https])(([-A-Za-z0-9]{1,20})://[-A-Za-z0-9+&@#/%?=~_|!:,.;]+[-A-Za-z0-9+&@#/%=~_|])',
'color': 'yellow', 'state': True},
{'name': 'Potential cryptographic private key', 'regular': '(\\.pem[\'"])', 'color': 'green',
'state': True},
{'name': 'google_api', 'regular': '(AIza[0-9A-Za-z-_]{35})', 'color': 'red', 'state': True},
{'name': 'firebase', 'regular': '(AAAA[A-Za-z0-9_-]{7}:[A-Za-z0-9_-]{140})', 'color': 'red',
'state': True},
{'name': 'authorization_api', 'regular': '(api[key|_key|\\s+]+[a-zA-Z0-9_\\-]{5,100})', 'color': 'red',
'state': True}, {'name': 'Log file', 'regular': '(\\.log[\'"])', 'color': 'green', 'state': True},
{'name': 'Potential cryptographic key bundle', 'regular': '(\\.pkcs12[\'"])', 'color': 'yellow',
'state': True},
{'name': 'Potential cryptographic key bundle', 'regular': '(\\.p12[\'"])', 'color': 'yellow',
'state': True},
{'name': 'Potential cryptographic key bundle', 'regular': '(\\.pfx[\'"])', 'color': 'yellow',
'state': True},
{'name': 'Pidgin OTR private key', 'regular': '(otr\\.private_key)', 'color': 'yellow', 'state': True},
{'name': 'File',
'regular': '(\\.((asc)|(ovpn)|(cscfg)|(rdp)|(mdf)|(sdf)|(sqlite)|(sqlite3)|(bek)|(tpm)|(fve)|(jks)|(psafe3)|(agilekeychain)|(keychain)|(pcap)|(gnucash)|(kwallet)|(tblk)|(dayone)|(exports)|(functions)|(extra)|(proftpdpasswd))[\'"])',
'color': 'yellow', 'state': True},
{'name': 'Ruby On Rails secret token configuration file', 'regular': '(secret_token\\.rb)',
'color': 'yellow', 'state': True},
{'name': 'Carrierwave configuration file', 'regular': '(carrierwave\\.rb)', 'color': 'yellow',
'state': True},
{'name': 'Potential Ruby On Rails database configuration file', 'regular': '(database\\.yml)',
'color': 'yellow', 'state': True},
{'name': 'OmniAuth configuration file', 'regular': '(omniauth\\.rb)', 'color': 'yellow',
'state': True},
{'name': 'Django configuration file', 'regular': '(settings\\.py)', 'color': 'yellow', 'state': True},
{'name': 'Jenkins publish over SSH plugin file',
'regular': '(jenkins.plugins.publish_over_ssh\\.BapSshPublisherPlugin.xml)', 'color': 'yellow',
'state': True},
{'name': 'Potential Jenkins credentials file', 'regular': '(credentials\\.xml)', 'color': 'yellow',
'state': True},
{'name': 'Potential MediaWiki configuration file', 'regular': 'LocalSettings\\.php', 'color': 'yellow',
'state': True},
{'name': 'Sequel Pro MySQL database manager bookmark file', 'regular': '(Favorites\\.plist)',
'color': 'yellow', 'state': True},
{'name': 'Little Snitch firewall configuration file', 'regular': '(configuration\\.user\\.xpl)',
'color': 'yellow', 'state': True},
{'name': 'Potential jrnl journal file', 'regular': '(journal\\.txt)', 'color': 'yellow',
'state': True},
{'name': 'Chef Knife configuration file', 'regular': '(knife\\.rb)', 'color': 'yellow', 'state': True},
{'name': 'Robomongo MongoDB manager configuration file', 'regular': '(robomongo\\.json)',
'color': 'yellow', 'state': True},
{'name': 'FileZilla FTP configuration file', 'regular': '(filezilla\\.xml)', 'color': 'yellow',
'state': True},
{'name': 'FileZilla FTP recent servers file', 'regular': '(recentservers\\.xml)', 'color': 'yellow',
'state': True},
{'name': 'Ventrilo server configuration file', 'regular': '(ventrilo_srv\\.ini)', 'color': 'yellow',
'state': True},
{'name': 'Terraform variable config file', 'regular': '(terraform\\.tfvars)', 'color': 'yellow',
'state': True}, {'name': 'Private SSH key', 'regular': '(.*_rsa)', 'color': 'yellow', 'state': True},
{'name': 'Private SSH key', 'regular': '(.*_dsa)', 'color': 'yellow', 'state': True},
{'name': 'Private SSH key', 'regular': '(.*_ed25519)', 'color': 'yellow', 'state': True},
{'name': 'Private SSH key', 'regular': '(.*_ecdsa)', 'color': 'yellow', 'state': True},
{'name': 'SSH configuration file', 'regular': '(\\.ssh_config)', 'color': 'yellow', 'state': True},
{'name': 'Shell command history file', 'regular': '(\\.?(bash_|zsh_|sh_|z)?history)',
'color': 'yellow', 'state': True},
{'name': 'MySQL client command history file', 'regular': '(.?mysql_history)', 'color': 'yellow',
'state': True},
{'name': 'PostgreSQL client command history file', 'regular': '(\\.?psql_history)', 'color': 'yellow',
'state': True},
{'name': 'PostgreSQL password file', 'regular': '(\\.?pgpass)', 'color': 'yellow', 'state': True},
{'name': 'Ruby IRB console history file', 'regular': '(\\.?irb_history)', 'color': 'yellow',
'state': True},
{'name': 'Pidgin chat client account configuration file', 'regular': '(\\.?purple/accounts\\\\.xml)',
'color': 'yellow', 'state': True}, {'name': 'DBeaver SQL database manager configuration file',
'regular': '(\\.?dbeaver-data-sources.xml)', 'color': 'yellow',
'state': True},
{'name': 'Mutt e-mail client configuration file', 'regular': '(\\.?muttrc)', 'color': 'yellow',
'state': True},
{'name': 'S3cmd configuration file', 'regular': '(\\.?s3cfg)', 'color': 'yellow', 'state': True},
{'name': 'AWS CLI credentials file', 'regular': '(\\.?aws/credentials)', 'color': 'yellow',
'state': True},
{'name': 'SFTP connection configuration file', 'regular': '(sftp-config(\\.json)?)', 'color': 'yellow',
'state': True},
{'name': 'T command-line Twitter client configuration file', 'regular': '(\\.?trc)', 'color': 'yellow',
'state': True},
{'name': 'Shell configuration file', 'regular': '(\\.?(bash|zsh|csh)rc)', 'color': 'yellow',
'state': True}, {'name': 'Shell profile configuration file', 'regular': '(\\.?(bash_|zsh_)?profile)',
'color': 'yellow', 'state': True},
{'name': 'Shell command alias configuration file', 'regular': '(\\.?(bash_|zsh_)?aliases)',
'color': 'yellow', 'state': True},
{'name': 'PHP configuration file', 'regular': '(config(\\.inc)?\\.php)', 'color': 'yellow',
'state': True},
{'name': 'GNOME Keyring database file', 'regular': '(key(store|ring))', 'color': 'yellow',
'state': True},
{'name': 'KeePass password manager database file', 'regular': '(kdbx?)', 'color': 'yellow',
'state': True},
{'name': 'SQL dump file', 'regular': '(sql(dump)?)', 'color': 'yellow', 'state': True},
{'name': 'Apache htpasswd file', 'regular': '(\\.?htpasswd)', 'color': 'yellow', 'state': True},
{'name': 'Configuration file for auto-login process', 'regular': '((\\.|_)?netrc)', 'color': 'yellow',
'state': True},
{'name': 'Rubygems credentials file', 'regular': '(\\.?gem/credentials)', 'color': 'yellow',
'state': True},
{'name': 'Tugboat DigitalOcean management tool configuration', 'regular': '(\\.?tugboat)',
'color': 'yellow', 'state': True},
{'name': 'DigitalOcean doctl command-line client configuration file', 'regular': '(doctl/config.yaml)',
'color': 'yellow', 'state': True},
{'name': 'git-credential-store helper credentials file', 'regular': '(\\.?git-credentials)',
'color': 'yellow', 'state': True},
{'name': 'GitHub Hub command-line client configuration file', 'regular': '(config/hub)',
'color': 'yellow', 'state': True},
{'name': 'Git configuration file', 'regular': '(\\.?gitconfig)', 'color': 'yellow', 'state': True},
{'name': 'Chef private key', 'regular': '(\\.?chef/(.*)\\\\.pem)', 'color': 'yellow', 'state': True},
{'name': 'Potential Linux shadow file', 'regular': '(etc/shadow)', 'color': 'yellow', 'state': True},
{'name': 'Potential Linux passwd file', 'regular': '(etc/passwd)', 'color': 'yellow', 'state': True},
{'name': 'Docker configuration file', 'regular': '(\\.?dockercfg)', 'color': 'yellow', 'state': True},
{'name': 'NPM configuration file', 'regular': '(\\.?npmrc)', 'color': 'yellow', 'state': True},
{'name': 'Environment configuration file', 'regular': '(\\.?env)', 'color': 'yellow', 'state': True},
{'name': 'AWS Access Key ID Value',
'regular': '((A3T[A-Z0-9]|AKIA|AGPA|AROA|AIPA|ANPA|ANVA|ASIA)[A-Z0-9]{16})', 'color': 'red',
'state': True}, {'name': 'ak sk',
'regular': '(((access_key|access_token|admin_pass|admin_user|algolia_admin_key|algolia_api_key|alias_pass|alicloud_access_key|amazon_secret_access_key|amazonaws|ansible_vault_password|aos_key|api_key|api_key_secret|api_key_sid|api_secret|api.googlemaps AIza|apidocs|apikey|apiSecret|app_debug|app_id|app_key|app_log_level|app_secret|appkey|appkeysecret|application_key|appsecret|appspot|auth_token|authorizationToken|authsecret|aws_access|aws_access_key_id|aws_bucket|aws_key|aws_secret|aws_secret_key|aws_token|AWSSecretKey|b2_app_key|bashrc password|bintray_apikey|bintray_gpg_password|bintray_key|bintraykey|bluemix_api_key|bluemix_pass|browserstack_access_key|bucket_password|bucketeer_aws_access_key_id|bucketeer_aws_secret_access_key|built_branch_deploy_key|bx_password|cache_driver|cache_s3_secret_key|cattle_access_key|cattle_secret_key|certificate_password|ci_deploy_password|client_secret|client_zpk_secret_key|clojars_password|cloud_api_key|cloud_watch_aws_access_key|cloudant_password|cloudflare_api_key|cloudflare_auth_key|cloudinary_api_secret|cloudinary_name|codecov_token|config|conn.login|connectionstring|consumer_key|consumer_secret|credentials|cypress_record_key|database_password|database_schema_test|datadog_api_key|datadog_app_key|db_password|db_server|db_username|dbpasswd|dbpassword|dbuser|deploy_password|digitalocean_ssh_key_body|digitalocean_ssh_key_ids|docker_hub_password|docker_key|docker_pass|docker_passwd|docker_password|dockerhub_password|dockerhubpassword|dot-files|dotfiles|droplet_travis_password|dynamoaccesskeyid|dynamosecretaccesskey|elastica_host|elastica_port|elasticsearch_password|encryption_key|encryption_password|env.heroku_api_key|env.sonatype_password|eureka.awssecretkey)[a-z0-9_ .\\-,]{0,25})(=|>|:=|\\|\\|:|<=|=>|:).{0,5}[\'\\"]([0-9a-zA-Z\\-_=]{8,64}))\\b`',
'color': 'red', 'state': True}, {'name': 'AWS Access Key ID',
'regular': '(("|\'|`)?((?i)aws)?_?((?i)access)_?((?i)key)?_?((?i)id)?("|\'|`)?\\s{0,50}(:|=>|=)\\s{0,50}("|\'|`)?(A3T[A-Z0-9]|AKIA|AGPA|AIDA|AROA|AIPA|ANPA|ANVA|ASIA)[A-Z0-9]{16}("|\'|`)?)',
'color': 'red', 'state': True},
{'name': 'AWS Account ID',
'regular': '(("|\'|`)?((?i)aws)?_?((?i)account)_?((?i)id)?("|\'|`)?\\s{0,50}(:|=>|=)\\s{0,50}("|\'|`)?[0-9]{4}-?[0-9]{4}-?[0-9]{4}("|\'|`)?)',
'color': 'red', 'state': True},
{'name': 'Artifactory API Token', 'regular': '((?:\\s|=|:|"|^)AKC[a-zA-Z0-9]{10,})', 'color': 'red',
'state': True},
{'name': 'Artifactory Password', 'regular': '((?:\\s|=|:|"|^)AP[\\dABCDEF][a-zA-Z0-9]{8,})',
'color': 'red', 'state': True},
{'name': 'Authorization Basic', 'regular': '(basic [a-zA-Z0-9_\\\\-:\\\\.=]+)', 'color': 'red',
'state': True},
{'name': 'Authorization Authorization Bearer', 'regular': '(bearer [a-zA-Z0-9_\\\\-\\\\.=]+)',
'color': 'red', 'state': True}, {'name': 'AWS Client ID',
'regular': '((A3T[A-Z0-9]|AKIA|AGPA|AIDA|AROA|AIPA|ANPA|ANVA|ASIA)[A-Z0-9]{16})',
'color': 'red', 'state': True}, {'name': 'AWS MWS Key',
'regular': '(amzn\\.mws\\.[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12})',
'color': 'red', 'state': True},
{'name': 'AWS MWS Key',
'regular': '(amzn\\.mws\\.[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12})',
'color': 'red', 'state': True},
{'name': 'AWS Secret Key', 'regular': '((?i)aws(.{0,20})?(?-i)[\'\\"][0-9a-zA-Z\\/+]{40}[\'"])',
'color': 'red', 'state': True}, {'name': 'Base32',
'regular': '((?:[A-Z2-7]{8})*(?:[A-Z2-7]{2}={6}|[A-Z2-7]{4}={4}|[A-Z2-7]{5}={3}|[A-Z2-7]{7}=)?)',
'color': 'null', 'state': True},
{'name': 'Base64', 'regular': '((eyJ|YTo|Tzo|PD[89]|aHR0cHM6L|aHR0cDo|rO0)[a-zA-Z0-9+/]+={0,2})',
'color': 'null', 'state': True}, {'name': 'Basic Auth Credentials',
'regular': '((?<=:\\/\\/)[a-zA-Z0-9]+:[a-zA-Z0-9]+@[a-zA-Z0-9]+\\.[a-zA-Z]+)',
'color': 'red', 'state': True},
{'name': 'Cloudinary Basic Auth', 'regular': '(cloudinary:\\/\\/[0-9]{15}:[0-9A-Za-z]+@[a-z]+)',
'color': 'red', 'state': True},
{'name': 'Facebook Access Token', 'regular': '(EAACEdEose0cBA[0-9A-Za-z]+)', 'color': 'red',
'state': True},
{'name': 'Facebook Client ID', 'regular': '((?i)(facebook|fb)(.{0,20})?[\'\\"][0-9]{13,17})',
'color': 'red', 'state': True}, {'name': 'Facebook Oauth',
'regular': '([f|F][a|A][c|C][e|E][b|B][o|O][o|O][k|K].*[\'|\\"][0-9a-f]{32}[\'|\\"])',
'color': 'red', 'state': True},
{'name': 'Facebook Secret Key', 'regular': '((?i)(facebook|fb)(.{0,20})?(?-i)[\'\\"][0-9a-f]{32})',
'color': 'red', 'state': True},
{'name': 'Github', 'regular': '((?i)github(.{0,20})?(?-i)[\'\\"][0-9a-zA-Z]{35,40})', 'color': 'red',
'state': True},
{'name': 'Google API Key', 'regular': '(AIza[0-9A-Za-z\\\\-_]{35})', 'color': 'red', 'state': True},
{'name': 'Google Cloud Platform API Key',
'regular': '((?i)(google|gcp|youtube|drive|yt)(.{0,20})?[\'\\"][AIza[0-9a-z\\\\-_]{35}][\'\\"])',
'color': 'red', 'state': True},
{'name': 'Google Oauth', 'regular': '([0-9]+-[0-9A-Za-z_]{32}\\.apps\\.googleusercontent\\.com)',
'color': 'red', 'state': True}, {'name': 'Heroku API Key',
'regular': '([h|H][e|E][r|R][o|O][k|K][u|U].{0,30}[0-9A-F]{8}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{4}-[0-9A-F]{12})',
'color': 'red', 'state': True},
{'name': 'LinkedIn Secret Key', 'regular': '((?i)linkedin(.{0,20})?[\'\\"][0-9a-z]{16}[\'\\"])',
'color': 'red', 'state': True},
{'name': 'Mailchamp API Key', 'regular': '([0-9a-f]{32}-us[0-9]{1,2})', 'color': 'red', 'state': True},
{'name': 'Mailgun API Key', 'regular': '(key-[0-9a-zA-Z]{32})', 'color': 'red', 'state': True},
{'name': 'Picatic API Key', 'regular': '(sk_live_[0-9a-z]{32})', 'color': 'red', 'state': True},
{'name': 'Slack Token', 'regular': '(xox[baprs]-([0-9a-zA-Z]{10,48})?)', 'color': 'red',
'state': True}, {'name': 'Slack Webhook',
'regular': '(https://hooks.slack.com/services/T[a-zA-Z0-9_]{8}/B[a-zA-Z0-9_]{8}/[a-zA-Z0-9_]{24})',
'color': 'red', 'state': True},
{'name': 'Stripe API Key', 'regular': '((?:r|s)k_live_[0-9a-zA-Z]{24})', 'color': 'red',
'state': True},
{'name': 'Square Access Token', 'regular': '(sqOatp-[0-9A-Za-z\\\\-_]{22})', 'color': 'red',
'state': True},
{'name': 'Square Oauth Secret', 'regular': '(sq0csp-[ 0-9A-Za-z\\\\-_]{43})', 'color': 'red',
'state': True},
{'name': 'Twilio API Key', 'regular': '(SK[0-9a-fA-F]{32})', 'color': 'red', 'state': True},
{'name': 'Twitter Oauth',
'regular': '([t|T][w|W][i|I][t|T][t|T][e|E][r|R].{0,30}[\'\\"\\\\s][0-9a-zA-Z]{35,44}[\'\\"\\\\s])',
'color': 'red', 'state': True},
{'name': 'Twitter Secret Key', 'regular': '(?i)twitter(.{0,20})?[\'\\"][0-9a-z]{35,44}',
'color': 'red', 'state': True},
{'name': 'google_captcha', 'regular': '(6L[0-9A-Za-z-_]{38}|^6[0-9a-zA-Z_-]{39})', 'color': 'red',
'state': True},
{'name': 'google_oauth', 'regular': '(ya29\\.[0-9A-Za-z\\-_]+)', 'color': 'red', 'state': True},
{'name': 'amazon_aws_access_key_id', 'regular': '(A[SK]IA[0-9A-Z]{16})', 'color': 'red',
'state': True},
{'name': 'amazon_aws_url', 'regular': 's3\\.amazonaws.com[/]+|[a-zA-Z0-9_-]*\\.s3\\.amazonaws.com',
'color': 'red', 'state': True},
{'name': 'authorization_api', 'regular': '(api[key|\\s*]+[a-zA-Z0-9_\\-]+)', 'color': 'red',
'state': True},
{'name': 'twilio_account_sid', 'regular': '(AC[a-zA-Z0-9_\\-]{32})', 'color': 'red', 'state': True},
{'name': 'twilio_app_sid', 'regular': '(AP[a-zA-Z0-9_\\-]{32})', 'color': 'red', 'state': True},
{'name': 'paypal_braintree_access_token',
'regular': '(access_token\\$production\\$[0-9a-z]{16}\\$[0-9a-f]{32})', 'color': 'red',
'state': True}, {'name': 'square_oauth_secret',
'regular': '(sq0csp-[ 0-9A-Za-z\\-_]{43}|sq0[a-z]{3}-[0-9A-Za-z\\-_]{22,43})',
'color': 'red', 'state': True},
{'name': 'square_access_token', 'regular': '(sqOatp-[0-9A-Za-z\\-_]{22}|EAAA[a-zA-Z0-9]{60})',
'color': 'red', 'state': True},
{'name': 'rsa_private_key', 'regular': '(-----BEGIN RSA PRIVATE KEY-----)', 'color': 'red',
'state': True},
{'name': 'ssh_dsa_private_key', 'regular': '(-----BEGIN DSA PRIVATE KEY-----)', 'color': 'red',
'state': True},
{'name': 'ssh_dc_private_key', 'regular': '(-----BEGIN EC PRIVATE KEY-----)', 'color': 'red',
'state': True},
{'name': 'pgp_private_block', 'regular': '(-----BEGIN PGP PRIVATE KEY BLOCK-----)', 'color': 'red',
'state': True},
{'name': 'json_web_token', 'regular': '(ey[A-Za-z0-9-_=]+\\.[A-Za-z0-9-_=]+\\.?[A-Za-z0-9-_.+/=]*)',
'color': 'red', 'state': True},
{'name': 'Google Cloud', 'regular': '(GOOG[\\w\\W]{10,30})', 'color': 'red', 'state': True},
{'name': 'Microsoft Azure', 'regular': '(AZ[A-Za-z0-9]{34,40})', 'color': 'red', 'state': True},
{'name': '腾讯云', 'regular': '(AKID[A-Za-z0-9]{13,20})', 'color': 'red', 'state': True},
{'name': '亚马逊云', 'regular': '(AKIA[A-Za-z0-9]{16})', 'color': 'red', 'state': True},
{'name': 'IBM Cloud', 'regular': '(IBM[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': 'Oracle Cloud', 'regular': '(OCID[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '阿里云', 'regular': '(LTAI[A-Za-z0-9]{12,20})', 'color': 'red', 'state': True},
{'name': '华为云', 'regular': '(AK[\\w\\W]{10,62})', 'color': 'red', 'state': True},
{'name': '百度云', 'regular': '(AK[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '京东云', 'regular': '(AK[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': 'UCloud', 'regular': '(UC[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '青云', 'regular': '(QY[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '金山云', 'regular': '(KS3[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '联通云', 'regular': '(LTC[A-Za-z0-9]{10,60})', 'color': 'red', 'state': True},
{'name': '移动云', 'regular': '(YD[A-Za-z0-9]{10,60})', 'color': 'red', 'state': True},
{'name': '电信云', 'regular': '(CTC[A-Za-z0-9]{10,60})', 'color': 'red', 'state': True},
{'name': '一云通', 'regular': '(YYT[A-Za-z0-9]{10,60})', 'color': 'red', 'state': True},
{'name': '用友云', 'regular': '(YY[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': '南大通用云', 'regular': '(CI[A-Za-z0-9]{10,40})', 'color': 'red', 'state': True},
{'name': 'G-Core Labs', 'regular': '(gcore[A-Za-z0-9]{10,30})', 'color': 'red', 'state': True},
{'name': 'MailChimp API Key', 'regular': '([0-9a-f]{32}-us[0-9]{12})', 'color': 'red', 'state': True},
{'name': 'Outlook team', 'regular': '((https://outlook\\.office.com/webhook/[0-9a-f-]{36}@))',
'color': 'red', 'state': True},
{'name': 'Sauce Token', 'regular': '(?i)sauce.{0,50}("|\'|`)?[0-9a-f-]{36}("|\'|`)?', 'color': 'red',
'state': True},
{'name': 'SonarQube Docs API Key', 'regular': '((?i)sonar.{0,50}("|\'|`)?[0-9a-f]{40}("|\'|`)?)',
'color': 'red', 'state': True},
{'name': 'HockeyApp', 'regular': '(?i)hockey.{0,50}("|\'|`)?[0-9a-f]{32}("|\'|`)?', 'color': 'red',
'state': True}, {'name': 'Username and password in URI',
'regular': '(([\\w+]{1,24})(://)([^$<]{1})([^\\s";]{1,}):([^$<]{1})([^\\s";/]{1,})@[-a-zA-Z0-9@:%._\\\\+~#=]{1,256}\\.[a-zA-Z0-9()]{1,24}([^\\s]+))',
'color': 'red', 'state': True},
{'name': 'NuGet API Key', 'regular': '(oy2[a-z0-9]{43})', 'color': 'red', 'state': True},
{'name': 'StackHawk API Key', 'regular': '(hawk\\.[0-9A-Za-z\\-_]{20}\\.[0-9A-Za-z\\-_]{20})',
'color': 'red', 'state': True},
{'name': 'Heroku config file', 'regular': '(heroku\\.json)', 'color': 'yellow', 'state': True},
{'name': 'jwt_token',
'regular': 'eyJ[A-Za-z0-9_\\/+-]{10,}={0,2}\\.[A-Za-z0-9_\\/+\\-]{15,}={0,2}\\\\.[A-Za-z0-9_\\/+\\-]{10,}={0,2}',
'color': 'yellow', 'state': True}, {'name': 'INFO-KEY',
'regular': '(access_key|access_token|admin_pass|admin_user|algolia_admin_key|algolia_api_key|alias_pass|alicloud_access_key|amazon_secret_access_key|amazonaws|ansible_vault_password|aos_key|api_key|api_key_secret|api_key_sid|api_secret|api.googlemaps AIza|apidocs|apikey|apiSecret|app_debug|app_id|app_key|app_log_level|app_secret|appkey|appkeysecret|application_key|appsecret|appspot|auth_token|authorizationToken|authsecret|aws_access|aws_access_key_id|aws_bucket|aws_key|aws_secret|aws_secret_key|aws_token|AWSSecretKey|b2_app_key|bashrc password|bintray_apikey|bintray_gpg_password|bintray_key|bintraykey|bluemix_api_key|bluemix_pass|browserstack_access_key|bucket_password|bucketeer_aws_access_key_id|bucketeer_aws_secret_access_key|built_branch_deploy_key|bx_password|cache_driver|cache_s3_secret_key|cattle_access_key|cattle_secret_key|certificate_password|ci_deploy_password|client_secret|client_zpk_secret_key|clojars_password|cloud_api_key|cloud_watch_aws_access_key|cloudant_password|cloudflare_api_key|cloudflare_auth_key|cloudinary_api_secret|cloudinary_name|codecov_token|config|conn.login|connectionstring|consumer_key|consumer_secret|credentials|cypress_record_key|database_password|database_schema_test|datadog_api_key|datadog_app_key|db_password|db_server|db_username|dbpasswd|dbpassword|dbuser|deploy_password|digitalocean_ssh_key_body|digitalocean_ssh_key_ids|docker_hub_password|docker_key|docker_pass|docker_passwd|docker_password|dockerhub_password|dockerhubpassword|dot-files|dotfiles|droplet_travis_password|dynamoaccesskeyid|dynamosecretaccesskey|elastica_host|elastica_port|elasticsearch_password|encryption_key|encryption_password|env.heroku_api_key|env.sonatype_password|eureka.awssecretkey)',
'color': 'yellow', 'state': True}]
portDic = [
{'name': '100个常见端口',
'value': '21,22,23,25,53,67,68,80,110,111,139,143,161,389,443,445,465,512,513,514,873,993,995,1080,1000,1352,1433,1521,1723,2049,2181,2375,3306,3389,4848,5000,5001,5432,5900,5632,5900,5989,6379,6666,7001,7002,8000,8001,8009,8010,8069,8080,8083,8086,8081,8088,8089,8443,8888,9900,9200,9300,9999,10621,11211,27017,27018,66,81,457,1100,1241,1434,1944,2301,3128,4000,4001,4002,4100,5800,5801,5802,6346,6347,30821,1090,1098,1099,4444,11099,47001,47002,10999,7000-7004,8000-8003,9000-9003,9503,7070,7071,45000,45001,8686,9012,50500,11111,4786,5555,5556,8880,8983,8383,4990,8500,6066'},
{'name': 'nmap top 1000',
'value': '1,3-4,6-7,9,13,17,19-26,30,32-33,37,42-43,49,53,70,79-85,88-90,99-100,106,109-111,113,119,125,135,139,143-144,146,161,163,179,199,211-212,222,254-256,259,264,280,301,306,311,340,366,389,406-407,416-417,425,427,443-445,458,464-465,481,497,500,512-515,524,541,543-545,548,554-555,563,587,593,616-617,625,631,636,646,648,666-668,683,687,691,700,705,711,714,720,722,726,749,765,777,783,787,800-801,808,843,873,880,888,898,900-903,911-912,981,987,990,992-993,995,999-1002,1007,1009-1011,1021-1100,1102,1104-1108,1110-1114,1117,1119,1121-1124,1126,1130-1132,1137-1138,1141,1145,1147-1149,1151-1152,1154,1163-1166,1169,1174-1175,1183,1185-1187,1192,1198-1199,1201,1213,1216-1218,1233-1234,1236,1244,1247-1248,1259,1271-1272,1277,1287,1296,1300-1301,1309-1311,1322,1328,1334,1352,1417,1433-1434,1443,1455,1461,1494,1500-1501,1503,1521,1524,1533,1556,1580,1583,1594,1600,1641,1658,1666,1687-1688,1700,1717-1721,1723,1755,1761,1782-1783,1801,1805,1812,1839-1840,1862-1864,1875,1900,1914,1935,1947,1971-1972,1974,1984,1998-2010,2013,2020-2022,2030,2033-2035,2038,2040-2043,2045-2049,2065,2068,2099-2100,2103,2105-2107,2111,2119,2121,2126,2135,2144,2160-2161,2170,2179,2190-2191,2196,2200,2222,2251,2260,2288,2301,2323,2366,2381-2383,2393-2394,2399,2401,2492,2500,2522,2525,2557,2601-2602,2604-2605,2607-2608,2638,2701-2702,2710,2717-2718,2725,2800,2809,2811,2869,2875,2909-2910,2920,2967-2968,2998,3000-3001,3003,3005-3007,3011,3013,3017,3030-3031,3052,3071,3077,3128,3168,3211,3221,3260-3261,3268-3269,3283,3300-3301,3306,3322-3325,3333,3351,3367,3369-3372,3389-3390,3404,3476,3493,3517,3527,3546,3551,3580,3659,3689-3690,3703,3737,3766,3784,3800-3801,3809,3814,3826-3828,3851,3869,3871,3878,3880,3889,3905,3914,3918,3920,3945,3971,3986,3995,3998,4000-4006,4045,4111,4125-4126,4129,4224,4242,4279,4321,4343,4443-4446,4449,4550,4567,4662,4848,4899-4900,4998,5000-5004,5009,5030,5033,5050-5051,5054,5060-5061,5080,5087,5100-5102,5120,5190,5200,5214,5221-5222,5225-5226,5269,5280,5298,5357,5405,5414,5431-5432,5440,5500,5510,5544,5550,5555,5560,5566,5631,5633,5666,5678-5679,5718,5730,5800-5802,5810-5811,5815,5822,5825,5850,5859,5862,5877,5900-5904,5906-5907,5910-5911,5915,5922,5925,5950,5952,5959-5963,5987-5989,5998-6007,6009,6025,6059,6100-6101,6106,6112,6123,6129,6156,6346,6389,6502,6510,6543,6547,6565-6567,6580,6646,6666-6669,6689,6692,6699,6779,6788-6789,6792,6839,6881,6901,6969,7000-7002,7004,7007,7019,7025,7070,7100,7103,7106,7200-7201,7402,7435,7443,7496,7512,7625,7627,7676,7741,7777-7778,7800,7911,7920-7921,7937-7938,7999-8002,8007-8011,8021-8022,8031,8042,8045,8080-8090,8093,8099-8100,8180-8181,8192-8194,8200,8222,8254,8290-8292,8300,8333,8383,8400,8402,8443,8500,8600,8649,8651-8652,8654,8701,8800,8873,8888,8899,8994,9000-9003,9009-9011,9040,9050,9071,9080-9081,9090-9091,9099-9103,9110-9111,9200,9207,9220,9290,9415,9418,9485,9500,9502-9503,9535,9575,9593-9595,9618,9666,9876-9878,9898,9900,9917,9929,9943-9944,9968,9998-10004,10009-10010,10012,10024-10025,10082,10180,10215,10243,10566,10616-10617,10621,10626,10628-10629,10778,11110-11111,11967,12000,12174,12265,12345,13456,13722,13782-13783,14000,14238,14441-14442,15000,15002-15004,15660,15742,16000-16001,16012,16016,16018,16080,16113,16992-16993,17877,17988,18040,18101,18988,19101,19283,19315,19350,19780,19801,19842,20000,20005,20031,20221-20222,20828,21571,22939,23502,24444,24800,25734-25735,26214,27000,27352-27353,27355-27356,27715,28201,30000,30718,30951,31038,31337,32768-32785,33354,33899,34571-34573,35500,38292,40193,40911,41511,42510,44176,44442-44443,44501,45100,48080,49152-49161,49163,49165,49167,49175-49176,49400,49999-50003,50006,50300,50389,50500,50636,50800,51103,51493,52673,52822,52848,52869,54045,54328,55055-55056,55555,55600,56737-56738,57294,57797,58080,60020,60443,61532,61900,62078,63331,64623,64680,65000,65129,65389,280,4567,7001,8008,9080'}
]
radConfig = '''exec_path: "" # 启动chrome的路径
disable_headless: false # 禁用无头模式
subdomain: false # 是否自动爬取子域
leakless: true # 实验性功能,防止内存泄露,可能造成卡住的现象
force_sandbox: false # 强制开启sandbox为 false 时默认开启沙箱但在容器中会关闭沙箱。为true时强制启用沙箱可能导致在docker中无法使用。
enable_image: false # 启用图片显示
parent_path_detect: false # 是否启用父目录探测功能
proxy: "" # 代理配置
user_agent: "" # 请求user-agent配置
domain_headers: # 请求头配置:[]{domain,map[headerKey]HeaderValue}
- domain: '*' # 为哪些域名设置headerglob语法
headers: {} # 请求头map[key]value
max_depth: 10 # 最大页面深度限制
navigate_timeout_second: 10 # 访问超时时间,单位秒
load_timeout_second: 10 # 加载超时时间,单位秒
retry: 0 # 页面访问失败后的重试次数
page_analyze_timeout_second: 100 # 页面分析超时时间,单位秒
max_interactive: 500 # 单个页面最大交互次数
max_interactive_depth: 10 # 页面交互深度限制
max_page_concurrent: 5 # 最大页面并发不大于10
max_page_visit: 1000 # 总共允许访问的页面数量
max_page_visit_per_site: 1000 # 每个站点最多访问的页面数量
element_filter_strength: 3 # 过滤同站点相似元素强度1-7取值强度逐步增大为0时不进行跨页面元素过滤
new_task_filter_config: # 检查某个链接是否应该被加入爬取队列
hostname_allowed: [] # 允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
hostname_disallowed: [] # 不允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
port_allowed: [] # 允许访问的端口, 支持的格式如: 80、80-85
port_disallowed: [] # 不允许访问的端口, 支持的格式如: 80、80-85
path_allowed: [] # 允许访问的路径,支持的格式如: test、*test*
path_disallowed: [] # 不允许访问的路径, 支持的格式如: test、*test*
query_key_allowed: [] # 允许访问的 Query Key支持的格式如: test、*test*
query_key_disallowed: [] # 不允许访问的 Query Key, 支持的格式如: test、*test*
fragment_allowed: [] # 允许访问的 Fragment, 支持的格式如: test、*test*
fragment_disallowed: [] # 不允许访问的 Fragment, 支持的格式如: test、*test*
post_key_allowed: [] # 允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
post_key_disallowed: [] # 不允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
request_send_filter_config: # 检查某个请求是否应该被发送
hostname_allowed: [] # 允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
hostname_disallowed: [] # 不允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
port_allowed: [] # 允许访问的端口, 支持的格式如: 80、80-85
port_disallowed: [] # 不允许访问的端口, 支持的格式如: 80、80-85
path_allowed: [] # 允许访问的路径,支持的格式如: test、*test*
path_disallowed: [] # 不允许访问的路径, 支持的格式如: test、*test*
query_key_allowed: [] # 允许访问的 Query Key支持的格式如: test、*test*
query_key_disallowed: [] # 不允许访问的 Query Key, 支持的格式如: test、*test*
fragment_allowed: [] # 允许访问的 Fragment, 支持的格式如: test、*test*
fragment_disallowed: [] # 不允许访问的 Fragment, 支持的格式如: test、*test*
post_key_allowed: [] # 允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
post_key_disallowed: [] # 不允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
request_output_filter_config: # 检查某个请求是否应该被输出
hostname_allowed: [] # 允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
hostname_disallowed: [] # 不允许访问的 Hostname支持格式如 t.com、*.t.com、1.1.1.1、1.1.1.1/24、1.1-4.1.1-8
port_allowed: [] # 允许访问的端口, 支持的格式如: 80、80-85
port_disallowed: [] # 不允许访问的端口, 支持的格式如: 80、80-85
path_allowed: [] # 允许访问的路径,支持的格式如: test、*test*
path_disallowed: [] # 不允许访问的路径, 支持的格式如: test、*test*
query_key_allowed: [] # 允许访问的 Query Key支持的格式如: test、*test*
query_key_disallowed: [] # 不允许访问的 Query Key, 支持的格式如: test、*test*
fragment_allowed: [] # 允许访问的 Fragment, 支持的格式如: test、*test*
fragment_disallowed: [] # 不允许访问的 Fragment, 支持的格式如: test、*test*
post_key_allowed: [] # 允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
post_key_disallowed: [] # 不允许访问的 Post Body 中的参数, 支持的格式如: test、*test*
entrance_retry: 0 # 入口重试次数
max_similar_request: 0 # 最大相似fetch/XHR请求数小于等于0时不限制
'''
def get_fingerprint_data():
try:
# 尝试打开文件并读取内容
with open(os.path.join(combined_directory, "fingerprint"), "r", encoding="utf-8") as file:
fingerprint = file.read()
except FileNotFoundError:
logger.error("文件不存在")
return json.loads(fingerprint)

104
core/redis_handler.py Normal file
View File

@ -0,0 +1,104 @@
# -*- coding:utf-8 -*-  
# @name: redis_handler
# @auth: rainy-autumn@outlook.com
# @version:
import asyncio
import json
from loguru import logger
import redis.asyncio as redis
from core.db import *
from core.util import *
import socket
from motor.motor_asyncio import AsyncIOMotorCursor
async def get_redis_pool():
keep_alive_config = {
'socket_keepalive': True,
'socket_keepalive_options': {
socket.TCP_KEEPIDLE: 60,
socket.TCP_KEEPCNT: 10,
socket.TCP_KEEPINTVL: 10,
}
}
redis_con = await redis.from_url(f"redis://:{REDIS_PASSWORD}@{REDIS_IP}:{REDIS_PORT}", encoding="utf-8", decode_responses=True, **keep_alive_config)
try:
yield redis_con
finally:
await redis_con.close()
await redis_con.connection_pool.disconnect()
async def refresh_config(name, t, content=None):
data = {
"name": name,
"type": t,
}
if content is not None:
data['content'] = content
async for redis_client in get_redis_pool():
name_all = []
if name == "all":
keys = await redis_client.keys("node:*")
for key in keys:
tmp_name = key.split(":")[1]
hash_data = await redis_client.hgetall(key)
if hash_data.get('state') != '3':
name_all.append(tmp_name)
else:
name_all.append(name)
for n in name_all:
await redis_client.rpush(f"refresh_config:{n}", json.dumps(data))
async def subscribe_log_channel():
channel_name = 'logs'
logger.info(f"Subscribed to channel {channel_name}")
while True:
try:
async for redis_client in get_redis_pool():
async with redis_client.pubsub() as pubsub:
await pubsub.psubscribe(channel_name)
while True:
message = await pubsub.get_message(ignore_subscribe_messages=True, timeout=3)
if message is not None:
data = json.loads(message["data"])
logger.info("Received message:" + json.dumps(data))
log_name = data["name"]
if log_name in GET_LOG_NAME:
if log_name not in LOG_INFO:
LOG_INFO[log_name] = []
LOG_INFO[log_name].append(data['log'])
if "Register Success" in data['log']:
await check_node_task(log_name, redis_client)
await redis_client.rpush(f'log:{log_name}', data['log'])
total_logs = await redis_client.llen(f'log:{log_name}')
if total_logs > TOTAL_LOGS:
await redis_client.delete(f'log:{log_name}')
except Exception as e:
logger.error(f"An error occurred: {e}. Reconnecting...")
await asyncio.sleep(1) # 等待一段时间后尝试重新连接
async def check_node_task(node_name, redis_conn):
async for mongo_client in get_mongo_db():
query = {
"progress": {"$ne": 100},
"$or": [
{"node": node_name},
{"allNode": True}
]
}
cursor: AsyncIOMotorCursor = mongo_client.task.find(query)
result = await cursor.to_list(length=None)
if len(result) == 0:
return
# Process the result as needed
response_data = []
for doc in result:
doc["id"] = str(doc["_id"])
response_data.append(doc)
for r in response_data:
add_redis_task_data = transform_db_redis(r)
await redis_conn.rpush(f"NodeTask:{node_name}", json.dumps(add_redis_task_data))
return

352
core/util.py Normal file
View File

@ -0,0 +1,352 @@
# -*- coding:utf-8 -*-  
# @name: util
# @auth: rainy-autumn@outlook.com
# @version:
import hashlib, random
import re
import string
import sys
from loguru import logger
from core.config import TIMEZONE, APP, SensitiveRuleList, Project_List
from datetime import timezone
from datetime import datetime, timedelta
import json
from urllib.parse import urlparse
def calculate_md5_from_content(content):
md5 = hashlib.md5()
md5.update(content.encode("utf-8")) # 将内容编码为 utf-8 后更新 MD5
return md5.hexdigest()
def evaluate_expression(express):
random_bool = random.choice([True, False])
return str(random_bool)
def generate_random_string(length):
# 生成随机字符串,包括大小写字母和数字
characters = string.ascii_letters + string.digits
random_string = ''.join(random.choice(characters) for _ in range(length))
return random_string
def parse_expression(express, eval_expression):
parts = []
part = ""
operator_flag = False
parentheses_depth = 0
for i in range(len(express)):
if express[i] == '(':
if i != 0:
if express[i - 1] != '\\':
parentheses_depth += 1
elif express[i] == ')':
if i != 0:
if express[i - 1] != '\\':
parentheses_depth -= 1
if express[i] == '|' and express[i + 1] == '|' and parentheses_depth == 0:
operator_flag = True
if part[0] == '(':
eval_expression += "("
eval_expression = parse_expression(part.strip("(").strip(")"), eval_expression)
eval_expression += ") or "
else:
eval_expression += evaluate_expression(part) + " or "
part = ""
elif express[i] == '&' and express[i + 1] == '&' and parentheses_depth == 0:
operator_flag = True
if part[0] == '(':
eval_expression += "("
eval_expression = parse_expression(part.strip("(").strip(")"), eval_expression)
eval_expression += ") and "
else:
eval_expression += evaluate_expression(part) + " and "
part = ""
else:
ch = ""
if operator_flag:
ch = express[i + 1]
operator_flag = False
else:
ch = express[i]
part += ch.strip()
if part[0] == '(':
eval_expression += "("
eval_expression = parse_expression(part.strip("(").strip(")"), eval_expression)
eval_expression += ")"
else:
eval_expression += evaluate_expression(part)
return eval_expression
def get_now_time():
TZ = timezone(
timedelta(hours=8),
name=TIMEZONE,
)
utc_now = datetime.utcnow().replace(tzinfo=timezone.utc)
time_now = utc_now.astimezone(TZ)
formatted_time = time_now.strftime("%Y-%m-%d %H:%M:%S")
return formatted_time
def read_json_file(file_path):
with open(file_path, encoding='utf-8') as f:
data = json.load(f)
return data
def transform_db_redis(request_data):
Subfinder = False
Ksubdomain = False
if "Subfinder" in request_data["subdomainConfig"]:
Subfinder = True
if "Ksubdomain" in request_data["subdomainConfig"]:
Ksubdomain = True
add_redis_task_data = {
"TaskId": request_data["id"],
"SubdomainScan": request_data["subdomainScan"],
"Subfinder": Subfinder,
"Ksubdomain": Ksubdomain,
"UrlScan": request_data["urlScan"],
"Duplicates": request_data["duplicates"],
"SensitiveInfoScan": request_data["sensitiveInfoScan"],
"PageMonitoring": request_data["pageMonitoring"],
"CrawlerScan": request_data["crawlerScan"],
"VulScan": request_data["vulScan"],
"VulList": request_data["vulList"],
"PortScan": request_data["portScan"],
"Ports": request_data["ports"],
"Waybackurl": request_data["waybackurl"],
"DirScan": request_data["dirScan"],
"type": 'scan'
}
return add_redis_task_data
def string_to_postfix(expression):
try:
operands_stack = []
expression_stack = []
start_char = 0
skip_flag = False
exp_flag = False
for index, char in enumerate(expression):
if skip_flag:
skip_flag = False
continue
if char == '|' and expression[index + 1] == '|':
skip_flag = True
operands_stack.append("||")
key = expression[start_char:index]
if key != "":
expression_stack.append(key)
start_char = index + 2
elif char == '&' and expression[index + 1] == '&':
skip_flag = True
operands_stack.append("&&")
key = expression[start_char:index]
if key != "":
expression_stack.append(key)
start_char = index + 2
elif char == '(' and expression[index - 1] != '\\' and exp_flag != True:
start_char = index + 1
operands_stack.append('(')
elif char == ')' and expression[index - 1] != '\\' and exp_flag != True:
key = expression[start_char:index]
if key != "":
expression_stack.append(key)
start_char = index + 1
popped_value = operands_stack.pop()
while popped_value != '(':
if popped_value != '(':
if popped_value != "":
expression_stack.append(popped_value)
popped_value = operands_stack.pop()
elif char == " ":
continue
elif char == "\"" and expression[index - 1] != "\\":
if exp_flag == False:
exp_flag = True
else:
if index == len(expression):
exp_flag = False
continue
tmp = expression[index:].replace(" ", "")
if tmp.startswith("\"||") or (tmp.startswith("\"))") and len(tmp) == 3) or tmp.startswith(
"\"&&") or tmp.startswith("\")||") or tmp.startswith("\")&&") or (
tmp.startswith("\")") and len(tmp) == 2) or re.findall(r"^\"[)]*(\|\||\&\&)", tmp):
exp_flag = False
if start_char != len(expression):
key = expression[start_char:]
if key != "":
expression_stack.append(key)
while len(operands_stack) != 0:
expression_stack.append(operands_stack.pop())
tmp = []
for key in expression_stack:
if key != "" and key != " ":
tmp.append(
key.strip().replace('\(', '(').replace('\)', ')').replace('\|\|', '||').replace('\&\&', '&&'))
return tmp
except Exception as e:
logger.error(f"后缀表达式转换出错:{expression}")
return ""
async def search_to_mongodb(expression_raw, keyword):
try:
if expression_raw == "":
return [{}]
if len(APP) == 0:
logger.error("WebFinger缓存数据为0请排查~")
expression = string_to_postfix(expression_raw)
stack = []
for expr in expression:
if expr == "&&":
right = stack.pop()
left = stack.pop()
stack.append({"$and": [left, right]})
elif expr == "||":
right = stack.pop()
left = stack.pop()
stack.append({"$or": [left, right]})
elif "!=" in expr:
key, value = expr.split("!=", 1)
key = key.strip()
if key in keyword:
value = value.strip("\"")
if key == 'statuscode':
value = int(value)
if key == 'project':
if value.lower() in Project_List:
value = Project_List[value.lower()]
if key == 'app':
finger_id = []
for ap_key in APP:
if value.lower() in APP[ap_key].lower():
finger_id.append(ap_key)
tmp_nor = {"$nor": []}
for f_i in finger_id:
tmp_nor['$nor'].append({"webfinger": {"$in": [f_i]}})
tmp_nor['$nor'].append({"technologies": {"$regex": value, "$options": "i"}})
stack.append(tmp_nor)
if type(keyword[key]) is list:
tmp_nor = {"$nor": []}
for v in keyword[key]:
tmp_nor['$nor'].append({v: {"$regex": value, "$options": "i"}})
stack.append(tmp_nor)
else:
tmp_nor = {"$nor": []}
if type(value) is int:
tmp_nor['$nor'].append({keyword[key]: {"$eq": value}})
else:
tmp_nor['$nor'].append({keyword[key]: {"$regex": value, "$options": "i"}})
stack.append(tmp_nor)
elif "==" in expr:
key, value = expr.split("==", 1)
key = key.strip()
if key in keyword:
value = value.strip("\"")
if key == 'statuscode':
value = int(value)
if key == 'project':
if value.lower() in Project_List:
value = Project_List[value.lower()]
if key == 'app':
finger_id = []
for ap_key in APP:
if value.lower() == APP[ap_key].lower():
finger_id.append(ap_key)
tmp_or = {"$or": []}
for f_i in finger_id:
tmp_or['$or'].append({"webfinger": {"$in": [f_i]}})
tmp_or['$or'].append({"technologies": {"$eq": value}})
stack.append(tmp_or)
if type(keyword[key]) is list:
tmp_or = {"$or": []}
for v in keyword[key]:
tmp_or['$or'].append({v: {"$eq": value}})
stack.append(tmp_or)
else:
tmp_or = {keyword[key]: {"$eq": value}}
stack.append(tmp_or)
elif "=" in expr:
key, value = expr.split("=", 1)
key = key.strip()
if key in keyword:
value = value.strip("\"")
if key == 'project':
if value.lower() in Project_List:
value = Project_List[value.lower()]
if key == 'app':
finger_id = []
for ap_key in APP:
if value.lower() in APP[ap_key].lower():
finger_id.append(ap_key)
tmp_or = {"$or": []}
for f_i in finger_id:
tmp_or['$or'].append({"webfinger": {"$in": [f_i]}})
tmp_or['$or'].append({"technologies": {"$regex": value, "$options": "i"}})
stack.append(tmp_or)
if type(keyword[key]) is list:
tmp_or = {"$or": []}
for v in keyword[key]:
tmp_or['$or'].append({v: {"$regex": value, "$options": "i"}})
stack.append(tmp_or)
else:
stack.append({keyword[key]: {"$regex": value, "$options": "i"}})
return stack
except Exception as e:
logger.error(e)
return ""
def get_root_domain(url):
# 如果URL不带协议添加一个默认的http协议
global root_domain
if not url.startswith(('http://', 'https://')):
url = 'http://' + url
parsed_url = urlparse(url)
# 检查是否为IP地址
try:
# 使用ip_address来检查
from ipaddress import ip_address
ip_address(parsed_url.netloc)
return parsed_url.netloc # 如果是IP地址直接返回
except ValueError:
pass
domain_parts = parsed_url.netloc.split('.')
# 复合域名列表
compound_domains = [
'com.cn', 'net.cn', 'org.cn', 'gov.cn', 'edu.cn', 'ac.cn', 'mil.cn',
'co.uk', 'org.uk', 'net.uk', 'gov.uk', 'ac.uk', 'sch.uk',
'co.jp', 'ne.jp', 'or.jp', 'go.jp', 'ac.jp', 'ad.jp',
'com.de', 'org.de', 'net.de', 'gov.de',
'com.ca', 'net.ca', 'org.ca', 'gov.ca',
'com.au', 'net.au', 'org.au', 'gov.au', 'edu.au',
'com.fr', 'net.fr', 'org.fr', 'gov.fr',
'com.br', 'com.mx', 'com.ar', 'com.ru',
'co.in', 'co.za',
'co.kr', 'com.tw'
]
# 检查是否为复合域名
is_compound_domain = False
for compound_domain in compound_domains:
if domain_parts[-2:] == compound_domain.split('.'):
is_compound_domain = True
root_domain = '.'.join(domain_parts[-3:])
break
if not is_compound_domain:
root_domain = '.'.join(domain_parts[-2:])
return root_domain

81950
dicts/ScopeSentry.PocList.json Normal file

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

14220
dicts/ScopeSentry.project.json Normal file

File diff suppressed because one or more lines are too long

9991
dicts/dirDict Normal file

File diff suppressed because it is too large Load Diff

20024
dicts/domainDict Normal file

File diff suppressed because it is too large Load Diff

77926
dicts/fingerprint Normal file

File diff suppressed because it is too large Load Diff

19
docker-compose.yml Normal file
View File

@ -0,0 +1,19 @@
version: '3'
services:
scope-sentry:
image: autumn27/scopesentry:latest
container_name: scope-sentry
restart: always
ports:
- "8082:8082"
environment:
TIMEZONE: Asia/Shanghai
MONGODB_IP: 127.0.0.1
MONGODB_PORT: 27017
DATABASE_NAME: ScopeSentry
DATABASE_USER: root
DATABASE_PASSWORD: QckSdkg5CKvtxfec
REDIS_IP: 127.0.0.1
REDIS_PORT: 6379
REDIS_PASSWORD: ScopeSentry

25
dockerfile Normal file
View File

@ -0,0 +1,25 @@
FROM debian:buster-slim AS git_installer
RUN sed -i 's/deb.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list && \
sed -i 's/security.debian.org/mirrors.aliyun.com\/debian-security/g' /etc/apt/sources.list
RUN apt-get update && \
apt-get install -y git curl && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
FROM python:3.7-slim
ENV TZ=Asia/Shanghai
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
COPY --from=git_installer /usr/bin/git /usr/bin/git
COPY --from=git_installer /usr/bin/curl /usr/bin/curl
WORKDIR /opt/ScopeSentry/
COPY ./ScopeSentry /opt/ScopeSentry/
RUN pip install -r requirements.txt -i https://pypi.tuna.tsinghua.edu.cn/simple --no-cache-dir
CMD ["python", "main.py"]

BIN
docs/images/asset-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 234 KiB

BIN
docs/images/asset-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 230 KiB

BIN
docs/images/craw-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

BIN
docs/images/craw-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.4 MiB

BIN
docs/images/dir-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 917 KiB

BIN
docs/images/dir-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 933 KiB

BIN
docs/images/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 84 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 179 KiB

BIN
docs/images/index-cn-2.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 152 KiB

BIN
docs/images/index-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 569 KiB

BIN
docs/images/index-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 162 KiB

BIN
docs/images/login.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 393 KiB

BIN
docs/images/node-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 461 KiB

BIN
docs/images/page-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 358 KiB

BIN
docs/images/page-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 400 KiB

BIN
docs/images/project-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 MiB

BIN
docs/images/qq.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 260 KiB

BIN
docs/images/sns-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 844 KiB

BIN
docs/images/sns-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 891 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 962 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1014 KiB

BIN
docs/images/subt-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

BIN
docs/images/subt-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

BIN
docs/images/task-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 200 KiB

BIN
docs/images/task-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 220 KiB

BIN
docs/images/task-pg-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 MiB

BIN
docs/images/task-pg-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 866 KiB

BIN
docs/images/url-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 MiB

BIN
docs/images/url-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 MiB

BIN
docs/images/vul-cn.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 356 KiB

BIN
docs/images/vul-en.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 431 KiB

BIN
docs/images/wx.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 106 KiB

165
main.py Normal file
View File

@ -0,0 +1,165 @@
import time
from loguru import logger
import uvicorn
from starlette.middleware.base import BaseHTTPMiddleware
from core.config import *
set_config()
from starlette.requests import Request
import asyncio
from urllib.parse import urlparse
from fastapi import FastAPI
from fastapi.responses import FileResponse
from fastapi.responses import JSONResponse
from api import dirscan
from core import db
import json
from fastapi import WebSocket
from starlette.exceptions import HTTPException as StarletteHTTPException
from starlette.websockets import WebSocketDisconnect
from core.redis_handler import subscribe_log_channel
app = FastAPI()
from core.apscheduler_handler import scheduler
@app.on_event("startup")
async def startup_db_client():
await db.create_database()
scheduler.start()
jobs = scheduler.get_jobs()
find_page_m = False
for j in jobs:
if j.id == 'page_monitoring':
find_page_m = True
if not find_page_m:
from api.scheduled_tasks import get_page_monitoring_time, create_page_monitoring_task
pat = await get_page_monitoring_time()
scheduler.add_job(create_page_monitoring_task, 'interval', hours=pat, id='page_monitoring', jobstore='mongo')
asyncio.create_task(subscribe_log_channel())
@app.exception_handler(StarletteHTTPException)
async def http_exception_handler(request, exc):
if type(exc.detail) == str:
exc.detail = {'code': 500, 'message': exc.detail}
return JSONResponse(exc.detail, status_code=exc.status_code)
os.chdir(os.path.dirname(os.path.abspath(__file__)))
from api import users, sensitive, dictionary, poc, configuration, fingerprint, node, project, task, asset_info, \
page_monitoring, vulnerability, SubdoaminTaker, scheduled_tasks, notification, system
app.include_router(users.router, prefix='/api')
app.include_router(sensitive.router, prefix='/api')
app.include_router(dictionary.router, prefix='/api/dictionary')
app.include_router(poc.router, prefix='/api')
app.include_router(configuration.router, prefix='/api/configuration')
app.include_router(fingerprint.router, prefix='/api')
app.include_router(node.router, prefix='/api')
app.include_router(project.router, prefix='/api')
app.include_router(task.router, prefix='/api')
app.include_router(asset_info.router, prefix='/api')
app.include_router(page_monitoring.router, prefix='/api')
app.include_router(vulnerability.router, prefix='/api')
app.include_router(SubdoaminTaker.router, prefix='/api')
app.include_router(scheduled_tasks.router, prefix='/api')
app.include_router(dirscan.router, prefix='/api')
app.include_router(notification.router, prefix='/api')
app.include_router(system.router, prefix='/api')
@app.middleware("http")
async def process_http_requests(request, call_next):
url = str(request.url)
parsed_url = urlparse(url)
# 从路径中获取文件名
file_name = os.path.basename(parsed_url.path).replace('..', '')
# 获取文件后缀名
file_extension = os.path.splitext(file_name)[1]
if '.html' == file_extension or '.css' == file_extension or '.svg' == file_extension or '.png' == file_extension or '.ico' == file_extension:
file_name = file_name.replace('..', '')
file_path = os.path.join("static", "assets", file_name)
return FileResponse(f"{file_path}")
elif '.js' == file_extension:
headers = {
"Content-Type": "application/javascript; charset=UTF-8"
}
file_name = file_name.replace('..', '')
file_path = os.path.join("static", "assets", file_name)
return FileResponse(f"{file_path}", headers=headers)
else:
response = await call_next(request)
return response
@app.get("/")
async def read_root():
return FileResponse("static/index.html")
class MongoDBQueryTimeMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next):
start_time = time.time()
response = await call_next(request)
end_time = time.time()
# 计算查询时间
query_time = end_time - start_time
# 获取当前请求的路由信息
route = request.url.path
if route.startswith("/api"):
logger.info(f"MongoDB 查询时间:{query_time} 秒, 路由: {route}")
return response
SQLTIME = False
if SQLTIME:
app.add_middleware(MongoDBQueryTimeMiddleware)
@app.websocket("/")
async def websocket_endpoint(websocket: WebSocket):
await websocket.accept()
node_name = ""
try:
while True:
data = await websocket.receive_text()
# 解析收到的消息,假设消息格式为 JSON {"node_name": "example_node"}
try:
message = json.loads(data)
node_name = message.get("node_name")
if node_name:
GET_LOG_NAME.append(node_name)
if node_name in LOG_INFO:
while LOG_INFO[node_name]:
log = LOG_INFO[node_name].pop(0)
await websocket.send_text(log)
else:
await websocket.send_text("Invalid message format: missing node_name")
except json.JSONDecodeError:
await websocket.send_text("Invalid JSON format")
except WebSocketDisconnect:
GET_LOG_NAME.remove(node_name)
pass
def banner():
banner = ''' _____ _____ _
/ ____| / ____| | |
| (___ ___ ___ _ __ ___ | (___ ___ _ __ | |_ _ __ _ _
\___ \ / __/ _ \| '_ \ / _ \ \___ \ / _ \ '_ \| __| '__| | | |
____) | (_| (_) | |_) | __/ ____) | __/ | | | |_| | | |_| |
|_____/ \___\___/| .__/ \___| |_____/ \___|_| |_|\__|_| \__, |
| | __/ |
|_| |___/ '''
print(banner)
if __name__ == "__main__":
banner()
uvicorn.run("main:app", host="0.0.0.0", port=8082, reload=True)

43
requirements.txt Normal file
View File

@ -0,0 +1,43 @@
annotated-types==0.5.0
anyio==3.7.1
APScheduler==3.10.4
async-timeout==4.0.3
backports.zoneinfo==0.2.1
certifi==2024.2.2
charset-normalizer==3.3.2
click==8.1.7
colorama==0.4.6
dnspython==2.3.0
exceptiongroup==1.2.0
fastapi==0.103.2
gitdb==4.0.11
GitPython==3.1.43
h11==0.14.0
httptools==0.6.0
idna==3.6
importlib-metadata==6.7.0
loguru==0.7.2
motor==3.3.2
passlib==1.7.4
pydantic==2.5.3
pydantic_core==2.14.6
PyJWT==2.8.0
pymongo==4.6.1
python-dotenv==0.21.1
pytz==2023.3.post1
PyYAML==6.0.1
redis==5.0.3
requests==2.31.0
six==1.16.0
smmap==5.0.1
sniffio==1.3.0
starlette==0.27.0
typing_extensions==4.7.1
tzdata==2024.1
tzlocal==5.1
urllib3==2.0.7
uvicorn==0.22.0
watchfiles==0.20.0
websockets==11.0.3
win32-setctime==1.1.0
zipp==3.15.0

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-BjlPWF20.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-Dk21xwtr.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-BndtnywR.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-Cx8c_DGm.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-acAxt2HR.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-BmBszaKw.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-DplDKtGt.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-CkCEk9VW.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-CSLIIgHK.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-C7FJj4Bp.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-BicBY-zJ.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-L23m9uYc.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-D6DnoWkM.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-ubEX2FhK.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-tiCr04ja.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-DW0ls8FD.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-B8-s3A2L.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-5vcJ2QQZ.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-CmRDl3aP.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-Cr6AeRWq.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-B9m7qx5e.js";import{d as s,u as a,q as t,o as e,i as o,a as u}from"./index-D8t8duXH.js";const p=s({__name:"403",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(e(),o(u(r),{type:"403",onErrorClick:n}))}});export{p as default};

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 13 KiB

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-B9m7qx5e.js";import{d as s,u as a,q as t,o,i as e,a as u}from"./index-D8t8duXH.js";const p=s({__name:"404",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(o(),e(u(r),{onErrorClick:n}))}});export{p as default};

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 13 KiB

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-BicBY-zJ.js";import{d as s,u as a,q as t,o,i as e,a as u}from"./index-L23m9uYc.js";const p=s({__name:"404",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(o(),e(u(r),{onErrorClick:n}))}});export{p as default};

View File

@ -0,0 +1 @@
import{_ as r}from"./Error.vue_vue_type_script_setup_true_lang-acAxt2HR.js";import{d as s,u as a,q as t,o,i as e,a as u}from"./index-BmBszaKw.js";const p=s({__name:"404",setup(s){const{push:p}=a(),_=t(),n=()=>{var r;p(null==(r=_.addRouters[0])?void 0:r.path)};return(s,a)=>(o(),e(u(r),{onErrorClick:n}))}});export{p as default};

Some files were not shown because too many files have changed in this diff Show More