功能概述与核心逻辑
本文实现的量化策略基于年化波动率匹配原则,通过动态调整ETF网格交易的买卖区间边界,使预设的价格波动范围与标的资产的历史波动特性保持同步。该方案的核心在于:利用统计学方法计算ETF过去N个周期的年化波动率指标,以此作为网格间距的设计依据,从而构建自适应市场环境的机械式套利系统。相较于固定百分比或绝对值间距的传统网格策略,本方法能有效降低趋势性行情中的方向性风险,同时提升震荡市中的捕获效率。
✅ 主要功能模块
- 波动率测算引擎:采用杨-希尔估计量(Yang-Hirsch)改进算法处理非重叠周期数据,消除季节性偏差
- 动态区间生成器:根据实时更新的波动率参数自动扩展/收缩网格层数
- 风险平价分配机制:结合凯利准则对每层仓位进行最优资本配置
- 滑点模拟组件:内置订单簿冲击模型验证策略可行性
数学建模与算法设计
📐 波动率指标定义
设价格序列为 P_t ,取对数收益率 r_i = \\ln(P_{i}/P_{i-1}) ,则年化波动率计算公式为:
σannual=252n∑i=1n(ri−rˉ)2 \sigma_{\text{annual}} = \sqrt{\frac{252}{n}} \sum_{i=1}^{n}(r_i - \bar{r})^2 σannual=n252 i=1∑n(ri−rˉ)2
其中252代表全年交易日数量,n为样本观测周期。为增强鲁棒性,引入GARCH(1,1)模型捕捉聚类效应:
{ωt=α0+α1ϵt−12+β1ωt−1σt2=ωtrt=μ+ϵt,ϵt∼N(0,σt2) \begin{cases} \omega_t = \alpha_0 + \alpha_1 \epsilon_{t-1}^2 + \beta_1 \omega_{t-1} \\ \sigma_t^2 = \omega_t \\ r_t = \mu + \epsilon_t, \quad \epsilon_t \sim N(0,\sigma_t^2) \end{cases} ⎩ ⎨ ⎧ωt=α0+α1ϵt−12+β1ωt−1σt2=ωtrt=μ+ϵt,ϵt∼N(0,σt2)
🔧 网格构造算法流程
- 初始化阶段:加载近3年日频数据,剔除异常值后标准化处理
- 特征提取:滚动计算90日窗口期的已实现波动率(RV)和已实现双幂变差(RBVI)
- 区间校准:令网格间距 d = k \\cdot \\sigma_{\\text{current}} ,其中k∈[1.5,2.5]由夏普比率动态调节
- 边界约束:设置最大回撤保护线,当累计亏损达15%时启动熔断机制暂停交易
python
import numpy as np
from arch import arch_model
import pandas as pd
def calculate_annualized_volatility(prices, window=90):
"""使用GARCH模型计算年化波动率"""
returns = np.log(prices / prices.shift(1)).dropna()
am = arch_model(returns, vol='Garch', p=1, o=1, q=1)
res = am.fit(disp='off')
volatility = res.conditional_volatility[:-1].values
return np.sqrt(252) * volatility[-1] # 转换为年化标准差
# 示例用法
etf_data = pd.read_csv('ETF_historical.csv', parse_dates=['Date'])
set_option('display.float_format', '{:.4f}'.format)
annual_vol = calculate_annualized_volatility(etf_data['Close'])
print(f"当前年化波动率: {annual_vol:.2%}")
Python实现详解
🛠️ 核心类结构设计
python
class VolatilityMatchedGridStrategy:
def __init__(self, symbol, initial_capital=1e6):
self.symbol = symbol # 交易标的代码
self.position = PositionManager() # 头寸控制器
self.risk_budget = RiskAllocator(kelly_factor=0.8) # 风险预算模块
self.volatility_tracker = VolSurfaceMonitor() # 波动曲面监视器
def rebalance(self, current_price):
"""主重构逻辑"""
target_levels = self._determine_grid_levels(current_price)
existing_pos = self.position.get_exposure()
orders = []
for level in target_levels:
delta = level['amount'] - existing_pos.get(level['price'], 0)
if abs(delta) > min_trade_size:
orders.append({
'side': 'buy' if delta > 0 else 'sell',
'qty': round(delta / current_price),
'price': level['price']
})
return orders
def _determine_grid_levels(self, spot_price):
"""基于波动率匹配生成网格节点"""
sigma = self.volatility_tracker.get_latest()
lower_bound = max(spot_price * (1 - 2*sigma), min_supported_price)
upper_bound = min(spot_price * (1 + 2*sigma), max_supported_price)
step = (upper_bound - lower_bound) / num_partitions
return [{'price': l, 'amount': self.risk_budget.allocate(l)}
for l in np.linspace(lower_bound, upper_bound, num=num_partitions)]
🔍 关键辅助函数解析
波动率表面监测器实现
python
class VolSurfaceMonitor:
def __init__(self, lookback=252):
self.history = deque(maxlen=lookback)
self.last_updated = None
def update(self, new_observation):
self.history.append(new_observation)
if len(self.history) == self.history.maxlen:
self.last_updated = self._compute_metrics()
def _compute_metrics(self):
"""复合指标计算"""
rets = np.diff(np.log(self.history))[1:]
parkinson = np.sqrt(np.var(np.log(self.history[1:]) - np.log(self.history[:-1])))
garman_klass = np.sqrt((np.log(self.history[-1]/self.history[0]))**2 + 0.5*np.var(np.diff(np.log(self.history))))
return {'parkinson': parkinson, 'garman': garman_klass}
风险分配器核心算法
python
class RiskAllocator:
def __init__(self, kelly_fraction=0.5):
self.fraction = kelly_fraction
self.edge = EdgeProphet() # 基于强化学习的胜率预测器
def allocate(self, price_level):
"""应用修正版凯利公式"""
edge_estimate = self.edge.predict(price_level)
optimal_bet = self.fraction * min(edge_estimate, 0.3) # 限制最大暴露比例
return max(optimal_bet * total_portfolio_value, min_bet_size)
实证测试与结果分析
📊 回测框架搭建
采用向量化解耦架构实现多维度绩效归因:
python
from backtester import BacktestEngine
from metrics import RiskMetricsDashboard
engine = BacktestEngine(
strategy=VolatilityMatchedGridStrategy('510300'),
brokerage=SimulatedExchange(commission_scheme='tiered'),
datafeed=ConsolidatedMarketData()
)
results = engine.run(start='2018-01-01', end='2023-12-31')
dashboard = RiskMetricsDashboard(results)
dashboard.plot_drawdown_profile()