j*****e 发帖数: 333 | 1 呵呵,他俩确实是夫妻店,但这些文章大家都知道主要是夫做出来的,妻用Mykland的话说就是:”she is good at having a big picture",哈哈。Mykland确实很聪明。但是他做的几片有名文章主要是关于怎么处理microstructure noise的,主要是关于怎么measure volatility, mainly RV,没怎么做预测好像?Mykland是张兰的博士时导师。我是想看predict方面哪些做的比较好。我看的几片文章貌似都推荐说VIX的预测最好。你说的OU process Viktor好像有提到说它比较能describe some patterns in real data,但还没做预测方面的研究? |
|
p******i 发帖数: 1358 | 2 mykland 思维实在跳跃的太快了
一般人类很难跟上。。。
的话说就
是:”she is good at having a big picture",哈哈。Mykland确实很聪明。但是他
做的几
片有名文章主要是关于怎么处理microstructure noise的,主要是关于怎么measure
volatility, mainly RV,没怎么做预测好像?Mykland是张兰的博士时导师。我是想看
predict
方面哪些做的比较好。我看的几片文章貌似都推荐说VIX的预测最好。你说的OU
process Viktor好
像有提到说它比较能describe some patterns in real data,但还没做预测方面的研
究? |
|
w**********y 发帖数: 1691 | 3 你好像在chicago..那Uchicago的Mykland和UIC的 Zhang Lan都做相关的research..
(btw,这是夫妻店啊..我太八卦了) |
|
w**********y 发帖数: 1691 | 4 As I know, both Adaboost and SVM or kernel SVM can't do feature selection.
You can only control some (tune) parameters, like the lambda in gaussian
kernel. or number of iterations in Adaboost.
Surely, you can do dimension reduction first.
In addition, SVM and Adaboost are classification methods. They only give 0
or 1 predictions (or probability of 1), not continuous predictions.
My research is in estimations of realized volatility/integrated volatility,
based on high frequency data with microstr... 阅读全帖 |
|
p********a 发帖数: 5352 | 5 ☆─────────────────────────────────────☆
jinggong718 (Jill) 于 (Wed Aug 20 12:54:19 2008) 提到:
我是北美三流烂校master, 12月份毕业想继续读phd, 不晓得哪些学校在finance方面的
phd项目比较好呢?
☆─────────────────────────────────────☆
doublefish (doublefish) 于 (Wed Aug 20 14:52:26 2008) 提到:
princeton, stanford, berkely
☆─────────────────────────────────────☆
akoug (xuantai) 于 (Thu Aug 21 10:57:24 2008) 提到:
princeton的统计?
当然上这学校本身令统计就已经不是问题了
☆─────────────────────────────────────☆
netghost (Up to Isomorphism) 于 (Fri Au... 阅读全帖 |
|