由买买提看人间百态

topics

全部话题 - 话题: eigenvalue
首页 上页 1 2 3 4 5 6 7 8 9 10 下页 末页 (共10页)
a*******a
发帖数: 33
1
来自主题: Computation版 - 求一个特殊矩阵的特征值
A^(-1)=
[ 0 0 0 ... 0 0 -1 1
0 0 0 ... 0 -2 2 0
0 0 0 ... -3 3 0 0
...
0 -(n-2) n-2
-(n-1) n-1 0 ... 0 0 0 0
n 0 0 ... 0 0 0 0 ]
The eigenvalues for this matrix may be easier to compute
[1 2 3 ... n]
and
[C(n-1,0), -C(n-1,1), ..., (-1)^n C(n-1,n-1)]
are two eigenvectors
C(n,k) is combination number
w**w
发帖数: 100
2
来自主题: Computation版 - numerical recipe c++
IT++ is a C++ library of mathematical, signal processing, speech processing,
and communications classes and functions.
http://itpp.sourceforge.net/
It can deal with matrix calculations very simple. Such:
M.eig() will give you the eigenvalues of M.
you can look the website.
I used TNT before, not bad. :)

every
solve
fortran
w**d
发帖数: 2334
3
来自主题: Computation版 - Multivariate Normal Distribution
The matrix is symmetric. Its eigenvalues are the variance of Gaussian RVs
after the transformation.
d**s
发帖数: 65
4
来自主题: Computation版 - 求矩阵平方根
除了求eigenvalue decomposition以外还有什么好方法吗?
另外jacobi method可以直接用到complex hermitian matrix吗?
c******m
发帖数: 599
5
来自主题: Computation版 - upper bound of eigenvalues
这个还有recent的结果?
w**d
发帖数: 2334
6
来自主题: Computation版 - upper bound of eigenvalues
what do you mean?
r**u
发帖数: 42
7
Please check out standard linear algebra program benchmark LAPACK++
LAPACK++ (Linear Algebra PACKage in C++) is a software library for numerical
linear algebra that solves systems of linear equations and eigenvalue problems
on high performance computer architectures.
Old version:
http://math.nist.gov/lapack++/
Latest version:
http://lapackpp.sourceforge.net/

多谢
c**z
发帖数: 1014
8
来自主题: Computation版 - 求助: 3*3 matrix eigenvalue problem
matlab can do that easily,
use the command eig.
If you don't have matlab,
maybe you can try scilab, which I think is free.
l*****i
发帖数: 3929
9
来自主题: Computation版 - 求助: 3*3 matrix eigenvalue problem
only 3*3, you can even use your pencil and paper to compute it if you do
n't have matlab
l******e
发帖数: 55
10
来自主题: Computation版 - 求助: 3*3 matrix eigenvalue problem
多谢longhei 和 cqxz!!
我其实是要不停的调用这个程序来算n多个不同的3×3矩阵。所以还是得自己写程序。不
过还是谢谢二位了!
还有,我已经搞定了,呵呵.
m********e
发帖数: 5088
11
来自主题: Computation版 - 这个方程该怎么解?
Femlab会自动地把nonlinear
part忽略的,我已经试过了。对于eigenvalue问题,famlab好像只能解linear problem。
当然,我也用的不熟,你觉得应该怎么解?
m********e
发帖数: 5088
12
100万*100万的算大型矩阵了,比较推荐Arpack(in Fortran77),但估计这么大的矩阵得
用64bit机器。
一般的32bit机器,4Gdual processor, 4G ram,最多可以解到30万*30万。
再小一点的10万,20万左右的,就可以用matlab*P解了
一般的Matlab如果用sparse+eigs,最多解10000*10000,再多没戏。
所以还是用fortran比较好。
如果我没猜错,应该是要解simultaneous eigenvalue equations吧。
好运

1000000*
s**i
发帖数: 381
13
来自主题: Computation版 - lapack++ and blitz++
In terms of eigenvalue decomposition, which one is best?
I guess still LAPACK?

),
This is the old version. There has been a new fork since 2000 by other guys:
http://lapackpp.sourceforge.net/html/index.html
h*****c
发帖数: 569
14
来自主题: Computation版 - Matlab的eigs函数
用它解generalized eigenvalue problem, Ax=cBx
[V D]=eigs(A,B)
总是出错,说我给出的B matrix要和A有同样的size,而且要是半正定对称矩阵(或者
blah blah...)
我查过,我的A和B是同样size的,而且B也是symmetric positive semi-definite
matrix,可matlab就不给我算...
有没有大侠遇到过这种问题?
h*****c
发帖数: 569
15
来自主题: Computation版 - Matlab的eigs函数
嗯,你说的这个是eigs的基本功能。
我说的那个问题是generalized eigenvalue problem,要指定两个矩阵的。
因为我现在调试用的矩阵比较小,我用eig(A,B)试了试,至少它还给我算出个东西,
不太明白为什么用eigs的时候不行...
f********r
发帖数: 50
16
how many eigenvalues do you need?
if only a few is needed, consider using jacobi-davidson
which is the fastest as I know.
http://www.math.uu.nl/people/sleijpen/
you can find the source code in matlab
and you can request the fortran code by writing an email
l*****s
发帖数: 8
17
来自主题: Computation版 - 我也有一个对角化的问题
我想对角化一个 ~2000x2000 的矩阵,要知道全部的eigenvalue 和 eigenvector 。但
是矩阵很稀疏,每行只有< 10 个的元素是非零。我用lapack觉的比较慢,有没有更好
的算法?
谢谢!
j*****n
发帖数: 1545
18
来自主题: Computation版 - 高维 PCA
有一个向量是1200维的,有10个data。 怎么做PCA啊? 维数我觉得太高了。
看见有个人是这样做的, 本来是x*x',他用的是x'*x,这样的自相关矩阵只有10×10
维的,用这个去做 PCA, 得到的10*1的eigen_vector再用x去乘,又得到一个1200×1的
玩意。
这样有道理吗?eigenvalue最多就10个,很奇怪。
r****y
发帖数: 1437
19
来自主题: Computation版 - 高维 PCA
hehe, you still have not got it right.

10
eigenvalue
c***r
发帖数: 63
20
来自主题: Computation版 - 高维 PCA
sample size too small
at most 10 vectors suffice to span the sample space

10
eigenvalue
k******n
发帖数: 35
21
You did not mention why you need the inverse. Without this, others could not
help you much.
(1) At first, I have to ask do you really need the inverse?
The inverse matrix would be dense. I do not think you have enough memory to
manipulate it. Even you can compute this inverse, you have nothing to do
with it, since you have no idea how accurate it is. Most time, what you need
is the product of the inverse and a vector/matrix, say inv(A)*B. In some
cases, people want eigenvalues or singular values
j**********t
发帖数: 12
22
Suppose A is a real and symmetric nxn matrix, what is the complexity to find
V and D such that A = V*D*V', is it n^3 or n^2? Thanks
h**********k
发帖数: 168
23
n^3 in matlab. can be worse in some other implementations.
d*****l
发帖数: 8441
24
If the matrix is special, it might be possible to reduce it, I guess.
x*z
发帖数: 381
25
来自主题: Computation版 - Matlab计算精度请教
我正在用matlab的lmisolver解线性矩阵不等式,碰到一个问题如下:
我需要求解A(X1,X2)<0, where X1 and X2 are unknown matrices,
所以我用lmisolver已经得到一个数值解X1_0和X2_0,为了验证结果,
我计算矩阵A(X1_0,X2_0)的eigenvalues,发现一些在-3E-6附近,
现在我老板的担心是,这些特征值这么小,可能是由于Matlab的计算误差使得
所有特征值都满足小于0的要求。所以希望我得到一些别的数值解使得A矩阵的
特征值离y轴更远点。
我想请教大家,我老板的担心是多余的吗?难道Matlab连1E-6这种计算精度都达不到?
谢谢。
h**********c
发帖数: 4120
26
correction:
stiff problem:
x'=\lambda x, when Re(\lambda)-> -\infty
x could be extended to n-vector, \lambda thus could be a matrix D, thus one
of the eigenvalues of D has its real part approaches negative infinity.
This is a totally different thing from singular Jaconbian of ODEs.
Stiff problem can be attacked by stability analysis. Normally a multi-step
implicit mesh scheme should help.
Just for discussion.
t***s
发帖数: 4666
27
来自主题: Computation版 - 问个matlab的eigs的问题
show us your matrix.
my guess is that your matrix has many negative eigenvalues and is almost
sigular.
x*****u
发帖数: 3419
28
来自主题: Computation版 - C++里用Blas/Lapack的问题
This is one I had that works:
/*
to compile :
$ gcc array_lapack.c -llapack -lblas -lm
*/
#include
#include
#define size 3 /* dimension of matrix */
struct complex {double re; double im;}; /* a complex number */
main()
{
struct complex A[3][3], b[3], WORK[6], RWORK[6];
struct complex w[3], vl[1][3], vr[1][3];
double AT[2*size*size]; /* for transformed matrix */
int i, j, ok;
char jobvl, jobvr;
int n, lda, ldvl,... 阅读全帖
x*****u
发帖数: 3419
29
来自主题: Computation版 - C++里用Blas/Lapack的问题
This is one I had that works:
/*
to compile :
$ gcc array_lapack.c -llapack -lblas -lm
*/
#include
#include
#define size 3 /* dimension of matrix */
struct complex {double re; double im;}; /* a complex number */
main()
{
struct complex A[3][3], b[3], WORK[6], RWORK[6];
struct complex w[3], vl[1][3], vr[1][3];
double AT[2*size*size]; /* for transformed matrix */
int i, j, ok;
char jobvl, jobvr;
int n, lda, ldvl,... 阅读全帖
w*******U
发帖数: 256
30
来自主题: Computation版 - determinant again
i reduced the PDEs to an eigenvalue problem, i.e. A(lambda)*x=0. if there
exists a non-trivial solution then det(A)=0. so now i need to numerically
calculate zeros of the determinant P(lambda)=0. does anyone know a good
library or algorithm to do this job? A is a complex matrix and its size is at least 500x500.
s*****e
发帖数: 10
31
你的matrix太小了。小矩阵得计算,CPU上可以很好得cache,SIMD也能优化上,所以
GPU没有优势。
GPU计算得优势主要优势在于computation and memory access balance。或者说memory
access hidden. 如果同样得code CPU能很好得cach得话,GPU就没有比较大得优势了。
你这个计算,我觉得应该是3倍左右的提升,不知道为什么你才1倍。再优化试试。

32
CPU
I******c
发帖数: 163
32
我觉得也可以把矩阵放到shared memory里去啊。
I******c
发帖数: 163
33
如果一个数需要8byte去存贮的话,64*64的矩阵需要32kb的shared memory。现在的gpu
都有48kb/sm的shared memory了。
g**********t
发帖数: 475
34
我觉得主要是Jacobi method的效率太差了,虽然很容易并行,但是对于我这种大小的
矩阵,其他的算法,比如QR algorithm可能更合适。就拿计算单个64 x 64的矩阵来说
,GPU上我的程序(Jacobi method)实际比CPU(用GSL,貌似是QR decomposition类的算
法)慢6、7倍,只有在计算很多的矩阵时GPU才能快一倍(2x提升)。我觉得主要是Jacobi
method不大适合我的矩阵大小。不知道QR decomposition类的算法能不能在GPU上高效
实现?

memory
了。
g**********t
发帖数: 475
35
矩阵是cache在shared memory里的,已经考虑了bank conflict。用的是单精度浮点,
cache了两个矩阵(一个放原始矩阵,一个放eigenvectors),大概用了32k的shared
memory。可能这也是个影响效率的原因,因为shared memory用的太多了,每个SM只能
launch一个block。
s*****e
发帖数: 10
36
如果单个计算GPU是CPU的1/6, 那么throughput就大概是
GPU SM # / CPU core # * ratio * 8 =
16 /4 * 1/6 * 8 ~= 5。 所以如果你把memory access hide 好的话,应该是4
到5倍。我做过这个类似的计算,实际就是大概3到4倍。你的程序还需要好好优化。特
别是这个8 blocks per sm.
从你下面的回帖,可以看出你对occupation没有很好的优化。
如果你是用CUDA 4.0的话,compiler帮你做了一些优化,所以你的效率大概是
5/8 * 2 ~= 2 倍。make sense to your observation。 :-)
Good luck

Jacobi
b***k
发帖数: 2673
37
来自主题: Computation版 - 线性代数一问
【 以下文字转载自 Mathematics 讨论区 】
发信人: blook (布鲁克), 信区: Mathematics
标 题: 线性代数一问
发信站: BBS 未名空间站 (Sun Mar 16 23:42:21 2014, 美东)
Given an arbitrary matrix A with dimension m x n,
what is the difference or relationship of A'*A and A*A'?
here A' is transpose of A.
obviousely both of them are symmetric, and they have identical eigenvalues.
but their eigenvectors are different which means eigenspace of both matrix
are distinct, right?
x*z
发帖数: 381
38
来自主题: EE版 - Matlab计算精度请教
【 以下文字转载自 Computation 讨论区 】
发信人: xxz (星星), 信区: Computation
标 题: Matlab计算精度请教
发信站: BBS 未名空间站 (Thu Feb 12 00:52:49 2009), 转信
我正在用matlab的lmisolver解线性矩阵不等式,碰到一个问题如下:
我需要求解A(X1,X2)<0, where X1 and X2 are unknown matrices,
所以我用lmisolver已经得到一个数值解X1_0和X2_0,为了验证结果,
我计算矩阵A(X1_0,X2_0)的eigenvalues,发现一些在-3E-6附近,
现在我老板的担心是,这些特征值这么小,可能是由于Matlab的计算误差使得
所有特征值都满足小于0的要求。所以希望我得到一些别的数值解使得A矩阵的
特征值离y轴更远点。
我想请教大家,我老板的担心是多余的吗?难道Matlab连1E-6这种计算精度都达不到?
谢谢。
w***z
发帖数: 771
39
来自主题: EE版 - 问一个线性代数的问题
a rotation matrix A : | cos(t), -sin(t) |, t is the rotation angle
| sin(t), cos(t) |
has two complex eigenvalue r = cos(t) +- sin(t)*i
and two complex eigenvector e = [1, i], [1, -i]
if t = 90 degree,
we have r = i, -i;
and e = [1,i], [1,-i]
My question is how we understand e keeps the same direction for a 90 degree
rotation matrix, or any rotation matrix? what's the intuition here? Looks
like we need four-dimension to imagine it. any idea to imagine... 阅读全帖
n*****n
发帖数: 5277
40
来自主题: EE版 - 问一个线性代数的问题
geometry meaning of eigenvector in two dimensional space is not as
straightforward as what in 3 dimensional space. I guess the changing part
has been included in eigenvalue.
w***z
发帖数: 771
41
来自主题: EE版 - 问一个线性代数的问题
a rotation matrix A : | cos(t), -sin(t) |, t is the rotation angle
| sin(t), cos(t) |
has two complex eigenvalue r = cos(t) +- sin(t)*i
and two complex eigenvector e = [1, i], [1, -i]
if t = 90 degree,
we have r = i, -i;
and e = [1,i], [1,-i]
My question is how we understand e keeps the same direction for a 90 degree
rotation matrix, or any rotation matrix? what's the intuition here? Looks
like we need four-dimension to imagine it. any idea to imagine... 阅读全帖
n*****n
发帖数: 5277
42
来自主题: EE版 - 问一个线性代数的问题
geometry meaning of eigenvector in two dimensional space is not as
straightforward as what in 3 dimensional space. I guess the changing part
has been included in eigenvalue.
r****j
发帖数: 278
43
来自主题: EE版 - 包子求paper Springerlink
http://www.springerlink.com/content/k5615715657m9374/
Quadratic programming with one negative eigenvalue is NP-hard
包子求,麻烦把paper发到 r***********[email protected],多谢!
q*****g
发帖数: 1568
44
来自主题: Mathematics版 - Re: 自然对数e
e 是和指数函数exp(x)分不开的。这个函数有很多特有的分析性质,比方
说exp(x)的导数等于它本身。
下面假设你学过线性代数:换句话说,如果你把求导当作一个线性运算(它
其实就是一个线性算子,呵呵),那么它也可以求特征值(eigenvalue),而且
你会发现这个运算的特征值有很多很多。。。确切地说,包含了所有实(复)数,
当中1这个特征值所对应的特征“向量”就是exp(x)了。
又比如说,我们所熟知的三角函数,其实都可以用exp(rx)的组合来表示,
不过那个系数r可不能取成实数了,得是复数才行。这一神奇的联系是完全从
分析而不是几何上得到的——刚才说了,exp(x)是求导算子对应于1的特征向量,
那么exp(rx)呢?答案也很简单,它是对应于r的特征向量。拿sin(x)来做个比方
吧,作用一次求导运算得到cos(x), 好象看不出来有什么耶,不急,再做一次
求导运算,我们得到了-sin(x)! 这说明,sin(x)本身虽然不是求导算子的
特征向量,却和对应于sqrt(-1)的那个特征向量有密切关系,因为那个特征向量
就具有同样的性质:一次求导多一个sqrt(
x*****d
发帖数: 427
45
来自主题: Mathematics版 - How to calculate det(A)???[合集]
发信人: cheeryyu (浪人清歌), 信区: Mathematics
标 题: [转载] How to calculate det(A)???
发信站: The unknown SPACE (Thu May 29 20:53:48 2003), WWW转贴
【 以下文字转载自 EE 讨论区,原文如下 】
发信人: cheeryyu (浪人清歌), 信区: EE
标 题: How to calculate det(A)???
发信站: The unknown SPACE (Thu May 29 20:26:04 2003) WWW-POST
A is a Hermitian matrix with complex values, all the diagonal values equals to
1, and the norm of all the off-diagonal values are less than 1.
Now I guess det(A) <= 1? that is, the product of all the eigenvalues are less
H****h
发帖数: 1037
46
来自主题: Mathematics版 - Re: what will happen to the eigenvalues?
利用如下定理可以证明只要A>=B,则A的特征值就不小于对应的B的特征值。
假设A是有限维空间上的Hermitian算子,特征值由大至小排列是a_1, a_2, ... , a_n。
则a_k=\max_{k-dim subspace V}\min_{x\in V}(x,Ax)/(x,x).

add
diagonal
s********y
发帖数: 660
47
A= 2 -1 0 0 ... 0 0
-1 2 -1 0 ... 0 0
0 -1 2 -1 ... 0 0
...
0 0 0 0 ... 2 -1
0 0 0 0 ... -1 2
A就是个tridiagonal matrix. 对角线上元素[-1 2 -1]
s*******y
发帖数: 558
48
X is an n-dimensional random vector with expected value u_X and
covariance C_X.
Let Y = AX, where A is an nxn orthogonal matrix (det(A) is either 1 or
-1). We know that the covariance of Y, denotes as C_Y is given by
C_Y = A C_X A'.
I was wondering whether the eigenvalues and eigenvectors of C_X is the
same as those of C_Y.
What if Y = AX + b, where b is an n-dimensional constant vector?
Thanks a lot!!!
F******n
发帖数: 160
49
Hello all,
I am looking for the references on the distance measure of two matrices,
both of which are positive definite and symmetric. For the unknown space (e
.g., non-Euclidean), I heard that the general mathematical treatment is
somehow related to Riemannian manifolds, and the problem could be further
formulated and boiled down to the generalized eigenvalue problem. I am
looking for the detailed reference papers/books to explain this approach.
Thanks very much in advance!
feyn
h****f
发帖数: 24
50
这是别人写的一个方法,用matlab,你看是否可行呢?
V = orth(2*rand(n)-1); % Make random eigenvectors
D = diag(rand(n,1)); % Generate random positive eigenvalues
A = V*D*V'; % A will then be positive definite
另一个
M = randn(n, n);
A = M * M';
首页 上页 1 2 3 4 5 6 7 8 9 10 下页 末页 (共10页)