Hello World
-
Hello World
Hello World ·My first post, Hello World.
Optimization
-
Frank-Wolfe Algorithm
Optimization ·Introduction
The Frank-Worlf Algorithm is an iterative first-order optimization algorithm for constrained convex optimization. Also known as the conditional gradient method, reduced gradient algorithm and the convex combination algorithm, the method was originally proposed by Marguerite Frank and Philip Wolfe in 1956. In each iteration, the Frank-Wolfe algorithm considers a linear approximation of the objective function, and moves towards...
Matrix Computation
-
Symmetric Storage Gaxpy
Matrix Computation ·Introduction
A matrix \(\color{black}{A\in\mathcal{R}^{n\times n}}\) is symmetric if \(\color{black}{A^{T}=A}\) and skew-symmetric if \(\color{black}{A^{T}=-A}\). Likewise, a matrix \(\color{black}{A\in\mathcal{C}^{n\times n}}\) is hermitian if \(\color{black}{A^{H}=A}\) and skew-hermitian if \(\color{black}{A^{H}=-A}\).
For such matrices, storage requirements can be halved by simply storing the lower triangle of elements.
For general \(\color{black}{n}\), we set
\[\color{black}{A.vec((n-j/2)(j-1)+i)=a_{ij} 1\le j\le i \le n}\]
for j = 1:n for...
-
Band Storage Gaxpy
Matrix Computation ·Introduction
Suppose \(\color{black}{A\in\mathcal{R}^{n\times n}}\) has lower bandwith \(\color{black}{p}\) and upper banwidth \(\color{black}{q}\) and assume that \(\color{black}{p}\) and \(\color{black}{q}\) are much smaller than \(\color{black}{n}\). Such a matrix can be stored in a \(\color{black}{(p+q+1)-\text{ by }-n}\) array \(\color{black}{A.band}\) with the convention that
\[\color{black}{a_{ij}=A.band(i-j+q+1,j)}\]
for all \(\color{black}{(i,j)}\) that fall inside the band.
for j = 1:n alpha_1 = max(1, j-q), alpha_2...
-
Fast Matrix-Vector Products
Matrix Computation ·The Fast Fourier Transform
The discrete Fourier transform (DFT) of a vector \(\color{black}{x \in C^n}\) is a matrix-vector product \[\color{black}{y = F_nx}\] where the DFT matrix \(\color{black}{F_n = (f_{ij}) \in C^{n\times n}}\) is defined by \[\color{black}{f_{kj} = \omega^{(k-1)(j-1)}_n}\] with \[\color{black}{\omega_n = exp(-2\pi i/n) = cos(2\pi/n) - i\centerdot sin(2\pi/n).(\omega_n^n = 1)}\]
The DFT is ubiquitous throughout computational science and engineering...
-
Matrix Multiplication
Matrix Computation ·The Notion of “Level” and the BLAS
The dot product and single alpha x plus y(saxpy) operations are example of level-1 operations. Level-1 operations involve an amount of data and an amount of arithmetic that area linear in the dimension of the operation. An \(\color{black}{m}\)-by-\(\color{black}{n}\) outer product update or a generalized saxpy(gaxpy) operation involves a quadratic amount of data...
Project
-
Vision Perception System of Intelligent Connected Vehicle
Project ·From Setempber 1st, 2018 to January 31st, 2019, I participated in our college project of Intelligent Connected Vehicle (ICV). In this project, I took charge of the vision perception system of the ICV. I finished the whole perception system setup and calibration of multi-sensors. Our recording platform is an electric vehicle, which has been modified with actuators for pedals (acceleration...