Карта особливостей ядра Гаусса


24

У SVM ядро ​​Гаусса визначається як:

K(x,y)=exp(xy222σ2)=ϕ(x)Tϕ(y)
деx,yRn. Я не знаю явного рівнянняϕ. Я хочу це знати.

Я також хотів би знати , є чи

iciϕ(xi)=ϕ(icixi)
, де ciR . Зараз я думаю, що це не дорівнює, тому що за допомогою ядра обробляється ситуація, коли лінійний класифікатор не працює. Я знаю ϕ проектує х у нескінченний простір. Тож якщо вона все ще залишається лінійною, незалежно від того, скільки її розмірів, svm все одно не може зробити хорошої класифікації.

чому це ядро ​​означає трансформацію? Або ви посилаєтесь на пов'язаний простір функцій?
Placidia

Так, який простір функцій щоб ϕ T ( x ) ϕ ( x)ϕ()ϕT(x)ϕ(x)=exp(12σ2xx2)

Відповіді:


20

ϕex. For notational simplicity, assume xR1:

ϕ(x)=ex2/2σ2[1,11!σ2x,12!σ4x2,13!σ6x3,]T

This is also discussed in more detail in these slides by Chih-Jen Lin of NTU (slide 11 specifically). Note that in the slides γ=12σ2 is used as kernel parameter.

The equation in the OP only holds for the linear kernel.


2
Hi, but this equation above only suit one dimension.
Vivian

So, here, the reproducing kernel Hilbert space is a subspace of 2, correct?
The_Anomaly

Is there also an explicit representation of the Laplacian kernel?
Felix Crazzolara

13

For any valid psd kernel k:X×XR, there exists a feature map φ:XH such that k(x,y)=φ(x),φ(y)H. The space H and embedding φ in fact need not be unique, but there is an important unique pair (H,φ) known as the reproducing kernel Hilbert space (RKHS).

The RKHS is discussed by: Steinwart, Hush and Scovel, An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels, IEEE Transactions on Information Theory 2006 (doi, free citeseer pdf).

It's somewhat complicated, but it boils down to this: define en:CC as

en(z):=(2σ2)nn!zneσ2z2.

Let n:N0N0d be a sequence ranging over all d-tuples of nonnegative integers; if d=3, perhaps n(0)=(0,0,0), n(1)=(0,0,1), n(2)=(0,1,1), and so on. Denote the jth component of the ith tuple by nij.

Then the ith component of φ(x) is j=1denij(xj). So φ maps vectors in Rd to infinite-dimensional complex vectors.

The catch to this is that we further have to define norms for these infinite-dimensional complex vectors in a special way; see the paper for details.


Steinwart et al. also give a more straightforward (to my thinking) embedding into L2(Rd), the Hilbert space of square-integrable functions from RdR:

Φσ(x)=(2σ)d2πd4e2σ2x22.
Note that Φσ(x) is itself a function from Rd to R. It's basically the density of a d-dimensional Gaussian with mean x and covariance 14σ2I; only the normalizing constant is different. Thus when we take
Φ(x),Φ(y)L2=[Φ(x)](t)[Φ(y)](t)dt,
we're taking the product of Gaussian density functions, which is itself a certain constant times a Gaussian density functions. When you do that integral by t, then, the constant that falls out ends up being exactly k(x,y).

These are not the only embeddings that work.

Another is based on the Fourier transform, which the celebrated paper of Rahimi and Recht (Random Features for Large-Scale Kernel Machines, NIPS 2007) approximates to great effect.

You can also do it using Taylor series: effectively the infinite version of Cotter, Keshet, and Srebro, Explicit Approximations of the Gaussian Kernel, arXiv:1109.4603.


1
Douglas Zare gave a 1d version of the "more straightforward" embedding in an interesting thread here.
Dougal

Here you find a more 'intuitive' explanation that the Φ can map onto a spave of dimension equal to the size of the training sample, even for an infinite training sample: stats.stackexchange.com/questions/80398/…

6

It seems to me that your second equation will only be true if ϕ is a linear mapping (and hence K is a linear kernel). As the Gaussian kernel is non-linear, the equality will not hold (except perhaps in the limit as σ goes to zero).


thank you for your answer. When σ0, the dimension of the Gaussian kernel projects would increase. And by your inspiration, now I think it is not equal. Because, using kernel just handle the situation that linear classification does not work.
Vivian
Використовуючи наш веб-сайт, ви визнаєте, що прочитали та зрозуміли наші Політику щодо файлів cookie та Політику конфіденційності.
Licensed under cc by-sa 3.0 with attribution required.