Імовірність функції вибірки - це спільна щільність випадкових змінних, але розглядається як функція невідомих параметрів, що задається конкретним зразком реалізацій з цих випадкових величин.
У вашому випадку виявляється, що тут припущення полягає в тому, що час роботи цих електронних компонентів наступний (тобто його має граничний розподіл), експоненціальний розподіл з однаковим параметром швидкості , і тому граничним PDF є:θ
fXi(xi∣θ)=θe−θxi,i=1,2,3
Also, it appears that the life of each component is fully independent of the life of the others. In such a case the joint density function is the product of the three densities,
fX1,X2,X3(x1,x2,x3∣θ)=θe−θx1⋅θe−θx2⋅θe−θx3=θ3⋅exp{−θ∑i=13xi}
To turn this into the likelihood function of the sample, we view it as a function of θ given a specific sample of xi's.
L(θ∣{x1,x2,x3})=θ3⋅exp{−θ∑i=13xi}
where only the left-hand-side has changed, to indicate what is considered as the variable of the function.
In your case the available sample is the three observed lifetimes {x1=3,x2=1.5,x3=2.1}, and so ∑3i=1xi=6.6. Then the likelihood is
L(θ∣{x1=3,x2=1.5,x3=2.1})=θ3⋅exp{−6.6θ}
In other words, in the likelihood you were given, the specific sample available has been already inserted in it. This is not usually done, i.e. we usually "stop" at the theoretical representation of the likelihood for general xi's, we then derive the conditions for its maximization with respect to θ, and then we plug into the maximization conditions the specific numerical sample of x-values, in order to obtain a specific estimate for θ.
Admittedly though, looking at the likelihood like this, may make more clear the fact that what matters here for inference (for the specific distributional assumption), is the sum of the realizations, and not their individual values: the above likelihood is not "sample-specific" but rather "sum-of-realizations-specific": if we are given any other n=3 sample for which the sum of its elements is again 6.6, we will obtain the same estimate for θ (this is essentially what it means to say that ∑x is a "sufficient" statistic -it contains all information that the sample can provide for inference, under the specific distributional assumption).