About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Machine learning theory
Example of the union bound:
X is a continuous random variable, thusP(X = 3) = 0
P(X = n) = 0
and we do not have to worry about boundaries.
P(A1) = P[0 <= X <= 2] = 1/(2^2)
P(A2) = P[1 <= X <= 3] = 1/(2^3)
P(A3) = P[2 <= X <= 4] = 1/(2^4)
by the union bound:
P(union Ai) <= 1/(2^2) + 1/(2^3) + 1/(2^4) +...
Here's a case where increased complexity does not reduce precision (if the link doesn't work, copy it into a browser with Google Colab).
Explanation of curse of high dimensions:
Here we revisit the curse of high dimensionality (code sample).
We go over a simple example that demonstrates the curse of high dimensions (if the link doesn't work, copy it into a browser with Google Colab).