The weak law of large numbers states that for any $\epsilon > 0$, the probability that the average of the results deviates from the expected value by more than $\epsilon$ goes to zero as the number of trials goes to infinity. This can be proven using Chebyshev's inequality. Here are the steps of the proof:
1. Let $X_1, X_2, \ldots, X_n$ be a sequence of independent and identically distributed random variables with expected value $\mu$ and finite variance $\sigma^2$.
2. The average of the first $n$ random variables is defined as $\bar{X_n} = \frac{1}{n} \sum_{i=1}^n X_i$.
3. Using Chebyshev's inequality, we have $P(|\bar{X_n}-\mu| \geq \epsilon) \leq \frac{\sigma^2}{n\epsilon^2}$
4. As the number of trials goes to infinity, i.e. $n \rightarrow \infty$, the right hand side of the inequality goes to zero.
5. Therefore, for any $\epsilon > 0$, the probability that the average of the results deviates from the expected value by more than $\epsilon$ goes to zero as the number of trials goes to infinity.
In summary, Chebyshev's inequality states that for any random variable $X$ with expected value $\mu$ and finite variance $\sigma^2$, the probability that the variable deviates from the expected value by more than $\epsilon$ is always less than or equal to $\frac{\sigma^2}{\epsilon^2}$, which becomes arbitrarily small as $\epsilon$ becomes small.