can anyone give an explanation of the essential difference between a simple average (say of a huge set of data) and expected value? if i know some average value, is there a way to view that as expected value or convert it to expected value? edit: like ok, say i play some simple game, and i play it billions of times. say of all the money i make and lose that the average is $8 from all those billions of plays. could i say that $8 is my expected value for that game?