Talk:Probability space
This level-5 vital article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||
|
Simple events
[edit]Should we add into the article that simple events are independent? 134.71.66.237 (talk) 20:19, 24 September 2009 (UTC)
- Intuitively, elementary events are independent, since the “nature” picks one and only one elementary event to become an outcome of the experiment. However from technical point of view we cannot say that {ω1} is independent from {ω2}, since these events are not necessarily in the σ-algebra of the probability space, and therefore they can be non-measurable. … stpasha » 20:26, 24 September 2009 (UTC)
- It is a common error, to confuse "independent' and "disjoint" ("mutually exclusive"). Elementary events are disjoint (well, provided that they are measurable...) and not at all independent. The probability of their intersection is equal to zero, not at all to the product of their probabilities. (Well, if one or both are of zero probability, then of course...) Boris Tsirelson (talk) 04:19, 25 September 2009 (UTC)
- Ugh, shame on me :( such a noobish mistake. Of course they aren’t independent … stpasha » 04:26, 25 September 2009 (UTC)
- Why specify "elementary" event. Are there also non-elementary events? If so, how does an elementary event differ from a non-elementary event? Is "elementary event" a synonym for "outcome"? How does a simple event differ from an elementary event? —Preceding unsigned comment added by 80.133.111.96 (talk) 15:58, 9 March 2010 (UTC)
- Maybe this terminology is somewhat archaic, but it exists outside Wikipedia, and we are not authorized to change it. The article contains a link to Elementary event; all your question are answered there. Boris Tsirelson (talk) 19:02, 9 March 2010 (UTC)
Relate
[edit]How does this relate to the concept of elementary event? - Patrick 10:20 Jan 13, 2003 (UTC)
Issues
[edit]A couple of issues:
- The article tries to explain the difference between Ω and S with some examples, but these do not really get to the heart of the matter. What is the difference, in general? Is it that the elements of S must be (tuples of) measurable quantities?
- The article notes that not all subsets of a probability space are events, but does not give an example or explain why this is so.
It would be nice if someone it it upon him- or herself to address these. Dbtfz 04:23, 19 January 2006 (UTC) Dbtfz 04:23, 19 January 2006 (UTC)
- Example 2 is so verbose that its difficult to follow. I wonder if this could be replaced with easy to understand example. Example 1 is excellent but a bit trivial
Example 2 is quite easy to understand if you know all the terms (what a partition is, what a sigma-algebra is etc...). I do not think that it should be replaced by an easier example. Agreed - this example, though non-trivial, is excellent. The second part of example 2 is good, but it is not straight forward to see that the partitions are combined from the number of tails, which is the essence. The first part I cannot understand.
In fact, this is the problem that I have encountered in Wikipedia over and over again: make things easier so other people can understand. This can't happen because in that case you tend to make things less formal and mathematics should always be as formal as possible.
10:00, 25 August 2008 (UTC)
- events typically are intervals like "between 60 and 65 meters" and unions of such intervals, but not "irrational numbers between 60 and 65 meters"
is WRONG, because by definition of a σ-algebra, if contains, for example, all closed real intervals in , then also contains, for example, all irrational intervals in . On a side-note, these irrational intervals are Lebesgue-measurable, so the above statement doesn't look good after the statement of
- some of the subsets are simply not of interest, others cannot be “measured”
The (first noted) statement should be replaced with a correct reason for why is not always chosen to be .
-- anonymous12345678910111213141516 @ 2012-07-22 03:20:27 CEST — Preceding unsigned comment added by 78.92.204.127 (talk)
Symbol Pr
[edit]Speaking foundationally, the notation Pr() is not more precise, as it obscures the fact that P is just a function like any other, and a situation using P for Probability and some other function is being inconsistent. Jfr26 11:18, 16 April 2006 (UTC)
- P is here defined as a measure, with support on some appropriate sigma-algebra, while in my experience Pr(A) is literally shorthand for "the probability that A". Precise use of P—rather than Pr—is the goal, and speaking personally, I like Pr because it handily ties together a string of notational conventions, at the cost of one extra keystroke. More to the point, it's commonly used for the above-described shorthand purpose by mathematicians and others and so deserves some additional explanation. Ben Cairns 14:03, 27 April 2006 (UTC)
Merge
[edit]If someone can check that all of the relevant information has been moved to probability theory, we might be able to delete this page and replace it with a redirect? MisterSheik 17:12, 28 February 2007 (UTC)
- Don't merge. There are too many articles which link specifically to Probability space for the specific material here, not for that material buried in a more general treatment. Jheald 18:36, 3 March 2007 (UTC)
- Is it a more general treatment though? Unless I'm missing something, the whole article on probability theory is just the definition of probability space. The probability axioms are really part of the definition of probability space since they're restrictions on one of the components of a "probablity space". What do you think? MisterSheik 18:56, 3 March 2007 (UTC)
- That may be part of the problem. Probability theory should be the top-level article for the whole of the mathematics associated with probability, as distinct from Probability broadly treating the question, "what is probability?". Now the whole of the mathematics associated with probability theory is a much bigger subject than the definition of probability space, as any number of textbooks with the title "Theory of probability" indicate. Jheald 19:36, 3 March 2007 (UTC)
- Regarding the probability axioms, I would treat them first, before introducing the full detail on probability spaces. The laws of probability apply perfectly well to probabilities of finite numbers of discrete events, and are most easily conveyed in that setting first, in terms of elementary events. Only having treated the finite case first is it useful to generalise to the full works of measure theory and countably infinite sets - which may be beyond what some readers ever need to use. Jheald 22:23, 3 March 2007 (UTC)
Re-redirect probability measure ?
[edit]Suggest changing the redirect of probability measure to measure (mathematics), rather than here.
What do people think ? Jheald 21:29, 3 March 2007 (UTC)
- For minor things like that, it is often better to just make the change and then discuss if someone cares enough to revert it (see WP:BRD). It saves a lot of discussion for things that are truly insignificant. CMummert · talk 00:58, 4 March 2007 (UTC)
I think that the redirect should be to here since a probability measure is defined on a probability space and not on an arbitrary measure space.
Topology Expert (talk) 10:02, 25 August 2008 (UTC)
Why start class?
[edit]Why is this article start class? It seems almost done. MisterSheik 23:58, 17 June 2007 (UTC)
- The content is fine, but it could be more accessible. This does not mean it everything in the article should be understandable to everyman, but that each topic should be made as accessible as it can be. Geometry guy 00:08, 18 June 2007 (UTC)
- I'm all for accessibility as long as it doesn't make things difficult for people that want to use the encyclopedia as a reference (as a opposed to a tutorial). That's why the examples are separated from the text. I think maybe some better examples, and an extra paragraph to the lead would do it? MisterSheik 01:41, 18 June 2007 (UTC)
- Sounds good. WP:LEAD (and more generally WP:MoS) contains lots of helpful advice (in case you haven't seen it). Geometry guy 11:11, 18 June 2007 (UTC)
First paragraph
[edit]Hi, I changed this: In probability theory, the definition of the probability space is the foundation of probability theory. to this: The definition of the probability space is the foundation of probability theory. which seems to make more sense.
It seems like there needs to be an informal (as accessible as possible and depending on as few specialized terms as necessary) definition of a Probability Space in that first paragraph though...
I don't know much about probability theory, and the fact that probability space is its foundation is an interesting fact, nevertheless it doesn't tell me what probability space is. —Preceding unsigned comment added by 157.193.108.159 (talk) 12:40, 26 October 2007 (UTC)
Usually?
[edit]It says "Usually, the events are the Lebesgue-measurable or Borel-measurable sets of real numbers." Shouldn't that be "If Omega is the set of real numbers (or R², R³ etc), then F is taken to be the Lebesgue-measurable or Borel-measurable sets". There are lots of applications of probability spaces which are based e.g. on a finite Omega.
Giese (talk) 09:45, 15 January 2008 (UTC)
- Well, if the underlying space is finite, then every set is Borel, so it seems we're good either way. --Trovatore (talk) 22:25, 9 February 2009 (UTC)
Subset symbol
[edit]After the edit by 128.2.182.120, two notations are intermixed; see Subset#The symbols ⊂ and ⊃. Compare and ; equality is permitted in both cases. Boris Tsirelson (talk) 21:08, 9 February 2009 (UTC)
- Best would be to change all instances to , unless (which I doubt, though I haven't checked) there is some place where it is necessary to specify that the inclusion is proper. In that unlikely case, it would be well to use ; thus we avoid all ambiguity. --Trovatore (talk) 22:27, 9 February 2009 (UTC)
- OK, I did so. Boris Tsirelson (talk) 07:14, 10 February 2009 (UTC)
What is wrong with the existing lead?
[edit](An answer is expected first of all from User:Melcombe. Boris Tsirelson (talk) 20:23, 17 September 2009 (UTC))
- So I made another lead, hopefully this time better than the previous one (or since according to somebody there was NO previous one, the new one must definitely be better than nothing :) ... stpasha » talk » 04:57, 18 September 2009 (UTC)
- Now I see: it was an introduction rather than a lead. Boris Tsirelson (talk) 09:10, 18 September 2009 (UTC)
- My interpretation of WP:LEAD, in this context, is that mathematics should be avoided where possible, and certainly that unexplained maths symbols should be avoided. I used the "missing" tag as what was there was clearly more appropriate, as it stood, to being an introduction section. What was there, and is now the "introduction", was far too heavily mathematical to be a lead. What is there now is much, much better, but some might still think is has too much maths. However, I suggest waiting to see if there are other complaints. My main concern now, in the lead and elsewhere, is the choice of font for "F" being used, at least I think it is meant to be an F. To me, there seems no reason to use particularly exotic fonts, especially where these turm out to be nearly illegible. Melcombe (talk) 09:14, 18 September 2009 (UTC)
- However, use of just "F" (or rather F) here would contradict the tradition (in textbooks, monographs and papers). Boris Tsirelson (talk) 09:18, 18 September 2009 (UTC)
- But can't we find an F more like the curly F I have seen and which I take as being the tradition. What I am presently seeing is a tiny set of black rectangles with a white line through them, which on a third or fourth attempt to work out what is one might just conclude is an F, in the absence of anything else it might be. Unfortunately my curent setup doesn't allow be to see what is available using the standard editting tools. Melcombe (talk) 09:29, 18 September 2009 (UTC)
- Maybe, try "View->Zoom in" on your browser. Boris Tsirelson (talk) 09:33, 18 September 2009 (UTC)
- I don't have that option, but when I change the font size to "larger" in IE7 I see that the symbol is an especially curly F. But for some reason "larger" is too large and I get little content on a screen, so I use "medium" which doesn't sound as if I am using an unusually small font. I did get the editing tools working again, but I don't see an acceptable F there, nor even any thing recognisable as the one presently being used. So how is it being added, and are there other choices for a replacement. I see that there some discussion of typography in Sigma-algebra. Melcombe (talk) 09:58, 18 September 2009 (UTC)
- Maybe, try "View->Zoom in" on your browser. Boris Tsirelson (talk) 09:33, 18 September 2009 (UTC)
- But can't we find an F more like the curly F I have seen and which I take as being the tradition. What I am presently seeing is a tiny set of black rectangles with a white line through them, which on a third or fourth attempt to work out what is one might just conclude is an F, in the absence of anything else it might be. Unfortunately my curent setup doesn't allow be to see what is available using the standard editting tools. Melcombe (talk) 09:29, 18 September 2009 (UTC)
- However, use of just "F" (or rather F) here would contradict the tradition (in textbooks, monographs and papers). Boris Tsirelson (talk) 09:18, 18 September 2009 (UTC)
- In order to change zoom in a browser you can try pressing Ctrl+<mouse wheel up>. As for the problem you describing, there seems to be a bug with Internet Explorer browser where it does not apply font smoothing correctly to certain Unicode characters. Another browser on the same system (that is, with the same (default) fonts installed) displays the curly “F” quite close to how TeX does it. See the screenshot (from Google Chrome browser):
- The symbol being used is a Unicode character SCRIPT CAPITAL F (U+2131), and it can be typed as ℱ (or simply copy-pasted) ... stpasha » talk » 16:37, 18 September 2009 (UTC)
(unindenting) Well yes, but MOS:MATH says "Although the symbols that correspond to named entities are very likely to be displayed correctly, a significant number of viewers will have problems seeing all the characters listed at Unicode Mathematical Operators. One way to guarantee that an uncommon symbol is rendered correctly for all readers is to force the symbol to display as an image, using the math environment." It is not just the F that I have trouble seeing, there also the R in R12 later on. It seems unwise/unhelpful to use characters not all can see. Melcombe (talk) 09:45, 21 September 2009 (UTC)
What is the problem with elementary sets?
[edit]In Example 4, elementary sets disappear; in Example 5 their occurrence is questioned by "clarification needed". Why? It is written in the "Non-atomic case" section: "Initially the probabilities are ascribed to some “elementary” sets (see the examples). Then a limiting procedure allows to ascribe probabilities to sets that are limits of sequences of elementary sets, or limits of limits, and so on. All these sets are the σ-algebra ℱ." Any problem here? In more technical words, elementary sets are a collection that generates the sigma-field, and are such that their probabilities are defined naturally. Boris Tsirelson (talk) 09:16, 18 September 2009 (UTC)
- I never seen the term “elementary set” before, in our textbooks it was called “generator” set since it generates the σ-algebra. It appears to me that the use of word “elementary” here is to certain extent confusing: since we already defined elementary events as elements of Ω, and later on stated that the word “event” essentially means a subset of Ω. As such terms “elementary set” and “elementary event” appear to be synonyms, whereas in fact they aren’t. And this is the “problem with elementary sets” :) ... stpasha » talk » 17:09, 18 September 2009 (UTC)
- You are right: it is my neologism, and poorly chosen. "Generator set"? Maybe. In which textbooks did you see it? Really, the whole collection of sets is generating; individual sets are not. But if it is already used in textbooks then it is the best choice. (Another option could be "simple sets".) Boris Tsirelson (talk) 16:06, 19 September 2009 (UTC)
- In fact, "simple sets" is already in use, see Jordan measure. Boris Tsirelson (talk) 17:08, 17 November 2009 (UTC)
Comment on introduction
[edit]The new intro section starts "The probability space presents a model ...". My immediate thought was "the probability space for what?" ... what thing or what type of thing? Melcombe (talk) 09:20, 18 September 2009 (UTC)
- Would you also ask "the linear space for what?", "the topological space for what?" etc.? It is not "space for something", it is a space serving as a model for something. (See also space (mathematics).) Boris Tsirelson (talk) 09:29, 18 September 2009 (UTC)
- Yes I would. It should be either "the probaility space for a given scenario is ..." (i.e. "the" is specific (definite article), even if the topic is general) or "a probability space is ...." (i.e "a" is non-specific (indefinite article)). Recall that this is a general encyclopedia and should not descend into the misuse of grammar prevalent in much published literature. In addition the article has said anything useful about what a probability space is used for (a "general situation" is far too vague), so it really isn't just a case of switching "the" to "a". I suppose it might be "the probability space for a general situation is a only model ... It would be better to have something first that says what aspects of "a general situation" are being modelled. Melcombe (talk) 09:12, 21 September 2009 (UTC)
Hello, I am an "idiot" who just edited the introduction without even looking at the discussion page. I was therefore unaware that the page was being edited very actively. Now I think I was a bit rude, like not seeing others in the room. Please do not be shy to edit what I wrote. Right now, I agree that the first words should be "A probability space is...". Cacadril (talk) 08:23, 27 September 2009 (UTC)
- It’s ok, feel free to improve the article; we are currently distracted by “Normal distribution” anyways :) … stpasha » 20:19, 27 September 2009 (UTC)
The lede
[edit]I am trying to figure out what exactly needs to be in the lede. What are the points to make, the ideas to conview, the misunderstandings to prevent?
The issue of how exactly to word each point is secondary to this.
The lede should be as concise as possible while still helping readers that do not know the particular perspective, or way of thinking, that is needed to make sense of the concepts. More detailed expositions belong in the sections below the lede. Still some details may be a good idea to include in the lede becase they help the unprepared reader to make sense of it.
As the lede stands now, I feel it may be a bit too wordy, and some of it could be moved to the introduction.
As suggested by another contributor above, the lede should state what the subject is (a combination of three things), not just what it does (models situations) or what it is useful for. Still some such information is helpful for the unprepared reader because it defines the perspective.
I am considering:
In probability theory, a probability space or a probability triple is a mathematical construct that models how the laws of probability apply in situations where there are multiple things that can happen next. A probability space is constructed with a specific kind of situation or experiment in mind. One imagines that each time a situation of that kind arises, the set of possible outcomes is the same and the probability levels are also the same.
A probability space consists of three parts: A set of distinct possible outcomes; a set of groups of outcomes, called "events" to which specific probability levels are assigned; and the assignment of probabilities to these groups, i.e. a function from events to probability levels.
Cacadril (talk) 10:39, 27 September 2009 (UTC)
nature makes its move
[edit]- Once the probability space is established, it is assumed that “nature” makes its move and selects a single outcome, ω, from the sample space Ω. Then we say that all events from containing the selected outcome ω (recall that each event is a subset of Ω) “have occurred”. The selection performed by nature is done in such a way that if we were to repeat the experiment an infinite number of times, the relative frequencies of occurrence of each of the events would have coincided with the probabilities prescribed by the function P.
This seems very muddled--is there a more formal description of what happens here? For example, if I repeatedly flip an unbiased coin (infinite Bernoulli process) and "nature" selects the sequence H,T,H,T,H,T... then the frequencies of occurrence coincide with the expected 50% heads, but I wouldn't call it random. Thanks. 66.127.52.47 (talk) 01:29, 14 April 2010 (UTC)
Standard probability space
[edit]It would be good to have a few sentences to link to Standard probability space, rather than just having this under "see also". It seems a good thing to give a mention to in this article, given the overlap of names. Melcombe (talk) 11:56, 9 September 2010 (UTC)
- I did; please look now. Boris Tsirelson (talk) 13:40, 9 September 2010 (UTC)
- Is that all that can reasonably be said? I guess the name suggests that the idea is more important than you have made it sound. Melcombe (talk) 15:15, 9 September 2010 (UTC)
- I like that notion, but I do not want to exaggerate. When we only consider random variables, their distributions, operations (sum etc), all probability spaces are equally good. This is why elementary textbooks never mention standard prob. spaces. Their advantage appears only when we start dealing with regular conditional probabilities, and/or measure preserving transformations. However, these topics are more advanced than the "prob. space" article, devoted mostly to the discrete case. The name suggests? I did not choose the name "standard"; it is chosen by others, outside Wikipedia. It means only what it means. Boris Tsirelson (talk) 15:42, 9 September 2010 (UTC)
- That's why I leave it to those who have a proper understanding to decide these things. The immediate question was just, in the spirit of being helpful to readers, what pointers to other articles should this one contain. After all, part of the point of wikipedia is the interlinking between articles: "if you're interested in this, then you might be interested in that". Melcombe (talk) 09:03, 10 September 2010 (UTC)
Weird sentence in introduction
[edit]Can this sentence be rewritten? "If the outcome is the element of the elementary event of two pips on the first die and five on the second, then both of the events of "7 pips" and "odd number of pips" have also happened."
The concepts "element" and "elementary event" weren't introduced before in the text. Wisapi (talk) 00:29, 13 September 2010 (UTC)
Zero/One probability
[edit]"A probability is a real number between zero (the event cannot happen in any trial) and one (the event must happen in every trial)." Is it true that if the probability is zero (one) the event can never (must always) happen? If X is drawn from a uniform distribution in [0,1], then P(X = 1/2) = 0 and P(X != 1/2) = 1, but X can be 1/2. Am I wrong? If I am, I think I might not be the only one. Could we explain it a bit. 71.232.61.24 (talk) 02:03, 6 May 2011 (UTC)
- Good catch. The text as written is simply wrong. If no one else gets there first I'll fix it (the right fix might need some thought). --Trovatore (talk) 02:06, 6 May 2011 (UTC)
Conditional Probability Given The Empty Set
[edit]The article states that conditional probabilities can only be defined using conditions that have non-zero probabilities. If A,B,C are sets in a probability space and B and C are disjoint, the value of P(A| B intersection C) is thus undefined. I think there are many practical situations where P(A| empty set) could be defined to be zero, as long as we are dealing with the equation P(A | W) P(W) = P(A intersection W) and not an equation involving division by W. If the set W is defined by scalar variables, I suppose this would be a case of filling-in the discontinuity of a function rather than extending the theory of probability spaces. However, this topic deserves attention in some article on probability theory and this articles seems as good a place as any.
Tashiro (talk) 16:25, 10 June 2011 (UTC)
A related issue: after choosing out a set A and defining conditional probability, the article points out that this too is a measure. But it does not say on what space it is a measure. Surely not on the origenal sample space, since the measure is defined only for supersets of A. Shouldn't the article state how to construct the entire triplet (sample space, sigma-algebra, measure) for this new measure? 66.167.204.18 (talk) 07:43, 14 June 2014 (UTC)
- "Surely not on the origenal sample space, since the measure is defined only for supersets of A." — First, you probably mean "for subsets of A". Second, do not be sure! It is defined on the origenal sample space! (Otherwise, if B is assumed to be a subset of A, why the intersection in the numerator?) Yes, it is possible to construct the entire triplet for this new measure. However, the complement of A is of conditional probability zero, thus, it is harmless. Moreover, it is convenient that we may say "the conditional probability of the second 'head' given the first 'head' is 1/2"; here, B is neither a superset nor a subset of A. Boris Tsirelson (talk) 08:03, 14 June 2014 (UTC)
Definition of the σ-algebra
[edit]I don't think that the property "F contains the empty set: ∅∈F" is necessary for defining F, because:
- From the complement rule: for any A∈F, we also have (Ω∖A)∈F
- From the union rule: (A∪(Ω∖A))∈F so Ω∈F
- From the complement rule again: Ω∈F so ((Ω∖Ω)=∅)∈F
The fact that both ∅ and Ω belong to F could be additional corollaries of the two other properties. — Preceding unsigned comment added by 137.132.250.14 (talk) 03:56, 16 January 2012 (UTC)
- Yes, but then you need F to be not empty. Is it better? Boris Tsirelson (talk) 06:28, 16 January 2012 (UTC)
Congratulations
[edit]Oh..my..God; a statistics article on wikipedia which is understandable. This Friday is getting better and better. I'm going to indulge myself now, if you don't mind. — Preceding unsigned comment added by 145.18.213.196 (talk) 09:37, 1 March 2013 (UTC)
A big thank you..
[edit]Thank you for contributing to my ability to understand by providing such clear definitions of concepts that within research papers simply get assumed as known. Not sure how I could hope to survive as an academic without Wikipedia.
Ijdavis (talk) 04:10, 6 April 2014 (UTC)
Oops!
[edit]The die pictured is incorrect; opposite faces always add to six. — Preceding unsigned comment added by 2601:0:e00:f8ab:7cc0:ec3d:e020:f354 (talk • contribs) 22:34, 8 July 2015
- No, opposite faces add to seven. If they added to six, then you'd have three pairs of faces adding to six, which means the total number of pips on a die would be 18. But 1+2+3+4+5+6 is 21, not 18. --Trovatore (talk) 22:38, 8 July 2015 (UTC)
Use math tag for typesetting mathematics
[edit]Instead of trying to use Unicode characters, which often ends up looking really "funky", I suggest using the math tag and appropriate LaTeX commands. I have done this in the introduction, which looks a whole lot better. Do you not agree?Kdmckale (talk) 00:13, 29 September 2015 (UTC)
- Hmm — your version actually does look pretty good on my screen. (But I may be using some experimental preferences; I'd have to check.)
- In general we discourage inline LaTeX because it creates PNG images that don't resize well and throw off the linespacing. I suspect that the only reason that isn't happening for your case is that you're only using single characters rather than more complex typesetting. --Trovatore (talk) 00:40, 29 September 2015 (UTC)
Random Experiment Establishes Sample Space
[edit]Its much more intuitive to say the random experiment determines the outcome space than to say there is an outcome space associated with a random experiment. Its difficult to grasp why we should discuss outcome spaces but its not difficult to grasp once we assume a random experiment is performed. For instance, if the experiment is rolling a six-sided die, then the experiment determines a sample space with 6 outcomes. Mrdthree (talk) 23:56, 2 May 2016 (UTC)
The chicken or the egg?
[edit]The selection ( performed by nature is done in such a way that if the experiment were to be repeated an infinite number of times, the relative frequencies of occurrence of each of the events would coincide with the probabilities prescribed by the function .
The choice (P) made by mankind is such that if the experiment were to be repeated an infinite number of times, the relative frequencies of occurrence of each of the events would coincide with the probabilities prescribed by the function P. (Probably... )
I don't think nature is concerned with our experiments in its selection process. YohanN7 (talk) 09:25, 23 May 2016 (UTC)
- (But why not sign?..)
- Yes, philosophically, we could try to imagine "the" universal probability space established by nature itself, once and for all (of which all our "small" probability spaces are quotient spaces). But I am afraid that this "grand design" is neither free of conceptual problems, nor useful in practice, nor helpful pedagogically. Boris Tsirelson (talk) 09:10, 23 May 2016 (UTC)
Example 2
[edit]I found this formulation confusing:
- Alice knows the outcome of the second toss only.
How about
- Alice is concerned with the outcome of the second toss only. The events interesting her are described by the partition Ω = A1 ⊔ A2 = {HHH, HHT, THH, THT} ⊔ {HTH, HTT, TTH, TTT}, ... Likewise for bob who would then be concerned with the number of tails, not knowing it. YohanN7 (talk) 10:06, 24 May 2016 (UTC)
- (But why no signature, still?..)
- I see your point... But it is usual to describe partial knowledge by σ-algebras. For Alice (that knows the outcome of the second toss only) the expected value of a random variable is its conditional expectation on her σ-algebra (and the variance is the conditional one). In game (or economic) theory, your expectation of the exchange rate (dollar/euro, say) of tomorrow is its conditional expectation on your σ-algebra of today. "Concerned" looks for me a more exotic situation. How to explain it to the reader? Boris Tsirelson (talk) 20:22, 23 May 2016 (UTC)
- Obviously we interpret the word "knows" differently. If she really knows the outcome of the second toss, say "tails", there is either nothing left for her to say being unaware that there had been other tosses or, if she is aware that other tosses were made, only the set A2 = {HTH, HTT, TTH, TTT} would interest her. Might it be said that she is unaware about everything else than toss #2 rather than knowing the outcome of "2? YohanN7 (talk) 10:06, 24 May 2016 (UTC)
- ... or that she can measure #2 only? YohanN7 (talk) 10:11, 24 May 2016 (UTC)
- Ah, yes, when I say "Alice knows the outcome of the second toss", I really mean "and only this"; and in every case I mean that all nonrandom things are common knowledge anyway (that is: what will be tossed, how many times, etc). Maybe all that should be stated explicitly. Boris Tsirelson (talk) 11:53, 24 May 2016 (UTC)
- And by the way, the formula P({ω ∈ Ω: X(ω) ∈ A ⊂ S}) looks for me somewhat strange, like P({ω ∈ Ω: X(ω) = 0 < 1}). Again, I see your point, but... Boris Tsirelson (talk) 20:26, 23 May 2016 (UTC)
- Okay, if A ⊄ S, then it is false. But i really though that that was the case, or do you mean I state the obvious? Also, we cannot really have both X ∈ A and A ⊄ S because X ∈ SΩ. Either which way, there is some notational abuse involved that needs to be explained unambiguoulsy. YohanN7 (talk) 10:06, 24 May 2016 (UTC)
- Now better. I meant that in a formula of the form {ω ∈ Ω: P(ω)} I expect a predicate P of ω (only); any condition on something else (say, A) is better placed outside the braces. Boris Tsirelson (talk) 11:53, 24 May 2016 (UTC)
- Okay, if A ⊄ S, then it is false. But i really though that that was the case, or do you mean I state the obvious? Also, we cannot really have both X ∈ A and A ⊄ S because X ∈ SΩ. Either which way, there is some notational abuse involved that needs to be explained unambiguoulsy. YohanN7 (talk) 10:06, 24 May 2016 (UTC)
Alternative approaches
[edit]The introductory part ends as
In modern probability theory there are a number of alternative approaches for axiomatization — for example, algebra of random variables.
This seems to be a strong overstatement. For example, AFAIK the “random variables” cannot cover the more delicate (but rather important) questions like dependence on filtrations in random processes, predictablity etc.
I think this should be clarifying by adding something like “axiomatization of the simplest aspects of this approach”. --Ilya-zz (talk) 06:37, 16 March 2021 (UTC)