When the Stats people sample data for analysis they have a load of rules for population size etc. However, they never have to filter the data first to avoid aliasing. Is this because the data is already in "digital" format? Hardy

# OT:Sampling in Stats

Started by ●September 23, 2008

Reply by ●September 24, 20082008-09-24

On Tue, 23 Sep 2008 19:23:06 -0700 (PDT), HardySpicer <gyansorova@gmail.com> wrote:>When the Stats people sample data for analysis they have a load of >rules for population size etc. However, they never have to filter the >data first to avoid aliasing. Is this because the data is already in >"digital" format?I think most stats aren't time-domain signals or similar type of thing where there is a fixed order to the data points. For those that are, when converting from one timeframe to another (effectively doing sample rate conversion), no doubt the appropriate filtering and such should be done. It might be helpful to list a few concrete examples that we could discuss and compare between statistics and DSP. When talking about the subjects I often see here, "white noise," "pink noise," correlation and whatnot, there appears to be some overlap between what is done with statistics and with DSP.> > >Hardy

Reply by ●September 24, 20082008-09-24

On Sep 23, 10:23�pm, HardySpicer <gyansor...@gmail.com> wrote:> When the Stats people sample data for analysis they have a load of > rules for population size etc. However, they never have to filter the > data first to avoid aliasing. Is this because the data is already in > "digital" format? > > HardyHey, Like Ben Bradley when stats people refer to sample & population size they are typically refering to sampling as in drawing (collecting) a particular *instance* of a random variable. They are not sampling in the Digital-to-Analog sense but in the statistical (ie, instance or a "run" of an experiment) sense (for example, a sample could be a random card value from a deck or a number from a dice roll).

Reply by ●September 24, 20082008-09-24

On Sep 24, 4:13 pm, Ikaro <ikarosi...@hotmail.com> wrote:> On Sep 23, 10:23 pm, HardySpicer <gyansor...@gmail.com> wrote: > > > When the Stats people sample data for analysis they have a load of > > rules for population size etc. However, they never have to filter the > > data first to avoid aliasing. Is this because the data is already in > > "digital" format? > > > Hardy > > Hey, > Like Ben Bradley when stats people refer to sample & population size > they are typically refering to sampling as in drawing (collecting) a > particular *instance* of a random variable. They are not sampling in > the Digital-to-Analog sense but in the statistical (ie, instance or a > "run" of an experiment) sense (for example, a sample could be a random > card value from a deck or a number from a dice roll).Well a die is discrete already having only 1 of six possibilities. What about the weather forecast. Rainfall measurement (I suppose thats discrete too). Ok Temperature, it gets measured at regular intervals making the process sampled. How often shoudl one take readings.

Reply by ●September 24, 20082008-09-24

"HardySpicer" <gyansorova@gmail.com> wrote in message news:b768474a-fc5b-40a9-8313-2feac8f28fd6@n33g2000pri.googlegroups.com...> On Sep 24, 4:13 pm, Ikaro <ikarosi...@hotmail.com> wrote:>> Like Ben Bradley when stats people refer to sample & >> population size >> they are typically refering to sampling as in drawing >> (collecting) a >> particular *instance* of a random variable. They are not >> sampling in >> the Digital-to-Analog sense but in the statistical (ie, >> instance or a >> "run" of an experiment) sense (for example, a sample could >> be a random >> card value from a deck or a number from a dice roll). > > Well a die is discrete already having only 1 of six > possibilities. > What about the weather forecast. Rainfall measurement (I > suppose thats > discrete too). Ok Temperature, it gets measured at regular > intervals > making the process sampled. How often shoudl one take > readings.Think in terms of "statistical independence." When sampling for statistical analysis, all first-order statistical analysis assumes this property about its samples. In the real-world, many measurements depend not only on the current conditions, but upon the history of those conditions. To get statistical independence, you must sample *slowly* enough so that the thing being measured has lost its memory of the past. A "fair" die doesn't remember what its last throw was. Temperature is an example of a measurement that depends on its past. The connection is known as "thermal mass."