I know I've seen this discussed.
But I do not know what it's called so I can not Google.
When we clockin the payroll computer gets a time of "hour" + 0|15|30|45
[ie quantized in 15 minute increments]
For a class of employees [including yours truly] there is no other
restriction of clockin/clockout times. For the purpose of this question
it is *explicitly assumed* that all clockin/clockout times have a random
distribution.
There is a simple set of changes that would reduce the actual average
workday of a class of employees by 6 to 10 minutes per day {7 days/week
& 52 wks/yr} [ upto at least an hour per week for a SPECIFIC job
classification]
I understand management well enough to expect them to dismiss this as
average savings per day is less than quantizing error of timeclock.
How do I demonstrate otherwise?
What term should I be searching for?
Is any thing aimed at non-tech reader?
Thank you.
[PS comp.dsp has taught me at least one thing ;]
I state more explicit thank you's
you teach old dogs *WOOF WOOF* ;/
Time cards and sampling theorems
Started by ●June 28, 2006
Reply by ●June 29, 20062006-06-29
Richard Owlett wrote:> I know I've seen this discussed. > But I do not know what it's called so I can not Google. > > When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 > [ie quantized in 15 minute increments] > > For a class of employees [including yours truly] there is no other > restriction of clockin/clockout times. For the purpose of this question > it is *explicitly assumed* that all clockin/clockout times have a random > distribution. > > There is a simple set of changes that would reduce the actual average > workday of a class of employees by 6 to 10 minutes per day {7 days/week > & 52 wks/yr} [ upto at least an hour per week for a SPECIFIC job > classification] > > I understand management well enough to expect them to dismiss this as > average savings per day is less than quantizing error of timeclock. > > How do I demonstrate otherwise? > What term should I be searching for? > Is any thing aimed at non-tech reader? > > Thank you. > > [PS comp.dsp has taught me at least one thing ;] > I state more explicit thank you's > you teach old dogs *WOOF WOOF* ;/if the times are random then the quantizing error is + and - and will average (integrate) towards zero over the year. on the other hand, the real work day reduction will be - every day and will integrate to a larger and larger reduction over the year. i.e. a +/- 6 minute per day quantizing error will vary around zero but will not grow and over a year will be some number say 10 minutes for example... the 6 minute work day reduction will accumulate to 1500 minutes over 250 days. The 1500 minute reduction will clearly be measureable despite the quantizing... demonstrate this to mangment using a spread sheet where you add a random error and the real reduction and graph the results... Mark
Reply by ●June 29, 20062006-06-29
Mark wrote:> Richard Owlett wrote: > >>I know I've seen this discussed. >>But I do not know what it's called so I can not Google. >> >>When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 >>[ie quantized in 15 minute increments] >> >>For a class of employees [including yours truly] there is no other >>restriction of clockin/clockout times. For the purpose of this question >>it is *explicitly assumed* that all clockin/clockout times have a random >>distribution. >> >>There is a simple set of changes that would reduce the actual average >>workday of a class of employees by 6 to 10 minutes per day {7 days/week >>& 52 wks/yr} [ upto at least an hour per week for a SPECIFIC job >>classification] >> >>I understand management well enough to expect them to dismiss this as >>average savings per day is less than quantizing error of timeclock. >> >>How do I demonstrate otherwise? >>What term should I be searching for? >>Is any thing aimed at non-tech reader? >> >>Thank you. >> >>[PS comp.dsp has taught me at least one thing ;] >> I state more explicit thank you's >> you teach old dogs *WOOF WOOF* ;/ > > > if the times are random then the quantizing error is + and - and will > average (integrate) towards zero over the year.This is understood by all involved and was discussed when we went from a system that _theoretically_ recorded "exact" start/stop times. Management gave a good explanation of this when our time keeping procedures were changed.> > on the other hand, the real work day reduction will be - every day and > will integrate to a larger and larger reduction over the year.The integration over time is the key point.> > i.e. a +/- 6 minute per day quantizing error will vary around zero but > will not grow and over a year will be some number say 10 minutes for > example... > > the 6 minute work day reduction will accumulate to 1500 minutes over > 250 days. > > The 1500 minute reduction will clearly be measureable despite the > quantizing...You know this. I know this.> > demonstrate this to mangment using a spread sheet where you add a > random error and the real reduction and graph the results...How? ;) I'm not a "creative" type, but I "edit" well. [ ITEM: >40 yrs ago while failing Freshman English due to inability to create compositions, a was assisting an English Honors major with his mechanics. ITEM: I'm not much of an original programmer, but I do well modifying existing code for new situations. ]> > Mark >
Reply by ●July 1, 20062006-07-01
Richard Owlett wrote:> I know I've seen this discussed. > But I do not know what it's called so I can not Google.> When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 > [ie quantized in 15 minute increments]Rounded or truncated? Do the employees know which, and can they see the actual value of the clock?> For a class of employees [including yours truly] there is no other > restriction of clockin/clockout times. For the purpose of this question > it is *explicitly assumed* that all clockin/clockout times have a random > distribution.The usual solution for quantized systems is dithering, where a random value (noise) slightly larger than the quantization value is added before quantization. Otherwise, arrive every day at 7 minutes after the hour and leave 7 minutes before the hour, or other appropriate quarter hour (if rounded). -- glen
Reply by ●July 1, 20062006-07-01
glen herrmannsfeldt wrote:> Richard Owlett wrote: > > > I know I've seen this discussed. > > But I do not know what it's called so I can not Google. > > > When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 > > [ie quantized in 15 minute increments] > > Rounded or truncated? Do the employees know which, and can they see > the actual value of the clock?Richard obviously know the principle. Expect others to know, too.> > For a class of employees [including yours truly] there is no other > > restriction of clockin/clockout times. For the purpose of this question > > it is *explicitly assumed* that all clockin/clockout times have a random > > distribution.This is the flaw. If you know about the cloock scheme, others do, too, and one might assume they try to abuse it.> The usual solution for quantized systems is dithering, where a random > value (noise) slightly larger than the quantization value is added > before quantization. > > Otherwise, arrive every day at 7 minutes after the hour and leave 7 > minutes before the hour, or other appropriate quarter hour (if rounded).This is the kind of thing you need to demonstrate; a "smart" worker who exploits the system. How to (dis)prove that these things happen? Record the times when people stamp their cards, to within second accuracy. Do that for a couple of months, and maybe some patterns emerge. Rune
Reply by ●July 1, 20062006-07-01
OK I'll bite. Others seem to know what you are asking - I can't figure it out. Richard Owlett wrote:> > I know I've seen this discussed. > But I do not know what it's called so I can not Google. > > When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 > [ie quantized in 15 minute increments] > > For a class of employees [including yours truly] there is no other > restriction of clockin/clockout times. For the purpose of this question > it is *explicitly assumed* that all clockin/clockout times have a random > distribution. >"For the purpose of this question" implies there is a question. What is the question:?> There is a simple set of changes that would reduce the actual average > workday of a class of employees by 6 to 10 minutes per day {7 days/week > & 52 wks/yr} [ upto at least an hour per week for a SPECIFIC job > classification] >Is this supposed to be the question? To me, it looks like a statement. Besides that It doesn't make much sense. When you say "reduce the actual average workday" do you mean reduce the amount of time actually worked or do you mean reduce the time credited for working. And what is the "simple set of changes" you refer to. It is clearly more accurate to record the time in minutes (seconds even better) and round to the nearest 15 min at the end of the pay period. Of course that increased accuracy isn't going to work in the employees favor since the current method of accounting just signals to the employee to adjust their target arrival and departure times. Assuming the employee can predictably and accurately control his/her arrival/departure he/she could be getting paid for close to 15 min per day more than the actual time the time clock records.> I understand management well enough to expect them to dismiss this as > average savings per day is less than quantizing error of timeclock. > > How do I demonstrate otherwise? > What term should I be searching for?Without knowing exactly what you are driving at that's difficult to say. But, this is not really a quantization problem but a sampling problem. The term you are looking for might be 'aliasing'. There are 2 quantized states for an employee -> He/she is either present or not. What you are discussing is: do you sample every 15 minutes or do you sample every minute to determine which state the employee is in. The inaccuracy of the sampling at 15 min intervals is due to aliasing not quantizing. The employee can exploit this aliasing by choosing a frequency for arrival/departure that will work in his/her favor. -jim> Is any thing aimed at non-tech reader? > > Thank you. > > [PS comp.dsp has taught me at least one thing ;] > I state more explicit thank you's > you teach old dogs *WOOF WOOF* ;/----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Reply by ●July 1, 20062006-07-01
jim wrote:> It is clearly more accurate to record the time in minutes (seconds even > better) and round to the nearest 15 min at the end of the pay period. Of > course that increased accuracy isn't going to work in the employees > favor since the current method of accounting just signals to the > employee to adjust their target arrival and departure times. Assuming > the employee can predictably and accurately control his/her > arrival/departure he/she could be getting paid for close to 15 min per > day more than the actual time the time clock records. >Sorry that was stated backwards. It should say "could be getting paid for close to 15 min per day more than the actual time present on the job". That is the time clock shows more time than the employee is present. -jim ----== Posted via Newsfeeds.Com - Unlimited-Unrestricted-Secure Usenet News==---- http://www.newsfeeds.com The #1 Newsgroup Service in the World! 120,000+ Newsgroups ----= East and West-Coast Server Farms - Total Privacy via Encryption =----
Reply by ●July 2, 20062006-07-02
glen herrmannsfeldt wrote:> Richard Owlett wrote: > >> I know I've seen this discussed. >> But I do not know what it's called so I can not Google. > >> When we clockin the payroll computer gets a time of "hour" + 0|15|30|45 >> [ie quantized in 15 minute increments] > > > Rounded or truncated?Rounded > Do the employees know which, and can they see> the actual value of the clock?Yes and Yes with qualification[ The the actual time of card swipe is displayed and the actual time is printed on pay stub. All calculations done using the "quantized time".]> >> For a class of employees [including yours truly] there is no other >> restriction of clockin/clockout times. For the purpose of this >> question it is *explicitly assumed* that all clockin/clockout times >> have a random distribution. > > > The usual solution for quantized systems is dithering, where a random > value (noise) slightly larger than the quantization value is added > before quantization. > > Otherwise, arrive every day at 7 minutes after the hour and leave 7 > minutes before the hour, or other appropriate quarter hour (if rounded).Ahh, you are answering a question other than which I was *attempting* to ask ;) You did not quote the two paragraphs that were the core of why I'm asking ;{ TO whit: OP quotes himself: >>> There is a simple set of changes that would reduce the actual >>> average workday of a class of employees by 6 to 10 minutes per day >>> {7 days/week & 52 wks/yr} [ upto at least an hour per week for a >>> SPECIFIC job classification] >>> >>> I understand management well enough to expect them to dismiss this >>> as average savings per day is less than quantizing error of >>> timeclock. I want them to spend money. *MY* goal is improvement of my work environment. *THEIR* goal is to reduce *TOTAL* expenses. How can I show management that a *COST* which reduces a *per shift* labor cost by less than the quantization amount will actually save them money. There are second order cost savings possible, but I've enough common sense to not even raise them as issues. Let it come as pleasant surprise if they take my advice. This what I believe is an 'engineering mindset' being applied to a 'personnel' issue -- most bang for buck.> > -- glen >
Reply by ●July 2, 20062006-07-02
Reply by ●July 2, 20062006-07-02
Richard Owlett wrote: ...> I want them to spend money. > *MY* goal is improvement of my work environment. > *THEIR* goal is to reduce *TOTAL* expenses. > > How can I show management that a *COST* which reduces a *per shift* > labor cost by less than the quantization amount will actually save them > money. > > There are second order cost savings possible, but I've enough common > sense to not even raise them as issues. Let it come as pleasant surprise > if they take my advice. > > This what I believe is an 'engineering mindset' being applied to a > 'personnel' issue -- most bang for buck.Let it go. Game the system if you can and choose to. If management doesn't think your ideas are worthwhile, they'll put you down as a pest. If you succeed in convincing management that you have better insight than they do, that will also convince them that you're dangerous. Jerry -- Heretics must be put suppressed not because they're probably wrong, but because they might be right. �����������������������������������������������������������������������






