mamcx
24-01-2013, 00:04:10
Re-encontre este articulo, el cual expone alguna suposiciones erradas a la hora de manipular el tiempo en los programas:
http://infiniteundo.com/post/25326999628/falsehoods-programmers-believe-about-time
There are always 24 hours in a day.
Months have either 30 or 31 days.
Years have 365 days.
February is always 28 days long.
Any 24-hour period will always begin and end in the same day (or week, or month).
A week always begins and ends in the same month.
A week (or a month) always begins and ends in the same year.
The machine that a program runs on will always be in the GMT time zone.
Ok, that’s not true. But at least the time zone in which a program has to run will never change.
Well, surely there will never be a change to the time zone in which a program hast to run in production.
The system clock will always be set to the correct local time.
The system clock will always be set to a time that is not wildly different from the correct local time.
If the system clock is incorrect, it will at least always be off by a consistent number of seconds.
The server clock and the client clock will always be set to the same time.
The server clock and the client clock will always be set to around the same time.
Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.
If the server clock and the client clock are not in synch, they will at least always be out of synch by a consistent number of seconds.
The server clock and the client clock will use the same time zone.
The system clock will never be set to a time that is in the distant past or the far future.
Time has no beginning and no end.
One minute on the system clock has exactly the same duration as one minute on any other clock
Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.
Fine, but the duration of one minute on the system clock would never be more than an hour.
You can’t be serious.
The smallest unit of time is one second.
Ok, one millisecond.
It will never be necessary to set the system time to any value other than the correct local time.
Ok, testing might require setting the system time to a value other than the correct local time but it will never be necessary to do so in production.
Time stamps will always be specified in a commonly-understood format like 1339972628 or 133997262837.
Time stamps will always be specified in the same format.
Time stamps will always have the same level of precision.
A time stamp of sufficient precision can safely be considered unique.
A timestamp represents the time that an event actually occurred.
Human-readable dates can be specified in universally understood formats such as 05/07/11.
http://infiniteundo.com/post/25326999628/falsehoods-programmers-believe-about-time
There are always 24 hours in a day.
Months have either 30 or 31 days.
Years have 365 days.
February is always 28 days long.
Any 24-hour period will always begin and end in the same day (or week, or month).
A week always begins and ends in the same month.
A week (or a month) always begins and ends in the same year.
The machine that a program runs on will always be in the GMT time zone.
Ok, that’s not true. But at least the time zone in which a program has to run will never change.
Well, surely there will never be a change to the time zone in which a program hast to run in production.
The system clock will always be set to the correct local time.
The system clock will always be set to a time that is not wildly different from the correct local time.
If the system clock is incorrect, it will at least always be off by a consistent number of seconds.
The server clock and the client clock will always be set to the same time.
The server clock and the client clock will always be set to around the same time.
Ok, but the time on the server clock and time on the client clock would never be different by a matter of decades.
If the server clock and the client clock are not in synch, they will at least always be out of synch by a consistent number of seconds.
The server clock and the client clock will use the same time zone.
The system clock will never be set to a time that is in the distant past or the far future.
Time has no beginning and no end.
One minute on the system clock has exactly the same duration as one minute on any other clock
Ok, but the duration of one minute on the system clock will be pretty close to the duration of one minute on most other clocks.
Fine, but the duration of one minute on the system clock would never be more than an hour.
You can’t be serious.
The smallest unit of time is one second.
Ok, one millisecond.
It will never be necessary to set the system time to any value other than the correct local time.
Ok, testing might require setting the system time to a value other than the correct local time but it will never be necessary to do so in production.
Time stamps will always be specified in a commonly-understood format like 1339972628 or 133997262837.
Time stamps will always be specified in the same format.
Time stamps will always have the same level of precision.
A time stamp of sufficient precision can safely be considered unique.
A timestamp represents the time that an event actually occurred.
Human-readable dates can be specified in universally understood formats such as 05/07/11.