'Funniest Y2K joke/Y2K and RAM'
|I don't think RAM had anything to do with it. Going to a 4-character year
|would have used up an additional 2.5% of the available columns on the
|punched card, and taken away characters from some other field.
I think the major factor in the Y2K problem was not so much with
RAM, but with I/O in general. If everyone who is keying dates into
a computer has to specify the full four-digit year, that can be a
lot of extra keystrokes. And using 4-digit years in date columns
on reports, especially if each row has several dates on it, can
result in the reports being somewhat cramped.
Besides, if memory is a concern why store dates in decimal format
anyway (the only internal form where Y2K is a problem)? Storing a
date in YYMMDD format takes 3 bytes; by contrast, two bytes will
suffice for storing a date for up to 179 years from the starting
reference point (MS-DOS uses two bytes to handle dates with years
1980 to 2107, though its date calculation routines will mishandle
"Febraruy 29, 2100" [there shouldn't be one].
By the way, what format do the other PicMeisters out here prefer to
use for dates? The one ap I wrote which used dates uses four bytes
for weekday, month, day, and year; all are kept in conventional form
(weekday=0 to 6; year goes 0 to 99) but the software correctly hand-
les the wraparound at 1-1-00, including maintaining the correct week-
Attachment converted: wonderland:WINMAIL.DAT (????/----) (0002577E)
|John Payson wrote:
CCYYMMDD HHMM SS.hh (i.e. 19990108 1356 18.02 for resolution down to
the hundredths of seconds) is my favorite here. It's simple, not too
redundant, and pretty clear. (Can remove the CC centuries digits if you
HAVE to, so long as you intelligently create them properly in the code
and handle them consistently; I've written apps that display "990108"
for today, but know "000108" is next year, not 99 years ago <G> Safe
bet that ((YY >= 80) && (YY <= 99)) ? Year = CC + 1900 : Year = CC +
2000, to be C-ish about it <G>
(For the C impaired, Boolean ? DoIfTrue : DoIfFalse means "evaluate
Boolean, if it's true execute DoIfTrue, otherwise execute DoIfFalse." C
is a good language but can be intentionally used to be obnoxious in!
There's a yearly "Obfuscation contest" where people try to out-confuse
each other - sorta a masochists' thing I guess <G>)
CCYYMMDD is easy enough to understand, *quite* simple to sort
chronologically, and I got SO tired of re-writing "sort April after
March" routines for every language / every platform I've ever been on,
which is a lot of different platforms <G> (Again, same as most here,
probably! Pr1me, CDC, IBM mainframes, DEC, Data General, CP/M, ... /
PL/1, JOVIAL, C, C++, Pascals, scads o' assemblers, ...) Re-inventing
the wheel gets old.
Maybe I'm just a lazy programmer, though <G>
>| I don't think RAM had anything to do with it. Going to a
>| 4-character year would have used up an additional 2.5% of
>| the available columns on the punched card, and taken away
>| characters from some other field.
> I think the major factor in the Y2K problem was not so much with
> RAM, but with I/O in general.
Everyone seems to be fixating on internal memory (aka RAM).
RAM _was_ expensive and rare.
But so was storage on disk! When some of the formats with
2 digit year fields were defined, a washing machine sized
disk drive stored 20,000 characters. Why do you think the
removable disk pack was so desirable? It allowed them to
leverage the investment in a multi-thousand dollar drive.
It was common practice to only use on-line disk storage as
work areas while a program was running. The input and output
files were kept on punched cards. When those files migrated
to magnetic tape and then to disk, the format wasn't changed.
I remember (a _long_ time ago) seeing graduate students bring
their data files to the computer center. Those files consisted
of thousands of 80-column punch cards in multiple boxes.
A lot of Y2K issues simply reflect 4 decades of human nature
and corporate inertia.
> If everyone who is keying dates into a computer has to specify
> the full four-digit year, that can be a lot of extra keystrokes.
Actually, for card columns with fixed elements, you could
program the keypunch to auto-duplicate certain columns, like
the ones with containing the 19 of each date.
Guess I'm showing my age...
> Besides, if memory is a concern why store dates in decimal form
> anyway (the only internal form where Y2K is a problem)?
Well, if I recall correctly, IBM systems had the card reader
interface hardware store the card columns directly in main
memory. There were no I/O routines to translate things --
the CPUs were too slow for that to be practical. And those
same CPUs worked very well with BCD representation. They did
the math directly on BCD fields. The fastest per-character
translation is no software translation at all.
There were 2 decades of business computing prior to C being
invented. And even now, data processing probably sees C as
the first letter of COBOL. RPG was hughly popular too.
|Lee Jones wrote:
> <snipped a bit out>
> I remember (a _long_ time ago) seeing graduate students bring
> their data files to the computer center. Those files consisted
> of thousands of 80-column punch cards in multiple boxes.
Dropping your box of cards, performed a quick randomization of your
data & program, too. Arrgh!
> <snipped some more>
> Actually, for card columns with fixed elements, you could
> program the keypunch to auto-duplicate certain columns, like
> the ones with containing the 19 of each date.
But you had to go get the card drum for that. (I think I still *have*
a couple of those drums here someplace!)
> <snipped some more>
> There were 2 decades of business computing prior to C being
> invented. And even now, data processing probably sees C as
> the first letter of COBOL. RPG was hughly popular too.
> Lee Jones
Definitely - The state of Washington's SDSS system, (DSHS's machines)
in Olympia, WA, which pay attendants for people with disabilities &
organizations which help them & so forth, is still run on an old Cobol
batch-style program. I think they've upgraded to where they actually
keep the data & application on terminals now, though. They could be on
cards still for all I know! They really seem unable/unwilling to
upgrade to interactive systems... I'm wondering how Y2K will affect
them! We'll find out soon enough...
Peter L. Peres
> How to keep date
Well, unless the device has to display time somehow in ASCII or digits,
I strongly prefer the Unix date format (unsigned 32 bits, seconds since
etc etc). When there is a timezone problem (there is, here), I add a
signed integer in 32 bits that expresses the offset to GMT and a ISO
3-char string that expresses the ISO name of the offset to GMT/UTC. I
confess that I have only gotten to do this twice so far, but it is a Good
It only takes 11 bytes to implement and doing date maths with this is
bliss compared to YYMMDD - YYMMDD etc. The date can be transmitted in
mangled format over serial, using hex ASCII if there is an
8-bit-uncleanliness problem (we are 8-bit clean because our national
character set maps up into >0x80 bytes normally). Hosts have no problem
formatting or parsing the date, especially since the time functions are
well represented in any C standard library.
Note that I use an UNSIGNED second counter, which is good till 2106 or
so. The C library functions usually aren't. I also use only the low bits
of the offset required to express a +/-12 Hour offset in seconds. This is
17 bits + sign = 18 bits out of 32. This leaves some unused bits and bytes
(can be flags etc). The 32-bit alignment is kept for easy addition
purposes. A mask removes the flags when doing this.
One can implement the binary->ASCII parser even on a PIC if need be, but
for simple things like clocks YYMMDDSS is more like it, in packed or
unpacked BCD, ready to add ASCII offsets for display (LCD) or not (LEDs
hope this helps,
More... (looser matching)
- Last day of these posts
- In 1999
, 2000 only
- New search...