Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

Why is Y2K a problem?

problem Y2K
0
Posted

Why is Y2K a problem?

0

Two digit storage. During the 50’s, 60’s, 70’s and even into the 80’s, computer chips and software applications identified the value for the year with just the last two digits. This means that 1969, 1978, 1985 and 1999 are stored as 69, 78, 85, and 99. When date calculations are performed on dates within the same century, the calculations are accurate (ie. 1999-1985=14). But what if the date is actually 1885 or 2085? Then 1999 – 1885 =104, but 99-85 still equals 14. A lot of information processed by a computer can be accurate when using the two-digit date representation. The concern is that systems may fail or provide incorrect information resulting in the delay of shipping of critical components for manufacturing or delivering incorrect amounts or on the wrong date. Leap year calculations. The Year 2000 is a special case leap year that only happens every 400 years. Leap years are calculated in computers by a simple set of rules, but not all of the rules programmed noted the special 40

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.