Why is i typically chosen as a loop counter, in constructs such as for(i=0; i ?
There are several possible answers to this, and most of them seem to support the idea of ‘computer centrism’ from the guys who work with computers (I mean ‘all of us’). For example, • FORTRAN (on which many people learned to program) used the initial letter of variables to determine their data type. A variable would be assumed integer if the initial letter was in the range I-N (some have commented that these are the first two letters of INteger). So if you wanted a quick int variable as a loop counter, you would start with i, proceed to j, and so on. • This is all very well, say the math oriented, but we’re using ‘i’ in equations as a sample variable like this : ___ \ /__ X i=0 i and FORTRAN obviously stole the idea from us. As to which one is actually true? Well, no one is quite sure. Programmers probably picked up the practice from FORTRAN, which in turn probably took it from the mathematicians. All I know is that every programming book around uses ‘i’ as the first loop counter. Of c