Why do computers use the binary code instead of the decimal system?
Binary is a set of instructions used to control the computer, and works from 1’s and 0’s, but the computer understands them as on or off signals. If the decimal system were used, there would need to be 10 different voltages, in which case there’d be more room for error with resistors etc., and therefore more room for corruption of data.