Why are some fonts (or graphical rulers) sized incorrectly?
Programs that try to optimize font sizes for better screen output (such as FontConfig-based applications) and those that try to show real-world geometry (like inch/centimeter scales in the GIMP) need to know the pixel density of the display to be accurate. This value is the ratio of the display’s pixel resolution to the physical size of your monitor generally between 70 and 120 DPI (dots per inch). If the DPI value used by the X server does not match the actual monitor characteristics, applications that use this information are likely to display fonts and other graphics that are scaled incorrectly based on this inaccurate information. You can use the xdpyinfo program in the xbase-clients package to check the assumed monitor size and dot resolution. The DPI value is retrieved by X client applications by querying the X server; therefore, the X server needs to know (or be told) what value to use.