Why does (eg) a 10 pt font in Java applications appear to have a different size from the same font at 10pt in a native application?
Conversion from the size in points into device pixels depends on device resolution as reported by the platform APIs. Java 2D defaults to assuming 72 dpi. Platform defaults vary. Mac OS also uses 72 dpi. Linux desktops based on GTK (Gnome) or Qt (KDE) typically default to 96 dpi and let the end-user customise what they want to use. Windows defaults to 96 dpi (VGA resolution) and also offers 120 dpi (large fonts size) and lets users further specify a custom resolution. So a couple of things can now be seen • The DPI reported by platform APIs likely has no correspondence to the true DPI of the display device • Its unlikely that Java 2D’s default matches the platform default. So a typical results is that for Window’s default 96 DPI that a 10 pt font in a Java application is 72/96 of the size of the native counterpart. Note that Swing’s Windows and GTK L&Fs do scale fonts based on the system DPI to match the desktop.