Star magnitudes do count backward, the result of an ancient fluke that seemed like a good idea at the time. Since then the history of the magnitude scale is, like so much else in astronomy, the history of increasing scientific precision being built on an ungainly historical foundation that was too deeply rooted for anyone to bulldoze it and start fresh. The story begins around B. Hipparchus ranked his stars in a simple way. He called the brightest ones "of the first magnitude," simply meaning "the biggest.
Looking at the stars on a dark, clear night one of the most obvious features is that they are of different brightness Figure 1. Some are bright while others are at the limit of naked eye visibility and everything in between. Scanning the sky with a pair of binoculars or a telescope will bring many fainter stars into view. The larger the aperture of the telescope or binoculars the fainter the stars you can see. Having a system to describe a star's brightness is useful for many reasons in astronomy, including the scientific study of variable stars and describing the brightness of a new object in the sky such as a nova, supernova or comet.
Most ways of counting and measuring things work logically. When the thing that you're measuring increases, the number gets bigger. When you gain weight, after all, the scale doesn't tell you a smaller number of pounds or kilograms. But things are not so sensible in astronomy — at least not when it comes to the brightnesses of stars.