Stellar magnitude of the sun and moon. Magnitude

Imagine that somewhere out at sea in the darkness of the night a light is quietly flickering. Unless an experienced sailor explains to you what it is, you often won’t know: it’s either a flashlight on the bow of a passing boat, or a powerful searchlight from a distant lighthouse.

We are in the same position on a dark night, looking at the twinkling stars. Their apparent brilliance also depends on their true luminous intensity, called luminosity, and from their distance to us. Only knowledge of the distance to the star allows one to calculate its luminosity compared to the Sun. For example, the luminosity of a star that is ten times less bright in reality than the Sun will be expressed as 0.1.

The true intensity of a star's light can be expressed even differently by calculating what magnitude it would appear to us if it were at a standard distance of 32.6 light years from us, that is, at such a distance that light traveling at a speed of 300,000 km /sec, would have passed it in this time.

Adopting such a standard distance has proven convenient for various calculations. The brightness of a star, like any source of light, varies inversely with the square of the distance from it. This law allows us to calculate the absolute magnitudes or luminosities of stars, knowing the distance to them.

When the distances to the stars became known, we were able to calculate their luminosities, that is, we were able to sort of line them up and compare them with each other under the same conditions. It must be admitted that the results were amazing, since it was previously assumed that all stars were “similar to our Sun.” The luminosities of the stars turned out to be amazingly varied, and they in our line cannot be compared with any line of pioneers.

We will give only extreme examples of luminosity in the world of stars.

The faintest known for a long time was a star that is 50 thousand times fainter than the Sun, and its absolute luminosity value: +16.6. However, later even fainter stars were discovered, the luminosity of which, compared to the sun, is millions of times less!

The dimensions in space are deceptive: Deneb from Earth shines brighter than Antares, but the Pistol is not visible at all. However, to an observer from our planet, both Deneb and Antares appear to be simply insignificant points compared to the Sun. How wrong this is can be judged by a simple fact: A gun emits as much light per second as the Sun does in a year!

On the other edge of the line of stars stands "S" of Golden Fish, visible only in the countries of the Southern Hemisphere of the Earth as an asterisk (that is, not even visible without a telescope!). In fact, it is 400 thousand times brighter than the Sun, and its absolute luminosity value is -8.9.

Absolute The luminosity value of our Sun is +5. Not so much! From a distance of 32.6 light years, we would have difficulty seeing it without binoculars.

If the brightness of an ordinary candle is taken to be the brightness of the Sun, then in comparison with it the “S” of Dorado will be a powerful spotlight, and the weakest star is weaker than the most pitiful firefly.

So, the stars are distant suns, but their light intensity can be completely different from that of our star. Figuratively speaking, changing our Sun for another would have to be done with caution. From the light of one we would go blind, in the light of the other we would wander as if in twilight.

Magnitudes

Since the eyes are the first measurement tool, we must know the simple rules that govern our estimates of the brightness of light sources. Our assessment of differences in brightness is relative rather than absolute. Comparing two faint stars, we see that they are noticeably different from each other, but for two bright stars the same difference in brightness goes unnoticed by us, since it is insignificant compared to the total amount of light emitted. In other words, our eyes evaluate relative, but not absolute difference in shine.

Hipparchus was the first to divide stars visible to the naked eye into six classes, according to their brightness. Later, this rule was somewhat improved without changing the system itself. The magnitude classes were distributed so that a 1st magnitude star (the average of 20) would produce a hundred times more light than a 6th magnitude star, which is at the limit of visibility for most people.

A difference of one magnitude is equal to the square of 2.512. A difference of two magnitudes corresponds to 6.31 (2.512 squared), a difference of three magnitudes corresponds to 15.85 (2.512 to the third power), a difference of four magnitudes corresponds to 39.82 (2.512 to the fourth power), and a difference of five magnitudes corresponds to 100 (2.512 squared). fifth degree).

A 6th magnitude star gives us a hundred times less light than a 1st magnitude star, and an 11th magnitude star is ten thousand times less. If we take a star of 21st magnitude, then its brightness will be less than 100,000,000 times.

As is already clear - the absolute and relative driving value,
things are completely incomparable. For a “relative” observer from our planet, Deneb in the constellation Cygnus looks something like this. But in fact, the entire orbit of the Earth would barely be enough to completely contain the circumference of this star.

To correctly classify stars (and they are all different from each other), you need to carefully ensure that along the entire interval between neighboring stellar magnitudes a brightness ratio of 2.512 is maintained. It is impossible to do such work with the naked eye; you need special tools, like photometers Pickering, using the North Star or even an “average” artificial star as a standard.

Also, for the convenience of measurements, it is necessary to weaken the light of very bright stars; this can be achieved either with a polarizing device, or with the help of photometric wedge.

Purely visual methods, even with the help of large telescopes, cannot extend our magnitude scale to faint stars. In addition, visual measurement methods should (and can) only be made directly at the telescope. Therefore, in our time, purely visual classification has already been abandoned, and the photoanalysis method is used.

How can you compare the amount of light received by a photographic plate from two stars of different brilliance? To make them appear the same, it is necessary to attenuate the light from the brighter star by a known amount. The easiest way to do this is by placing the aperture in front of the telescope lens. The amount of light entering the telescope varies depending on the area of ​​the lens, so that the attenuation of any star's light can be accurately measured.

Let's choose some star as a standard one and photograph it with the full aperture of the telescope. Then we will determine which aperture should be used at a given exposure in order to obtain the same image when shooting a brighter star as in the first case. The ratio of the areas of the reduced and full holes gives the ratio of the brightness of the two objects.

This measurement method gives an error of only 0.1 magnitude for any star in the range from 1st to 18th magnitude. The magnitudes obtained in this way are called photovisual.

(from Wikipedia)

Stellar magnitude is a numerical characteristic of an object in the sky, most often a star, showing how much light comes from it to the point where the observer is located.

Visible (visual)

The modern concept of apparent magnitude is made to correspond to the magnitudes assigned to stars by the ancient Greek astronomer Hipparchus in the 2nd century BC. e. Hipparchus divided all stars into six magnitudes. He called the brightest stars of the first magnitude, the dimmest stars of the sixth magnitude. He distributed the intermediate values ​​evenly among the remaining stars.

The apparent magnitude of the star depends not only on how much light the object emits, but also on how far it is from the observer. Apparent magnitude is considered a unit of measurement shine stars, and the greater the brilliance, the smaller the magnitude, and vice versa.

In 1856, N. Pogson proposed a formalization of the magnitude scale. The apparent magnitude is determined by the formula:

Where I- luminous flux from the object, C- constant.

Since this scale is relative, its zero point (0 m) is defined as the brightness of a star whose luminous flux is equal to 10³ quanta /(cm² s Å) in green light (UBV scale) or 10 6 quanta /(cm²· s·Å) in the entire visible range of light. A star 0 m outside the Earth's atmosphere creates an illumination of 2.54·10 −6 lux.

The magnitude scale is logarithmic, since changes in brightness by the same number of times are perceived as the same (Weber-Fechner law). Moreover, since Hipparchus decided that the magnitude of topics less than a star brighter, then the formula contains a minus sign.

The following two properties help to use apparent magnitudes in practice:

  1. An increase in luminous flux by 100 times corresponds to a decrease in apparent stellar magnitude by exactly 5 units.
  2. A decrease in stellar magnitude by one unit means an increase in the luminous flux by 10 1/2.5 = 2.512 times.

Nowadays, apparent magnitude is used not only for stars, but also for other objects, such as the Moon and Sun and planets. Because they can be brighter than the brightest star, they can have a negative apparent magnitude.

The apparent magnitude depends on the spectral sensitivity of the radiation receiver (eye, photoelectric detector, photographic plate, etc.)

  • Visual magnitude ( V or m v ) is determined by the sensitivity spectrum of the human eye (visible light), which has a maximum sensitivity at a wavelength of 555 nm. or photographically with an orange filter.
  • Photographic or “blue” magnitude ( B or m p ) is determined by photometrically measuring the image of the star on a photographic plate sensitive to blue and ultraviolet rays, or using an antimony-cesium photomultiplier with a blue filter.
  • Ultraviolet magnitude ( U) has a maximum in the ultraviolet at a wavelength of about 350 nm.

Differences in magnitudes of one object in different ranges U−B And B−V are integral indicators of the color of an object; the larger they are, the redder the object is.

  • Bolometric the magnitude corresponds to the total radiation power of the star, i.e., the power summed over the entire radiation spectrum. To measure it, a special device is used - a bolometer.

absolute

Absolute magnitude (M ) is defined as the apparent magnitude of an object if it were located at a distance of 10 parsecs from the observer. The absolute bolometric magnitude of the Sun is +4.7. If the apparent magnitude and distance to the object are known, the absolute magnitude can be calculated using the formula:

Where d 0 = 10 pc ≈ 32.616 light years.

Accordingly, if the apparent and absolute magnitudes are known, the distance can be calculated using the formula

The absolute magnitude is related to luminosity by the following relationship: where and are the luminosity and absolute magnitude of the Sun.

Magnitudes of some objects

An object m
Sun −26,7
Full Moon −12,7
Iridium Flash (maximum) −9,5
Supernova 1054 (maximum) −6,0
Venus (maximum) −4,4
Earth (looking from the Sun) −3,84
Mars (maximum) −3,0
Jupiter (maximum) −2,8
International Space Station (maximum) −2
Mercury (maximum) −1,9
Andromeda Galaxy +3,4
Proxima Centauri +11,1
The brightest quasar +12,6
The faintest stars visible to the naked eye +6 to +7
Faintest object captured by an 8-meter ground-based telescope +27
Faintest object captured by the Hubble Space Telescope +30
An object Constellation m
Sirius Big dog −1,47
Canopus Keel −0,72
α Centauri Centaurus −0,27
Arcturus Bootes −0,04
Vega Lyra 0,03
Chapel Auriga +0,08
Rigel Orion +0,12
Procyon Small dog +0,38
Achernar Eridanus +0,46
Betelgeuse Orion +0,50
Altair Eagle +0,75
Aldebaran Taurus +0,85
Antares Scorpion +1,09
Pollux Twins +1,15
Fomalhaut Southern fish +1,16
Deneb Swan +1,25
Regulus a lion +1,35

The sun from different distances


Magnitude

A dimensionless physical quantity characterizing , created by a celestial object near the observer. Subjectively, its meaning is perceived as (y) or (y). In this case, the brightness of one source is indicated by comparing it with the brightness of another, taken as a standard. Such standards usually serve as specially selected fixed stars. Magnitude was first introduced as an indicator of the apparent brightness of optical stars, but later extended to other emission ranges: , . The magnitude scale is logarithmic, as is the decibel scale. On the magnitude scale, a difference of 5 units corresponds to a 100-fold difference in the fluxes of light from the measured and reference sources. Thus, a difference of 1 magnitude corresponds to a light flux ratio of 100 1/5 = 2.512 times. Denotes magnitude with a Latin letter "m"(from Latin magnitudo, magnitude) in the form of an upper italic index to the right of the number. The direction of the magnitude scale is reversed, i.e. The higher the value, the weaker the object's shine. For example, a star of 2nd magnitude (2 m) is 2.512 times brighter than a 3rd magnitude star (3 m) and 2.512 x 2.512 = 6.310 times brighter than a 4th magnitude star (4 m).

Apparent magnitude (m; often referred to simply as “magnitude”) indicates the flux of radiation near the observer, i.e. the observed brightness of a celestial source, which depends not only on the actual radiation power of the object, but also on the distance to it. The scale of visible magnitudes dates back to the star catalog of Hipparchus (before 161 c. 126 BC), in which all stars visible to the eye were first divided into 6 brightness classes. The stars of Ursa Major's Dipper have a magnitude of about 2 m, Vega has about 0 m. Especially bright luminaries have a negative magnitude value: Sirius has about -1.5 m(i.e. the flux of light from it is 4 times greater than from Vega), and the brightness of Venus at some moments almost reaches -5 m(i.e. the light flux is almost 100 times greater than from Vega). We emphasize that the apparent magnitude can be measured both with the naked eye and with a telescope; both in the visual range of the spectrum and in others (photographic, UV, IR). In this case, “visible” (English apparent) means “observable”, “apparent” and is not specifically related to the human eye (see:).

Absolute magnitude(M) indicates what apparent magnitude the luminary would have if the distance to it were 10 and absent. Thus, the absolute magnitude, in contrast to the visible one, allows one to compare the true luminosities of celestial objects (in a given spectral range).

As for spectral ranges, there are many systems of stellar magnitudes, differing in the choice of a specific measurement range. When observed with the eye (naked or through a telescope), it is measured visual magnitude(m v). Based on the image of a star on a regular photographic plate, obtained without additional filters, it is measured photographic magnitude(mP). Since the photographic emulsion is sensitive to blue rays and insensitive to red, blue stars appear brighter on the photographic plate (than it appears to the eye). However, with the help of a photographic plate, using orthochromatic and yellow, the so-called photovisual magnitude scale(m P v), which practically coincides with the visual one. By comparing the brightness of a source measured in different spectral ranges, one can find out its color, estimate the surface temperature (if it is a star) or (if it is a planet), determine the degree of interstellar absorption of light and other important characteristics. Therefore, standard ones have been developed, mainly determined by the selection of light filters. The most popular is three-color: ultraviolet (Ultraviolet), blue (Blue) and yellow (Visual). At the same time, the yellow range is very close to the photovisual one (B m P v), and blue - to photographic (B m P).

The unequal brightness (or shine) of various objects in the sky is probably the first thing a person notices when observing; therefore, in connection with this, long ago, the need arose to introduce a convenient value that would make it possible to classify luminaries by brightness.

Story

For the first time, such a value was used for his observations with the naked eye by the ancient Greek astronomer, the author of the first European star catalogue, Hipparchus. He classified all the stars in his catalog by brightness, designating the brightest as stars of 1st magnitude, and the dimmest as stars of 6th magnitude. This system took root, and in the middle of the 19th century it was improved to its modern form by the English astronomer Norman Pogson.

Thus, we obtained a dimensionless physical quantity, logarithmically related to the illumination created by the luminaries (the actual stellar magnitude):

m1-m2 =-2.5*lg(L1/L2)

where m1 and m2 are the magnitudes of the luminaries, and L1 and L2 are the illumination in lux (lx is the SI unit of illumination) created by these objects. If you substitute the value m1-m2 = 5 into the left side of this equation, then after making a simple calculation, you will find that the illumination in this case is related as 1/100, so that a difference in brightness of 5 magnitudes corresponds to a difference in illumination from objects of 100 once.

Continuing to solve this problem, we extract the 5th root of 100 and we get a change in illumination with a difference in brightness of one magnitude, the change in illumination will be 2.512 times.

This is all the basic mathematical apparatus necessary for orientation in a given brightness scale.

Magnitude scale

With the introduction of this system, it was also necessary to set the starting point for the magnitude scale. For this purpose, the brightness of the star Vega (alpha Lyrae) was initially taken as zero magnitude (0m). At present, the most accurate reference point is the brightness of the star, which is 0.03m brighter than Vega. However, the eye will not notice such a difference, so for visual observations, the brightness corresponding to zero magnitude can still be taken as Vega.

Another important thing to remember regarding this scale is that the lower the magnitude, the brighter the object. For example, the same Vega with its magnitude of +0.03m will be almost 100 times brighter than a star with a magnitude of +5m. Jupiter, with its maximum brightness of -2.94m, will be brighter than Vega at:

2.94-0.03 = -2.5*lg(L1/L2)
L1/L2 = 15.42 times

You can solve this problem in another way - simply by raising 2.512 to a power equal to the difference in the magnitudes of the objects:

2,512^(-2,94-0,03) = 15,42

Magnitude classification

Now, having finally dealt with the hardware, let’s consider the classification of stellar magnitudes used in astronomy.

The first classification is based on the spectral sensitivity of the radiation receiver. In this regard, stellar magnitude can be: visual (brightness is taken into account only in the range of the spectrum visible to the eye); bolometric (brightness is taken into account over the entire range of the spectrum, not only visible light, but also ultraviolet, infrared and other spectra combined); photographic (brightness taking into account the sensitivity to the spectrum of photocells).

This also includes stellar magnitudes in a specific part of the spectrum (for example, in the range of blue light, yellow, red or ultraviolet radiation).

Accordingly, visual magnitude is intended to assess the brightness of luminaries during visual observations; bolometric - to estimate the total flux of all radiation from the star; and photographic and narrow-band quantities - for assessing the color indicators of luminaries in any photometric system.

Apparent and absolute magnitudes

The second type of classification of stellar magnitudes is based on the number of dependent physical parameters. In this regard, stellar magnitude can be visible and absolute. Apparent magnitude is the brightness of an object that the eye (or other radiation receiver) perceives directly from its current position in space.

This brightness depends on two parameters at once - the power of the luminary’s radiation and the distance to it. The absolute magnitude depends only on the radiation power and does not depend on the distance to the object, since the latter is assumed to be general for a specific class of objects.

The absolute magnitude for stars is defined as their apparent magnitude if the distance to the star were 10 parsecs (32.616 light years). Absolute magnitude for Solar System objects is defined as their apparent magnitude if they were located at a distance of 1 AU. from the Sun and would show its full phase to the observer, and the observer himself would also be at 1 AU. (149.6 million km) from the object (i.e. at the center of the Sun).

The absolute magnitude of meteors is defined as their apparent magnitude if they were located at a distance of 100 km from the observer and at the zenith point.

Application of magnitudes

These classifications can be used together. For example, the absolute visual magnitude of the Sun is M(v) = +4.83. and the absolute bolometric M(bol) = +4.75, since the Sun shines not only in the visible range of the spectrum. Depending on the temperature of the photosphere (visible surface) of the star, as well as its luminosity class (main sequence, giant, supergiant, etc.).

There are differences between visual and bolometric absolute magnitudes of a star. For example, hot stars (spectral classes B and O) shine mainly in the ultraviolet range, which is invisible to the eye. So their bolometric brilliance is much stronger than their visual one. The same applies to cool stars (spectral classes K and M), which shine predominantly in the infrared range.

The absolute visual magnitude of the most powerful stars (hypergiants and Wolf-Rayet stars) is of the order of -8, -9. The absolute bolometric can reach -11, -12 (which corresponds to the apparent magnitude of the full Moon).

The radiation power (luminosity) is millions of times higher than the radiation power of the Sun. The apparent visual magnitude of the Sun from Earth's orbit is -26.74m; in the area of ​​Neptune's orbit it will be -19.36m. The apparent visual magnitude of the brightest star, Sirius, is -1.5m, and the absolute visual magnitude of this star is +1.44, i.e. Sirius is almost 23 times brighter than the Sun in the visible spectrum.

The planet Venus in the sky is always brighter than all the stars (its visible brightness ranges from -3.8m to -4.9m); Jupiter is somewhat less bright (from -1.6m to -2.94m); During oppositions, Mars has an apparent magnitude of about -2m or brighter. In general, most planets are the brightest objects in the sky after the Sun and Moon most of the time. Because there are no stars with high luminosity in the vicinity of the Sun.

Let's continue our algebraic excursion to the heavenly bodies. In the scale that is used to assess the brightness of stars, they can, in addition to fixed stars; find a place for yourself and other luminaries - planets, Sun, Moon. We will talk specifically about the brightness of the planets; Here we also indicate the magnitude of the Sun and Moon. The stellar magnitude of the Sun is expressed by the number minus 26.8, and the full1) Moon – minus 12.6. Why both numbers are negative, the reader should think, is clear after everything that was said earlier. But perhaps he will be puzzled by the insufficiently large difference between the magnitudes of the Sun and the Moon: the first is “only twice as large as the second.”

Let us not forget, however, that the designation of magnitude is, in essence, a certain logarithm (based on 2.5). And just as it is impossible, when comparing numbers, to divide their logarithms by one another, it makes no sense, when comparing stellar magnitudes, to divide one number by another. The following calculation shows the result of a correct comparison.

If the magnitude of the Sun is “minus 26.8”, then this means that the Sun is brighter than a star of the first magnitude

2.527.8 times. The moon is brighter than a star of the first magnitude

2.513.6 times.

This means that the brightness of the Sun is greater than the brightness of the full Moon at

2.5 27.8 2.5 14.2 times. 2.5 13.6

Having calculated this value (using tables of logarithms), we get 447,000. This is, therefore, the correct ratio of the brightnesses of the Sun and the Moon: the daylight in clear weather illuminates the Earth 447,000 times more powerfully than the full Moon on a cloudless night.

Considering that the amount of heat emitted by the Moon is proportional to the amount of light it scatters - and this is probably close to the truth - we must admit that the Moon sends us 447,000 times less heat than the Sun. It is known that every square centimeter at the boundary of the earth's atmosphere receives from the Sun about 2 small calories of heat per minute. This means that the Moon sends no more than 225,000th of a small calorie to 1 cm2 of the Earth every minute (that is, it can heat 1 g of water in 1 minute by 225,000th of a degree). This shows how unfounded are all attempts to attribute any influence on the earth’s weather to moonlight2).

1) In the first and last quarters, the magnitude of the Moon is minus 9.

2) The question of whether the Moon can influence the weather through its gravity will be discussed at the end of the book (see “The Moon and Weather”).

The widespread belief that clouds often melt under the influence of the rays of the full Moon is a gross misconception, explained by the fact that the disappearance of clouds at night (due to other reasons) becomes noticeable only under moonlight.

Let us now leave the Moon and calculate how many times the Sun is brighter than the most brilliant star in the entire sky - Sirius. Reasoning in the same way as before, we obtain the ratio of their brilliance:

2,5 27,8

2,5 25,2

2,52,6

i.e. The Sun is 10 billion times brighter than Sirius.

The following calculation is also very interesting: how many times is the illumination provided by the full Moon brighter than the total illumination of the entire starry sky, that is, all the stars visible to the naked eye on one celestial hemisphere? We have already calculated that stars from the first to the sixth magnitude, inclusive, shine together as much as a hundred stars of the first magnitude. The problem, therefore, comes down to calculating how many times the Moon is brighter than a hundred stars of the first magnitude.

This ratio is equal

2,5 13,6

100 2700.

So, on a clear moonless night we receive from the starry sky only 2700th of the light that the full Moon sends, and 2700x447,000, i.e. 1200 million times less than the Sun gives on a cloudless day.

Let us also add that the magnitude of the normal international

“candles” at a distance of 1 m is equal to minus 14.2, which means that a candle at the specified distance illuminates brighter than the full Moon by 2.514.2-12.6, i.e. four times.

It may also be interesting to note that the searchlight of an aircraft beacon with a power of 2 billion candles would be visible from the distance of the Moon as a 4½th magnitude star, i.e. could be distinguished by the naked eye.

The true brilliance of the stars and the Sun

All the gloss estimates we have made so far have referred only to their apparent brightness. The given numbers express the brilliance of the luminaries at the distances at which each of them is actually located. But we know well that the stars are not equally distant from us; The visible brightness of the stars therefore tells us both about their true brightness and about their distance from us - or rather, about neither one nor the other, until we separate both factors. Meanwhile, it is important to know what the comparative brightness or, as they say, “luminosity” of various stars would be if they were at the same distance from us.

By posing the question this way, astronomers introduce the concept of the “absolute” magnitude of stars. The absolute magnitude of a star is the one that the star would have if it were located at a distance from us.

standing 10 "parsecs". Parsec is a special measure of length used for stellar distances; We will talk about its origin separately later, here we will only say that one parsec is about 30,800,000,000,000 km. It is not difficult to calculate the absolute magnitude of the star if you know the distance of the star and take into account that the brightness should decrease in proportion to the square of the distance1).

We will introduce the reader to the results of only two such calculations: for Sirius and for our Sun. The absolute magnitude of Sirius is +1.3, the Sun is +4.8. This means that from a distance of 30,800,000,000,000 km, Sirius would shine for us as a star of 1.3 magnitude, and our Sun would be of 4.8 magnitude, i.e., weaker than Sirius in

2.5 3.8 2.53.5 25 times,

2,50,3

although the visible brilliance of the Sun is 10,000,000,000 times greater than the brilliance of Sirius.

We are convinced that the Sun is far from the brightest star in the sky. However, we should not consider our Sun to be a complete pygmy among the stars around it: its luminosity is still above average. According to stellar statistics, the average luminosity of stars surrounding the Sun up to a distance of 10 parsecs are stars of the ninth absolute magnitude. Since the absolute magnitude of the Sun is 4.8, it is brighter than the average of the “neighboring” stars, in

2,58

2,54,2

50 times.

2,53,8

Although 25 times absolutely dimmer than Sirius, the Sun is still 50 times brighter than the average stars around it.

The brightest star known

The highest luminosity is possessed by an eighth-magnitude star inaccessible to the naked eye in the constellation Doradus, designated

1) The calculation can be performed using the following formula, the origin of which will become clear to the reader when a little later he becomes more familiar with “parsec” and “parallax”:

Here M is the absolute magnitude of the star, m is its apparent magnitude, π is the parallax of the star in

seconds. Consecutive transformations are as follows: 2.5M = 2.5m 100π 2,

M lg 2.5 =m lg 2.5 + 2 + 2 lgπ, 0.4M = 0.4m +2 + 2 lgπ,

M =m + 5 + 5 logπ.

For Sirius, for example, m = –1.6π = 0",38. Therefore, its absolute value

M = –l.6 + 5 + 5 log 0.38 = 1.3.

with the Latin letter S. The constellation Dorado is located in the southern hemisphere of the sky and is not visible in the temperate zone of our hemisphere. The star in question is part of our neighboring star system, the Small Magellanic Cloud, whose distance from us is estimated to be about 12,000 times greater than the distance to Sirius. At such a great distance, a star must have an absolutely exceptional luminosity to appear even of the eighth magnitude. Sirius, thrown just as deep into space, would shine as a 17th magnitude star, that is, it would be barely visible through the most powerful telescope.

What is the luminosity of this wonderful star? The calculation gives the following result: minus the eighth value. This means that our star is absolutely: 400,000 times (approximately) brighter than the Sun! With such exceptional brightness, this star, if placed at the distance of Sirius, would appear nine magnitudes brighter than it, i.e., would have approximately the brightness of the Moon in the quarter phase! A star that, from the distance of Sirius, could flood the Earth with such bright light has an undeniable right to be considered the brightest star known to us.

The magnitude of the planets in the earthly and alien skies

Let us now return to the mental journey to other planets (which we made in the section “Alien Skies”) and more accurately evaluate the brilliance of the stars shining there. First of all, we indicate the stellar magnitudes of the planets at their maximum brightness in the earth’s sky. Here's the sign.

In the sky of the Earth:

Venus.............................

Saturn..............................

Mars..................................

Uranus..................................

Jupiter...........................

Neptune.............................

Mercury......................

Looking through it, we see that Venus is brighter than Jupiter by almost two magnitudes, i.e. 2.52 = 6.25 times, and Sirius 2.5-2.7 = 13 times

(the magnitude of Sirius is 1.6). From the same tablet it is clear that the dim planet Saturn is still brighter than all the fixed stars except Sirius and Canopus. Here we find an explanation for the fact that the planets (Venus, Jupiter) are sometimes visible to the naked eye during the day, while stars in daylight are completely inaccessible to the naked eye.