Fractions vs. Decimals: Pros and Cons

Last time, we looked at what fractions are, and saw that fractions and decimals are two different ways to handle numbers less than 1 (or between integers). Here, we’ll look at several questions about why we need both forms, and whether one is better than the other. The comparison and contrast turns out to be very instructive.

Why do we need both?

First, this question from 1999:

Fractions or Decimals?

Hi,

My name is Alysia. My question is, why do we need both decimals and fractions to represent amounts less than one?

Doctor Rick answered:

Hi, Alysia. This is an interesting question.

Both decimals and fractions go back a long way. The Babylonians had something like our decimals, except their numbers were based on 60 instead of 10. The Egyptians used fractions, but their fractions all had 1 in the numerator (that is, they would if they had written them the way we do). Fractions as we know them were used by the Greeks. But not always - sometimes the Greeks preferred Babylonian-style "sexagesimals," and sometimes they preferred Egyptian-style "unit fractions."

In effect, Babylonian fractions were like our decimal numbers, except that (a) they had 59 “digits” instead of 10; (b) they had no 0 to fill empty places; and (c) they had no decimal point, so that “12” might mean 120, or 12, or 1.2, or .12, and you’d have to decide from context.

Egyptian fractions would be something like “\(\overline{3}\)” for \(\frac{1}{3}\) and “\(\overline{8}\)” for \(\frac{1}{8}\); for \(\frac{3}{4}\), they had to write something like “\(\overline{2} + \overline{4}\)” (\(\frac{1}{2} + \frac{1}{4}\)).

For more on Babylonian sexagesimal numbers (including “fractions”), see the MacTutor History of Mathematics site here. Similarly, see their explanation of Egyptian numerals (including fractions) here.

If I had to pick either decimals or fractions and never use the other again, which would I choose? That would be a hard choice.

Decimals are better than fractions when I need to do a lot of calculations. You add and multiply decimals just the same way you do whole numbers, except you have to keep track of where the decimal point goes.

We have chosen decimals over fractions in designing computers and computer languages, and calculators, too. Computers and calculators don't understand fractions.

Modern scientific calculators often have special buttons to enter or display fractions, and they can store numbers in fractional form; but this fails for some operations (such as roots), and they revert to their native language, decimals. The same is true, ultimately, when we work on paper.

But decimals pose a problem. When you write 1/8 as a decimal, it's 0.125 - it has more digits than 1/8. Multiplying 24 times 0.125 by hand takes more work than multiplying 24 by 1/8. This will happen with a lot of fractions.

Now that we have calculators and computers, the number of digits doesn't matter so much. We can live with 0.125 instead of 1/8. But it gets a lot worse. When you try to write 1/3 as a decimal, you get 0.333333333333333333333333333333333333333333333333333333333333333333... I'll stop there, but you get the idea. You can't write the decimal EXACTLY, because it would go on forever.

This problem shows up occasionally on calculators. You do some calculation and you know the answer should be 1, but it comes out as 0.999999999. That's very close to 1; in fact, if the 9's went on forever, it would be exactly the same as 1 - but it doesn't look right at all.

So although decimals are more convenient in some ways, they are much bulkier and clumsier when you want precision.

For a long time people preferred to work with fractions rather than decimals. That's part of the reason for all the strange unit conversions we have: 1 quart is 1/4 gallon, 1 inch is 1/12 foot. Decimals must have started gaining the upper hand by the time the metric system was developed, in the 1790's. With the advent of computers, decimals really took over. Fractions are not nearly as important now as they once were.

But there is still a place for fractions. In math, we often want to keep exact results. Since 1/3 is an exact number but 0.3333 is only approximately the same number, we have to write the fraction 1/3 in order to keep it exact. I hope your teacher insists on this - it's hard to tell whether you really understand what you are doing if you just punch numbers into a calculator and write down the decimal result. If you're wrong, it's hard to tell what mistake you made. Exactness is good.

As I mentioned last time, fractions give you the flexibility to use the appropriate “unit” (denominator) for your purposes; but that makes for awkward mixtures of different denominators. Decimals are more uniform (a major benefit of the metric system), but are too rigid to handle exact values.

If I had to choose, I'd have to go along with decimals as long as I need to use a calculator. But I would really miss fractions when I need to do math by hand (I don't always have a calculator) and when I want to do math exactly. I'm glad I don't have to make that choice. You'll be glad, too, if you learn to work with both decimals and fractions well.

So the answer is: Having more choices is a good thing, not a problem!

Which is more precise?

Picking up on that idea of exact results, we have this question from 1998:

Fraction or Decimal?

Dr. Math,

Please settle a bet... 

In general, which is more precise, a fraction or a decimal (for instance, 1/3 vs. 0.33)?

Thanks,
Jessica

The big question is going to be, “What do you mean by ‘precise’?”

Doctor Jerry was the first to reply:

Hi Jessica,

I'm guessing that you are thinking of the decimal representation of fractions. For example 1/4 = 0.25. In this case, both 1/4 and 0.25 have equal precision. 

I wrote 1/4 = 0.25 because these two things represent exactly the same number. The fraction 1/3, however, is different in that if you divide 1 by 3 you will get 0.3333333....  The threes never stop. If you divide 1 by 4, you get 0.25 and that's it. So, I can say that 1/3 = 0.3333.... (the dots mean that the 3 is repeated indefinitely)  but 1/3 is not equal to 0.33. In this case, 1/3 is more precise. In fact, 1/3 - 0.33 = 1/3 - 33/100 = 1/300, which is the error you would commit if you were to use 0.33 in place of 1/3.

So, for some numbers, we can be equally precise in both forms (though perhaps fractions can be more concise!); but for other numbers, it is impossible for decimals to be exactly what we want them to be.

Then I added some more detailed observations, giving two contrary answers:

Hi, Jessica -

This is a fascinating question, because it leads into some ideas worth thinking about.

Fractions are more precise …

My first answer is that fractions are unquestionably more precise, in at least two ways. First, any rational number can be exactly represented by a fraction (that's what a rational number is, in the first place), while most rational numbers can't be exactly represented by a decimal. 

Your example of 1/3 makes this very clear. It may take a huge numerator and denominator to represent some numbers, but even a simple little number like 1/3 can't be represented exactly by any number of decimal places (unless you use a notation to indicate a repeating decimal, in which case it is just as exact as a fraction). In fact, *most* rational numbers do not produce terminating decimals; only those whose denominators contain only factors of 2 and 5 can be represented exactly by a finite decimal. So fractions mean exactly what they say, while decimals are usually just approximations.

If “precise” means “always exactly what we mean to say”, then decimals definitely are not.

Secondly, when you work with fractions, you don't lose any of that precision, as long as you are only adding, subtracting, multiplying, dividing, and taking (integer) powers. You have probably had the experience of doing a series of calculations on a calculator and finding that the answer was .99999998 when you expected 1.0; that's because a calculator can only store a limited number of decimal places, and calculations can increase the error caused by rounding until it becomes noticeable. With decimals, that is unavoidable, because you can never store all the digits; with fractions, it will only happen when the numerator or denominator gets too big to handle.

Fractions rarely have to be rounded, so they are more precise. (As we’ll see, though, that is mostly true only of school problems.)

On the other hand, sometimes a number can be very precise, but not really accurate. How can that be? I can think of two cases where fractions are inaccurate. First, there is the mathematical problem of "real" numbers: not all numbers are rational. If you take the square root of 1/2, the result can't be represented by any fraction, so you would have to approximate it by some fraction, such as 29/41. Then your answer looks precise, but the precision is misleading, because it doesn't accurately represent the truth! In fact, since most real numbers are irrational, most numbers can't be represented accurately by a fraction!

In a sense, the problem with fractions is that they are more precise than they ought to be in real problems. A precise statement of what is really an approximation is a lie.

… no, decimals are more precise!

Now I started my second answer, continuing from that last thought:

Second, there is the scientific problem of "real" numbers: nothing we can measure in the real world is exact, so the precision of a fraction doesn't accurately represent our knowledge. If I measure something as 1/2 inch, it may really be 1001/2000 inch. Again, the precision of my fraction is misleading. I don't really know that it is exactly 1/2 inch; the fraction is just an approximation.

A benefit of decimals is that they provide an easy way to indicate how precise your measurement is. If I read the length off a ruler, I can say it's 0.5 inch; if I use a laser to measure it, I might say 0.50000 inch, because I know that my measurement was no more than 0.000005 away from the correct value. That way, the precision of my number reflects the accuracy of my measurement, and I am not implying more precision than I really have. To put it another way, decimals give me a way to control my level of precision, and in that way can be said to be more precise than fractions!

There is no equivalent of significant digits in the world of fractions.

To sum this up: Fractions are technically more precise, but either one is only as accurate as you make it; both can be used either as an approximation or as an exact value. Working with rational numbers, a decimal will usually be only an approximation; but with real numbers (in either sense), you usually can do no better than an approximation anyway.

You should have known you wouldn't get a simple yes or no answer that would settle your bet. You'll have to decide which of you is right, based on how you are defining precision!

Most bets we are asked about end the same way.

Why use decimals?

Finally, we have this 2011 question from a teacher:

Decimals Versus Fractions: Pluses and Minuses

I have just started teaching middle school math. My students' questions routinely take the form of, "Why do they write it that way"? My answer is usually to equate "math shorthand" with the texting lingo of kids -- just a quicker way to write something.

When asked what they thought the hardest part of math is, many answered fractions and decimals. So when we talked about how a decimal represents the same thing as a fraction, they asked, "Why use decimals?" I really don't know that answer, unless it too is simply another shorter, quicker way to express quantities.

My thought is that .023 is easier, quicker, and neater than writing out 23 over 1,000. However, I don't want to pass that along if there is more to it than that. Is that why, or is there another reason?

I guess I'm searching for a deeper meaning of decimals and their use.

Doctor Ian answered this time. First, on the idea of “math shorthand”, applied to decimals as in Terri’s example:

That's a nice way to tie it to something they know.

But there's also more to it than just writing things more quickly. And sometimes the fraction is actually quicker to write.

For example, I'd rather write this ...

  1/7 

... than this ...
    ______
  0.142857

... wouldn't you?

As we’ve seen, sometimes fractions are more compact, and sometimes decimals are. Each has its place.

Addition

Next, on the advantages of decimals:

One way to get them to answer this for themselves is to ask them to do some basic operations using the two different notations. What they will find, I think, is that for the most part, decimals make it easier to add things:

   1/4  +  1/5 = 5/20 + 4/20
               = 9/20

   0.25 + 0.20 = 0.45 

But in some sense, that's just because decimals already have common denominators (or are nearly there if you just tack some zeros on the end):

     0.12   + 0.3456 
   = 0.1200 + 0.3456 
   = 0.4656

So if you have fractions with the same denominator, there's really no advantage. In fact, sometimes the fractions are easier, e.g.,

        1/7 + 2/7      = 3/7
  
   0.142857 + 0.285714 = ... I don't even want to add those!

This is what I like to do when a student asks, “Why do we have to do it this way?” or “What can’t we do it that way?”: tell them to try it for themselves and find out. That’s good for teachers, too: We can be so accustomed to one way to do things that we forget there are other ways.

Do you note how each example above is contrived to make a particular point? Common denominators or not, terminating decimal or not, that can make a huge difference. Some fractions are easy to add, some decimals are easy to add — and others are horrible.

Comparison

Using decimals can also make comparisons easier:

   3/5 < 4/7 ?         Uh, maybe.

   0.6 < 0.57 ?        Clearly not!

Did you notice that he didn’t use an exact decimal for 4/7? A little wisdom helps make it easier than it could have been. On the other hand, given the fractions, there are easier ways to compare than to use decimals. (We’ll be getting to that later in this series.)

Multiplication

On the other hand, with fractions, multiplication is easy, while with decimals it's more work:

    3/4 * 2/3   = 6/12
                = 1/2

   0.75 * 0.667 = 0.50025

And fractions have the nice property that you can often cancel numerators and denominators, so you don't even have to multiply:

       /   /   /
   2   3   4   5   2   1
   - * - * - * - = - = -      
   3   4   5   6   6   3
   /   /   /

Compare that to the decimal alternative, 

  0.667 * 0.75 * 0.8 * 0.833 = 0.3333666

The fact that decimals get longer when you multiply makes this even worse.

Note, too, that in all these cases, the result from the fraction is exact, while the result with the decimal has to be rounded off somewhere... which may actually have adverse consequences in a later calculation.

The invention of calculators skewed things in favor of decimals, since they're usually easier to enter and to show on a simple display. But the tide is turning back as systems like Mathematica, which can do exact calculations, become more common.

In part, rounding is useful because it lets us make the work easier (as long as we don’t round too much and cause errors); and it is not as bad as it might otherwise be, because most calculations don’t need precision in the first place.

So in some sense, the reason we have both fractions and decimals is the same reason we have different kinds of hammers, and different kinds of saws, and different kinds of screwdrivers.... Any notation is a kind of tool, and not every tool is the best for every job.

And that's a good lesson for your students to learn, because it's really quite rare in mathematics for there to be one best way to solve a problem. Part of the art of mathematics is in deciding which approach is likely to be best, before jumping in with the first one that occurs to you, or the one you habitually use.

Our students need a full toolkit, and the wisdom to select the right tool for the job.

Terri responded:

Yes, that helped tremendously! (That's why I love this site!)

I really like comparing the various representations to saws, hammers, and drills.... different tools for different jobs. That will help them so much.

My goal is to help these kiddos truly understand numbers, not just memorize steps and procedures. I think your analogy and examples will help with that a lot.

2 thoughts on “Fractions vs. Decimals: Pros and Cons”

  1. Thanks for sharing this thread. What a rare and unique treat to see in the mind of someone so passionate about numbers. I really enjoyed it and learned a great deal about many things I did not intend to. Best, Ryan

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.