Welcome to calculus. I'm professor Ghrist. We're about to begin lecture 53, on absolute and conditional convergence. >> Infinite series either converge or diverge, right? Well, the real situation is a bit more complicated than that. In this lesson we'll focus attention on series that have both positive and negative terms, and we'll refine our notion of convergence into absolute and conditional. >> When it comes to determining convergence or divergence, we have at our disposal a collection of tests. Sometimes, those tests work well. Sometimes, well, it's not so obvious how to proceed. Consider the following three series, all of which look very complicated. Following the rule of doing the ratio test first is maybe not such a good idea in this particular case. But I claim that these series are easy to figure out. The first two converge, the last diverges. Why is it so easy? It is easy because of this term in front, the negative one to the n. A term which, ironically, renders most of our prior tests useless. However, these are examples of alternating series, series whose terms alternate between positive and negative. Such series are very easy to work with. One often writes them in the form, sum over n of negative one to the n a sub n, where the a sub n terms are positive. Let's do one last test, the alternating series test. It begins as follows. If we have a decreasing positive sequence, a sub n, then we're going to look at the limit of these terms just as in the nth term test. The alternating series. Sum over n negative one to the n a sub n converges if and only if the limit is zero, and it diverges if and only if the limit is non-zero. This is in contradistinction to the nth term test, which was not if and only if. Now the bad part of this test is it's not very applicable. You must have an alternating series with decreasing terms. However, this test is as easy as can be, and because of the ubiquity of alternating series it's really quite useful. How can it be that we get such a strong result for this test? Well let us consider the convergence of an alternating series. Of course, we will look at the sequence of partial sums. That is the sum as n goes from zero to sum T, negative one to the n a sub n. What does the convergence of this sequence look like? Well, if it is an alternating series, then that means we start at a zero, we jump backwards by the amount a one, then we jump forwards by the amount a two, backwards by the amount a three, etc. If, as per our hypotheses, the sequence of a sub ns is decreasing, that means our jumps get smaller and smaller. This means in the limit, there are only two possibilities. If the limit of the a sub ns is zero, then our jump size decreases to zero and our series converges. If the limit of the a sub n is non-zero, then we oscillate back and forth by that limiting jump amount. As per the nth term test, the series does not converge. Let's see how this test works in the context of the three series with which we began our lecture. All three of these are alternating series, and hence, the alternating series test applies. The first is the sum of negative one to the n, pi to the n, log, n squared plus one over n factorial times hyperbolic cosine of n to the five-thirds. Now looking at this, it's pretty obvious that the n factorial term dominates all others. Hence the limit of the a sub ns is zero and this series converges. No other work necessary. Likewise, in the second series the sum negative one to the n of one over log of log cubed of log to the fifth of n. This one also has terms that are going to zero. Obviously, hence it converges without further work. The last example involving the cuperative end of the fifth minus three n cubed plus two over the square root of n cubed plus nine n squared plus one. Well that's not half so complicated as it looks. We could have easily computed the leading order terms of both of these and discerned through the nth term test that this diverges. This is true whether or not it's an alternating series. In all these cases, it's pretty simple. The contemplation of alternating series leads us to new definitions involving general series. We say that a series is absolutely convergent if not only does the sum of the ace of ns converge, but the sum of the absolute values of the a sub n terms also converges. We say that a series is conditionally convergent if it is a convergent series. That is, the sum of the a sub ns converges, but the sum of the absolute values of the a sub ns diverges. Of course, a series that only have positive terms in it can't be conditionally convergent. A conditionally convergent series is one that has enough negative terms, or maybe alternating terms, so as to force a weaker sort of convergence. Both of these are in distinction to a divergent series which does not converge at all. With these definitions there are exactly three possibilities for a series. If you're given the sequence, then the series, the sum of these events, might diverge. If it converges, one checks the convergence, the absolute values. If that does not converge, then you have a conditionally convergent series. Otherwise, you have an absolutely convergent series. These three are mutually exclusive. Every series is one of these three types and only one. Let's see some examples. If we take the sum from one to infinity of negative one to the n plus one times one over n squared. This is one minus a fourth, plus a ninth, minus one sixteenth, etc. What can we say about this? Well, it definitely converges based on the alternating series test. When we take the absolute values, what do we get? We get a P series with P equals two. That also converges. Therefore, this is an example of an absolutely convergent series. On the other hand, if we take what we might call the alternating harmonic series, negative one to the n plus one, over n. Then by the alternating series test, this does converge. However, when we take the absolute values of the terms we get the harmonic series, which diverges. Therefore, this is the canonical example of a conditionally convergent series. If we take the sum of negative one to the n, well we've already seen that, that is a divergent series, and taking away the negative signs is not going to change that, not at all. In general, if we look at an alternating p series where we take one over n to the p and have the signs alternate on it, then instead of having convergence, it's p bigger than one, and divergence for p less than or equal to one. In this alternating scenario we have absolute convergence. For p strictly bigger than one and conditional convergence for p less than or equal to one. For the remainder of this course, we're going to go back and reconsider all of the convergences that we have seen or implied and revisit them in light of our new definitions. Recall, from the very beginning of this class how we showed that log of one plus x is the alternating sum of x to the n over n. Now, you remember and I remember, that this only works when x Is strictly less than one in absolute value, just like the geometric series. However, I claim that we can bend the rules just a little bit. What happens when we substitute x equals one into this formula? We seem to get that log of two is one minus a half, plus a third, minus a fourth, plus a fifth, etc. Is this True? In the past, I've asked you to trust me. But now, you don't need me because you have enough tools at your disposal. The series on the right does converge by the alternating series test. However, since it is an alternating harmonic series, its convergence is conditional. It limits to the value on the left log of two. Now recall that this series is a little bit tricky. We have argued in the past that by rearranging the terms and combining in the appropriate manner, we seem to get something that is a different value. Namely, three half log two simply by rearranging the terms in the original series that was presented as a mystery. Here it is presented as a warning. The real reason for that strange behavior comes from the conditional convergence of this series. Whenever you have a conditionally convergent series, you should be cautious. It is a dangerous situation. You are so close to having a divergent series, you've got to be careful. On the other hand, when you have an absolutely convergent series, you're in great shape. It is a theorem that you can rearrange the terms in an absolutely convergent series at will, as you wish and the sum remains constant. Absolutely convergent series are very robust and will be very helpful when you're working with tailor series. On the other hand, this is not the case for conditionally convergent series. It is a result whose proof will not fit in this margin that given a conditionally convergent series, you can rearrange the terms to sum up to any number you wish. They are a bit dangerous, beware of them but trust in absolute convergence. >> The distinction between absolute and conditional convergence may seem a little academic. After all, what does it matter to applications in the sciences? Oh, it makes a big difference, as we'll see in our next two lessons, which will take us back to the very beginnings of this course.