Sunday, April 28, 2013

On Targeting Civilians

Targeting civilians is always a war crime or crime against humanity. The leeway for incidental civilian deaths

The Boston Marathon attack, in addition to being an ordinary crime, was a war crime. It is fair to conclude now that the attackers were motivated by religion and anti-imperialist sentiments. One also seemed to have a personal grudge against America because as a non-citizen he was excluded from the national level of boxing competition, and then was further injured when he had his application for citizenship denied. Had he been made a citizen he would doubtless have gone back to boxing and done only injury to other volunteers for that brutal and uncivilized sport.

We cannot say the same for policies made and conducted by powerful politicians, including our own.

President Barack Obama seems to think that if he minimizes his war crimes, crimes against humanity, and attacks on human rights, he is okay. He is just driving a little over the speed limit because his passengers - capitalists and the corporate security state - want him to get to a preconceived goal a little bit faster than the rules allow for.

So if some women and children get blasted to pieces by a drone or a helicopter gunship or trigger-happy U.S. gunmen on patrol somewhere, Barack figures he has his ass covered. I can hear him at the trial: "I never gave orders to purposefully kill civilians."

I imagine his predecessor Franklin D. Roosevelt would have said the same thing, had anyone been powerful enough to put him on trial. Before World War II, starting with the Japanese invasion of China, President Roosevelt inveighed against any bombing of urban areas. It always involved civilian casualties, and it was a war crime, he pointed out. At least it was a war crime when Japan or Germany or Italy did it. When our ally, the British Empire did it, Roosevelt grew mute. Then he authorized the firebombing of Japanese cities, which was on a far larger scale than had ever been done up until that point. In hypocrisy, at least, the U.S. is Number One.

Or Barack could point to Harry Truman. His team of Democratic Party experts recommended dropping the first two atomic bombs on Japanese cities, after having rejected proposals for demonstration bombings that would have caused minimal casualties, or the destruction of only military targets or personnel. You can almost hear Harry's defense lawyer, "But he did not order the Air Force to bomb civilians. There were some military personnel at Hiroshima and Nagasaki. We regret the civilian casualties."

It would take an encyclopedia to cover all of the war crimes and crimes against humanity since the inception of the United States of America.

So don't be surprised in foreigners take American indignation at American civilian casualties with a grain of salt.

Let's get rid of the Democratic and Republican Parties. Let's try George W. Bush, Barack Obama, and other top U.S. decision makers for the War Crimes and Crimes Against Humanity they have committed.

Let's put an end to U.S. imperialism. Let's remove all U.S. troops from foreign territories.

Only then can there really be Liberty and Justice for All.

See also: war crimes

Friday, April 26, 2013

The Accounting System #5: Physical Money As Portable Accounting

previous: #4: The Big Reversal

During the Age of Coin the normal relation of reality to symbols of reality was inverted for many people. This inversion permeates human culture even today, long after the Age of Coin (roughly 1400 to 1900) has faded into history.

Leaving aside human relations, possessions and their use are generally believed to hold value. Food, clothing, housing, tools, and land have obvious value. Coin, however, came to be the common denominator allowing people to think about the value of mixtures of types of possessions. A farmer could come up with a total numeric value of his land, cows, sheep, and money held in bank accounts and cash. An urban investor know how much his stocks were worth, and the shares of companies they represented in turn could be accounted for in terms of projected future profits and current assets that might include factories, inventories, and money in bank accounts, less what was owed on loans.

It was a convenient fiction that any citizen could turn all assets into a pile of coin or paper currency, stand naked beside the pile, and know his or her worth. Even more conveniently, one could come close to this by selling everything, putting the resulting money into a bank account, and looking at a single number on a bank statement.

In reality not everyone could convert all their assets to currency or even into bank accounts at once. There was not enough currency to meet the demand if too many people demanded cash at any given time. The history of the United States of America, from colonial times until the creation of the Federal Reserve, was punctuated by proofs of this. Metallic coin, preferably silver or gold coin, was always in short supply because at first it had to be imported. After about 1850, when large amounts of gold and silver were found and mined in the states, coin remained in short supply because the rest of the economy expanded at such a fast rate. The Free Silver (coinage) and Greenback political campaigns encapsulated the deadening effect on the economy of the mindset that only gold could be real money, or represent real value.

Going back to our naked ape sitting beside a pile of coin (perhaps have kept aside a knife or six-shooter to defend it with), pretty soon, except in cases of insanity, hunger or discomfort would lead to spending down the pile. If no one was willing to trade food for gold, the pile was useless, and without value. We know from history that in times of famine scraps of food came to be "worth their weight in gold."

Another problem would occur if everyone tried to liquidate their assets at once. In that hypothetical situation, there would be no buyers. The value of truly valuable things drops to nothing if buyers cannot be found. This can be true even when there is plenty of money (coin, cash, account balances, or credit) to buy the assets. Thus we have had a series of asset bubbles over the centuries, some of the most famous being the Tulip bubble, the South Seas bubble, the Florida land bubble, the Internet stock bubble, the housing bubble of 2005-2006, and (as of April 15, 2013) the Gold bubble. In a bubble money is traded for something believed to have real value, until only the most foolish citizens can't see that the money value exceeds the real value. Then, when buyers become scarce enough, bubbles burst. Tulips are just tulips again, not investments. (My favorite bubbles: tropical fish bubbles and the farmed mink bubble).

Next: #6: Counting, Coins, and Crashes

[The Accounting System, Your Fate is in the Cloud, is a work in progress by William P. Meyers, ©2013]

Sunday, April 21, 2013

The Accounting System #4: The Big Reversal

Previous: Banks, Credit Cycles, and Gold

Decade by decade, The Accounting System grew in size and importance. Since some of the accounting by corporations listed on the American stock markets was found to be fraudulent in the aftermath (pun intended) of the 1929 crash, the Securities and Exchange Commission (SEC) undertook to establish public accounting standards. Over time these came to be issued by the Financial Accounting Standards Board. The Generally Accepted Accounting Principles (GAAP) are used by all accountants in the United States. Similar systems are used in other nations.

When a statement about money is made that diverges from GAAP, accountants refer to those figures as non-GAAP. Non-GAAP figures issued by publically listed corporations during the Internet Stock Bubble led to the current era in which any non-GAAP figures released by traded companies are expected to be issued with an explanation of their relationship to GAAP figures.

The possession by individuals of gold coins and bullion (but not jewelry) was illegal in the U.S. from 1933 to 1974. Because there was inflation during this period, some economists claim the gold standard is necessary for a sound economy. Reasoning from ideology rather from fact, they refuse to look at the fact that using gold as money had been a historic disaster. Overall 1933 to 1974 for a great run for the American economy, showing that the gold standard had holding the nation back, rather than helping, back before 1929.

Credit cards as an idea go back at least as far as the novel Looking Backwards by Edward Bellamy, first published in 1887. Credit cards in their modern form first became common in the 1960s, when perhaps 100 million were mailed, often unsolicited, to bank customers. While there are many other forms of credit, including business loans, loans on property, and student loans, the credit card and its relative the debit card put The Accounting System in the pockets of most Americans (and their international equivalents).

This led to a paradoxical role reversal. We do our accounting in units of coin (the dollar in the U.S., originally silver coins issued by the Dutch and then Spanish governments). For portability these units are mainly represented by paper currency, mostly in the form of $20 bills accessed from ATMs. While banks count paper currency in their accounting systems, and businesses still account for such bills until they can be turned into real money (bank deposits), individuals seldom systematically account for the cash they carry or keep. Each note has a serial number, which the Federal government records, largely as a defense against counterfeiting.

In effect cash is now at the periphery of The Accounting System. Almost all money in the United States, and even globally, now consists of entries in the accounting system. These entries are not even hard copies on paper ledgers. They are held in computer databases. They are electronic representations of 1's and 0's. If something ate the banking aspect of the accounting system, they would disappear.

Thus units of value were once counted by coin, but now value is direct relation of numbers of virtual accounting dollars to whatever real goods and services they can buy at any given time.

The Internet itself led to another major expansion of The Accounting System, and a consolidation of its power over people. The Internet tied computers together, including the computers used by banks, government, corporations and individuals for accounting. This led to pressure on individuals and businesses to use electronic fund transfers rather than physical checks for all deposits and withdrawals. Physical checks are now photographed rather  than returned to their writers after cancellation. We are rapidly moving in the direction of it being difficult to use cash or even paper checks to make payments for goods and services. Physical property is increasingly identified; even ordinary jewelry diamonds have small serial numbers burned into them.

All sorts of extensions of The Accounting System have come into existence. Medical records are being virtualized. Most forms of messaging via Internet go into permanent records. Almost every action made by any individual on the Internet is recorded, and those "personal profiles" are increasingly being exchanged by corporations and governments.

The Accounting System knows what your balance is, your debt level and credit worthiness, where you shop, what you like, what you own, and whether you can get permission to drive or fly on a commercial airline or cross an international border.

The Accounting System is not just counting and recording. It is an increasingly active system that determines the fates of individuals. It has traps for the unwary and free rides for the privileged. Parts of it are fair, but much of it is biased.

Next, the virtualization of money will be examined at a much higher level of detail.

Next: #5: Physical Money as Portable Accounting

[The Accounting System, Your Fate is in the Cloud, is a work in progress by William P. Meyers, ©2013]

Friday, April 19, 2013

The Accountings System #3: Bank Cycles of Credit and Gold

The banking system, which was at the heart of the accounting system, was also found to be wanting for a modern economy. This is because the banking system had three separate functions which did not go well together. It became the core of the social accounting system: a man might claim to be rich, but it his bank checks bounced, people knew the reality was otherwise. It allowed its depositors to keep their money safe and perhaps earn a bit of interest. To make a profit, bankers had to loan out the money of their depositors at an interest rate sufficient to keep their depositors happy, pay for the costs of operation, and still come out ahead.  This was the credit system.

The credit system of the early 1800's in the United States is worth a close look because of the way it resembled and differed from our present system. Individuals were more likely to be sources of credits, which typically took the form of "IOUs." Andrew Jackson, before he became President, is a well documented example. Like most Americans of the era, he seldom could put his hands on much cash or coin, but he owned land, race horses, and slaves of considerable value. If he purchased goods he would write a personal note, an IOU, which he pledged to redeem at a later date, say when he received some cash to pay for his cotton crop.

A creditor holding the Jackson note might want to spend it before then, and would sign it over to another man, perhaps to buy a horse. If a man was considered to have good credit, as Jackson was, his notes might circulate for some time and be considered more sound than bank notes. If a note was collectable, and you took it to the Hermitage to demand payment, and Jackson still had no cash, he might try to pay you off with a slave, dog, horse, or perhaps some wine or whiskey. In effect Andrew would discount his own note, giving you, perhaps, a $250 race horse for a $300 note. As long as creditors were happy with the exchange his credit remained good. Similarly, if you were another local slaver and wanted to buy some possession of Jackson's and he was willing to sell, he would probably accept your IOU (which, along with the shortage of gold, was one reason most people had very little cash to play with).

The primitive banks of that era were in a strangely similar position to private individuals like Jackson. The bank might actually own nothing but a license from the state (usually obtained by bribing legislators), but usually began with a little bit of gold or silver coin. Banks would take deposits – hopefully coin, but also other bank's notes – and then would start making loans, which is to say, creating credit. They tried to avoid loaning their coins, instead issuing their own bank notes. If too many people came in demanding that the bank notes be redeemed in coin, the coin would run out, depositors would demand their deposits, and the bank would fail. The more clever bankers liked to loan to people a goodly distance from home, so that their notes would circulate afar and be unlikely to be redeemed.

This system led to credit cycles of boom and bust. When people were confident in the banknotes and IOUs a speculator could buy land and be confident of selling it in a year or two for more money. Sound familiar? During a boom farmers got good prices for their crops, and manufacturers had little problem selling their wares to the farmers. When credit contracted, as it invariably did, no one wanted to take an IOU or banknote. With little gold to go around, commerce collapsed and people returned to bartering until time healed the wounds and another upward cycle began.

Eventually Americans got tired of this ridiculous system, made banknotes illegal, and tried various banking reserve systems cumulating in the Federal Reserve System. Only the federal government could issue paper money, and that was backed by gold or silver. But that system did not work long either, as the Great Depression proved.

Next: The Big Reversal

[The Accounting System, Your Fate is in the Cloud, is a work in progress by William P. Meyers, ©2013]

Thursday, April 18, 2013

The Accounting System #2: The Coin Age

Once upon a time there was no Accounting System except Nature itself.

We do not know exactly when in history humans began counting in earnest. We know that some animals have some ability to count, for instance to notice when a child has gone missing. We also know that most animals can see the difference between "more" and "less."

By the time of the early urban civilizations we know about (Egypt, Palestine and Mesopotamia, and China for example) counting and keeping records of counts was a well-established set of skills. Rulers wanted to know how many cows were in their herds, how many soldiers were available for battle, and whether subordinates had contributed their fair share of grain to the royal stores. Merchants needed to track their inventories, as did anyone who farmed on a large scale. This keeping of records of counts of things is the earliest manifestation of The Accounting System.

The ledger, or written record of counts, thus preceded what we now call cash and coin. Coin is generally treated as having inherent value, and for that reason tended over the millennia to standardize on three metals: gold, silver, and copper. Coin stands as a way station between bartering (directly trading one kind of good or service for another) and systemic accounting through modern record keeping.

People counted coins, and thus accounting and bookkeeping seemed to be the art of coin counting. This was the case with the improved accounting systems of early Renaissance Italy. This led to misconceptions, both popular and among the professional accountants, which persist to this day.

Counting things other than coin did not go away, but accountants and ordinary people came to start measuring all things by their value in coin. To keep his books straight, a farmer might count his cows, multiply by a set value per cow, and account for the total as an asset in units of currency.

During the Coin Age (roughly 1400 to 1900) other aspects of the accounting system evolved and expanded. People, from peasants to kings, still needed to count their things. The most important thing to count was land, but that was somewhat more complicated than counting cows or shillings. It required a title system (note the term likely evolved from titles such as duke, lord, sir, and mister) and surveys, and a legal system as well. People who lived on land but had no legal title to it lost it over the centuries. This was particularly obvious in the Americas, where the natives were dispossessed of almost all of their tribal land. Today the land title system extends to every part of the world except the Antarctic.

The human identity system also expanded during the Coin Age. We know that ancient kings took censuses of their subjects. Various forms of identity papers evolved, particularly in Europe. Passports and visas were required for travel. Birth certificates evolved from baptismal records into a pervasive system that came to account for most births. Place of birth was attached to nationality, and the various national identity systems, including Social Security numbers, drivers licenses, and death certificates in the United States, evolved into a system that accounts for each individual human living and dead.

The rapid rise of industrialism, including the rise of industrial methods of agriculture, and parallel expansion of the global human population, put strains on the economic systems of the Coin Age.

In a throwback to the Platonic (or medieval Scholastic) system of intellectual architectures that don't reflect reality, Gold was declared by many people to be the only "real" money. This proved to be impractical to the point of economic disaster.  At times the supply of gold did not grow as fast as the economy, leading to recessions and depressions. At other times the supply of gold from new discoveries grew rapidly, leading to inflation (it took increasing amounts of gold to buy other goods like cows, houses, and services).

Next: Bank Cycles, Credit and Gold

[The Accounting System, Your Fate is in the Cloud, is a work in progress by William P. Meyers, ©2013]

Wednesday, April 17, 2013

The Accounting System. Introduction: why you should care

Forget the stars. The Accounting System determines your fate. The Accounting System determines whether you are born with or without money and with or without opportunity. It determines your level of childhood mortality, your schooling, your pay if you work, and the size of your estate when you die.

Fortune favors those who are favored by the accounting system. Not surprisingly, those who built the system are those most favored by it.

Within The Accounting System individual people have choices they can make, but as each decade passes those choices are narrowed for most individuals. Because The Accounting System is so complex and pervasive, the naïve may not even realize it exists. People mistake one or more parts for the whole. They may think it is neutral, a mere method certain people use to keep score, when in fact it increasingly determines your score. You score, in the simplest terms, is the money you have to spend, the things that The Accounting System lists as owned by you, and your reputation.

Unless you are lucky enough to be blessed by the System from birth, or to be selected as one of its favorites at some point in your life, it would be a good idea to learn all you can about The Accounting System. Your success or failure in life is highly dependent on how well you understand the System.

Learning is not just collecting facts. It requires assembling those facts into a mental picture that is an accurate reflection of reality. Because The Accounting System is large and complex, there are some steps that will help you learn about it. By stepping back from the system you should be able to see it in perspective. Then you should be able to see where you are in the system, and what paths are available to move to where you want to be in the system.

We will begin with the story of the development of accounting over history, a temporal perspective. That will bring up many subjects that will be developed in more detail in later chapters of this book.

next: The Coin Age

[The Accounting System, Your Fate is in the Cloud, is a work in progress by William P. Meyers, ©2013]

Sunday, April 7, 2013

American War Crimes Organizations and Nuremberg Article 9

Are you a member of the Democratic Party or the Republican Party? Then you had better hope that the military of the United State of America keeps the world at bay. Which is not likely to last forever, given that the U.S. government will become bankrupt if interest rates on its debts ever rise above an average of about 5%.

There is a precedent from the Nuremberg trials that allows Democrats and Republicans to be tried simply for being in those parties. Article 9 clearly states that for Crimes Against Peace, War Crimes, and Crimes Against Humanity:

Article 9.
At the trial of any individual member of any group or organization the Tribunal may declare (in connection with any act of which the individual may be convicted) that the group or organization of which the individual was a member was a criminal organization.

Thus it would only take one conviction of a Democratic Party or Republican Party official, living or dead, to turn all party members into criminals. As criminals they would lose their right to vote, which would allow non-war crimes political parties to gain the ascendancy, much as the Republican Party gained dominance after the Civil War by preventing Democratic Party leaders (who mostly had also been Confederate State leaders) from voting and running for office.

The Crimes Against Peace, War Crimes, and Crimes Against Humanity of the United States Government, and hence of her leaders and citizens, loom large in history, and not just in the distant past. It is too big of a topic to begin to compass in a short essay. But to illustrate, we need look no further than the events that provided the context for the Nuremberg trials of German political leaders, the losers of World War II.

Nuremberg defendants

Defendants at Nuremberg Trial, Franz von Papen standing

First, let's touch on the arguments against the legality of the Nuremberg trials, their sentencing and convictions. The U.S. Constitution (in article 1, section 9) prohibits ex post facto laws, but the entirety of the Nuremberg apparatus was created without the traditional legislation involving statutes with clear punishments for clear crimes. The trial was conducted without a jury and with no appeals apparatus, and the trial judges belonged to the injured parties. Despite war crimes being widespread during the period concerned, the charter of the tribunal specifically limits itself to prosecuting members of the Axis Powers (Germany, Italy, and Japan). Because the very legal propriety of the Tribunal should have been the main question, Article 3 states: Neither the Tribunal, its members nor their alternates can be challenged by the prosecution, or by the Defendants or their Counsel.

Nevertheless, the way of the Law, and particularly Common Law, is that Nuremberg established a precedent. If other nations can hold a kangaroo court and prosecute (and execute) people for war crimes, then the United States of America can legally be subjected to such a court, should our defensive arsenal ever allow it. Or should we have the wisdom to constitute one ourselves.

After the first round of leading Nazi (National Socialist German Workers' Party) officials and government leaders were convicted and hung until dead, the Nazi Party was declared a criminal organization by the tribunal. 8.5 million German citizens had been members, and many were soon tried in the Denazification Campaign, which was also ex post facto in nature.

The Nuremberg Trials were careful to not look too closely at how World War II actually started, repeating the lack of care of Allied historians for the timeline of events leading up to World War I. The judges refused to allow the defense to bring up that it was France that first declared war on Germany or that the German invasion of Norway at the beginning of of the war only managed to get there ahead of a British invasion by a few days. There was some small success, however, in the defense pointing to certain British and American war tactics, notably unrestricted submarine warfare, to push back against certain of the charges.

But, as is so often the case in the law, some mighty big fish got away because they were doing the prosecuting. The level of German bombing of British Empire targets in England was doubtless criminal, but nothing new, and nothing the British themselves had not done in 400 years of empire building. The unprecedented war crimes of World War II were Allied affairs: the complete destruction of cities by bombing, in particular firebombing, and eventually the atomic war waged against Japan.

Americans are so used to excusing their attacks on Japanese cities that they seldom can think clearly about war crimes. They excuse these attacks as military necessities. "Millions of U.S. soldiers would have died if we had been forced to invade Japan." But the U.S. was not being forced to invade Japan. The diplomatic history of the war is one of nearly continuous peace initiatives by the Japanese before, during, and right up to the end of the war, all spurned by President Franklin Roosevelt and his Secretary of State Cordell Hull. [Before the U.S. officially entered the war Roosevelt frequently decried bombing cities as a crime.]

If killing civilians is excused as a war crime because it saves lives of combatants, then the entire Nuremberg prosecution was flawed. The defense lawyers could have argued that Jews, foreign civilians and and German political prisoners who died in or were murdered in the concentration camps were impediments to military success, and that would have been true. The blitz against London could be interpreted as legally justified because it would have saved German lives in the invasion of England, had it taken place.

The bombings of Hiroshima and Nagasaki were war crimes plain and simple. President Roosevelt had died, so President Harry Truman made the ultimate decision. His advisors were all members of the Democratic Party. Only General Douglas MacArthur, who was a Republican and no bleeding heart, argued against the use of the atomic bomb on cities filled with civilians. He believed, correctly, that it stained the honor of American soldiers.

I see no reason for there to be a statute of limitations on war crimes organizations. The Democratic Party and Republican Party should be declared criminal organizations based on their past actions. Living leaders, mostly Presidents, who have committed war crimes, crimes of peace, or crimes against humanity should be prosecuted and punished appropriately.

The only real question is: what constitutes membership in these parties? Many more Germans voted for the Nazi Party in elections than actually belonged to the party. Party membership was clear: one paid one's dues and received a membership card. In the U.S. the parties don't keep membership rolls. It is hard to say if registering to vote in the primary of a political party actually constitutes becoming a member of a party. Certainly donating money to a party should be held as the rough equivalent of paying membership dues to the Nazi Party.

I guess we'll have to sort that out when be set up our Peace and Justice Tribunal. I'd like to hold it in Philadelphia, to remind us that crimes against humanity are in the U.S. DNA, and that restitution has never been made for the wars against native American tribes or for slavery, indentured servitude, or the more subtle crimes of the ruling class of this nation.

Wednesday, April 3, 2013

Raise The Medicare Contribution Rate

If Medicare and Medicaid will need more funding in the future, the logical and fair thing to do would be to raise the contribution rates. These raises could be small and gradual, say a quarter percent (0.25%) per year until contributions come in line with projected future expenditures.

The Medicare payroll deduction is not a tax. It is an compulsory insurance premium. It is currently 2.9%. For wage slaves the accounting of the 2.9% is that 1.45% is paid by the employer, and 1.45% paid by the employee. (Self-employed people pay both halves). In reality employers pay the full amount, of which half appears as a deduction when employees get the accounting for their pay check. For the history of this arrangement see Medicare History.

I think it would be a good thing to account for any rate increase on the worker side. Most businesses are profitable, and so most workers create more value than they are paid for, the difference going to management's usually bloated salaries and the profits of the owner. Capitalist apologists (who complain endlessly for their suffering rich clients) say that increased taxes on business discourage hiring. By placing the increase on the worker side of the paycheck it raises awareness that this is a necessary expense to pay for health care in your old age.

How would paychecks change? The most commonly paid wage in America is the federal minimum wage, currently $7.25 per hour. Assuming a 40 hour work week, raw wages are then $290.00 per week. At the current 2.9% rate, employers turn over $8.41 to the federal tax authority. Of that $4.205 appears as a payroll deduction to the workers. Each 0.25% increase would deduct a further $0.725 per week.

That does not sound like much (unless $0.73 is the difference between making your rent or not), but over time it should add up to enough to keep the nation's Medicare budget balanced. It takes a lot of years of saving to prepare for almost-free medical care beyond the age of 65.

A different way to look at the problem is to admit that the medical profession has not come up with inexpensive ways to keep large numbers of people healthy once they are old. A person who never made more than $15,000 a year in their lifetime may cost the Medicare system over $100,000 a year for as long as they can be kept alive.

It almost seems as if the health system for seniors is mainly designed to extract the last penny from their pockets and from the Medicare system.

Everyone thinks medical costs need to be capped, but no one (except maybe me) wants to be accused of letting a senior die for lack of medical treatment, no matter how expensive.

The only events that could fix the system are catastrophes. A disease with high mortality that affects seniors more than working-age people would perform actuarial wonders.

We could take a lesson from the Roman Republic, upon which so much of our original system was based. Seniors could buck up and pledge to refuse treatments that do not prolong a healthy, mobile, and fully conscious life. We could make opium smoking available to seniors who can't stand pain, and let nature takes its (inexpensive) course.

Unless we are willing to take a far harder line on capping expenses, we need to raise the Medicare contribution rate.