The History of Medicare
Medicine and Medicare in the U.S.
The U.S. has had a Medicare program since the mid-1900’s and has accepted a theoretical responsibility to take care of the elderly, veterans, and the poor.
Improved standards of living have increased our life expectancy tremendously. It was not unusual to see people in this country live only through their 30’s in the 1700s. By 1900, the average life expectancy was almost 50. By now, it is not unusual for people to live into their 80’s and 90’s.[1]
There have always been a few people who never encountered serious illness or accident and lived to old ages, but most of the rest of us become old thanks to nurturing environments and our medical care. Our medical care is often thanks to our medical insurance.
Before the 20th Century, hospitals were where people came to die. There was little effective medicine and people spent only a small portion of their income on what there was. If they were fortunate, they might find a knowledgeable herbalist; if people were unlucky, they might buy a potion containing toxic ingredients like arsenic along with fillers.
The American Public Health Association started in 1873.[2] During the early 1900’s hospitals became more efficient. As Louis Pasteur, Joseph Lister, and others began to understand the role of germs and cleanliness, hospitals became cleaner and the chance of survival grew.
Universities improved doctor’s training and introduced stricter licensing; scientists developed more effective medicines. All of this cost money and, in turn, the medical system needed to find a way to make money to pay for it.
See Melissa Thomasson of Miami University in her article, Health Insurance in the United States.[3] She is one of our most knowledgeable researchers and her articles provide an expert view of U.S. medical care and insurance.
Once hospitals were established, they needed to market themselves and bring in income. They began a movement to convince women to have babies in hospitals rather than at home with midwives. It worked.
Hospitals began to grow in popularity. However, the beds were often empty.[4]
The American Medical Association had introduced increasingly tough standards for doctor training and licensure as the 1900’s progressed. Other medical organizations proliferated, each introducing stricter standards for medical practice.
By the late 1920’s, we could measure blood pressure and had x-rays. We had the drugs insulin and Salvarsan. In the next few decades, we had sulfa drugs, synthetic penicillin, and a vaccine for polio.
Americans have discussed health insurance since the 1900’s. Teddy Roosevelt’s platform included health insurance in 1912. However, there was no individual or employer-based model for the concept.
The first medical insurance in the U.S. began at Baylor University Hospital in Dallas, Texas in 1929. To get people to use the hospital, Baylor started a program for a group of public school teachers encouraging them to use hospital services by having them pay $6.00 a month into a group fund which would give them a fixed number of days of hospital care. This employer-centered model of health insurance became Blue Cross.
As time went on, single-hospital based plans broadened into Blue Cross plans; subscribing hospitals had contracts with the plan as a non-profit mutual organization. It was not an insurance company, and so had less strict regulations.
Meanwhile, Blue Shield insurance sprang up in the 1930s as a way to for doctors to keep control over their revenue without being co-opted into Blue Cross or the ever-dreaded compulsory government insurance plan. Blue Shield also gained non-profit designation and developed and an employer-based insurance.
World War II popularized the concept of employer-based health insurance. As the war went on, a labor shortage encouraged employers to provide benefits to lure workers. Medical insurance and pensions became standard union benefits.
The I.R.S. also played a part in making employer-based health insurance increasingly attractive. In 1943, employee health insurance was ruled tax free. More tax advantages came in 1954.
President Truman asked Congress to create a national health insurance in 1945. He did not succeed in creating one, but the topic has never vanished. Instead, by focusing on the military, the elderly, and, later, on the disadvantaged, he created a model for government-sponsored health insurance.
President Eisenhower focused on insurance that was part of the Dependent’s Medical Care Act by in 1956 providing services to those in the military and began paving the way for services for the elderly.
The program was expanded by President Lyndon Johnson in 1965 when he signed H.R. 6675 on July 30. As time went on, the program was expanded even further both in the numbers of people covered and in what it covered.
For a history of the Medicare program, see A Brief History of Medicare in America.[5]
President Bush added Part D in 2003, and it went into effect in 2006. By then, many medical conditions that had been fatal in the past were being managed by medication. This was good news for personal longevity but meant that those without adequate health insurances were finding it hard or impossible to pay for medication. Prescriptions cost an increasingly large percentage of older people’s income.
Under President Obama, the Affordable Care Plan, covered preventive services like colonoscopies and mammograms as well as a yearly “wellness” visit.
Both Medicare and ACA health insurance now cover these. For an overview of the relationship between the ACA and Medicare, please see ObamaCare Medicare: Obamacare and Medicare.
Only time will tell if our next step is toward expanding coverage to reach “universal coverage,” or not.