Standard of Care Kills
Segment # 367
The concept of “standard of care” has become a means by which hospitals, state medical boards, and government control and restrict the care a doctor may provide to a patient. The past five years have shown how medical protocols that later proved wrong were imposed on doctors and patients. Doctors like Paul Marik and Peter McCullough risked everything to stay true to their oath to do no harm. Now that JFKjr has taken over HHS more will be revealed how corrupt and incompetent forces directed patient care. For the moment lets see how all this started.
History of Standard of care
The history of the standard of care in medicine has evolved significantly over time, reflecting changes in medical knowledge, societal expectations, and legal frameworks.
Early Beginnings
In ancient times, medical liability was often based on theocratic conceptions and related to the patient's social position. The Babylonian Code of Hammurabi (1900 BC) contained some of the first legislative provisions for medical practice, including severe penalties for physicians who inadvertently injured patients3.
19th Century Developments
Through most of the first half of the 19th century, medicine lacked a strong scientific foundation. The mid-19th century saw significant changes:
The American Medical Association (AMA) was founded in 1847, working to reform medical education and standardize medical practice1.
Scientific breakthroughs, such as ether anesthesia (1846) and antisepsis (1867), began to legitimize medical authority1.
Medical malpractice lawsuits, initially rare, became more common during the era of "marketplace professionalism"1.
The concept of medical negligence evolved as a genuine tort doctrine, based on the breach of a standard of care1.
20th Century Advancements
The 20th century brought about significant developments in the standard of care:
The Liaison Committee on Medical Education (LCME) was founded in 1942, leading to the standardization of medical training and certification1.
After World War II, increased funding for biomedical research and expanded third-party payments enhanced the power and prestige of medical specialties2.
The Nuremberg Code (1945-1946) established the principle of patient autonomy, requiring consent for medical procedures3.
In the 1980s, medical professional associations, specialty societies, and voluntary health organizations became more involved in developing standards of care2.
Recent Developments
In recent years, the standard of care has continued to evolve:
The development of clinical practice guidelines has become pervasive, influencing the standard of care1.
State legislatures have passed statutes defining the standard of care in their jurisdictions5.
The concept of "duty to inform" has been established, requiring physicians to disclose material risks to patients4.
Today, the standard of care is generally understood as the level at which an ordinary, prudent professional with the same training and experience in good standing in a same or similar community would practice under the same or similar circumstances4.
The concept of medical liability has ancient roots, dating back to some of the earliest civilizations:
Mesopotamia
In 2400 BC, Babylon became a center of medical development. The Babylonian Code of Hammurabi (1900 BC) contained some of the first legislative provisions for medical practice, including severe penalties for physicians who inadvertently injured patients1. For example, if a doctor caused a patient to lose sight in an eye, the doctor's hands would be amputated1.
Ancient Egypt
Egyptian texts also addressed medical liability. The "Holy Bible" of ancient Egyptians stipulated that for the first three days of treatment, the doctor bore no responsibility. After three days, the doctor assumed full responsibility for the patient's outcome1. Notably, Egyptian law punished doctors with death if they did not follow the established rules of their science, regardless of the treatment outcome1.
Ancient Greece and Rome
In Greece, medical liability could be severe. One account describes the execution by crucifixion of a physician named Glaucos during the time of Alexander the Great for prescribing the wrong medication1.
In Rome, medical liability evolved over time. Early Roman law, such as the Pandects, stated that treatment failure was not grounds for liability. However, later laws from the Dodecanese era specified that physicians should be punished for negligence and inexperience1. By 573 from the founding of Rome, conditions for establishing a doctor's liability for patient compensation had been formulated1.
These early approaches to medical liability laid the groundwork for the development of more sophisticated legal frameworks in later centuries.