Between 1997 and 2016, spending on direct-to-consumer healthcare advertising skyrocketed, from $2.1 billion to $9.6 billion. Where advertising once represented 12% of total medical marketing expenditures, it came to represent 32%. There was a time when reputable doctors wouldn’t dream of such flagrant self-promotion. As medicine became more professionalized throughout the nineteenth century, doctors began looking for ways to differentiate trustworthy practitioners from charlatans and quacks. Education requirements and licensing were the main methods of gatekeeping. Another, as one doctor notes in the Journal of Medical Ethics, was to prohibit advertising.
The American Medical Association’s original code of ethics, published in 1847, called any form of self-advertising “highly reprehensible” and “derogatory to the dignity of the profession.” A good doctor builds a reputation among his community. He shouldn’t need to stoop so low as to publish pamphlets, flyers, testimonials, or any other type of ad. In the United Kingdom, the Medical Act of 1858 established a general medical council with the authority to oversee standards and manage the newly-created Medical Register, a list of all qualified practitioners. While the Medical Act didn’t explicitly ban advertisements at the time, practitioners believed self-promotion was improper.
Throughout the nineteenth century, “quackery was flagrant and brazen,” explains the medical historian James Harvey Young in the journal Pharmacy in History. “No disease, however dire, if one believed advertising, could resist the potency of the promoter’s product.” In this sense, the prohibition served a noble cause: saving unsuspecting patients from the many ill-trained or money-grubbing practitioners who hawked expensive, unproven cure-alls. From traveling salesmen to local apothecaries, the allure of a lucrative market in treatments, tinctures, and trinkets proved irresistible. But this tightening of the grip ended up putting a squeeze on many decent, qualified practitioners—especially women.
“Many perfectly reputable Georgian physicians [circa 1714-1837] engaged without a blush in commercial activities that would have resulted in their name being struck off the Register in the strait-laced world of late-Victorian professional medicine,” the medical historian Roy Porter wrote in his exhaustively researched book Health for Sale: Quackery in England, 1660-1850. “Making money out of nostrums and engaging in self-publicising, activities the Victorians deemed obnoxious to professional ethics, were regarded as marks of a good business nose—rational self-interest—by the Georgians.”
For the world’s first women doctors, advertising was a double-edged sword. Most hospitals and clinics refused to employ these women, so they were forced into private practice. If they chose to advertise their services, they risked being deemed unscrupulous or wholly unprofessional—accusations they may have faced anyway, simply because of their gender. But if they didn’t advertise at all, they risked not building up enough of a practice to pay their bills and maintain their business.
Case in Point: Elizabeth Blackwell
The first woman in America to earn a medical degree, Elizabeth Blackwell was repeatedly turned down when she applied to work at various medical establishments around New York City after graduating in 1849. She decided to open her own practice, and the first step toward a successful business was a good address. But the city’s landlords refused to rent to her.
At the time, “female physician” was how abortionists referred to themselves, though none had medical degrees. As medical professionals increasingly desired dominion over women’s reproductive health, these were the lay practitioners the AMA was hoping to squash.
Blackwell finally found a consulting room to rent with “an air of pleasant gloom about it.” The landlady worried about the nature of her practice, but instead of turning Blackwell away, she charged her outrageous rent and often sent her patients away. But Blackwell did not want to advertise herself, for fear of appearing unethical. Seeing her plight, Horace Greeley, the editor of the New York Tribune, ran a short article praising Elizabeth’s achievements and puffing up her new practice. This helped boost her business, but Elizabeth was still rationing food and coal.
She then devised an ingenious way of marketing her practice, one that would preserve her reputation as professional and respectable: She would give a lecture series. Attending lectures was a popular Victorian pastime. It was also, perhaps surprisingly, relatively socially acceptable for women to give lectures.
In her “Lectures on the Laws of Life,” given in a nearby church basement (and later edited and published), Blackwell extolled her philosophy of health: well-being could only be achieved when the needs of the body, mind, and soul were all being met. Alongside her common-sense suggestions for fresh air, good food, and intellectual stimulation came the radical ideas that girls should learn about their bodies and engage in boisterous, active play.
Blackwell’s lectures showed the public she was knowledgeable, professional, and personable—all good qualities in a Victorian doctor, but especially important for the only female MD in the country. The lectures offered a perfect opportunity to prove to people that she was not the monster they imagined her to be. These talks garnered a small but well-to-do audience of mostly Quaker families. Her promotion worked: many of them became patients and spread the word about her practice.
For other nineteenth-century women doctors, even “roundabout” methods of self-promotion were viewed with distaste. In the Lancet in 1871, a concerned citizen wrote in to lament the appearance of an article in the London Echo newspaper wherein patient “Helen Field” described experiencing indescribable agony that doctor after doctor only made worse. At long last, it was a woman doctor who provided relief. Field offered to give the details of how to contact this incredible lady doctor to any Echo reader who wanted it. To the Lancet writer, this was a grotesque form of advertisement that any and all female doctors should totally eschew if they wished to be considered true professionals:
Lady doctors, for some years to come, will incur considerable risk of being dragged into undesirable notoriety by the questionable laudation of persons of the Helen Field type … Ephemeral reputation would be dearly purchased at the cost of submission to degrading advertisements.
Clearly, not every doctor followed the rules. Another 1871 Lancet article, entitled “Unprofessional Advertising,” noted that physicians were still freely engaging in self-marketing, especially in the more provincial areas of Great Britain: “From all parts of the country we receive newspapers and cuttings from newspapers, containing the advertisements, more or less unprofessional, of local practitioners.” The medical periodical thanks its readers for sending in these ads, but admits its lack of power over the practice. “The only way to bring about the desired consummation,” the Lancet concludes, “is for the medical practitioners of the neighbourhood to combine and expose their peccant brothers in the manner they deem most effective.”
That there was such a thing as “unprofessional” advertising suggests the existence of “professional” advertising. According to Claire Brock of the University of Leicester, “cunningly constructed self-representation thus formed an integral part of nineteenth-century ‘doctoring,’ despite official distaste to the contrary.” In her article in the International Journal of Cultural Studies, Brock cites the example of Arthur Conan Doyle’s The Stark Munro Letters, which is actually a novelization of the author’s experience of establishing a medical practice in Plymouth, England.
“As a young practitioner, Stark Munro needs patients in order to form a successful practice and, of course, as a good doctor, he cannot advertise his services,” Brock explains. She continues:
His luck, when it comes, is in the form of what he labels “accidental cases,” of which he quickly takes advantage: “Two small accidents occurred near my door (it was a busy crossing), and though I got little enough from either of them, I ran down to the newspaper office on each occasion, and had the gratification of seeing in the evening edition that ‘the driver, though much shaken, is pronounced by Dr. Stark Munro, of Oakley Villa, to have suffered no serious injury.’ …Perhaps the fathers of the profession would shake their heads over such a proceeding in a little provincial journal; but I was never able to see that any of them were very averse from seeing their own names appended to the bulletin of some sick statesman in The Times.”
Not every medical practitioner advertising his wares or services was in it to get rich. Some solemnly believed they had discovered a treatment that could cure disease. Plenty of men who promoted their bizarre curatives had medical degrees or some scientific training at a university. Take, for example, Franz Mesmer’s animal magnetism, Elisha Perkins’ metallic tractor, James Graham’s celestial bed, and Robert James’ febrifuge powder.
The nineteenth century was one of great transformation when it came to pharmaceuticals. The early part of the century saw a combination of folk remedies and good old-fashioned trial and error. Many common medications were poisonous. (Flu? Constipation? Melancholy? Mercury tablets to the rescue!) And there was a general preference for “heroic” medicine: bloodletting, gastrointestinal cleansing, etc.
“First, the apothecaries spearheaded the drive to sell drugs. Chemists and druggists, whose numbers increased fourfold between 1865 and 1905—from something over 10,000 to more than 40,000—added to the momentum by selling drugs over the counter to meet demand generated by self-medication,” notes Roy Church, professor of history at the University of East Anglia, in the journal Medical History.
Throughout the 1800s, science aided the production of medications that actually worked. In their labs, chemists began to coax secrets from various botanicals that had been used for centuries as remedies. They discovered how to extract cocaine from coca leaves, salicylic acid from willow bark (a precursor to aspirin), quinine from China bark, digitalis from foxglove, and opium, morphine, codeine, and heroin from poppies. Medication would soon morph into the purview of pharmaceutical companies, and marketing became widespread. In 1900, patent medicines were the top category in national advertising spending.
“In the late nineteenth century professionalism and consumerism collided in a vociferous debate over the commodification of health,” the historian Lori Loeb remarked in Albion: A Quarterly Journal Concerned With British Studies. “[D]octors condemned ‘quackery,’ especially patent medicines—the Victorian appellation for over-the-counter drugs. They dismissed myriad pills, tonics and appliances as addictive, dangerous, or useless. This professional critique, doctors claimed, was an altruistic defence of patients.”
From one side of its mouth, the British Medical Journal harshly criticized patent medicines; from the other side, it ran ads for them. The American Medical Association, Loeb asserts, couldn’t help but point out the hypocrisy:
If the British Medical Association could not resist the advertising revenue derived from patent medicines, it was equally true that many doctors could not resist recommending patent medicines to patients. Far from epitomizing professional altruism, the patent medicine question demonstrates the reluctance of doctors to abandon individual self-interest in the wake of consumerist challenges that would ultimately transform twentieth-century medical practice.
During the Victorian era, physicians gradually transformed from a hit-and-miss group, one trained mainly through casual apprenticeships, into a respected class of educated, licensed, largely middle-class professionals. The establishment of medical associations, such as the British Medical Association in 1832 and the American Medical Association in 1847, aided this transition.
Even still, medicine wasn’t the notoriously lucrative career we know it to be today. Most physicians and surgeons at nineteenth-century hospitals were not paid. But those who could afford to give their time at such institutions did reap further rewards. “Although attending physicians were unpaid, their post was a source of considerable social status and monetary profit,” the historian John Harley Warner explains in the Journal of American History. Hospital posts were seen as prestigious; most attending physicians were allowed to teach on the wards so they could earn money from fee-paying students.
The “Degrading Stigma of Trade?”
When Elizabeth Blackwell established her own hospital in 1857—staffed by women, for women—resident physician Marie Zakrzewska reported regularly working 18-hour days. In addition to acting as a doctor, she was meal planner, shopper, and housekeeper. She spent her evenings hunched over a sewing machine while giving medical lectures to the student interns. On top of all that, she had to spend her afternoons going on home visits to her private patients. And this was the only time she was actually earning money.
Some practitioners were reduced to selling drugs for sixpence, which bore a “degrading stigma of trade,” as Church puts it. Still others had to serve societies or clubs to make ends meet, while “the élite among physicians fulfilled their role as consultants in the growing number of hospitals; those in London, for whom private patients continued to be a lucrative source, enjoying the highest status and incomes.”
It wasn’t until 1977 that the AMA amended its rules to allow some forms of advertising. Finally, in October 1979, the Federal Trade Commission ruled that the AMA could not bar doctors from advertising their services. In the U.K., it wasn’t until 1990 that the General Medical Council relaxed its advertising policy. For men and women doctors alike, concocting creative marketing opportunities is no longer part of the job.
This article was published on JSTOR Daily, read the original article here.
View original article
Contributor: Olivia Campbell / JSTOR Daily