We know through retrospective epidemiological data that the first confirmed American AIDS death occurred in 1969: a 15-year-old, sexually active young man in Saint Louis, Mo. who had never traveled abroad.
And by the time HIV was isolated and identified in 1981, it had already found its target communities: people who have engaged in anal sex, injection-drug users, breastfed infants, and individuals who received medically sanctioned human tissue (i.e., donated blood or organ transplants).
During the early '80s in the States, a strong cultural shift had just occurred, from "free love" and "live and let live" to the conservative Christian movement led by Jerry Falwell.
The Moral Majority had become a major player in the political field, which led to the election of President Ronald Reagan. The policies that would shape America for the next eight years would also mold America's response to the introduction of AIDS into the lives of countless individuals and the American lexicon.
Due to advances in diagnosis and treatment of the virus, in addition to the political activism of the direct and indirect victims of HIV/AIDS, the virus is no longer a death sentence for the estimated 1.2 to 1.5 million Americans who are affected, according to the Centers for Disease Control and Prevention (CDC).
But lest we forget, these data, which seem so large, are a measure of the prevalence of HIV/AIDS in 2011. Thirty years ago, that number was exponentially smaller.
Imagine an America that only had around 10,000 to 20,000 persons with HIV/AIDS, and a world that may have had about 70,000 to 100,000 total cases.
Couple that information with the knowledge that a muted health care response would eventually lead to 2 million deaths annually. Would it have been morally acceptable to demonize particular populations?
We have all heard countless justifications for why some people "got it," without any level of sympathy. These sentiments are based on racism, sexism, homophobia, classism, and/or prejudice against addicts.
Typically, bleeding hearts only pour out to individuals who are considered without fault in their HIV status, such as newborns or hemophiliacs.
After 30 years of AIDS, we know what works and, more importantly, what does not work. We know that first and foremost, education is the greatest deterrent to infection (if one is HIV-negative) or infecting another person (if one is HIV-positive).
Furthermore, we have seen the effect of readily available medication (anti-retroviral drugs) on the level of impact that HIV/AIDS has on an individual and on a community. And we have seen effective public health initiatives that have saved countless lives, domestically and internationally (e.g., syringe needle exchanges).
Moreover, after 30 years of AIDS, we know that our leaders have a choice of when, how, and to whom any and all interventions are available. To the ultimate detriment of 20 million people each year, those interventions are often not available, sometimes due to funding, and sometimes due to normative culture values that punish those most in need: the world's outcasts.
It is not being cynical to suggest that if we knew then what we know now, all possible barriers to the spread of HIV/AIDS would have been enacted.
Even in 2011, Congress has taken actions that will diminish headway in the fight against HIV/AIDS. Reinstating the federal ban on syringe-exchange funding, coupled with funding for abstinence-only education, unfortunately shows a trade of proven-effective health policy for proven-ineffective actions.
As HIV public health advocates, our hope is that the correct actions are taken so that in a few decades, we do not look back and wonder why the tools that we have today were not utilized. There is no viable excuse for knowing now what we already know and still not doing the right thing.
Read the rest