I believe we as Americans can all agree that cancer is bad. Can't think of anything else we would all agree about.
There are many people who have benefitted from cancer. It has given new hope to people who had taken life for granted. It has forged stronger bonds between friends and relatives. Some patients have met their soul mates in the hospital.
Cancer can be good.
Apologies for the dark humor. I do know people who believe that slave’s lives improved compared to their lives in Africa
