My Hubby posted about dental insurance today. We currently don't have any, which drives me nuts because I have genetically bad teeth, enjoy candy too much, need 2 crowns, and have braces. Unfortunately the dental insurance offered through his work is $70 a month, which is more than just a regular cleaning costs, so we do not have it. Luckily my braces were already paid for by our old insurance company so I don't have to worry about those costs, but once I get my braces off we better have good dental insurance because I will be getting a thorough cleaning, probably a few fillings (I always need fillings) and those 2 crowns from past root canals that were never crowned. Crowns alone cost at least $400 each. Ugh.
Anyway, the whole point of this post that I keep getting away from, is that I'm not too happy with the US healthcare system in the first place. Lorie posted yesterday that she needs to see a doctor urgently but because they don't have insurance she can't go yet. It's just unfair that there are people in real need of care that can't get it because our healthcare system is all about money and not actually helping people. I know that most of the other advanced countries have some sort of universal healthcare and I've heard Canadian friends say how much better they like theirs compared to what the US offers.
What do you think about the healthcare here in the US? And if you don't live in the US, what kind of healthcare do you have and do you think it's better or worse?