Jan 09, 2009 11:09 PM GMT
Having spent about half of my life in or around the UK and half in the US (and having trained as an economist), I've been watching the debate shape up in the States over what to do about health care. In the past days I've heard noises from lots of politicians about the features they imagine in a reformed health care system. To me, this seems like talking about the trees before you think about the forest.
When the UK opted for its National Health Service, it did so from the ideological premise that health care isn't like other goods (e.g. cars) that you should have only if you can afford them. The US takes this 'socialized' view of many other services (public education, fire services, police, etc), but not for medicine - at least not yet. Unless you are old or very poor, must either pay for the medical treatment you get or pay for insurance to cover the expense for you.
So what do you think? Is medical treatment something everyone deserves without paying for it (or for medical insurance) directly out of pocket?