Considering the situation in the USA, I think when a government lies to its people about health care, Christians should evaluate their participation in both the government and its plans.
"Should a Christian be involved in health insurance?" is a question few even bother to consider. Is government directed health insurance what God would want us to do in answering the call to care for each other? Can we look to the government as fulfilling our responsibilities when we know the government lies with impugnity?