394 Comments
User's avatar
⭠ Return to thread
ChazLB's avatar

Americans health care system is nothing to do with health and everything to do with MONEY! ONLY ..the "health" part is just treating symptoms with drugs they can make money off of.

Expand full comment
Fran's avatar

Yes, and I've worked in the health care community. and things have really changed. No more local family doctors, and now hospitals and doctors practices are corporately owned.I don't blame them, and I'm not defending America in this regard, just people, ordinary people.

Expand full comment
ChazLB's avatar

My old country doctor in LA has gotten too old to practice...he was great... and let me pay in cash for little things.. yeah the problem is mom and pop everything is evaporating! makes me sad .

Expand full comment
Fran's avatar

I agree, a lot of things have changed and not for the better. Those family doctors are now gone, and they don't even own their own practices. There were better times.

Expand full comment