Here's an interesting article in the New York Times about eating better and why it's not synonymous with eating more organic food. The central point seems to be that if you want to eat healthier and also eat stuff that doesn't mess up the environment, you're much better off switching to more plant-based foods than eating everything organic. And it hasn't been proven that organic foods are more nutritious. I still like buying organic when I can, not necessarily because I think it's healthier but sometimes the quality is better.
I also try to eat local when I can. That's hard in the winter months, but I look forward to the farmer's markets opening up again soon. I think eating local whenever possible is the biggest thing Americans can do to promote health and the environment.