Has anyone ever visited America and can tell me why their country is better? I ask only because my wife is thinking of becoming a traveling nurse. We've read about a few places via wikitravel. However, getting information from someone who actually lives there would be much better. Key things we are looking for; food, safety and things to do.