We are not out of the recession, but signs are improving. House sales are up, contractors I have talked to are starting to get more business, and the sentiment of America seems to be improving. What are your thoughts? I am curious what everyone perceives has truly changed. I don't think much has changed outside of the media finding they were not selling the 'recession' as well as they could sell 'swine flue' so things have switched gears.
As you can tell I am not a fan of the media, but I believe that the lack of them reporting on the economy is helping things, so I will say Thanks to those journalists now chasing influenza across the globe!
Have a real estate question?
email it to email@example.com