Also, depending on the "current-ness" of your receiver and whether it can do HDMI switching (let alone 1.3a), when the Blu Ray player does the decoding, you have to send the audio out via analogue cables (which are fine, don't get me wrong) using a pcm signal. The problem that I perceive (and face) is that the Blu Ray players do not adjust the sound field to compensate for room irregularities. That is, when a Blu Ray player is outputting the 5.1 signal (whether it be DTS-MA, TrueHD, DD+ or the standard DD or DTS), there is no chance to monkey with the paramaters to adjust for volume or distance, since (at least on my Yammy), those features are disabled when I use the multi-channel in sectio of the receiver.

At the end of the day, I think it will come down to a versatility issue. Having the receiver control the soundstage simply makes more sense than requiring each different component to be tuned accordingly. I don't think that the time has yet come for a person with a good reciever to upgrade simply for the satisfaction of seeing the little lights go on, telling you that you're receiving a TrueHD or a DTS-MA signal.

In re: upscaling - the issue is really with legacy discs (read: standard DVDs). Assuming you're going to get rid of your standard DVD player when you purchase a Blu Ray unit, you'll want it to do a good job upconverting that signal to match your TV's resolution. Great debate is had about this topic. At the end of the day, the choice on where to do upconverting (because it will happen somewhere in the chain, whether you like it or not), should depend on which device has the best chips. So, using a cheaper model HDTV, one might defer to the player to do the upconverting. With a more expensive, higher end HDTV, you might want to let the TV do all the scaling. Others might want the player to de-interlace and the TV to upscale. Again, it all depends on the chips in each device.