The ATI Radeon HD 4850 is a GPU -- a chip not a whole graphics adapter. The chip is quite capable of HDTV resolutions, but the graphics adapter manufacturers don't have to implement all of the available features. To drive an LCD HDTV monitor, you would want an adapter card with an HDMI output port that can support the native resolution of the display. The adapter card should be able to do 1920x1080p at 60 Hz. If the display can do 120 Hz, you should check to see if the adapter can handle it. Based on the specs for the 4850, the 400 MHz RAMDACs should have plenty of bandwidth for a 120 Hz HD display. If you ever want to play Blu-ray from the PC, you should make sure everything supports HDCP.
Though I have never hooked up a PC via HDMI, I would think it would look stunning. It's a digital transmission and a very high resolution. As for cons, the only thing I can think of is an application -- especially a game -- that tries to run full screen and chooses an unsupported resolution might cause the display to lose sync and the display would be blank. I would not expect this behavior from a new game, and even older ones might have a command line switch to choose a different res or run in a window.
EDIT: in summary, check the specs on the graphics adapter you would like because we know the chip can handle all of this.