I thought with the later versions of RDP it somewhat works like remote X, where can pass through a lot of the context and the rendering is done on the clients GPU instead of being rasterized and sent over the network? I've found RDP very snappy in later versions, with very crisp rendering even over relatively low bandwidth connections (order of magnitudes better than VNC).
There's also remoteapp which allows you to just run one program remotely, but haven't got much experience with this.
RDP got a lot faster due to tapping in at different levels in the rendering engine, but those are mostly optimizations and not how the whole thing is meant to be viewed, it is a 'remote display protocol' that transports what is already visible on some remote screen or memory buffer to another computer for viewing. Shortcuts and optimizations don't really change that, whereas 'X' is at heart a client-server solution.
What does an optimization have to with anything? So what if X does not have a native GPGPU driver, that's really more a function of hardware manufacturers support than anything else. And if you are not too picky about all your stuff being 'open' then NVidia's X driver and CUDA on linux co-exist just fine with accelerated graphics and GPGPU support.
In a nutshell: RDP is a good way to access remote systems that are running a window manager of sorts, X is a good way to have remote clients connect to a local X server (the display).
RDP is more closely related to VNC and something like 'screen' than X, which is more like a networked resource that you can access through remote clients.
RDP is an application sharing protocol, X is a screen sharing protocol, the two are radically different from each other which gives each advantages and disadvantages that the other lacks and a bunch of overlap. Most notably, with X security was bolted on as an afterthought. RDP, originally reading the screen contents and sending those over (compressed) does not allow for much interaction with what gets sent over, it is rather low level whereas X sends over display primitives. Again, X's initial - rather naive - implementation made a ton of assumptions because the original implementors had nice fat workstations and fast networks to work with and so the real-world utility of these features was rather limited.
Anyway, you could write books about this and not get all the details, besides the various implementations of both but all that there is to this is that due to their history and intended application the two are very different beasts and X offers a versatility that not a whole lot of people need but when you need that versatility you need it badly.