'Real' HDTV you're talking 19.4Mbps and upwards (apparently).
The above program is about 13.5Mbps.
HOWEVER there's a fair chance that that is infact what it was broadcast at over cable, 2 x 13.5Mbps channels fit snugly into a single 64QAM 6MHz cable channel.
The main concern for the US networks is making as efficient as possible use of their bandwidth while supplying HD-like quality. Probably be the same here and I'd imagine that 19Mbps channels won't be that common. 2 of them will eat a 256QAM 6MHz channel totally.
They do fit nicely into the 38Mbps or so that a 64QAM 8MHz channel supplies, BUT ntl will almost certainly be running on 51Mbps 256QAM channels on the TV (already do on the VOD I believe) so 17Mbps to get 3 channels into that 51Mbps is more likely, if not 12.5Mbps to squeeze 4 in there.
A big difference between HD and SD as far as *most* cable operators go is that with SD they can use ubr - unspecified bitrate, so they can do things like putting 10 channels into a 38Mbps channel and using statistical multiplexing so that they all receive variable bandwidth depending on how demanding their needs are for bandwidth.
While Eastenders is a demanding program to watch its' bitrate needs are relatively minor compared to live footie for example. Doing it this was you assume that not every channel will need its' full bandwidth all the time, just as you don't supply 100% of the bandwidth that every cable modem needs so you don't supply 100% of the bandwidth all the channels need on grounds that not every channel will need 100% all the time. The issue there though lies when that multiplex combined requires more bandwidth than it's allocated, for example when the news is on and it's the sport reports that stream will be needing more bandwidth and might tip the whole multiplex over its' capacity. If ntl do use this it may explain why you see very occasional picture issues. (BTW A multiplex is a series of TV channels that are combined into a single data stream, just as your cable modem receives a constant data stream and picks out the bits it needs so does your set top box).
HDTV is far less tolerant of this so virtually requires CBR, constant bit rate. That 13.5 / 19.4Mbps has to be nailed up, meaning a lot less efficient use of spectrum. No doubt this is something being tackled by Scientific Atlanta, Motorola et al in a drive for more efficient use of cable spectrums.
Anyway apologies I've just dragged it all hideously off topic with this