At 10:27 p.m. 16/12/2008, Simon Blake wrote:
On Mon, Dec 15, 2008 at 06:56:19PM +1300, Nathan Ward said:
What applications are available for multicast? Can Flash do multicast? I think Quicktime and Windows media can, yeah?
I don't know about Quicktime (I suspect it can), or Flash (I suspect it can't) but to multicast off a Windows media server you move from Windows Server 2008 Standard to Windows Server 2008 Enterprise or Datacentre, with the attendant increase in cost. So for many smaller deployments, you'd be balancing transit cost versus server cost, and the transit cost may be lower.
Server 2003 does it as well. AKL Uni has quite a bit coming off its 03 server. One of the big attractions in M/C is not having to have a server. On Internet 2 theres a regular stream (sorry) of M/C trials using boxes like the Visionary Solutions, QVidium or Vbrick, to multi-cast direct from an appliance. These aren't too expensive, and Amino makes a range of real cute set top boxes, that are very affordable. they also work at HD levels, around 10Mbps for 1080i. The general idea has been to make a solution that is sort of IP-cable-TV. This is handy in hotels or if you're a cable operator. Sadly, even on Internet-2 there is variable reachability. Each trial is followed by a burst of emails giving reception around the USA. As user of streaming deliveries, it suggests to me that, its still too hard for making a living off. But it does raise a point that I was wondering about doing a talk at NZNOG-09. Big-ish networks. Back in '94 WCC did quite a bit of modelling around ftth. We worked on what sort of network design would be required to deliver 100+Mbps to 100,000 homes. Coming from an electricity background, it was interesting. I suspect that with the new Govt move to FTTH we should start a group discussion on some of the issues, now, rather than waiting until things break. It might also be a good time to start large-ish use of IP-V6. So - a question - would there be interest in a talk on big-ish networks, even if all it does is trigger a wild group discussion that disolves into beer drinking (to stay on topic).
I think there's also a perception that multicast is a solution to a diminishing problem. Most of the major broadcasters seem to be acknowledging that the end of appointment TV (and radio) is nigh [1]. If live content is ~40% of your volume and dropping, and you still need some kind of non-multicast infrastructure to deal with the on demand (individually time shifted) content you're dishing up, then why bother with the effort of setting up multiple platforms?
For nearly 2 decades its been known that on-demand viewing was the killer app. Live streaming is fine but is really only needed for synchronous events like election night, tennis, rowing, etc. On-demand is very easy on servers, can be squidified and so becomes easy on networks. Part of the work mentioned above, was a distributed (it has to be distributed for bandwidth reasons) network of 1000 squid caches, just for Wellington. Interestingly, there are roughly 1000 electricity substations, most of which have enough space for a rack or half rack. They also have good power :-) - sorry. The substations are also in geographical locations that follow the load requirements (ie they are where the people are). The issues aren't just network topology, but also geographic. So - how does the group feel about managing large cache networks, measured in 1000's of servers, geographically dispersed ? What network topologies are viable ? Is this of interest for NZNOG-09 ? I would also explain how the Power companies in Wgtn and London do it. (or did) Richard