This is an interesting article about the underlying application architecture of MySpace.
There are quite a few parallels with Trade Me. I recognise a few of the problems that are described in the first few pages (up to 3 million customers):
- Managing session data across multiple web servers;
- Using caching to ease the load on the database;
- Partitioning the database when it’s too big/busy to live on a single server, with the corresponding issues around data replication;
- Implementing a storage area network (SAN);
- Bumping into I/O constraints in the database;
- The impossibility of realistic load testing
Like us, they have also recently migrated to .NET (in their case from ColdFusion). I previously wrote about our migration experience, if you’re interested.
Clearly, they’ve also had to deal with lots of problems we haven’t run into yet. I was talking to Scott Guthrie from Microsoft at TechEd in Auckland last year and was bragging (just a little!) about how we’d just clocked up 1 billion page impressions the previous month. He’d recently spent some time with the MySpace guys as part of their migration to .NET and told me that at that stage they were serving out 1 billion impressions per day! Ouch! :-)
So, there are probably some pointers here to the sorts of changes we’ll need to consider as Trade Me continues to grow – for example, moving to more of a distributed approach to the database design