For web stuff in particular, using a comparatively low-bandwidth and/or high-latency connection can also be useful, since not everyone has 10+ Mbit connections with a 20ms ping to a Bay Area datacenter. On Linux, you can use 'tc' to simulate that.
I had a 384 Kbit, ~100ms-latency crappy SDSL connection until recently (was renting in the Santa Cruz mountains in a location that had poor connection options), and it was pretty amazing how two sites that looked very similar from a fast connection, would load much differently on the slow connection, often for reasons not really inherently connected to the site's needs (it's one thing if it's loading slowly due to streaming video, versus due to having a gigantic background-wallpaper image, or unnecessarily serialized roundtrips).
If you're on FreeBSD you can use dummynet to simulate packet delay and loss.
It's quite handy. My partner has slow, unreliable, high-latency Internet in his house and there are an entire class of performance problems that are extremely obvious when I work from his place that are barely measurable when I work from a 100mb line that's only a few milliseconds away from a major datacenter.
I second Sloppy. I've found it very useful for debugging Flash errors that only happen occasionally. By slowing the loading sometimes them become reproducible.
It's also useful to see how a webpage loads on a slow connection - for example: is your copy legible before your background image has loaded?
My favorite (hah!) site that demonstrates this is Twitter.[1]
While tethering, it often takes upwards of two minutes to load a page. That seems just a little obscene when the entire useful content of the page is 10 lines of text.
(Hmm, maybe if I disable JS it'll fall back to a more sane approach?[2])
[1] Ok, now I feel simultaneously stupid/vindicated, since I posted this before reading the article... :)
I get that Twitter is the bees knees and all, but it's got some pretty serious problems for such a simple service. For example, the last time I tried visiting it, it only showed the top banner and kept redirecting back to itself. And Google, perhaps on the opposite end of the complexity spectrum, loads before I can start typing my query. And when I do, results come straight away -- even on my shitty TalkTalk connection. But perhaps that's not a fair comparison, since this is Google after all.
On Solaris/Illumos you have Crossbow. With it you can simulate a whole lot of different kinds of network connections and can starve your programs from bandwidth in the most imaginative ways to test things like saturating the links to different parts of the application.
I had a 384 Kbit, ~100ms-latency crappy SDSL connection until recently (was renting in the Santa Cruz mountains in a location that had poor connection options), and it was pretty amazing how two sites that looked very similar from a fast connection, would load much differently on the slow connection, often for reasons not really inherently connected to the site's needs (it's one thing if it's loading slowly due to streaming video, versus due to having a gigantic background-wallpaper image, or unnecessarily serialized roundtrips).