Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For web stuff in particular, using a comparatively low-bandwidth and/or high-latency connection can also be useful, since not everyone has 10+ Mbit connections with a 20ms ping to a Bay Area datacenter. On Linux, you can use 'tc' to simulate that.

I had a 384 Kbit, ~100ms-latency crappy SDSL connection until recently (was renting in the Santa Cruz mountains in a location that had poor connection options), and it was pretty amazing how two sites that looked very similar from a fast connection, would load much differently on the slow connection, often for reasons not really inherently connected to the site's needs (it's one thing if it's loading slowly due to streaming video, versus due to having a gigantic background-wallpaper image, or unnecessarily serialized roundtrips).



If you're on FreeBSD you can use dummynet to simulate packet delay and loss.

It's quite handy. My partner has slow, unreliable, high-latency Internet in his house and there are an entire class of performance problems that are extremely obvious when I work from his place that are barely measurable when I work from a 100mb line that's only a few milliseconds away from a major datacenter.


And if you are on OS X (and have Xcode installed), look into Developer->Applications->Utilities->Network Link Conditioner.


If you are in the UK you can simply choose Virgin as your broadband provider to achieve the same effect.


Or talktalk, I don't even get cable out here in the country...


Or if you’re on any OS with Java installed, try Sloppy: http://www.dallaway.com/sloppy/


I second Sloppy. I've found it very useful for debugging Flash errors that only happen occasionally. By slowing the loading sometimes them become reproducible. It's also useful to see how a webpage loads on a slow connection - for example: is your copy legible before your background image has loaded?


Thanks! I didn't even realize that was there.


My favorite (hah!) site that demonstrates this is Twitter.[1]

While tethering, it often takes upwards of two minutes to load a page. That seems just a little obscene when the entire useful content of the page is 10 lines of text.

(Hmm, maybe if I disable JS it'll fall back to a more sane approach?[2])

[1] Ok, now I feel simultaneously stupid/vindicated, since I posted this before reading the article... :)

[2] Just tried this, it falls back to http://twitter.com/?_twitter_noscript=1, which, hilariously, still seems to rely on javascript. Good job!


I get that Twitter is the bees knees and all, but it's got some pretty serious problems for such a simple service. For example, the last time I tried visiting it, it only showed the top banner and kept redirecting back to itself. And Google, perhaps on the opposite end of the complexity spectrum, loads before I can start typing my query. And when I do, results come straight away -- even on my shitty TalkTalk connection. But perhaps that's not a fair comparison, since this is Google after all.


On Solaris/Illumos you have Crossbow. With it you can simulate a whole lot of different kinds of network connections and can starve your programs from bandwidth in the most imaginative ways to test things like saturating the links to different parts of the application.

BTW, didn't know about tc. Thanks for the tip.


On windows you can use the testing/debugging HTTP proxy to simulate slow connections.


Ooops...I forgot to include the NAME of said tool - Fiddler!


"... For web stuff in particular, using a comparatively low-bandwidth and/or high-latency connection can also be useful, ..."

If you teamed low bandwidth with virtualised browsers of all flavours, this would make a pretty good testing service.


Fiddler can do the same for you on windows.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: