[p2p-hackers] paper "rarest first and choke algorithms are enough"
Arnaud.Legout at sophia.inria.fr
Thu May 4 10:32:00 UTC 2006
> My thoughts on the two issues Bryan brings:
> On fairness
> Both are saying something true. The point is whether the experiments
> carried out capture any situation that seeders are scarce.
yes. You can see in the paper that there are torrents with 0 or only 1 seed.
> But generally I don't think free-riders can be eliminated even with a
> byte-for-byte fairness heuristic. For example, free-riders contributes
> nothing, but if their downloading bandwidths are also small (compare
> with other leechers), the fair share offered by seeders can well
> saturate this capacity. Also joining peers without any information of
> the free riders and optimistic unchoking inevitably feed the free
> riders.  tries to replace optimistic unchoking by estimate the
> bandwidth between peers, how well does this work? I don't know if any
> better approaches can solve this problem.
Ashwin Bharambe et al.  did a very nice study on BitTorrent. We agree
on several points, even if we used
a different methodology (simulations versus experimentations). However,
the results on fairness (that justify the proposed policies)
was performed with the old choke algorithm in seed state. With the new
choke algorithm in seed state, I do not believe the problems to hold.
> The good thing is, bitTorrent introduce seeders to provide extra
> resources. And if we want to prove/disprove the current algorithm is
> not efficient esp. when uploading capacity is rare, two things should
> be demonstrated:
> 1. whether peers can pick up optimal remote downloaders in the
> neighborhood quickly.
> 2. If they can do 1, how much performance loss they undertake by the
> current optimistic unchoking algorithm; if they cannot, how far from
> optimal do we expect the downloader selection.
You are right, but this is an extremely complex problem to model.
> On Network Coding
> I am not really familiar with network coding, the only example in P2P
> I read is "Network Coding for Large Scale Content Distribution" .
> But network coding often requires a global view or good heuristics on
> what to generate. In , to generate highly needed (innovative in
> this article) needs communication among the neighbors. Anybody knows
> some real-world, large-scale P2P system that provides realistic data?
Network coding does not require a global view of the network, you simply
always generate new blocks of data as a linear combination of all the
blocks you have.
You choose the coefficient randomly and send them with the block. I
simplify a lot, of course, but this is the idea.
Thanks a lot for the comments,
More information about the P2p-hackers