Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: shard aware peer store pruning #2167

Merged
merged 1 commit into from
Dec 7, 2023

Conversation

SionoiS
Copy link
Contributor

@SionoiS SionoiS commented Oct 27, 2023

Description

With the addition of sharded peer management, peer store pruning must anticipate sharding changes (keep some peers for each shards just in case).

I went with removing peers on shards with higher than average count. I'm thinking that doing this in a loop would have the effect of balancing peers per shards eventually.

I also remove all unconnected peers without shards.

Let me know if another strategy would make more sense.

Changes

  • sharded peer store pruning

How to test

Simulations will be done. All unit tests already pass.

Tracking #1940

@SionoiS SionoiS changed the title prune shards with many peers feat: peer manager sharded peer store pruning Oct 27, 2023
@github-actions
Copy link

github-actions bot commented Oct 27, 2023

You can find the image built from this PR at

quay.io/wakuorg/nwaku-pr:2167

Built from 20a959d

@SionoiS SionoiS changed the title feat: peer manager sharded peer store pruning feat: shard aware peer store pruning Oct 30, 2023
@SionoiS SionoiS force-pushed the feat--sharded-peer-store-pruning branch from 8033fe2 to b339c19 Compare October 30, 2023 14:07
@SionoiS SionoiS marked this pull request as ready for review October 30, 2023 14:07
Copy link
Contributor

@jm-clius jm-clius left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Roughly looks fine to me. I cannot immediately think of another simple mechanism to prune with shard load balancing. However, simulating this (and other conn/peer behaviour) becomes more and more important to really understand the impact of changes. For example, I mentioned in comment below that I think this approach will make us more likely to prune peers that support multiple shards, which should in fact be a reason for us to prefer to keep them.

Comment on lines +707 to +789
for peer in peers[0..count]:
if peersToPrune.len >= pruningCount:
break

peersToPrune.incl(peer)
Copy link
Contributor

@jm-clius jm-clius Oct 30, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thing to remember is that these peers more than likely also support other shards that we may be interested in, so removing them here also removes them from the other supported shards' slots. Not sure if it's a good idea to make this more complicated in this PR, but a possible issue I foresee is that peers that support more shards (good!) are more likely to be pruned overall. Perhaps a future improvement is to sort the peers here by number of supported shards and prune the peers first that support fewer/no other shards?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thing to remember is that these peers more than likely also support other shards that we may be interested in, so removing them here also removes them from the other supported shards' slots. Not sure if it's a good idea to make this more complicated in this PR, but a possible issue I foresee is that peers that support more shards (good!) are more likely to be pruned overall. Perhaps a future improvement is to sort the peers here by number of supported shards and prune the peers first that support fewer/no other shards?

On the flip side, peers that support less shard might be more decentralized (not running in data center with beefy hardware).

Peer that support all shards == central server (exaggeration but still). We have to be careful about what we incentivize.

Copy link
Contributor

@alrevuelta alrevuelta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

waku/node/peer_manager/peer_manager.nim Outdated Show resolved Hide resolved
@SionoiS SionoiS force-pushed the feat--sharded-peer-store-pruning branch from b339c19 to cddf2a0 Compare October 31, 2023 12:11
@SionoiS SionoiS self-assigned this Oct 31, 2023
@SionoiS SionoiS added the E:1.4: Sharded peer management and discovery See https://github.com/waku-org/pm/issues/67 for details label Oct 31, 2023
@SionoiS
Copy link
Contributor Author

SionoiS commented Nov 22, 2023

Blocked by #2151

@SionoiS SionoiS force-pushed the feat--sharded-peer-store-pruning branch from 0235987 to 2f15554 Compare December 7, 2023 11:52
@SionoiS SionoiS merged commit 281c13a into master Dec 7, 2023
9 of 10 checks passed
@SionoiS SionoiS deleted the feat--sharded-peer-store-pruning branch December 7, 2023 12:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
E:1.4: Sharded peer management and discovery See https://github.com/waku-org/pm/issues/67 for details
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants