Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Race condition: OCI caching #580

Open
connorsmith256 opened this issue Mar 20, 2023 · 1 comment
Open

[BUG] Race condition: OCI caching #580

connorsmith256 opened this issue Mar 20, 2023 · 1 comment
Labels
bug Something isn't working help wanted Extra attention is needed pinned Should not be removed as stale over time

Comments

@connorsmith256
Copy link
Contributor

Describe the bug

The host has a race condition in oci.rs: on a cache miss, the host will download the parcheezy bytes and then cache them to disk. While the host is writing the bytes to disk, if a subsequent request to start the same provider is received, the host will count this as a cache hit and return the cache path. The host will then fail to start the provider, since the bytes haven't been fully written to disk yet.

To Reproduce

Note I haven't confirmed 100% this is happening, but it seems like the most likely cause. To reproduce, simply issue two provider start commands with the same OCI reference and different link names concurrently. When the race condition is triggered, you'll see this error message from the host:

"msg":"Failed to start provider wasmcloud.azurecr.io/kvredis:0.19.0 (default: \"unexpected end of file\""

Expected behavior

The host should either block subsequent provider starts until after the current request has been satisfied, or the internal NIF code should somehow block on filesystem access to the cache path...

Environment (please complete the following information)

  • OS: Linux
  • Arch: aarch64
  • wasmCloud Version: 0.61.0
@connorsmith256 connorsmith256 added bug Something isn't working help wanted Extra attention is needed labels Mar 20, 2023
@stale
Copy link

stale bot commented May 19, 2023

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this has been closed too eagerly, please feel free to tag a maintainer so we can keep working on the issue. Thank you for contributing to wasmCloud!

@stale stale bot added the stale label May 19, 2023
@connorsmith256 connorsmith256 added pinned Should not be removed as stale over time and removed stale labels May 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed pinned Should not be removed as stale over time
Projects
None yet
Development

No branches or pull requests

1 participant