-
Notifications
You must be signed in to change notification settings - Fork 524
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fun task]: Add React tracing #1307
Comments
Hi @yurishkuro , is it up for grab can I start on it ? |
@psk001 yes |
Thanks @yurishkuro. I followed the mentioned article and was able to get traces using jaeger all-in-one For the exporter, a URL is required to send traces to collector const collectorExporter = new OTLPTraceExporter({
// url:""
headers: {}
}); Without a given value, it defaults to localhost. Is there a collector URL that can be added here? |
All-in-one listens on port 4318 for OTLP over HHTP. |
is it up for grabs? :) |
Yes, but also see a previous unsuccessful PR #1627 |
yep, read it through and personally i think author got almost through with the implementation with few formatting/cleanup issues i suppose and one issue with adding a way to disable the tracing. |
@MAX-786 I think the main unresolved issue for me was how the traces from the UI should be exported. UI communicates with jaeger-query via an internal API. jaeger-query itself generates traces internally and exports them somewhere (in case of all-in-one it exports it back to itself over local port). So my assumption was that the zero-configuration option would be for the UI to export traces into an internal endpoint in the jaeger-query where it can handle them the same way as the the traces it itself generates. A non-zero configuration option would be to make it the responsibility of the operator to provide an export URL for traces from the UI. It's certainly doable but makes the configuration much more complicated imo, as one has to potentially deal with additional CORS settings to allow the UI to talk to something else, etc. |
To be fair, we could start with the non-zero configuration and at least merge the React PR to capture the traces, to divide the problem. Then we can see if we can add a new endpoint to jaeger-query to accept traces from the UI (we already have an implementation of OTLP/HTTP endpoint, it's easy to reuse). |
Requirement
Understand and visualize the behavior of Jaeger UI via traces
Problem
Currently we only start traces at the query service.
Proposal
Perhaps follow suggestions in the blog?
https://developers.redhat.com/articles/2023/03/22/how-enable-opentelemetry-traces-react-applications
or this one
https://www.cncf.io/blog/2024/08/05/how-to-add-otel-instrumentation-to-a-react-native-app/
Open questions
How can we send trace data from the browser to Jaeger backend? Should the query-service implement a dedicated endpoint where frontend can report traces?
Jaeger UI is historically served by
jaeger-query
binary that is mostly a read-only service, but it does have an internal tracing that could be configured via OTEL env vars to send the data to another service (by default it sends to localhost which happens to work when running in all-in-one mode). When the UI part involves its own tracing, it needs to export the data. The question is where.Option 1: to an external URL, e.g. a
jaeger-collector
running somewhere. This was the approach in the previous PR where the endpoint was configurable and passed to the UI, except that it wasn't passed using the right mechanism #1627 (comment). This will work as long as the collector is set up with a correct CORS policy.Option 2: UI can export data back into
query-service
via a dedicated endpoint, e.g. we could mount an OTLP receiver in thequery-service
as well. This avoids any cross-site issues but requires a lot more implementation to support this endpoint and then somehow export the data elsewhere.I would recommend going with option 1. It just means we need to have a configuration option in the UI config where the user can specify the OTLP endpoint to export the data.
The text was updated successfully, but these errors were encountered: