Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kafka PubSub consumer: invalid header field value error #3507

Closed
nzkeith opened this issue Aug 27, 2024 · 5 comments
Closed

Kafka PubSub consumer: invalid header field value error #3507

nzkeith opened this issue Aug 27, 2024 · 5 comments
Labels
kind/bug Something isn't working
Milestone

Comments

@nzkeith
Copy link

nzkeith commented Aug 27, 2024

Expected Behavior

If a Kafka message header value isn't a valid HTTP header value, it is URL-encoded before the message is sent to the Dapr app over HTTP. Similarly for the header name.

Actual Behavior

If a Kafka message header value isn't a valid HTTP header value, the attempt to send the message to the Dapr app fails with error net/http: invalid header field value. Similarly for the header name. Example log message:

level=warning msg="encountered a retriable error while publishing a subscribed message to topic my.topic, err: Post \"http://127.0.0.1:8080/dapr/events/my.topic\": net/http: invalid header field value for \"my-header\"" app_id=my-app instance=my-app-6dfd8f7887-96n8z scope=dapr.runtime.processor.pubsub.subscription type=log ver=1.14.1

Steps to Reproduce the Problem

  1. Publish a kafka message containing a header with name my-header and value \x02, i.e. a single byte encoding the decimal value 2.
  2. Consume that message using the Apache Kafka PubSub component.

Release Note

RELEASE NOTE: FIX Kafka PubSub consumer URL-encodes invalid HTTP header names/values

@nzkeith nzkeith added the kind/bug Something isn't working label Aug 27, 2024
@nzkeith
Copy link
Author

nzkeith commented Aug 27, 2024

I see that the Azure Service Bus inbound binding urlencodes HTTP metadata on delivery (code):

for key, val := range msg.ApplicationProperties {
	if stringVal, ok := val.(string); ok {
		// Escape the key and value to ensure they are valid URL query parameters.
		// This is necessary for them to be sent as HTTP Metadata.
		metadata[url.QueryEscape(key)] = url.QueryEscape(stringVal)
	}
}

I think we need something similar for Kafka PubSub.

@yaron2
Copy link
Member

yaron2 commented Aug 27, 2024

Good find @nzkeith, would you be interested in submitting a PR for this?

@yaron2 yaron2 added this to the v1.15 milestone Aug 27, 2024
@passuied
Copy link
Contributor

Good find @nzkeith, would you be interested in submitting a PR for this?

Yes @yaron2 we'll be contributing here... This is blocking one of our usecases (Flink is emitting binary headers which is blocking our ability to consume them from dapr...)
Hopefully we can release in an upcoming 1.14.x patch

@yaron2
Copy link
Member

yaron2 commented Aug 28, 2024

Duplicate of #3507

@yaron2 yaron2 marked this as a duplicate of #3507 Aug 28, 2024
@yaron2 yaron2 closed this as completed Aug 28, 2024
@nzkeith
Copy link
Author

nzkeith commented Aug 28, 2024

Duplicate of #3503

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants