Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trailer headers #1652

Open
fafhrd91 opened this issue Feb 17, 2017 · 8 comments
Open

Trailer headers #1652

fafhrd91 opened this issue Feb 17, 2017 · 8 comments
Milestone

Comments

@fafhrd91
Copy link
Member

at the moment aiohttp drops Trailer headers. Should we provide access api?

@AraHaan
Copy link
Contributor

AraHaan commented Feb 17, 2017

I do not see a reason to not do that.

@fafhrd91
Copy link
Member Author

reason is how to design api.

@kxepal
Copy link
Member

kxepal commented Feb 17, 2017

That would be a great feature. Suddenly, it's also has not much wide support.

As for API, I think about response.trailers property which acts as like as headers, but is only accessible when response had been completely read (empty dict or exception?). Since there could be multiple trailers, user better have to explicitly specify what he want to get from there. We can parse headers[TRAILER] in to list of hdrs to make things a little bit friendly.

The most interesting part is how to send the trailers. I can't figure anything better except to pass a dict of header: lambda which will be processes at the end of sending request payload. But, user will have to setup own generator/coroutine with all these closures since we encapsulated all the streaming within ClientRequest.write_bytes method which is not a part of frontend API. Or we can have a ClientChunkedRequest which will be more friendly for explicit data streaming.

Just a thoughts (:

@fafhrd91
Copy link
Member Author

I personally do not care about send part, if user wants to send trailer header he should implement chunking himself, it is easy. I am more concerned how to pass trailer data from parser.

@asvetlov asvetlov added this to the 3.0 milestone Oct 19, 2017
@asvetlov asvetlov modified the milestones: 3.0, 3.1 Feb 9, 2018
@gsauthof
Copy link

FWIW, I care about both parts, client and server.

Since the StreamResponse already has enable_chunked_encoding() it would be natural to also support chunked trailer headers.

Regarding the server side API, perhaps adding an optional dictionary parameter to write_eof() wouldn't be too surprising.

@asvetlov
Copy link
Member

write_trailers() looks even better :)

@gsauthof
Copy link

gsauthof commented Mar 20, 2018

@asvetlov with a write_trailers() method you need to extend the state machine. After a call to write_trailers() only a call to write_eof() should be allowed. Also, write_trailers() and write_eof() effectively would always show up in pairs. Arguably, these are arguments in favor of an optional parameter to write_eof().

Btw, currently, I'm using this as workaround for sending headers in the chunked trailer part:

value = some_accumulated_value # e.g. a checksum
response._payload_writer.chunked = False
response._chunked = False
await response.write(b'0\r\n')
hs = multidict.CIMultiDict( [('X-foo', value)] )
headers = ''.join(['{}: {}\r\n'.format(k, v) for k, v in hs.items()])
headers = headers.encode('utf-8') + b'\r\n'
await response.write(headers)
await response.write_eof()

@vegerot
Copy link

vegerot commented Jul 19, 2024

I'd like this because my server is sending Server-Timing trailer that I need to read. Is there any workaround for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants