From: jonathan rochkind Date: 2011-07-26T10:45:45+09:00 Subject: [ruby-core:38508] [Ruby 1.9 - Feature #5064] HTTP user-agent class Issue #5064 has been updated by jonathan rochkind. I think this is a good API which will avoid the need for a third party library for API convenience. (You didn't go into detail about 'timeout' configuration, but a SINGLE timeout param should exist, possibly along with separate read/open timeouts if someone really wants different values for each.). However, I think there may, at least in the past, be other performance-related reasons people needed to get third party libraries, especially when the app involves threading -- but NOT related to the the hypothetical Agent itself and it's concurrency model. Not sure if I have this right, or if these are still issues in 1.9.3? 1. Huge performance problem of the way timeouts are implemented. 2. global-interpreter-blocking nature of DNS lookups, instead of a non-blocking select DNS lookup. Can anyone confirm whether I have this right or not, and if it's still an issue in 1.9.3? If either of these are still issues, then a new standard library Agent addition ought to solve them too to really provide a solid library obviating need for third party libraries for standard use cases. ---------------------------------------- Feature #5064: HTTP user-agent class http://redmine.ruby-lang.org/issues/5064 Author: Eric Hodel Status: Open Priority: Normal Assignee: Category: lib Target version: 1.9.4 Currently there are some problems with Net::HTTP: * Too many ways to use (user confusion) * No automatic support for HTTPS (must conditionally set use_ssl) * No automatic support for HTTPS peer verification (must be manually set) * Single-connection oriented * No support for redirect-following * No support for HTTP/1.1 persistent connection retry (RFC 2616 8.1.4) * No automatic support for HTTP proxies * No automatic support for authentication (must be set per-request) Additionally the style of the API of Net::HTTP makes it difficult to take advantage of persistent connections. The user has to store the created connection and manually handle restarting the connection if it has timed out or is closed by the server. RFC 2616 8.1.1 has a large section explaining the benefits of persistent connections, but while Net::HTTP implements persistent connections they could be easier for users to implement with next work. I've implemented support for many of these additional features of Net::HTTP in various projects and I'd like Ruby to have the features required to make a useful HTTP user-agent built-in. The agent should have the following responsibilities: * Make or reuse connections based on [host, port, SSL enabled] * Automatically enable SSL for https URIs * Automatically enable SSL peer verification for SSL connections * Limit number of persistent connections per host * Follow redirects * Retry when a persistent connection fails * Automatically configure proxies * Automatically use authentication * Callbacks for various options connect The agent may add the following responsibilities: * Default headers for all requests * HTTP cookies * Tracking history * Logging I don't think any of these features are critical as they are implementable by users via callbacks. The agent would have the following configurable items: * Number of connections per host * Depth of redirects followed * Persistent connection retries (none, HTTP/1.1 (default), always) * Proxy host, port, user, password I think the class should be called Net::HTTP::Agent. Basic use would look something like this: uris = [ URI('http://example/1'), URI('http://example/2'), URI('https://secure.example'), ] agent = Net::HTTP::Agent.new uris.map do |uri| agent.get uri # Returns Net::HTTPResponse end For special requests a Net::HTTPRequest could be constructed: req = Net::HTTP::Get.new uri.request_uri # do something special with req agent.request req The agent should support GET, POST, etc. directly through API methods. I think the API should look something like this: def get uri_or_string, query = nil, headers = nil # Same for other requests with no body # # query may be a Hash or String # How query param vs query string in URI is used is undecided def post uri_or_string, data, headers = nil # same for other requests with a body # # data may be a String, IO or Hash # How data format is chosen is undecided SSL options, proxy options, timeouts and similar options should exist on Net::HTTP::Agent and be set on new connections as they are made. I've implemented most of these features in mechanize as Mechanize::HTTP::Agent. The Agent class in mechanize is bigger than is necessary and would need to be cut-down for inclusion in Ruby as Net::HTTP::Agent https://github.com/tenderlove/mechanize/blob/master/lib/mechanize/http/agent.rb Mechanize depends on net-http-persistent to provide HTTP/1.1 retry support and connection management: https://github.com/drbrain/net-http-persistent/blob/master/lib/net/http/persistent.rb Portions of net-http-persistent should be patches of Net::HTTP, for example #idempotent? #can_retry?, #reset and portions of #request. Other parts (connection management) should be moved to Net::HTTP::Agent. net-http-persistent provides a separate connection list per thread. I would like Net::HTTP::Agent to be multi-thread friendly but implementing this in another way would be fine. As an addendum, open-uri and mechanize should be written to take advantage of Net::HTTP::Agent on order to guide useful implementation. -- http://redmine.ruby-lang.org