Determining what requests happen after UI interaction with Rails, Capybara -
i'm using capybara web crawling, , have following challenge: after interact dom elements (e.g. click button), want know (or make guess) if new page loading , if ajax requests taking place. because i'm crawling sites don't control, don't have access server-side state or know expect (i.e. it's not matter of waiting page load, it's matter of knowing if it's happening @ all).
the best case scenario if query list of recent/ongoing/completed http requests , data them.
alternatively nice if @ least find out if page reloading/has reloaded since last interaction.
at least check see if url of page i'm on matches url used on, misses ajax requests, page refreshes, , doesn't wait page load happen. looking better this.
i'm looking works selenium. non-ajax case work webkit too. suggestions?
selenium doesn't provide api monitor http traffic or see if page loading. if need log http requests should use proxy browsermob-proxy.
i think may selenium tries block when page loading doesn't happen in circumstances (it may better try if selenium blocks in yours).
if selenium blocks in circumstances can measure time took clicking link. if took more e.g. 0.1 second, means page being loaded after click.
require 'benchmark' time = benchmark.realtime { click_link 'some link' } if time > 0.1 # looks page being loaded after click end
i don't know if poltergeist blocking or not.
Comments
Post a Comment