This paper addresses speeding, that is, “too fast” responses, in web surveys. Relying on the response process model, we argue that very short response times indicate low data quality, stemming from a lack of attention on the part of respondents. To identify speeding, prior research employed case-wise procedures. Using data from nine online surveys, we demonstrate that response behavior of individual respondents varies considerably during a survey. Thus, we use case- and page-wise procedures to capture speeding behavior that taps different, although related, phenomena. Moreover, page-specific speeding measures capture aspects of data quality that traditional quality measures do not cover. Employing both page-specific and case-wise speeding measures, we examine whether removing speeders makes a difference in substantive findings. The evidence indicates that removing “too fast” responses does not alter marginal distributions, irrespective of which speeder-correction technique is employed. Moreover, explanatory models yield, by and large, negligible coefficient differences (on average about one standard error). Only in exceptional cases differences exceed two standard errors. Our findings suggest that speeding primarily adds some random noise to the data and attenuate correlations, if it makes a difference at all. The paper concludes by discussing implications and limitations.