Forcing a maximum pagination limit

You know what's a really fun and easy way to bog down someone's server? Make a request against a controller for hundreds of thousands of records. Here's how to keep that from happening in your applications.
Your application has hundreds, thousands, however many, records sitting in a model and all paginated away in the controller. Awesome. Now go back to your application and add limit:9999999 into the URL, what happens? At best your server begrudgingly and slowly returns the gigantic resultset and at worst has shat the bed. Do this repeatedly and you have yourself a fairly easy denial of service attack.
Despite all the chaos such a thing can cause, it's actually really easy to fix, like, six line easy.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36

<?php /** * AppController * * Base application controller. * * @author Joe Beeson */ class AppController extends Controller { /** * Pagination results limit. * * @var int * @access protected * @see AppController::beforeFilter */ protected $_maximumPaginationResults = 25; /** * Triggered prior to the action. * * @return void * @access public */ public function beforeFilter() { if (isset($this->passedArgs['limit'])) { $this->passedArgs['limit'] = min( $this->_maximumPaginationResults, $this->passedArgs['limit'] ); } } }

By always asking for the lowest of the two values (one from the request, the other hardwired in the controller) we make sure that our pagination never goes off and commits database seppuku and, if you ever feel so inclined, you can just change the _maxPagination value in specific controllers to get different limits.

Permalink

| Leave a comment  »