I've checked out several* questions and WP docs and I have some idea of the answer but am looking for a specific number on the following questions:
Should I assume its 4G since thats the limit of MySQL in this case?
Specific details: I have a custom DB query that pre-loads 350,000 or so bytes of data which contains ~100,000 or so lat/lng pairs (and some other data) which were queried using a custom WPquery based on the Haversine formula (part of a custom 'Dealer Locator' type system).
The data is simply an array (not json, maybe it should be?)
I am using Map Clustering to output top level items on the Google Map (moot perhaps as the google Maps API is not my concern here)
This query is taxing of course and I do not want it running on every page load, so naturally I want to store/cache it. Currently I am stuffing it into update_option via an Admin AJAX action that is triggered manually from WPadmin so it can be flushed as needed.
My current method to handling speed works fine, I get the results I want at 350kb no problem. I am mostly concerned with the limit to this approach.
References I checked out (also did some googling):
Excessive memory used by update_option() (this one is interesting, 17M+ characters long he managed to get in there?)
Is there a limit to the length/size of serialized data that can be stored as user meta? length-size-of-serialized-data-that-can-be-stored-as-use
store simple data in get_option()
I've checked out several* questions and WP docs and I have some idea of the answer but am looking for a specific number on the following questions:
Should I assume its 4G since thats the limit of MySQL in this case?
Specific details: I have a custom DB query that pre-loads 350,000 or so bytes of data which contains ~100,000 or so lat/lng pairs (and some other data) which were queried using a custom WPquery based on the Haversine formula (part of a custom 'Dealer Locator' type system).
The data is simply an array (not json, maybe it should be?)
I am using Map Clustering to output top level items on the Google Map (moot perhaps as the google Maps API is not my concern here)
This query is taxing of course and I do not want it running on every page load, so naturally I want to store/cache it. Currently I am stuffing it into update_option via an Admin AJAX action that is triggered manually from WPadmin so it can be flushed as needed.
My current method to handling speed works fine, I get the results I want at 350kb no problem. I am mostly concerned with the limit to this approach.
References I checked out (also did some googling):
Excessive memory used by update_option() (this one is interesting, 17M+ characters long he managed to get in there?)
Is there a limit to the length/size of serialized data that can be stored as user meta? length-size-of-serialized-data-that-can-be-stored-as-use
store simple data in get_option()
http://codex.wordpress/Function_Reference/get_option
http://codex.wordpress/Function_Reference/add_option
http://codex.wordpress/Function_Reference/update_option
The option_value
column in the database is LONGTEXT
, meaning it can hold up to 4GB of text.
But keep in mind that WordPress loads all options on every page load by default, and storing large values for options you only intend to use on a single page is inefficient. Either set the autoload
argument of update_option()
and add_option()
to false
, or store the data in a custom table and only query it when you need it.