I'm working on a WordPress shortcode that aggregates data from Gravity Forms entries. I have:
My initial thought was that I would fetch the entries once and then sort through them for each training and total them up in the process for my global averages.
The issue:
Using GFAPI::get_entries()
with 'page_size' => 0
or any large range quickly causes:
Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes)
When I try chunking with paging, the script then times out (shared hosting — cannot raise limits).
Here’s a simplified version of what I’m doing:
// Now let's get all of the entries
$search_criteria = [
'status' => 'active',
'start_date' => $start, // Which is 2020
'end_date' => $end, // Which is today
];
$enrolled_entries = [];
$chunk_size = 500;
$offset = 0;
do {
$paging = [
'offset' => $offset,
'page_size' => $chunk_size,
];
$entries_chunk = GFAPI::get_entries( $enroll_form_ids, $search_criteria, [], $paging );
if ( is_wp_error( $entries_chunk ) ) {
break;
}
$enrolled_entries = array_merge( $enrolled_entries, $entries_chunk );
$retrieved_count = count( $entries_chunk );
$offset += $chunk_size;
} while ( $retrieved_count === $chunk_size );
I only need to calculate averages, counts, and basic stats — no need to load every field or the full entry object unless needed.
What I'm looking for:
Is there a better way to efficiently aggregate values across many Gravity Forms entries?
Is there a server-safe way to handle this in chunks, background jobs, or something similar within WordPress?
My full script can be seen here:
Thanks for any help or direction!