Extreme Insolation: Climatic Variation Shapes the Evolution of Thermal Tolerance at Multiple Scales
Abstract
The climatic variability hypothesis (CVH) is a cornerstone of thermal ecology, predicting the evolution of wider organismal thermal tolerance ranges in more thermally variable environments. Thermal tolerance ranges depend on both upper and lower tolerance limits (critical thermal maxima [] and critical thermal minima []), which may show different responses to environmental gradients. To delineate the relative effects of mean and extreme temperatures on thermal tolerances, we conducted a within-latitude comparative test of CVH predictions for army ants (Dorylinae) at multiple scales: across elevations, in seasonal versus aseasonal forests, and in subterranean versus surface microhabitats. Consistent with the CVH, thermally buffered subterranean species had narrower thermal tolerance ranges. Both and decreased with elevation for subterranean species. In contrast, aboveground species (those exposed to insolation) showed a decrease in but no change in across elevations. Furthermore, greater seasonal temperature variation in dry forests correlated with increased but not . These patterns suggest that and respond to different abiotic selective forces: habitat-specific exposure to extreme insolation corresponds to differences but not to variation. We predict that increasingly frequent heat spikes associated with climate change will have habitat-specific physiological consequences for ectothermic animals. Models predicting climate change impacts should account for species microhabitat uses and within-latitude differences in temperature seasonality.