17 resultados para Quantitative estimates
em CentAUR: Central Archive University of Reading - UK
Resumo:
Quantitative estimates of temperature and precipitation change during the late Pleistocene and Holocene have been difficult to obtain for much of the lowland Neotropics. Using two published lacustrine pollen records and a climate-vegetation model based on the modern abundance distributions of 154 Neotropical plant families, we demonstrate how family-level counts of fossil pollen can be used to quantitatively reconstruct tropical paleoclimate and provide needed information on historic patterns of climatic change. With this family-level analysis, we show that one area of the lowland tropics, northeastern Bolivia, experienced cooling (1–3 °C) and drying (400 mm/yr), relative to present, during the late Pleistocene (50,000–12,000 calendar years before present [cal. yr B.P.]). Immediately prior to the Last Glacial Maximum (LGM, ca. 21,000 cal. yr B.P.), we observe a distinct transition from cooler temperatures and variable precipitation to a period of warmer temperatures and relative dryness that extends to the middle Holocene (5000–3000 cal. yr B.P.). This prolonged reduction in precipitation occurs against the backdrop of increasing atmospheric CO2 concentrations, indicating that the presence of mixed savanna and dry-forest communities in northeastern Bolivia durng the LGM was not solely the result of low CO2 levels, as suggested previously, but also lower precipitation. The results of our analysis demonstrate the potential for using the distribution and abundance structure of modern Neotropical plant families to infer paleoclimate from the fossil pollen record.
Resumo:
This paper provides an overview of interpolation of Banach and Hilbert spaces, with a focus on establishing when equivalence of norms is in fact equality of norms in the key results of the theory. (In brief, our conclusion for the Hilbert space case is that, with the right normalisations, all the key results hold with equality of norms.) In the final section we apply the Hilbert space results to the Sobolev spaces Hs(Ω) and tildeHs(Ω), for s in R and an open Ω in R^n. We exhibit examples in one and two dimensions of sets Ω for which these scales of Sobolev spaces are not interpolation scales. In the cases when they are interpolation scales (in particular, if Ω is Lipschitz) we exhibit examples that show that, in general, the interpolation norm does not coincide with the intrinsic Sobolev norm and, in fact, the ratio of these two norms can be arbitrarily large.
Resumo:
Tidal Flats are important examples of extensive areas of natural environment that remain relatively unaffected by man. Monitoring of tidal flats is required for a variety of purposes. Remote sensing has become an established technique for the measurement of topography over tidal flats. A further requirement is to measure topographic changes in order to measure sediment budgets. To date there have been few attempts to make quantitative estimates of morphological change over tidal flat areas. This paper illustrates the use of remote sensing to measure quantitative and qualitative changes in the tidal flats of Morecambe Bay during the relatively long period 1991–2007. An understanding of the patterns of sediment transport within the Bay is of considerable interest for coastal management and defence purposes. Tidal asymmetry is considered to be the dominant cause of morphological change in the Bay, with the higher currents associated with the flood tide being the main agency moulding the channel system. Quantitative changes were measured by comparing a Digital Elevation Model (DEM) of the intertidal zone formed using the waterline technique applied to satellite Synthetic Aperture Radar (SAR) images from 1991–1994, to a second DEM constructed from airborne laser altimetry data acquired in 2005. Qualitative changes were studied using additional SAR images acquired since 2003. A significant movement of sediment from below Mean Sea Level (MSL) to above MSL was detected by comparing the two Digital Elevation Models, though the proportion of this change that could be ascribed to seasonal effects was not clear. Between 1991 and 2004 there was a migration of the Ulverston channel of the river Leven north-east by about 5 km, followed by the development of a straighter channel to the west, leaving the previous channel decoupled from the river. This is thought to be due to independent tidal and fluvial forcing mechanisms acting on the channel. The results demonstrate the effectiveness of remote sensing for measurement of long-term morphological change in tidal flat areas. An alternative use of waterlines as partial bathymetry for assimilation into a morphodynamic model of the coastal zone is also discussed.
Resumo:
Records of Atlantic basin tropical cyclones (TCs) since the late nineteenth century indicate a very large upward trend in storm frequency. This increase in documented TCs has been previously interpreted as resulting from anthropogenic climate change. However, improvements in observing and recording practices provide an alternative interpretation for these changes: recent studies suggest that the number of potentially missed TCs is sufficient to explain a large part of the recorded increase in TC counts. This study explores the influence of another factor—TC duration—on observed changes in TC frequency, using a widely used Atlantic hurricane database (HURDAT). It is found that the occurrence of short-lived storms (duration of 2 days or less) in the database has increased dramatically, from less than one per year in the late nineteenth–early twentieth century to about five per year since about 2000, while medium- to long-lived storms have increased little, if at all. Thus, the previously documented increase in total TC frequency since the late nineteenth century in the database is primarily due to an increase in very short-lived TCs. The authors also undertake a sampling study based upon the distribution of ship observations, which provides quantitative estimates of the frequency of missed TCs, focusing just on the moderate to long-lived systems with durations exceeding 2 days in the raw HURDAT. Upon adding the estimated numbers of missed TCs, the time series of moderate to long-lived Atlantic TCs show substantial multidecadal variability, but neither time series exhibits a significant trend since the late nineteenth century, with a nominal decrease in the adjusted time series. Thus, to understand the source of the century-scale increase in Atlantic TC counts in HURDAT, one must explain the relatively monotonic increase in very short-duration storms since the late nineteenth century. While it is possible that the recorded increase in short-duration TCs represents a real climate signal, the authors consider that it is more plausible that the increase arises primarily from improvements in the quantity and quality of observations, along with enhanced interpretation techniques. These have allowed National Hurricane Center forecasters to better monitor and detect initial TC formation, and thus incorporate increasing numbers of very short-lived systems into the TC database.
Resumo:
1. It has been postulated that climate warming may pose the greatest threat species in the tropics, where ectotherms have evolved more thermal specialist physiologies. Although species could rapidly respond to environmental change through adaptation, little is known about the potential for thermal adaptation, especially in tropical species. 2. In the light of the limited empirical evidence available and predictions from mutation-selection theory, we might expect tropical ectotherms to have limited genetic variance to enable adaptation. However, as a consequence of thermodynamic constraints, we might expect this disadvantage to be at least partially offset by a fitness advantage, that is, the ‘hotter-is-better’ hypothesis. 3. Using an established quantitative genetics model and metabolic scaling relationships, we integrate the consequences of the opposing forces of thermal specialization and thermodynamic constraints on adaptive potential by evaluating extinction risk under climate warming. We conclude that the potential advantage of a higher maximal development rate can in theory more than offset the potential disadvantage of lower genetic variance associated with a thermal specialist strategy. 4. Quantitative estimates of extinction risk are fundamentally very sensitive to estimates of generation time and genetic variance. However, our qualitative conclusion that the relative risk of extinction is likely to be lower for tropical species than for temperate species is robust to assumptions regarding the effects of effective population size, mutation rate and birth rate per capita. 5. With a view to improving ecological forecasts, we use this modelling framework to review the sensitivity of our predictions to the model’s underpinning theoretical assumptions and the empirical basis of macroecological patterns that suggest thermal specialization and fitness increase towards the tropics. We conclude by suggesting priority areas for further empirical research.
Resumo:
We present a well-dated, high-resolution, ~ 45 kyr lake sediment record reflecting regional temperature and precipitation change in the continental interior of the Southern Hemisphere (SH) tropics of South America. The study site is Laguna La Gaiba (LLG), a large lake (95 km2) hydrologically-linked to the Pantanal, an immense, seasonally-flooded basin and the world's largest tropical wetland (135,000 km2). Lake-level changes at LLG are therefore reflective of regional precipitation. We infer past fluctuations in precipitation at this site through changes in: i) pollen-inferred extent of flood-tolerant forest; ii) relative abundance of terra firme humid tropical forest versus seasonally-dry tropical forest pollen types; and iii) proportions of deep- versus shallow-water diatoms. A probabilistic model, based on plant family and genus climatic optima, was used to generate quantitative estimates of past temperature from the fossil pollen data. Our temperature reconstruction demonstrates rising temperature (by 4 °C) at 19.5 kyr BP, synchronous with the onset of deglacial warming in the central Andes, strengthening the evidence that climatic warming in the SH tropics preceded deglacial warming in the Northern Hemisphere (NH) by at least 5 kyr. We provide unequivocal evidence that the climate at LLG was markedly drier during the last glacial period (45.0–12.2 kyr BP) than during the Holocene, contrasting with SH tropical Andean and Atlantic records that demonstrate a strengthening of the South American summer monsoon during the global Last Glacial Maximum (~ 21 kyr BP), in tune with the ~ 20 kyr precession orbital cycle. Holocene climate conditions occurred as early as 12.8–12.2 kyr BP, when increased precipitation in the Pantanal catchment caused heightened flooding and rising lake levels in LLG. In contrast to this strong geographic variation in LGM precipitation across the continent, expansion of tropical dry forest between 10 and 3 kyr BP at LLG strengthens the body of evidence for widespread early–mid Holocene drought across tropical South America.
Resumo:
In this paper, we obtain quantitative estimates for the asymptotic density of subsets of the integer lattice Z2 that contain only trivial solutions to an additive equation involving binary forms. In the process we develop an analogue of Vinogradov’s mean value theorem applicable to binary forms.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
Considerable progress has been made in understanding the present and future regional and global sea level in the 2 years since the publication of the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. Here, we evaluate how the new results affect the AR5’s assessment of (i) historical sea level rise, including attribution of that rise and implications for the sea level budget, (ii) projections of the components and of total global mean sea level (GMSL), and (iii) projections of regional variability and emergence of the anthropogenic signal. In each of these cases, new work largely provides additional evidence in support of the AR5 assessment, providing greater confidence in those findings. Recent analyses confirm the twentieth century sea level rise, with some analyses showing a slightly smaller rate before 1990 and some a slightly larger value than reported in the AR5. There is now more evidence of an acceleration in the rate of rise. Ongoing ocean heat uptake and associated thermal expansion have continued since 2000, and are consistent with ocean thermal expansion reported in the AR5. A significant amount of heat is being stored deeper in the water column, with a larger rate of heat uptake since 2000 compared to the previous decades and with the largest storage in the Southern Ocean. The first formal detection studies for ocean thermal expansion and glacier mass loss since the AR5 have confirmed the AR5 finding of a significant anthropogenic contribution to sea level rise over the last 50 years. New projections of glacier loss from two regions suggest smaller contributions to GMSL rise from these regions than in studies assessed by the AR5; additional regional studies are required to further assess whether there are broader implications of these results. Mass loss from the Greenland Ice Sheet, primarily as a result of increased surface melting, and from the Antarctic Ice Sheet, primarily as a result of increased ice discharge, has accelerated. The largest estimates of acceleration in mass loss from the two ice sheets for 2003–2013 equal or exceed the acceleration of GMSL rise calculated from the satellite altimeter sea level record over the longer period of 1993–2014. However, when increased mass gain in land water storage and parts of East Antarctica, and decreased mass loss from glaciers in Alaska and some other regions are taken into account, the net acceleration in the ocean mass gain is consistent with the satellite altimeter record. New studies suggest that a marine ice sheet instability (MISI) may have been initiated in parts of the West Antarctic Ice Sheet (WAIS), but that it will affect only a limited number of ice streams in the twenty-first century. New projections of mass loss from the Greenland and Antarctic Ice Sheets by 2100, including a contribution from parts of WAIS undergoing unstable retreat, suggest a contribution that falls largely within the likely range (i.e., two thirds probability) of the AR5. These new results increase confidence in the AR5 likely range, indicating that there is a greater probability that sea level rise by 2100 will lie in this range with a corresponding decrease in the likelihood of an additional contribution of several tens of centimeters above the likely range. In view of the comparatively limited state of knowledge and understanding of rapid ice sheet dynamics, we continue to think that it is not yet possible to make reliable quantitative estimates of future GMSL rise outside the likely range. Projections of twenty-first century GMSL rise published since the AR5 depend on results from expert elicitation, but we have low confidence in conclusions based on these approaches. New work on regional projections and emergence of the anthropogenic signal suggests that the two commonly predicted features of future regional sea level change (the increasing tilt across the Antarctic Circumpolar Current and the dipole in the North Atlantic) are related to regional changes in wind stress and surface heat flux. Moreover, it is expected that sea level change in response to anthropogenic forcing, particularly in regions of relatively low unforced variability such as the low-latitude Atlantic, will be detectable over most of the ocean by 2040. The east-west contrast of sea level trends in the Pacific observed since the early 1990s cannot be satisfactorily accounted for by climate models, nor yet definitively attributed either to unforced variability or forced climate change.
Resumo:
Seeds of 39 seed lots of a total of twelve different crops were stored hermetically in a wide range of air-dry environments (2-25% moisture content at 0-50 degrees C), viability assessed periodically, and the seed viability equation constants estimated. Within a species, estimates of the constants which quantify absolute longevity (K-E) and the relative effects on longevity of moisture content (C-W) and temperature (C-H and C-Q) did not differ (P >0.05 to P >0.25) among lots. Comparison among the 12 crops provided variant estimates of K-E and C-W (P< 0.01), but common values of C-H and C-Q (0.0322 and 0.000454, respectively, P >0.25). Maize (Zea mays) provided the greatest estimate of K-E (9.993, s.e.= 0.456), followed by sorghum (Sorghum bicolor) (9.381, s.e. 0.428), pearl millet (Pennisetum typhoides) (9.336, s.e.= 0.408), sugar beet (Beta vulgaris) (8.988, s.e.= 0.387), African rice (Oryza glaberrima) (8.786, s.e.= 0.484), wheat (Triticum aestivum) (8.498, s.e.= 0.431), foxtail millet (Setaria italica) (8.478, s.e.= 0.396), sugarcane (Saccharum sp.) (8.454, s.e.= 0.545), finger millet (Eleusine coracana) (8.288, s.e.= 0.392), kodo millet (Paspalum scrobiculatum) (8.138, s.e.= 0.418), rice (Oryza sativa) (8.096, s.e.= 0.416) and potato (Solanum tuberosum) (8.037, s.e.= 0.397). Similarly, estimates of C-W were ranked maize (5.993, s.e.= 0.392), pearl millet (5.540, s.e.= 0.348), sorghum (5.379, s.e.=0.365), potato (5.152, s.e.= 0.347), sugar beet (4.969, s.e.= 0.328), sugar cane (4.964, s.e.= 0.518), foxtail millet (4.829, s.e.= 0.339), wheat (4.836, s.e.= 0.366), African rice (4.727, s.e.= 0.416), kodo millet (4.435, s.e.= 0.360), finger millet (4.345, s.e.= 0.336) and rice (4.246, s.e.= 0.355). The application of these constants to long-term seed storage is discussed.
Resumo:
This study presents a systematic and quantitative analysis of the effect of inhomogeneous surface albedo on shortwave cloud absorption estimates. We used 3D radiative transfer modeling over a checkerboard surface albedo to calculate cloud absorption. We have found that accounting for surface heterogeneity enhances cloud absorption. However, the enhancement is not sufficient to explain the reported difference between measured and modeled cloud absorption.
Resumo:
We consider the classical coupled, combined-field integral equation formulations for time-harmonic acoustic scattering by a sound soft bounded obstacle. In recent work, we have proved lower and upper bounds on the $L^2$ condition numbers for these formulations, and also on the norms of the classical acoustic single- and double-layer potential operators. These bounds to some extent make explicit the dependence of condition numbers on the wave number $k$, the geometry of the scatterer, and the coupling parameter. For example, with the usual choice of coupling parameter they show that, while the condition number grows like $k^{1/3}$ as $k\to\infty$, when the scatterer is a circle or sphere, it can grow as fast as $k^{7/5}$ for a class of `trapping' obstacles. In this paper we prove further bounds, sharpening and extending our previous results. In particular we show that there exist trapping obstacles for which the condition numbers grow as fast as $\exp(\gamma k)$, for some $\gamma>0$, as $k\to\infty$ through some sequence. This result depends on exponential localisation bounds on Laplace eigenfunctions in an ellipse that we prove in the appendix. We also clarify the correct choice of coupling parameter in 2D for low $k$. In the second part of the paper we focus on the boundary element discretisation of these operators. We discuss the extent to which the bounds on the continuous operators are also satisfied by their discrete counterparts and, via numerical experiments, we provide supporting evidence for some of the theoretical results, both quantitative and asymptotic, indicating further which of the upper and lower bounds may be sharper.
Resumo:
We have developed a new Bayesian approach to retrieve oceanic rain rate from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI), with an emphasis on typhoon cases in the West Pacific. Retrieved rain rates are validated with measurements of rain gauges located on Japanese islands. To demonstrate improvement, retrievals are also compared with those from the TRMM/Precipitation Radar (PR), the Goddard Profiling Algorithm (GPROF), and a multi-channel linear regression statistical method (MLRS). We have found that qualitatively, all methods retrieved similar horizontal distributions in terms of locations of eyes and rain bands of typhoons. Quantitatively, our new Bayesian retrievals have the best linearity and the smallest root mean square (RMS) error against rain gauge data for 16 typhoon overpasses in 2004. The correlation coefficient and RMS of our retrievals are 0.95 and ~2 mm hr-1, respectively. In particular, at heavy rain rates, our Bayesian retrievals outperform those retrieved from GPROF and MLRS. Overall, the new Bayesian approach accurately retrieves surface rain rate for typhoon cases. Accurate rain rate estimates from this method can be assimilated in models to improve forecast and prevent potential damages in Taiwan during typhoon seasons.
Resumo:
The Bollène-2002 Experiment was aimed at developing the use of a radar volume-scanning strategy for conducting radar rainfall estimations in the mountainous regions of France. A developmental radar processing system, called Traitements Régionalisés et Adaptatifs de Données Radar pour l’Hydrologie (Regionalized and Adaptive Radar Data Processing for Hydrological Applications), has been built and several algorithms were specifically produced as part of this project. These algorithms include 1) a clutter identification technique based on the pulse-to-pulse variability of reflectivity Z for noncoherent radar, 2) a coupled procedure for determining a rain partition between convective and widespread rainfall R and the associated normalized vertical profiles of reflectivity, and 3) a method for calculating reflectivity at ground level from reflectivities measured aloft. Several radar processing strategies, including nonadaptive, time-adaptive, and space–time-adaptive variants, have been implemented to assess the performance of these new algorithms. Reference rainfall data were derived from a careful analysis of rain gauge datasets furnished by the Cévennes–Vivarais Mediterranean Hydrometeorological Observatory. The assessment criteria for five intense and long-lasting Mediterranean rain events have proven that good quantitative precipitation estimates can be obtained from radar data alone within 100-km range by using well-sited, well-maintained radar systems and sophisticated, physically based data-processing systems. The basic requirements entail performing accurate electronic calibration and stability verification, determining the radar detection domain, achieving efficient clutter elimination, and capturing the vertical structure(s) of reflectivity for the target event. Radar performance was shown to depend on type of rainfall, with better results obtained with deep convective rain systems (Nash coefficients of roughly 0.90 for point radar–rain gauge comparisons at the event time step), as opposed to shallow convective and frontal rain systems (Nash coefficients in the 0.6–0.8 range). In comparison with time-adaptive strategies, the space–time-adaptive strategy yields a very significant reduction in the radar–rain gauge bias while the level of scatter remains basically unchanged. Because the Z–R relationships have not been optimized in this study, results are attributed to an improved processing of spatial variations in the vertical profile of reflectivity. The two main recommendations for future work consist of adapting the rain separation method for radar network operations and documenting Z–R relationships conditional on rainfall type.