937 resultados para ESTIMATE OF BIOPHYSICAL DATA
Resumo:
Empirical approaches and, more recently, physical approaches, have grounded the establishment of logical connections between radiometric variables derived from remote data and biophysical variables derived from vegetation cover. This study was aimed at evaluating correlations of dendrometric and density data from canopies of Eucalyptus spp., as collected in Capao Bonito forest unit, with radiometric data from imagery acquired by the TM/Landsat-5 sensor on two orbital passages over the study site (dates close to field data collection). Results indicate that stronger correlations were identified between crown dimensions and canopy height with near-infrared spectral band data (rho(s)4), irrespective of the satellite passage date. Estimates of spatial distribution of dendrometric data and canopy density (D) using spectral characterization were consistent with the spatial distribution of tree ages during the study period. Statistical tests were applied to evaluate performance disparities of empirical models depending on which date data were acquired. Results indicated a significant difference between models based on distinct data acquisition dates.
Resumo:
Marine species generally have large population sizes, continuous distributions and high dispersal capacity. Despite this, they are often subdivided into separate populations, which are the basic units of fisheries management. For example, populations of some fisheries species across the deep water of the Timor Trench are genetically different, inferring minimal movement and interbreeding. When connectivity is higher than the Timor Trench example, but not so high that the populations become one, connectivity between populations is crinkled. Crinkled connectivity occurs when migration is above the threshold required to link populations genetically, but below the threshold for demographic links. In future, genetic estimates of connectivity over crinkled links could be uniquely combined with other data, such as estimates of population size and tagging and tracking data, to quantify demographic connectedness between these types of populations. Elasmobranch species may be ideal targets for this research because connectivity between populations is more likely to be crinkled than for finfish species. Fisheries stock-assessment models could be strengthened with estimates of connectivity to improve the strategic and sustainable harvesting of biological resources.
Resumo:
The scope of this study was to estimate calibrated values for dietary data obtained by the Food Frequency Questionnaire for Adolescents (FFQA) and illustrate the effect of this approach on food consumption data. The adolescents were assessed on two occasions, with an average interval of twelve months. In 2004, 393 adolescents participated, and 289 were then reassessed in 2005. Dietary data obtained by the FFQA were calibrated using the regression coefficients estimated from the average of two 24-hour recalls (24HR) of the subsample. The calibrated values were similar to the the 24HR reference measurement in the subsample. In 2004 and 2005 a significant difference was observed between the average consumption levels of the FFQA before and after calibration for all nutrients. With the use of calibrated data the proportion of schoolchildren who had fiber intake below the recommended level increased. Therefore, it is seen that calibrated data can be used to obtain adjusted associations due to reclassification of subjects within the predetermined categories.
Resumo:
The assessment of the glacier thickness is one of the most widespread applications of radioglaciology, and is the basis for estimating the glacier volume. The accuracy of the measurement of ice thickness, the distribution of profiles over the glacier and the accuracy of the boundary delineation of the glacier are the most important factors determining the error in the evaluation of the glacier volume. The aim of this study is to get an accurate estimate of the error incurred in the estimate of glacier volume from GPR-retrieved ice-thickness data.
Resumo:
Background and Objective: To maximise the benefit from statin therapy, patients must maintain regular therapy indefinitely. Non-compliance is thought to be common in those taking medication at regular intervals over long periods of time, especially where they may perceive no immediate benefit (News editorial, 2002). This study extends previous work in which commonly held prescribing data is used as a surrogate marker of compliance and was designed to examine compliance in those stabilised on statins in a large General Practice. Design: Following ethical approval, details of all patients who had a single statin for 12 consecutive months with no changes in drug, frequency or dose, between December 1999 and March 2003, were obtained. Setting: An Eastern Birmingham Primary Care Trust GP surgery. Main Outcome Measures: A compliance ratio was calculated by dividing the number of days treatment by the number of doses prescribed. For a once daily regimen the ratio for full compliance_1. Results: 324 patients were identified. The average compliance ratio for the first six months of the study was 1.06 ± 0.01 (range 0.46 – 2.13) and for the full twelve months was 1.05 ± 0.01 (range 0.58 – 2.08). Conclusions: The data shown here indicates that as a group, long-term, stabilised statin users appear compliant. However, the range of values obtained show that there are identifiable subsets of patients who are not taking their therapy as prescribed. Although the apparent use of more doses than prescribed in some patients may result from medication hording, this cannot be the case in the patients who apparently take less. It has been demonstrated here that the compliance ratio can be used as an early indicator of problems allowing targeted compliance advice can be given where it will have the most benefit. References: News Editorial. Pharmacy records could be used to enhance statin compliance in elderly. Pharm. J. 2002; 269: 121.
Resumo:
The need for continuous recording rain gauges makes it difficult to determine the rainfall erosivity factor (R-factor) of the (R)USLE model in areas without good temporal data coverage. In mainland Spain, the Nature Conservation Institute (ICONA) determined the R-factor at few selected pluviographs, so simple estimates of the R-factor are definitely of great interest. The objectives of this study were: (1) to identify a readily available estimate of the R-factor for mainland Spain; (2) to discuss the applicability of a single (global) estimate based on analysis of regional results; (3) to evaluate the effect of record length on estimate precision and accuracy; and (4) to validate an available regression model developed by ICONA. Four estimators based on monthly precipitation were computed at 74 rainfall stations throughout mainland Spain. The regression analysis conducted at a global level clearly showed that modified Fournier index (MFI) ranked first among all assessed indexes. Applicability of this preliminary global model across mainland Spain was evaluated by analyzing regression results obtained at a regional level. It was found that three contiguous regions of eastern Spain (Catalonia, Valencian Community and Murcia) could have a different rainfall erosivity pattern, so a new regression analysis was conducted by dividing mainland Spain into two areas: Eastern Spain and plateau-lowland area. A comparative analysis concluded that the bi-areal regression model based on MFI for a 10-year record length provided a simple, precise and accurate estimate of the R-factor in mainland Spain. Finally, validation of the regression model proposed by ICONA showed that R-ICONA index overpredicted the R-factor by approximately 19%.
Resumo:
This paper proposes an experimental study of quality metrics that can be applied to visual and infrared images acquired from cameras onboard an unmanned ground vehicle (UGV). The relevance of existing metrics in this context is discussed and a novel metric is introduced. Selected metrics are evaluated on data collected by a UGV in clear and challenging environmental conditions, represented in this paper by the presence of airborne dust or smoke. An example of application is given with monocular SLAM estimating the pose of the UGV while smoke is present in the environment. It is shown that the proposed novel quality metric can be used to anticipate situations where the quality of the pose estimate will be significantly degraded due to the input image data. This leads to decisions of advantageously switching between data sources (e.g. using infrared images instead of visual images).
Resumo:
In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.
Resumo:
An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.
Resumo:
A constrained high-order statistical algorithm is proposed to blindly deconvolute the measured spectral data and estimate the response function of the instruments simultaneously. In this algorithm, no prior-knowledge is necessary except a proper length of the unit-impulse response. This length can be easily set to be the width of the narrowest spectral line by observing the measured data. The feasibility of this method has been demonstrated experimentally by the measured Raman and absorption spectral data.
Resumo:
Many of the most interesting questions ecologists ask lead to analyses of spatial data. Yet, perhaps confused by the large number of statistical models and fitting methods available, many ecologists seem to believe this is best left to specialists. Here, we describe the issues that need consideration when analysing spatial data and illustrate these using simulation studies. Our comparative analysis involves using methods including generalized least squares, spatial filters, wavelet revised models, conditional autoregressive models and generalized additive mixed models to estimate regression coefficients from synthetic but realistic data sets, including some which violate standard regression assumptions. We assess the performance of each method using two measures and using statistical error rates for model selection. Methods that performed well included generalized least squares family of models and a Bayesian implementation of the conditional auto-regressive model. Ordinary least squares also performed adequately in the absence of model selection, but had poorly controlled Type I error rates and so did not show the improvements in performance under model selection when using the above methods. Removing large-scale spatial trends in the response led to poor performance. These are empirical results; hence extrapolation of these findings to other situations should be performed cautiously. Nevertheless, our simulation-based approach provides much stronger evidence for comparative analysis than assessments based on single or small numbers of data sets, and should be considered a necessary foundation for statements of this type in future.
Resumo:
Annual loss of nests by industrial (nonwoodlot) forest harvesting in Canada was estimated using two avian point-count data sources: (1) the Boreal Avian Monitoring Project (BAM) dataset for provinces operating in this biome and (2) available data summarized for the major (nonboreal) forest regions of British Columbia. Accounting for uncertainty in the proportion of harvest occurring during the breeding season and in avian nesting densities, our estimate ranges from 616 thousand to 2.09 million nests. Estimates of the impact on numbers of individuals recruited into the adult breeding population were made based on the application of survivorship estimates at various stages of the life cycle. Future improvements to this estimate are expected as better and more extensive avian breeding pair density estimates become available and as provincial forestry statistics become more refined, spatially and temporally. The effect of incidental take due to forestry is not uniform and is disproportionately centered in the southern boreal. Those species whose ranges occur primarily in these regions are most at risk for industrial forestry in general and for incidental take in particular. Refinements to the nest loss estimate for industrial forestry in Canada will be achieved primarily through the provision of more accurate estimates of the area of forest harvested annually during the breeding season stratified by forest type and Bird Conservation Region (BCR). A better understanding of survivorship among life-history stages for forest birds would also allow for better modeling of the effect of nest loss on adult recruitment. Finally, models are needed to project legacy effects of forest harvesting on avian populations that take into account forest succession and accompanying cumulative effects of landscape change.
Resumo:
Burst timing synchronisation is maintained in a digital data decoder during multiple burst reception in a TDMA system. The data within a multiple burst are streamed into memory storage and data corresponding to a first burst in the series of bursts are selected on the basis of a current timing estimate derived from a synchronisation burst. Selections of data corresponding to other bursts in the series of bursts are modified in accordance with updated timing estimates derived from previously processed bursts.