136 resultados para categorical and mix datasets
Resumo:
Current variability of precipitation (P) and its response to surface temperature (T) are analysed using coupled(CMIP5) and atmosphere-only (AMIP5) climate model simulations and compared with observational estimates. There is striking agreement between Global Precipitation Climatology Project (GPCP) observed and AMIP5 simulated P anomalies over land both globally and in the tropics suggesting that prescribed sea surface temperature and realistic radiative forcings are sufficient for simulating the interannual variability in continental P. Differences between the observed and simulated P variability over the ocean, originate primarily from the wet tropical regions, in particular the western Pacific, but are reduced slightly after 1995. All datasets show positive responses of P to T globally of around 2 %/K for simulations and 3-4 %/K in GPCP observations but model responses over the tropical oceans are around 3 times smaller than GPCP over the period 1988-2005. The observed anticorrelation between land and ocean P, linked with El Niño Southern Oscillation, is captured by the simulations. All data sets over the tropical ocean show a tendency for wet regions to become wetter and dry regions drier with warming. Over the wet region (75% precipitation percentile), the precipitation response is ~13-15%/K for GPCP and ~5%/K for models while trends in P are 2.4%/decade for GPCP, 0.6% /decade for CMIP5 and 0.9%/decade for AMIP5 suggesting that models are underestimating the precipitation responses or a deficiency exists in the satellite datasets.
Resumo:
Reliable techniques for screening large numbers of plants for root traits are still being developed, but include aeroponic, hydroponic and agar plate systems. Coupled with digital cameras and image analysis software, these systems permit the rapid measurement of root numbers, length and diameter in moderate ( typically <1000) numbers of plants. Usually such systems are employed with relatively small seedlings, and information is recorded in 2D. Recent developments in X-ray microtomography have facilitated 3D non-invasive measurement of small root systems grown in solid media, allowing angular distributions to be obtained in addition to numbers and length. However, because of the time taken to scan samples, only a small number can be screened (typically<10 per day, not including analysis time of the large spatial datasets generated) and, depending on sample size, limited resolution may mean that fine roots remain unresolved. Although agar plates allow differences between lines and genotypes to be discerned in young seedlings, the rank order may not be the same when the same materials are grown in solid media. For example, root length of dwarfing wheat ( Triticum aestivum L.) lines grown on agar plates was increased by similar to 40% relative to wild-type and semi-dwarfing lines, but in a sandy loam soil under well watered conditions it was decreased by 24-33%. Such differences in ranking suggest that significant soil environment-genotype interactions are occurring. Developments in instruments and software mean that a combination of high-throughput simple screens and more in-depth examination of root-soil interactions is becoming viable.
Resumo:
Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.
Resumo:
Cross-bred cow adoption is an important and potent policy variable precipitating subsistence household entry into emerging milk markets. This paper focuses on the problem of designing policies that encourage and sustain milkmarket expansion among a sample of subsistence households in the Ethiopian highlands. In this context it is desirable to measure households’ ‘proximity’ to market in terms of the level of deficiency of essential inputs. This problem is compounded by four factors. One is the existence of cross-bred cow numbers (count data) as an important, endogenous decision by the household; second is the lack of a multivariate generalization of the Poisson regression model; third is the censored nature of the milk sales data (sales from non-participating households are, essentially, censored at zero); and fourth is an important simultaneity that exists between the decision to adopt a cross-bred cow, the decision about how much milk to produce, the decision about how much milk to consume and the decision to market that milk which is produced but not consumed internally by the household. Routine application of Gibbs sampling and data augmentation overcome these problems in a relatively straightforward manner. We model the count data from two sites close to Addis Ababa in a latent, categorical-variable setting with known bin boundaries. The single-equation model is then extended to a multivariate system that accommodates the covariance between crossbred-cow adoption, milk-output, and milk-sales equations. The latent-variable procedure proves tractable in extension to the multivariate setting and provides important information for policy formation in emerging-market settings
Resumo:
Platelets in the circulation are triggered by vascular damage to activate, aggregate and form a thrombus that prevents excessive blood loss. Platelet activation is stringently regulated by intracellular signalling cascades, which when activated inappropriately lead to myocardial infarction and stroke. Strategies to address platelet dysfunction have included proteomics approaches which have lead to the discovery of a number of novel regulatory proteins of potential therapeutic value. Global analysis of platelet proteomes may enhance the outcome of these studies by arranging this information in a contextual manner that recapitulates established signalling complexes and predicts novel regulatory processes. Platelet signalling networks have already begun to be exploited with interrogation of protein datasets using in silico methodologies that locate functionally feasible protein clusters for subsequent biochemical validation. Characterization of these biological systems through analysis of spatial and temporal organization of component proteins is developing alongside advances in the proteomics field. This focused review highlights advances in platelet proteomics data mining approaches that complement the emerging systems biology field. We have also highlighted nucleated cell types as key examples that can inform platelet research. Therapeutic translation of these modern approaches to understanding platelet regulatory mechanisms will enable the development of novel anti-thrombotic strategies.
Resumo:
Despite the importance of a thorough understanding of the effect of synthetic fertiliser on insect population dynamics, existing literature is conflicting and an area of intense debate. Here, a categorical random-effects meta-analysis and a vote count meta-analysis are employed to examine the effects of nitrogen(N), phosphorus (P), potassium (K) and NPK fertiliser on insect population dynamics. In agreement with the general consensus, insects were found to respond positively, overall, to fertilisers. Sucking insects showed a much stronger response to fertilisers than chewing insects. The environment in which a study is conducted can have a marked effect on insect responses to fertiliser, with natural environments showing the potential for buffering effects of nitrogen fertilisers in particular. As well as highlighting the potential shortfall in the amount of research investigating particularly the effects of potassium and phosphorus, this study provides an invaluable flag post in the ongoing research investigating fertiliser effects on ecosystems.
Resumo:
Much of mainstream economic analysis assumes that markets adjust smoothly, through prices, to changes in economic conditions. However, this is not necessarily the case for local housing markets, whose spatial structures may exhibit persistence, so that conditions may not be those most suited to the requirements of modern-day living. Persistence can arise from the existence of transaction costs. The paper tests the proposition that housing markets in Inner London exhibit a degree of path dependence, through the construction of a three-equation model, and examines the impact of variables constructed for the 19th and early 20th centuries on modern house prices. These include 19th-century social structures, slum clearance programmes and the 1908 underground network. Each is found to be significant. The tests require the construction of novel historical datasets, which are also described in the paper.
Resumo:
This paper presents a video surveillance framework that robustly and efficiently detects abandoned objects in surveillance scenes. The framework is based on a novel threat assessment algorithm which combines the concept of ownership with automatic understanding of social relations in order to infer abandonment of objects. Implementation is achieved through development of a logic-based inference engine based on Prolog. Threat detection performance is conducted by testing against a range of datasets describing realistic situations and demonstrates a reduction in the number of false alarms generated. The proposed system represents the approach employed in the EU SUBITO project (Surveillance of Unattended Baggage and the Identification and Tracking of the Owner).
Resumo:
This research presents a novel multi-functional system for medical Imaging-enabled Assistive Diagnosis (IAD). Although the IAD demonstrator has focused on abdominal images and supports the clinical diagnosis of kidneys using CT/MRI imaging, it can be adapted to work on image delineation, annotation and 3D real-size volumetric modelling of other organ structures such as the brain, spine, etc. The IAD provides advanced real-time 3D visualisation and measurements with fully automated functionalities as developed in two stages. In the first stage, via the clinically driven user interface, specialist clinicians use CT/MRI imaging datasets to accurately delineate and annotate the kidneys and their possible abnormalities, thus creating “3D Golden Standard Models”. Based on these models, in the second stage, clinical support staff i.e. medical technicians interactively define model-based rules and parameters for the integrated “Automatic Recognition Framework” to achieve results which are closest to that of the clinicians. These specific rules and parameters are stored in “Templates” and can later be used by any clinician to automatically identify organ structures i.e. kidneys and their possible abnormalities. The system also supports the transmission of these “Templates” to another expert for a second opinion. A 3D model of the body, the organs and their possible pathology with real metrics is also integrated. The automatic functionality was tested on eleven MRI datasets (comprising of 286 images) and the 3D models were validated by comparing them with the metrics from the corresponding “3D Golden Standard Models”. The system provides metrics for the evaluation of the results, in terms of Accuracy, Precision, Sensitivity, Specificity and Dice Similarity Coefficient (DSC) so as to enable benchmarking of its performance. The first IAD prototype has produced promising results as its performance accuracy based on the most widely deployed evaluation metric, DSC, yields 97% for the recognition of kidneys and 96% for their abnormalities; whilst across all the above evaluation metrics its performance ranges between 96% and 100%. Further development of the IAD system is in progress to extend and evaluate its clinical diagnostic support capability through development and integration of additional algorithms to offer fully computer-aided identification of other organs and their abnormalities based on CT/MRI/Ultra-sound Imaging.
Resumo:
Objective To undertake a process evaluation of pharmacists' recommendations arising in the context of a complex IT-enabled pharmacist-delivered randomised controlled trial (PINCER trial) to reduce the risk of hazardous medicines management in general practices. Methods PINCER pharmacists manually recorded patients’ demographics, details of interventions recommended, actions undertaken by practice staff and time taken to manage individual cases of hazardous medicines management. Data were coded and double entered into SPSS v15, and then summarised using percentages for categorical data (with 95% CI) and, as appropriate, means (SD) or medians (IQR) for continuous data. Key findings Pharmacists spent a median of 20 minutes (IQR 10, 30) reviewing medical records, recommending interventions and completing actions in each case of hazardous medicines management. Pharmacists judged 72% (95%CI 70, 74) (1463/2026) of cases of hazardous medicines management to be clinically relevant. Pharmacists recommended 2105 interventions in 74% (95%CI 73, 76) (1516/2038) of cases and 1685 actions were taken in 61% (95%CI 59, 63) (1246/2038) of cases; 66% (95%CI 64, 68) (1383/2105) of interventions recommended by pharmacists were completed and 5% (95%CI 4, 6) (104/2105) of recommendations were accepted by general practitioners (GPs), but not completed at the end of the pharmacists’ placement; the remaining recommendations were rejected or considered not relevant by GPs. Conclusions The outcome measures were used to target pharmacist activity in general practice towards patients at risk from hazardous medicines management. Recommendations from trained PINCER pharmacists were found to be broadly acceptable to GPs and led to ameliorative action in the majority of cases. It seems likely that the approach used by the PINCER pharmacists could be employed by other practice pharmacists following appropriate training.
Resumo:
The occurrence of mid-latitude windstorms is related to strong socio-economic effects. For detailed and reliable regional impact studies, large datasets of high-resolution wind fields are required. In this study, a statistical downscaling approach in combination with dynamical downscaling is introduced to derive storm related gust speeds on a high-resolution grid over Europe. Multiple linear regression models are trained using reanalysis data and wind gusts from regional climate model simulations for a sample of 100 top ranking windstorm events. The method is computationally inexpensive and reproduces individual windstorm footprints adequately. Compared to observations, the results for Germany are at least as good as pure dynamical downscaling. This new tool can be easily applied to large ensembles of general circulation model simulations and thus contribute to a better understanding of the regional impact of windstorms based on decadal and climate change projections.
Resumo:
The link between the Pacific/North American pattern (PNA) and the North Atlantic Oscillation (NAO) is investigated in reanalysis data (NCEP, ERA40) and multi-century CGCM runs for present day climate using three versions of the ECHAM model. PNA and NAO patterns and indices are determined via rotated principal component analysis on monthly mean 500 hPa geopotential height fields using the varimax criteria. On average, the multi-century CGCM simulations show a significant anti-correlation between PNA and NAO. Further, multi-decadal periods with significantly enhanced (high anti-correlation, active phase) or weakened (low correlations, inactive phase) coupling are found in all CGCMs. In the simulated active phases, the storm track activity near Newfoundland has a stronger link with the PNA variability than during the inactive phases. On average, the reanalysis datasets show no significant anti-correlation between PNA and NAO indices, but during the sub-period 1973–1994 a significant anti-correlation is detected, suggesting that the present climate could correspond to an inactive period as detected in the CGCMs. An analysis of possible physical mechanisms suggests that the link between the patterns is established by the baroclinic waves forming the North Atlantic storm track. The geopotential height anomalies associated with negative PNA phases induce an increased advection of warm and moist air from the Gulf of Mexico and cold air from Canada. Both types of advection contribute to increase baroclinicity over eastern North America and also to increase the low level latent heat content of the warm air masses. Thus, growth conditions for eddies at the entrance of the North Atlantic storm track are enhanced. Considering the average temporal development during winter for the CGCM, results show an enhanced Newfoundland storm track maximum in the early winter for negative PNA, followed by a downstream enhancement of the Atlantic storm track in the subsequent months. In active (passive) phases, this seasonal development is enhanced (suppressed). As the storm track over the central and eastern Atlantic is closely related to the NAO variability, this development can be explained by the shift of the NAO index to more positive values.
Resumo:
Based on the availability of hemispheric gridded data sets from observations, analysis and global climate models, objective cyclone identification methods were developed and applied to these data sets. Due to the large amount of investigation methods combined with the variety of different datasets, a multitude of results exist, not only for the recent climate period but also for the next century, assuming anthropogenic changed conditions. Different thresholds, different physical quantities, and considerations of different atmospheric vertical levels add to a picture that is difficult to combine into a common view of cyclones, their variability and trends, in the real world and in GCM studies. Thus, this paper will give a comprehensive review of the actual knowledge on climatologies of mid-latitude cyclones for the Northern and Southern Hemisphere for the present climate and for its possible changes under anthropogenic climate conditions.
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
The goal was to quantitatively estimate and compare the fidelity of images acquired with a digital imaging system (ADAR 5500) and generated through scanning of color infrared aerial photographs (SCIRAP) using image-based metrics. Images were collected nearly simultaneously in two repetitive flights to generate multi-temporal datasets. Spatial fidelity of ADAR was lower than that of SCIRAP images. Radiometric noise was higher for SCIRAP than for ADAR images, even though noise from misregistration effects was lower. These results suggest that with careful control of film scanning, the overall fidelity of SCIRAP imagery can be comparable to that of digital multispectral camera data. Therefore, SCIRAP images can likely be used in conjunction with digital metric camera imagery in long-term landcover change analyses.