945 resultados para Point density analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To describe the training undertaken by pharmacists employed in a pharmacist-led information technology-based intervention study to reduce medication errors in primary care (PINCER Trial), evaluate pharmacists’ assessment of the training, and the time implications of undertaking the training. Methods: Six pharmacists received training, which included training on root cause analysis and educational outreach, to enable them to deliver the PINCER Trial intervention. This was evaluated using self-report questionnaires at the end of each training session. The time taken to complete each session was recorded. Data from the evaluation forms were entered onto a Microsoft Excel spreadsheet, independently checked and the summary of results further verified. Frequencies were calculated for responses to the three-point Likert scale questions. Free-text comments from the evaluation forms and pharmacists’ diaries were analysed thematically. Key findings: All six pharmacists received 22 hours of training over five sessions. In four out of the five sessions, the pharmacists who completed an evaluation form (27 out of 30 were completed) stated they were satisfied or very satisfied with the various elements of the training package. Analysis of free-text comments and the pharmacists’ diaries showed that the principles of root cause analysis and educational outreach were viewed as useful tools to help pharmacists conduct pharmaceutical interventions in both the study and other pharmacy roles that they undertook. The opportunity to undertake role play was a valuable part of the training received. Conclusions: Findings presented in this paper suggest that providing the PINCER pharmacists with training in root cause analysis and educational outreach contributed to the successful delivery of PINCER interventions and could potentially be utilised by other pharmacists based in general practice to deliver pharmaceutical interventions to improve patient safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the analysis of winter (October-November-December-January-February-March; ONDJFM) storm events and their changes due to increased anthropogenic greenhouse gas concentrations over Europe. In order to assess uncertainties that are due to model formulation, 4 regional climate models (RCMs) with 5 high resolution experiments, and 4 global general circulation models (GCMs) are considered. Firstly, cyclone systems as synoptic scale processes in winter are investigated, as they are a principal cause of the occurrence of extreme, damage-causing wind speeds. This is achieved by use of an objective cyclone identification and tracking algorithm applied to GCMs. Secondly, changes in extreme near-surface wind speeds are analysed. Based on percentile thresholds, the studied extreme wind speed indices allow a consistent analysis over Europe that takes systematic deviations of the models into account. Relative changes in both intensity and frequency of extreme winds and their related uncertainties are assessed and related to changing patterns of extreme cyclones. A common feature of all investigated GCMs is a reduced track density over central Europe under climate change conditions, if all systems are considered. If only extreme (i.e. the strongest 5%) cyclones are taken into account, an increasing cyclone activity for western parts of central Europe is apparent; however, the climate change signal reveals a reduced spatial coherency when compared to all systems, which exposes partially contrary results. With respect to extreme wind speeds, significant positive changes in intensity and frequency are obtained over at least 3 and 20% of the European domain under study (35–72°N and 15°W–43°E), respectively. Location and extension of the affected areas (up to 60 and 50% of the domain for intensity and frequency, respectively), as well as levels of changes (up to +15 and +200% for intensity and frequency, respectively) are shown to be highly dependent on the driving GCM, whereas differences between RCMs when driven by the same GCM are relatively small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop and analyze a class of efficient Galerkin approximation methods for uncertainty quantification of nonlinear operator equations. The algorithms are based on sparse Galerkin discretizations of tensorized linearizations at nominal parameters. Specifically, we consider abstract, nonlinear, parametric operator equations J(\alpha ,u)=0 for random input \alpha (\omega ) with almost sure realizations in a neighborhood of a nominal input parameter \alpha _0. Under some structural assumptions on the parameter dependence, we prove existence and uniqueness of a random solution, u(\omega ) = S(\alpha (\omega )). We derive a multilinear, tensorized operator equation for the deterministic computation of k-th order statistical moments of the random solution's fluctuations u(\omega ) - S(\alpha _0). We introduce and analyse sparse tensor Galerkin discretization schemes for the efficient, deterministic computation of the k-th statistical moment equation. We prove a shift theorem for the k-point correlation equation in anisotropic smoothness scales and deduce that sparse tensor Galerkin discretizations of this equation converge in accuracy vs. complexity which equals, up to logarithmic terms, that of the Galerkin discretization of a single instance of the mean field problem. We illustrate the abstract theory for nonstationary diffusion problems in random domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Question: What are the correlations between the degree of drought stress and temperature, and the adoption of specific adaptive strategies by plants in the Mediterranean region? Location: 602 sites across the Mediterranean region. Method: We considered 12 plant morphological and phenological traits, and measured their abundance at the sites as trait scores obtained from pollen percentages. We conducted stepwise regression analyses of trait scores as a function of plant available moisture (α) and winter temperature (MTCO). Results: Patterns in the abundance for the plant traits we considered are clearly determined by α, MTCO or a combination of both. In addition, trends in leaf size, texture, thickness, pubescence and aromatic leaves and other plant level traits such as thorniness and aphylly, vary according to the life form (tree, shrub, forb), the leaf type (broad, needle) and phenology (evergreen, summer-green). Conclusions: Despite conducting this study based on pollen data we have identified ecologically plausible trends in the abundance of traits along climatic gradients. Plant traits other than the usual life form, leaf type and leaf phenology carry strong climatic signals. Generally, combinations of plant traits are more climatically diagnostic than individual traits. The qualitative and quantitative relationships between plant traits and climate parameters established here will help to provide an improved basis for modelling the impact of climate changes on vegetation and form a starting point for a global analysis of pollen-climate relationships

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymers with the ability to heal themselves could provide access to materials with extended lifetimes in a wide range of applications such as surface coatings, automotive components and aerospace composites. Here we describe the synthesis and characterisation of two novel, stimuli-responsive, supramolecular polymer blends based on π-electron-rich pyrenyl residues and π-electron-deficient, chain-folding aromatic diimides that interact through complementary π–π stacking interactions. Different degrees of supramolecular “cross-linking” were achieved by use of divalent or trivalent poly(ethylene glycol)-based polymers featuring pyrenyl end-groups, blended with a known diimide–ether copolymer. The mechanical properties of the resulting polymer blends revealed that higher degrees of supramolecular “cross-link density” yield materials with enhanced mechanical properties, such as increased tensile modulus, modulus of toughness, elasticity and yield point. After a number of break/heal cycles, these materials were found to retain the characteristics of the pristine polymer blend, and this new approach thus offers a simple route to mechanically robust yet healable materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The IEEE 754 standard for oating-point arithmetic is widely used in computing. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. The IEEE infinities are said to have the behaviour of limits. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. We elucidate the transreal tangent and extend real limits to transreal limits. Arguing from this firm foundation, we maintain that there are three category errors in the IEEE 754 standard. Firstly the claim that IEEE infinities are limits of real arithmetic confuses limiting processes with arithmetic. Secondly a defence of IEEE negative zero confuses the limit of a function with the value of a function. Thirdly the definition of IEEE NaNs confuses undefined with unordered. Furthermore we prove that the tangent function, with the infinities given by geometrical con- struction, has a period of an entire rotation, not half a rotation as is commonly understood. This illustrates a category error, confusing the limit with the value of a function, in an important area of applied mathe- matics { trigonometry. We brie y consider the wider implications of this category error. Another paper proposes transreal arithmetic as a basis for floating- point arithmetic; here we take the profound step of proposing transreal arithmetic as a replacement for real arithmetic to remove the possibility of certain category errors in mathematics. Thus we propose both theo- retical and practical advantages of transmathematics. In particular we argue that implementing transreal analysis in trans- floating-point arith- metic would extend the coverage, accuracy and reliability of almost all computer programs that exploit real analysis { essentially all programs in science and engineering and many in finance, medicine and other socially beneficial applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is widely thought that changes in both the surface buoyancy fluxes and wind stress drive variability in the Atlantic meridional overturning circulation (AMOC), but that they drive variability on different time scales. For example, wind forcing dominates short-term variability through its effects on Ekman currents and coastal upwelling, whereas buoyancy forcing is important for longer time scales (multiannual and decadal). However, the role of the wind forcing on multiannual to decadal time scales is less clear. Here the authors present an analysis of simulations with the Nucleus for European Modelling of the Ocean (NEMO) ocean model with the aim of explaining the important drivers of the zonal density gradient at 26°N, which is directly related to the AMOC. In the experiments, only one of either the wind stress or the buoyancy forcing is allowed to vary in time, whereas the other remains at its seasonally varying climatology. On subannual time scales, variations in the density gradient, and in the AMOC minus Ekman, are driven largely by local wind-forced coastal upwelling at both the western and eastern boundaries. On decadal time scales, buoyancy forcing related to the North Atlantic Oscillation dominates variability in the AMOC. Interestingly, however, it is found that wind forcing also plays a role at longer time scales, primarily impacting the interannual variability through the excitation of Rossby waves in the central Atlantic, which propagate westward to interact with the western boundary, but also by modulating the decadal time-scale response to buoyancy forcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A statistical–dynamical downscaling (SDD) approach for the regionalization of wind energy output (Eout) over Europe with special focus on Germany is proposed. SDD uses an extended circulation weather type (CWT) analysis on global daily mean sea level pressure fields with the central point being located over Germany. Seventy-seven weather classes based on the associated CWT and the intensity of the geostrophic flow are identified. Representatives of these classes are dynamically downscaled with the regional climate model COSMO-CLM. By using weather class frequencies of different data sets, the simulated representatives are recombined to probability density functions (PDFs) of near-surface wind speed and finally to Eout of a sample wind turbine for present and future climate. This is performed for reanalysis, decadal hindcasts and long-term future projections. For evaluation purposes, results of SDD are compared to wind observations and to simulated Eout of purely dynamical downscaling (DD) methods. For the present climate, SDD is able to simulate realistic PDFs of 10-m wind speed for most stations in Germany. The resulting spatial Eout patterns are similar to DD-simulated Eout. In terms of decadal hindcasts, results of SDD are similar to DD-simulated Eout over Germany, Poland, Czech Republic, and Benelux, for which high correlations between annual Eout time series of SDD and DD are detected for selected hindcasts. Lower correlation is found for other European countries. It is demonstrated that SDD can be used to downscale the full ensemble of the Earth System Model of the Max Planck Institute (MPI-ESM) decadal prediction system. Long-term climate change projections in Special Report on Emission Scenarios of ECHAM5/MPI-OM as obtained by SDD agree well to the results of other studies using DD methods, with increasing Eout over northern Europe and a negative trend over southern Europe. Despite some biases, it is concluded that SDD is an adequate tool to assess regional wind energy changes in large model ensembles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large pine weevil, Hylobius abietis, is a serious pest of reforestation in northern Europe. However, weevils developing in stumps of felled trees can be killed by entomopathogenic nematodes applied to soil around the stumps and this method of control has been used at an operational level in the UK and Ireland. We investigated the factors affecting the efficacy of entomopathogenic nematodes in the control of the large pine weevil spanning 10 years of field experiments, by means of a meta-analysis of published studies and previously unpublished data. We investigated two species with different foraging strategies, the ‘ambusher’ Steinernema carpocapsae, the species most often used at an operational level, and the ‘cruiser’ Heterorhabditis downesi. Efficacy was measured both by percentage reduction in numbers of adults emerging relative to untreated controls and by percentage parasitism of developing weevils in the stump. Both measures were significantly higher with H. downesi compared to S. carpocapsae. General linear models were constructed for each nematode species separately, using substrate type (peat versus mineral soil) and tree species (pine versus spruce) as fixed factors, weevil abundance (from the mean of untreated stumps) as a covariate and percentage reduction or percentage parasitism as the response variable. For both nematode species, the most significant and parsimonious models showed that substrate type was consistently, but not always, the most significant variable, whether replicates were at a site or stump level, and that peaty soils significantly promote the efficacy of both species. Efficacy, in terms of percentage parasitism, was not density dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give an a posteriori analysis of a semidiscrete discontinuous Galerkin scheme approximating solutions to a model of multiphase elastodynamics, which involves an energy density depending not only on the strain but also the strain gradient. A key component in the analysis is the reduced relative entropy stability framework developed in Giesselmann (2014, SIAM J. Math. Anal., 46, 3518–3539). This framework allows energy-type arguments to be applied to continuous functions. Since we advocate the use of discontinuous Galerkin methods we make use of two families of reconstructions, one set of discrete reconstructions and a set of elliptic reconstructions to apply the reduced relative entropy framework in this setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give an a priori analysis of a semi-discrete discontinuous Galerkin scheme approximating solutions to a model of multiphase elastodynamics which involves an energy density depending not only on the strain but also the strain gradient. A key component in the analysis is the reduced relative entropy stability framework developed in Giesselmann (SIAM J Math Anal 46(5):3518–3539, 2014). The estimate we derive is optimal in the L∞(0,T;dG) norm for the strain and the L2(0,T;dG) norm for the velocity, where dG is an appropriate mesh dependent H1-like space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The components of many signaling pathways have been identified and there is now a need to conduct quantitative data-rich temporal experiments for systems biology and modeling approaches to better understand pathway dynamics and regulation. Here we present a modified Western blotting method that allows the rapid and reproducible quantification and analysis of hundreds of data points per day on proteins and their phosphorylation state at individual sites. The approach is of particular use where samples show a high degree of sample-to-sample variability such as primary cells from multiple donors. We present a case study on the analysis of >800 phosphorylation data points from three phosphorylation sites in three signaling proteins over multiple time points from platelets isolated from ten donors, demonstrating the technique's potential to determine kinetic and regulatory information from limited cell numbers and to investigate signaling variation within a population. We envisage the approach being of use in the analysis of many cellular processes such as signaling pathway dynamics to identify regulatory feedback loops and the investigation of potential drug/inhibitor responses, using primary cells and tissues, to generate information about how a cell's physiological state changes over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within-field variation in sugar beet yield and quality was investigated in three commercial sugar beet fields in the east of England to identify the main associated variables and to examine the possibility of predicting yield early in the season with a view to spatially variable management of sugar beet crops. Irregular grid sampling with some purposively-located nested samples was applied. It revealed the spatial variability in each sugar beet field efficiently. In geostatistical analyses, most variograms were isotropic with moderate to strong spatial dependency indicating a significant spatial variation in sugar beet yield and associated growth and environmental variables in all directions within each field. The Kriged maps showed spatial patterns of yield variability within each field and visual association with the maps of other variables. This was confirmed by redundancy analyses and Pearson correlation coefficients. The main variables associated with yield variability were soil type, organic matter, soil moisture, weed density and canopy temperature. Kriged maps of final yield variability were strongly related to that in crop canopy cover, LAI and intercepted solar radiation early in the growing season, and the yield maps of previous crops. Therefore, yield maps of previous crops together with early assessment of sugar beet growth may make an early prediction of within-field variability in sugar beet yield possible. The Broom’s Barn sugar beet model failed to account for the spatial variability in sugar yield, but the simulation was greatly improved when corrected for early canopy development cover and when the simulated yield was adjusted for weeds and plant population. Further research to optimize inputs to maximise sugar yield should target the irrigation and fertilizing of areas within fields with low canopy cover early in the season.