982 resultados para histogram calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The capabilities of a high-resolution (HR), accurate mass spectrometer (Exactive-MS) operating in full scan MS mode was investigated for the quantitative LC/MS analysis of drugs in patients' plasma samples. A mass resolution of 50,000 (FWHM) at m/z 200 and a mass extracted window of 5 ppm around the theoretical m/z of each analyte were used to construct chromatograms for quantitation. The quantitative performance of the Exactive-MS was compared with that of a triple quadrupole mass spectrometer (TQ-MS), TSQ Quantum Discovery or Quantum Ultra, operating in the conventional selected reaction monitoring (SRM) mode. The study consisted of 17 therapeutic drugs including 8 antifungal agents (anidulafungin, caspofungin, fluconazole, itraconazole, hydroxyitraconazole posaconazole, voriconazole and voriconazole-N-oxide), 4 immunosuppressants (ciclosporine, everolimus, sirolimus and tacrolimus) and 5 protein kinase inhibitors (dasatinib, imatinib, nilotinib, sorafenib and sunitinib). The quantitative results obtained with HR-MS acquisition show comparable detection specificity, assay precision, accuracy, linearity and sensitivity to SRM acquisition. Importantly, HR-MS offers several benefits over TQ-MS technology: absence of SRM optimization, time saving when changing the analysis from one MS to another, more complete information of what is in the samples and easier troubleshooting. Our work demonstrates that U/HPLC coupled to Exactive HR-MS delivers comparable results to TQ-MS in routine quantitative drug analyses. Considering the advantages of HR-MS, these results suggest that, in the near future, there should be a shift in how routine quantitative analyses of small molecules, particularly for therapeutic drugs, are performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To compare the predictive accuracy of the original and recalibrated Framingham risk function on current morbidity from coronary heart disease (CHD) and mortality data from the Swiss population. METHODS: Data from the CoLaus study, a cross-sectional, population-based study conducted between 2003 and 2006 on 5,773 participants aged 35-74 without CHD were used to recalibrate the Framingham risk function. The predicted number of events from each risk function were compared with those issued from local MONICA incidence rates and official mortality data from Switzerland. RESULTS: With the original risk function, 57.3%, 21.2%, 16.4% and 5.1% of men and 94.9%, 3.8%, 1.2% and 0.1% of women were at very low (<6%), low (6-10%), intermediate (10-20%) and high (>20%) risk, respectively. With the recalibrated risk function, the corresponding values were 84.7%, 10.3%, 4.3% and 0.6% in men and 99.5%, 0.4%, 0.0% and 0.1% in women, respectively. The number of CHD events over 10 years predicted by the original Framingham risk function was 2-3 fold higher than predicted by mortality+case fatality or by MONICA incidence rates (men: 191 vs. 92 and 51 events, respectively). The recalibrated risk function provided more reasonable estimates, albeit slightly overestimated (92 events, 5-95th percentile: 26-223 events); sensitivity analyses showed that the magnitude of the overestimation was between 0.4 and 2.2 in men, and 0.7 and 3.3 in women. CONCLUSION: The recalibrated Framingham risk function provides a reasonable alternative to assess CHD risk in men, but not in women.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: The expansion of a recovering population - whether re-introduced or spontaneously returning - is shaped by (i) biological (intrinsic) factors such as the land tenure system or dispersal, (ii) the distribution and availability of resources (e.g. prey), (iii) habitat and landscape features, and (iv) human attitudes and activities. In order to develop efficient conservation and recovery strategies, we need to understand all these factors and to predict the potential distribution and explore ways to reach it. An increased number of lynx in the north-western Swiss Alps in the nineties lead to a new controversy about the return of this cat. When the large carnivores were given legal protection in many European countries, most organizations and individuals promoting their protection did not foresee the consequences. Management plans describing how to handle conflicts with large predators are needed to find a balance between "overabundance" and extinction. Wildlife and conservation biologists need to evaluate the various threats confronting populations so that adequate management decisions can be taken. I developed a GIS probability model for the lynx, based on habitat information and radio-telemetry data from the Swiss Jura Mountains, in order to predict the potential distribution of the lynx in this mountain range, which is presently only partly occupied by lynx. Three of the 18 variables tested for each square kilometre describing land use, vegetation, and topography, qualified to predict the probability of lynx presence. The resulting map was evaluated with data from dispersing subadult lynx. Young lynx that were not able to establish home ranges in what was identified as good lynx habitat did not survive their first year of independence, whereas the only one that died in good lynx habitat was illegally killed. Radio-telemetry fixes are often used as input data to calibrate habitat models. Radio-telemetry is the only way to gather accurate and unbiased data on habitat use of elusive larger terrestrial mammals. However, it is time consuming and expensive, and can therefore only be applied in limited areas. Habitat models extrapolated over large areas can in turn be problematic, as habitat characteristics and availability may change from one area to the other. I analysed the predictive power of Ecological Niche Factor Analysis (ENFA) in Switzerland with the lynx as focal species. According to my results, the optimal sampling strategy to predict species distribution in an Alpine area lacking available data would be to pool presence cells from contrasted regions (Jura Mountains, Alps), whereas in regions with a low ecological variance (Jura Mountains), only local presence cells should be used for the calibration of the model. Dispersal influences the dynamics and persistence of populations, the distribution and abundance of species, and gives the communities and ecosystems their characteristic texture in space and time. Between 1988 and 2001, the spatio-temporal behaviour of subadult Eurasian lynx in two re-introduced populations in Switzerland was studied, based on 39 juvenile lynx of which 24 were radio-tagged to understand the factors influencing dispersal. Subadults become independent from their mothers at the age of 8-11 months. No sex bias neither in the dispersal rate nor in the distance moved was detected. Lynx are conservative dispersers, compared to bear and wolf, and settled within or close to known lynx occurrences. Dispersal distances reached in the high lynx density population - shorter than those reported in other Eurasian lynx studies - are limited by habitat restriction hindering connections with neighbouring metapopulations. I postulated that high lynx density would lead to an expansion of the population and validated my predictions with data from the north-western Swiss Alps where about 1995 a strong increase in lynx abundance took place. The general hypothesis that high population density will foster the expansion of the population was not confirmed. This has consequences for the re-introduction and recovery of carnivores in a fragmented landscape. To establish a strong source population in one place might not be an optimal strategy. Rather, population nuclei should be founded in several neighbouring patches. Exchange between established neighbouring subpopulations will later on take place, as adult lynx show a higher propensity to cross barriers than subadults. To estimate the potential population size of the lynx in the Jura Mountains and to assess possible corridors between this population and adjacent areas, I adapted a habitat probability model for lynx distribution in the Jura Mountains with new environmental data and extrapolated it over the entire mountain range. The model predicts a breeding population ranging from 74-101 individuals and from 51-79 individuals when continuous habitat patches < 50 km2 are disregarded. The Jura Mountains could once be part of a metapopulation, as potential corridors exist to the adjoining areas (Alps, Vosges Mountains, and Black Forest). Monitoring of the population size, spatial expansion, and the genetic surveillance in the Jura Mountains must be continued, as the status of the population is still critical. ENFA was used to predict the potential distribution of lynx in the Alps. The resulting model divided the Alps into 37 suitable habitat patches ranging from 50 to 18,711 km2, covering a total area of about 93,600 km2. When using the range of lynx densities found in field studies in Switzerland, the Alps could host a population of 961 to 1,827 residents. The results of the cost-distance analysis revealed that all patches were within the reach of dispersing lynx, as the connection costs were in the range of dispersal cost of radio-tagged subadult lynx moving through unfavorable habitat. Thus, the whole Alps could once be considered as a metapopulation. But experience suggests that only few disperser will cross unsuitable areas and barriers. This low migration rate may seldom allow the spontaneous foundation of new populations in unsettled areas. As an alternative to natural dispersal, artificial transfer of individuals across the barriers should be considered. Wildlife biologists can play a crucial role in developing adaptive management experiments to help managers learning by trial. The case of the lynx in Switzerland is a good example of a fruitful cooperation between wildlife biologists, managers, decision makers and politician in an adaptive management process. This cooperation resulted in a Lynx Management Plan which was implemented in 2000 and updated in 2004 to give the cantons directives on how to handle lynx-related problems. This plan was put into practice e.g. in regard to translocation of lynx into unsettled areas. Résumé: L'expansion d'une population en phase de recolonisation, qu'elle soit issue de réintroductions ou d'un retour naturel dépend 1) de facteurs biologiques tels que le système social et le mode de dispersion, 2) de la distribution et la disponibilité des ressources (proies), 3) de l'habitat et des éléments du paysage, 4) de l'acceptation de l'espèce par la population locale et des activités humaines. Afin de pouvoir développer des stratégies efficaces de conservation et de favoriser la recolonisation, chacun de ces facteurs doit être pris en compte. En plus, la distribution potentielle de l'espèce doit pouvoir être déterminée et enfin, toutes les possibilités pour atteindre les objectifs, examinées. La phase de haute densité que la population de lynx a connue dans les années nonante dans le nord-ouest des Alpes suisses a donné lieu à une controverse assez vive. La protection du lynx dans de nombreux pays européens, promue par différentes organisations, a entraîné des conséquences inattendues; ces dernières montrent que tout plan de gestion doit impérativement indiquer des pistes quant à la manière de gérer les conflits, tout en trouvant un équilibre entre l'extinction et la surabondance de l'espèce. Les biologistes de la conservation et de la faune sauvage doivent pour cela évaluer les différents risques encourus par les populations de lynx, afin de pouvoir rapidement prendre les meilleuresmdécisions de gestion. Un modèle d'habitat pour le lynx, basé sur des caractéristiques de l'habitat et des données radio télémétriques collectées dans la chaîne du Jura, a été élaboré afin de prédire la distribution potentielle dans cette région, qui n'est que partiellement occupée par l'espèce. Trois des 18 variables testées, décrivant pour chaque kilomètre carré l'utilisation du sol, la végétation ainsi que la topographie, ont été retenues pour déterminer la probabilité de présence du lynx. La carte qui en résulte a été comparée aux données télémétriques de lynx subadultes en phase de dispersion. Les jeunes qui n'ont pas pu établir leur domaine vital dans l'habitat favorable prédit par le modèle n'ont pas survécu leur première année d'indépendance alors que le seul individu qui est mort dans l'habitat favorable a été braconné. Les données radio-télémétriques sont souvent utilisées pour l'étalonnage de modèles d'habitat. C'est un des seuls moyens à disposition qui permette de récolter des données non biaisées et précises sur l'occupation de l'habitat par des mammifères terrestres aux moeurs discrètes. Mais ces méthodes de- mandent un important investissement en moyens financiers et en temps et peuvent, de ce fait, n'être appliquées qu'à des zones limitées. Les modèles d'habitat sont ainsi souvent extrapolés à de grandes surfaces malgré le risque d'imprécision, qui résulte des variations des caractéristiques et de la disponibilité de l'habitat d'une zone à l'autre. Le pouvoir de prédiction de l'Analyse Ecologique de la Niche (AEN) dans les zones où les données de présence n'ont pas été prises en compte dans le calibrage du modèle a été analysée dans le cas du lynx en Suisse. D'après les résultats obtenus, la meilleure mé- thode pour prédire la distribution du lynx dans une zone alpine dépourvue d'indices de présence est de combiner des données provenant de régions contrastées (Alpes, Jura). Par contre, seules les données sur la présence locale de l'espèce doivent être utilisées pour les zones présentant une faible variance écologique tel que le Jura. La dispersion influence la dynamique et la stabilité des populations, la distribution et l'abondance des espèces et détermine les caractéristiques spatiales et temporelles des communautés vivantes et des écosystèmes. Entre 1988 et 2001, le comportement spatio-temporel de lynx eurasiens subadultes de deux populations réintroduites en Suisse a été étudié, basé sur le suivi de 39 individus juvéniles dont 24 étaient munis d'un collier émetteur, afin de déterminer les facteurs qui influencent la dispersion. Les subadultes se sont séparés de leur mère à l'âge de 8 à 11 mois. Le sexe n'a pas eu d'influence sur le nombre d'individus ayant dispersés et la distance parcourue au cours de la dispersion. Comparé à l'ours et au loup, le lynx reste très modéré dans ses mouvements de dispersion. Tous les individus ayant dispersés se sont établis à proximité ou dans des zones déjà occupées par des lynx. Les distances parcourues lors de la dispersion ont été plus courtes pour la population en phase de haute densité que celles relevées par les autres études de dispersion du lynx eurasien. Les zones d'habitat peu favorables et les barrières qui interrompent la connectivité entre les populations sont les principales entraves aux déplacements, lors de la dispersion. Dans un premier temps, nous avons fait l'hypothèse que les phases de haute densité favorisaient l'expansion des populations. Mais cette hypothèse a été infirmée par les résultats issus du suivi des lynx réalisé dans le nord-ouest des Alpes, où la population connaissait une phase de haute densité depuis 1995. Ce constat est important pour la conservation d'une population de carnivores dans un habitat fragmenté. Ainsi, instaurer une forte population source à un seul endroit n'est pas forcément la stratégie la plus judicieuse. Il est préférable d'établir des noyaux de populations dans des régions voisines où l'habitat est favorable. Des échanges entre des populations avoisinantes pourront avoir lieu par la suite car les lynx adultes sont plus enclins à franchir les barrières qui entravent leurs déplacements que les individus subadultes. Afin d'estimer la taille de la population de lynx dans le Jura et de déterminer les corridors potentiels entre cette région et les zones avoisinantes, un modèle d'habitat a été utilisé, basé sur un nouveau jeu de variables environnementales et extrapolé à l'ensemble du Jura. Le modèle prédit une population reproductrice de 74 à 101 individus et de 51 à 79 individus lorsque les surfaces d'habitat d'un seul tenant de moins de 50 km2 sont soustraites. Comme des corridors potentiels existent effectivement entre le Jura et les régions avoisinantes (Alpes, Vosges, et Forêt Noire), le Jura pourrait faire partie à l'avenir d'une métapopulation, lorsque les zones avoisinantes seront colonisées par l'espèce. La surveillance de la taille de la population, de son expansion spatiale et de sa structure génétique doit être maintenue car le statut de cette population est encore critique. L'AEN a également été utilisée pour prédire l'habitat favorable du lynx dans les Alpes. Le modèle qui en résulte divise les Alpes en 37 sous-unités d'habitat favorable dont la surface varie de 50 à 18'711 km2, pour une superficie totale de 93'600 km2. En utilisant le spectre des densités observées dans les études radio-télémétriques effectuées en Suisse, les Alpes pourraient accueillir une population de lynx résidents variant de 961 à 1'827 individus. Les résultats des analyses de connectivité montrent que les sous-unités d'habitat favorable se situent à des distances telles que le coût de la dispersion pour l'espèce est admissible. L'ensemble des Alpes pourrait donc un jour former une métapopulation. Mais l'expérience montre que très peu d'individus traverseront des habitats peu favorables et des barrières au cours de leur dispersion. Ce faible taux de migration rendra difficile toute nouvelle implantation de populations dans des zones inoccupées. Une solution alternative existe cependant : transférer artificiellement des individus d'une zone à l'autre. Les biologistes spécialistes de la faune sauvage peuvent jouer un rôle important et complémentaire pour les gestionnaires de la faune, en les aidant à mener des expériences de gestion par essai. Le cas du lynx en Suisse est un bel exemple d'une collaboration fructueuse entre biologistes de la faune sauvage, gestionnaires, organes décisionnaires et politiciens. Cette coopération a permis l'élaboration du Concept Lynx Suisse qui est entré en vigueur en 2000 et remis à jour en 2004. Ce plan donne des directives aux cantons pour appréhender la problématique du lynx. Il y a déjà eu des applications concrètes sur le terrain, notamment par des translocations d'individus dans des zones encore inoccupées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Back-focal-plane interferometry is used to measure displacements of optically trapped samples with very high spatial and temporal resolution. However, the technique is closely related to a method that measures the rate of change in light momentum. It has long been known that displacements of the interference pattern at the back focal plane may be used to track the optical force directly, provided that a considerable fraction of the light is effectively monitored. Nonetheless, the practical application of this idea has been limited to counter-propagating, low-aperture beams where the accurate momentum measurements are possible. Here, we experimentally show that the connection can be extended to single-beam optical traps. In particular, we show that, in a gradient trap, the calibration product κ·β (where κ is the trap stiffness and 1/β is the position sensitivity) corresponds to the factor that converts detector signals into momentum changes; this factor is uniquely determined by three construction features of the detection instrument and does not depend, therefore, on the specific conditions of the experiment. Then, we find that force measurements obtained from back-focal-plane displacements are in practice not restricted to a linear relationship with position and hence they can be extended outside that regime. Finally, and more importantly, we show that these properties are still recognizable even when the system is not fully optimized for light collection. These results should enable a more general use of back-focal-plane interferometry whenever the ultimate goal is the measurement of the forces exerted by an optical trap.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Back-focal-plane interferometry is used to measure displacements of optically trapped samples with very high spatial and temporal resolution. However, the technique is closely related to a method that measures the rate of change in light momentum. It has long been known that displacements of the interference pattern at the back focal plane may be used to track the optical force directly, provided that a considerable fraction of the light is effectively monitored. Nonetheless, the practical application of this idea has been limited to counter-propagating, low-aperture beams where the accurate momentum measurements are possible. Here, we experimentally show that the connection can be extended to single-beam optical traps. In particular, we show that, in a gradient trap, the calibration product κ·β (where κ is the trap stiffness and 1/β is the position sensitivity) corresponds to the factor that converts detector signals into momentum changes; this factor is uniquely determined by three construction features of the detection instrument and does not depend, therefore, on the specific conditions of the experiment. Then, we find that force measurements obtained from back-focal-plane displacements are in practice not restricted to a linear relationship with position and hence they can be extended outside that regime. Finally, and more importantly, we show that these properties are still recognizable even when the system is not fully optimized for light collection. These results should enable a more general use of back-focal-plane interferometry whenever the ultimate goal is the measurement of the forces exerted by an optical trap.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many questions in evolutionary biology require an estimate of divergence times but, for groups with a sparse fossil record, such estimates rely heavily on molecular dating methods. The accuracy of these methods depends on both an adequate underlying model and the appropriate implementation of fossil evidence as calibration points. We explore the effect of these in Poaceae (grasses), a diverse plant lineage with a very limited fossil record, focusing particularly on dating the early divergences in the group. We show that molecular dating based on a data set of plastid markers is strongly dependent on the model assumptions. In particular, an acceleration of evolutionary rates at the base of Poaceae followed by a deceleration in the descendants strongly biases methods that assume an autocorrelation of rates. This problem can be circumvented by using markers that have lower rate variation, and we show that phylogenetic markers extracted from complete nuclear genomes can be a useful complement to the more commonly used plastid markers. However, estimates of divergence times remain strongly affected by different implementations of fossil calibration points. Analyses calibrated with only macrofossils lead to estimates for the age of core Poaceae ∼51-55 Ma, but the inclusion of microfossil evidence pushes this age to 74-82 Ma and leads to lower estimated evolutionary rates in grasses. These results emphasize the importance of considering markers from multiple genomes and alternative fossil placements when addressing evolutionary issues that depend on ages estimated for important groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Turtle Mountain in Alberta, Canada has become an important field laboratory for testing different techniques related to the characterization and monitoring of large slope mass movements as the stability of large portions of the eastern face of the mountain is still questionable. In order to better quantify the volumes potentially unstable and the most probable failure mechanisms and potential consequences, structural analysis and runout modeling were preformed. The structural features of the eastern face were investigated using a high resolution digital elevation model (HRDEM). According to displacement datasets and structural observations, potential failure mechanisms affecting different portions of the mountain have been assessed. The volumes of the different potentially unstable blocks have been calculated using the Sloping Local Base Level (SLBL) method. Based on the volume estimation, two and three dimensional dynamic runout analyses have been performed. Calibration of this analysis is based on the experience from the adjacent Frank Slide and other similar rock avalanches. The results will be used to improve the contingency plans within the hazard area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Mechanistic-Empirical Pavement Design Guide (MEPDG) was developed under National Cooperative Highway Research Program (NCHRP) Project 1-37A as a novel mechanistic-empirical procedure for the analysis and design of pavements. The MEPDG was subsequently supported by AASHTO’s DARWin-ME and most recently marketed as AASHTOWare Pavement ME Design software as of February 2013. Although the core design process and computational engine have remained the same over the years, some enhancements to the pavement performance prediction models have been implemented along with other documented changes as the MEPDG transitioned to AASHTOWare Pavement ME Design software. Preliminary studies were carried out to determine possible differences between AASHTOWare Pavement ME Design, MEPDG (version 1.1), and DARWin-ME (version 1.1) performance predictions for new jointed plain concrete pavement (JPCP), new hot mix asphalt (HMA), and HMA over JPCP systems. Differences were indeed observed between the pavement performance predictions produced by these different software versions. Further investigation was needed to verify these differences and to evaluate whether identified local calibration factors from the latest MEPDG (version 1.1) were acceptable for use with the latest version (version 2.1.24) of AASHTOWare Pavement ME Design at the time this research was conducted. Therefore, the primary objective of this research was to examine AASHTOWare Pavement ME Design performance predictions using previously identified MEPDG calibration factors (through InTrans Project 11-401) and, if needed, refine the local calibration coefficients of AASHTOWare Pavement ME Design pavement performance predictions for Iowa pavement systems using linear and nonlinear optimization procedures. A total of 130 representative sections across Iowa consisting of JPCP, new HMA, and HMA over JPCP sections were used. The local calibration results of AASHTOWare Pavement ME Design are presented and compared with national and locally calibrated MEPDG models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES:: For certain major operations, inpatient mortality risk is lower in high-volume hospitals than those in low-volume hospitals. Extending the analysis to a broader range of interventions and outcomes is necessary before adopting policies based on minimum volume thresholds. METHODS:: Using the United States 2004 Nationwide Inpatient Sample, we assessed the effect of intervention-specific and overall hospital volume on surgical complications, potentially avoidable reoperations, and deaths across 1.4 million interventions in 353 hospitals. Outcome variations across hospitals were analyzed through a 3-level hierarchical logistic regression model (patients, surgical interventions, and hospitals), which took into account interventions on multiple organs, 144 intervention categories, and structural hospital characteristics. Discriminative performance and calibration were good. RESULTS:: Hospitals with more experience in a given intervention had similar reoperation rates but lower mortality and complication rates: odds ratio per volume deciles 0.93 and 0.97. However, the benefit was limited to heart surgery and a small number of other operations. Risks were higher for hospitals that performed more interventions overall: odds ratio per 1000 for each event was approximately 1.02. Even after adjustment for specific volume, mortality varied substantially across both high- and low-volume hospitals. CONCLUSION:: Although the link between specific volume and certain inpatient outcomes suggests that specialization might help improve surgical safety, the variable magnitude of this link and the heterogeneity of hospital effect do not support the systematic use of volume-based referrals. It may be more efficient to monitor risk-adjusted postoperative outcomes and to investigate facilities with worse than expected outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Astrocytes fulfill a central role in regulating K+ and glutamate, both released by neurons into the extracellular space during activity. Glial glutamate uptake is a secondary active process that involves the influx of three Na+ ions and one proton and the efflux of one K+ ion. Thus, intracellular K+ concentration ([K+]i) is potentially influenced both by extracellular K+ concentration ([K+]o) fluctuations and glutamate transport in astrocytes. We evaluated the impact of these K+ ion movements on [K+]i in primary mouse astrocytes by microspectrofluorimetry. We established a new noninvasive and reliable approach to monitor and quantify [K+]i using the recently developed K+ sensitive fluorescent indicator Asante Potassium Green-1 (APG-1). An in situ calibration procedure enabled us to estimate the resting [K+]i at 133±1 mM. We first investigated the dependency of [K+]i levels on [K+]o. We found that [K+]i followed [K+]o changes nearly proportionally in the range 3-10 mM, which is consistent with previously reported microelectrode measurements of intracellular K+ concentration changes in astrocytes. We then found that glutamate superfusion caused a reversible drop of [K+]i that depended on the glutamate concentration with an apparent EC50 of 11.1±1.4 µM, corresponding to the affinity of astrocyte glutamate transporters. The amplitude of the [K+]i drop was found to be 2.3±0.1 mM for 200 µM glutamate applications. Overall, this study shows that the fluorescent K+ indicator APG-1 is a powerful new tool for addressing important questions regarding fine [K+]i regulation with excellent spatial resolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Myotragus balearicus was an endemic bovid from the Balearic Islands (Western Mediterranean) that became extinct around 6,000-4,000 years ago. The Myotragus evolutionary lineage became isolated in the islands most probably at the end of the Messinian crisis, when the desiccation of the Mediterranean ended, in a geological date established at 5.35 Mya. Thus, the sequences of Myotragus could be very valuable for calibrating the mammalian mitochondrial DNA clock and, in particular, the tree of the Caprinae subfamily, to which Myotragus belongs. Results: We have retrieved the complete mitochondrial cytochrome b gene (1,143 base pairs), plus fragments of the mitochondrial 12S gene and the nuclear 28S rDNA multi-copy gene from a well preserved Myotragus subfossil bone. The best resolved phylogenetic trees, obtained with the cytochrome b gene, placed Myotragus in a position basal to the Ovis group. Using the calibration provided by the isolation of Balearic Islands, we calculated that the initial radiation of caprines can be dated at 6.2 ± 0.4 Mya. In addition, alpine and southern chamois, considered until recently the same species, split around 1.6 ± 0.3 Mya, indicating that the two chamois species have been separated much longer than previously thought. Conclusion: Since there are almost no extant endemic mammals in Mediterranean islands, the sequence of the extinct Balearic endemic Myotragus has been crucial for allowing us to use the Messinian crisis calibration point for dating the caprines phylogenetic tree.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article reports on a lossless data hiding scheme for digital images where the data hiding capacity is either determined by minimum acceptable subjective quality or by the demanded capacity. In the proposed method data is hidden within the image prediction errors, where the most well-known prediction algorithms such as the median edge detector (MED), gradient adjacent prediction (GAP) and Jiang prediction are tested for this purpose. In this method, first the histogram of the prediction errors of images are computed and then based on the required capacity or desired image quality, the prediction error values of frequencies larger than this capacity are shifted. The empty space created by such a shift is used for embedding the data. Experimental results show distinct superiority of the image prediction error histogram over the conventional image histogram itself, due to much narrower spectrum of the former over the latter. We have also devised an adaptive method for hiding data, where subjective quality is traded for data hiding capacity. Here the positive and negative error values are chosen such that the sum of their frequencies on the histogram is just above the given capacity or above a certain quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we introduce a highly efficient reversible data hiding system. It is based on dividing the image into tiles and shifting the histograms of each image tile between its minimum and maximum frequency. Data are then inserted at the pixel level with the largest frequency to maximize data hiding capacity. It exploits the special properties of medical images, where the histogram of their nonoverlapping image tiles mostly peak around some gray values and the rest of the spectrum is mainlyempty. The zeros (or minima) and peaks (maxima) of the histograms of the image tiles are then relocated to embed the data. The grey values of some pixels are therefore modified.High capacity, high fidelity, reversibility and multiple data insertions are the key requirements of data hiding in medical images. We show how histograms of image tiles of medical images can be exploited to achieve these requirements. Compared with data hiding method applied to the whole image, our scheme can result in 30%-200% capacity improvement and still with better image quality, depending on the medical image content. Additional advantages of the proposed method include hiding data in the regions of non-interest and better exploitation of spatial masking.