149 resultados para CALIBRATION


Relevância:

10.00% 10.00%

Publicador:

Resumo:

When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To compare the predictive accuracy of the original and recalibrated Framingham risk function on current morbidity from coronary heart disease (CHD) and mortality data from the Swiss population. METHODS: Data from the CoLaus study, a cross-sectional, population-based study conducted between 2003 and 2006 on 5,773 participants aged 35-74 without CHD were used to recalibrate the Framingham risk function. The predicted number of events from each risk function were compared with those issued from local MONICA incidence rates and official mortality data from Switzerland. RESULTS: With the original risk function, 57.3%, 21.2%, 16.4% and 5.1% of men and 94.9%, 3.8%, 1.2% and 0.1% of women were at very low (<6%), low (6-10%), intermediate (10-20%) and high (>20%) risk, respectively. With the recalibrated risk function, the corresponding values were 84.7%, 10.3%, 4.3% and 0.6% in men and 99.5%, 0.4%, 0.0% and 0.1% in women, respectively. The number of CHD events over 10 years predicted by the original Framingham risk function was 2-3 fold higher than predicted by mortality+case fatality or by MONICA incidence rates (men: 191 vs. 92 and 51 events, respectively). The recalibrated risk function provided more reasonable estimates, albeit slightly overestimated (92 events, 5-95th percentile: 26-223 events); sensitivity analyses showed that the magnitude of the overestimation was between 0.4 and 2.2 in men, and 0.7 and 3.3 in women. CONCLUSION: The recalibrated Framingham risk function provides a reasonable alternative to assess CHD risk in men, but not in women.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: The expansion of a recovering population - whether re-introduced or spontaneously returning - is shaped by (i) biological (intrinsic) factors such as the land tenure system or dispersal, (ii) the distribution and availability of resources (e.g. prey), (iii) habitat and landscape features, and (iv) human attitudes and activities. In order to develop efficient conservation and recovery strategies, we need to understand all these factors and to predict the potential distribution and explore ways to reach it. An increased number of lynx in the north-western Swiss Alps in the nineties lead to a new controversy about the return of this cat. When the large carnivores were given legal protection in many European countries, most organizations and individuals promoting their protection did not foresee the consequences. Management plans describing how to handle conflicts with large predators are needed to find a balance between "overabundance" and extinction. Wildlife and conservation biologists need to evaluate the various threats confronting populations so that adequate management decisions can be taken. I developed a GIS probability model for the lynx, based on habitat information and radio-telemetry data from the Swiss Jura Mountains, in order to predict the potential distribution of the lynx in this mountain range, which is presently only partly occupied by lynx. Three of the 18 variables tested for each square kilometre describing land use, vegetation, and topography, qualified to predict the probability of lynx presence. The resulting map was evaluated with data from dispersing subadult lynx. Young lynx that were not able to establish home ranges in what was identified as good lynx habitat did not survive their first year of independence, whereas the only one that died in good lynx habitat was illegally killed. Radio-telemetry fixes are often used as input data to calibrate habitat models. Radio-telemetry is the only way to gather accurate and unbiased data on habitat use of elusive larger terrestrial mammals. However, it is time consuming and expensive, and can therefore only be applied in limited areas. Habitat models extrapolated over large areas can in turn be problematic, as habitat characteristics and availability may change from one area to the other. I analysed the predictive power of Ecological Niche Factor Analysis (ENFA) in Switzerland with the lynx as focal species. According to my results, the optimal sampling strategy to predict species distribution in an Alpine area lacking available data would be to pool presence cells from contrasted regions (Jura Mountains, Alps), whereas in regions with a low ecological variance (Jura Mountains), only local presence cells should be used for the calibration of the model. Dispersal influences the dynamics and persistence of populations, the distribution and abundance of species, and gives the communities and ecosystems their characteristic texture in space and time. Between 1988 and 2001, the spatio-temporal behaviour of subadult Eurasian lynx in two re-introduced populations in Switzerland was studied, based on 39 juvenile lynx of which 24 were radio-tagged to understand the factors influencing dispersal. Subadults become independent from their mothers at the age of 8-11 months. No sex bias neither in the dispersal rate nor in the distance moved was detected. Lynx are conservative dispersers, compared to bear and wolf, and settled within or close to known lynx occurrences. Dispersal distances reached in the high lynx density population - shorter than those reported in other Eurasian lynx studies - are limited by habitat restriction hindering connections with neighbouring metapopulations. I postulated that high lynx density would lead to an expansion of the population and validated my predictions with data from the north-western Swiss Alps where about 1995 a strong increase in lynx abundance took place. The general hypothesis that high population density will foster the expansion of the population was not confirmed. This has consequences for the re-introduction and recovery of carnivores in a fragmented landscape. To establish a strong source population in one place might not be an optimal strategy. Rather, population nuclei should be founded in several neighbouring patches. Exchange between established neighbouring subpopulations will later on take place, as adult lynx show a higher propensity to cross barriers than subadults. To estimate the potential population size of the lynx in the Jura Mountains and to assess possible corridors between this population and adjacent areas, I adapted a habitat probability model for lynx distribution in the Jura Mountains with new environmental data and extrapolated it over the entire mountain range. The model predicts a breeding population ranging from 74-101 individuals and from 51-79 individuals when continuous habitat patches < 50 km2 are disregarded. The Jura Mountains could once be part of a metapopulation, as potential corridors exist to the adjoining areas (Alps, Vosges Mountains, and Black Forest). Monitoring of the population size, spatial expansion, and the genetic surveillance in the Jura Mountains must be continued, as the status of the population is still critical. ENFA was used to predict the potential distribution of lynx in the Alps. The resulting model divided the Alps into 37 suitable habitat patches ranging from 50 to 18,711 km2, covering a total area of about 93,600 km2. When using the range of lynx densities found in field studies in Switzerland, the Alps could host a population of 961 to 1,827 residents. The results of the cost-distance analysis revealed that all patches were within the reach of dispersing lynx, as the connection costs were in the range of dispersal cost of radio-tagged subadult lynx moving through unfavorable habitat. Thus, the whole Alps could once be considered as a metapopulation. But experience suggests that only few disperser will cross unsuitable areas and barriers. This low migration rate may seldom allow the spontaneous foundation of new populations in unsettled areas. As an alternative to natural dispersal, artificial transfer of individuals across the barriers should be considered. Wildlife biologists can play a crucial role in developing adaptive management experiments to help managers learning by trial. The case of the lynx in Switzerland is a good example of a fruitful cooperation between wildlife biologists, managers, decision makers and politician in an adaptive management process. This cooperation resulted in a Lynx Management Plan which was implemented in 2000 and updated in 2004 to give the cantons directives on how to handle lynx-related problems. This plan was put into practice e.g. in regard to translocation of lynx into unsettled areas. Résumé: L'expansion d'une population en phase de recolonisation, qu'elle soit issue de réintroductions ou d'un retour naturel dépend 1) de facteurs biologiques tels que le système social et le mode de dispersion, 2) de la distribution et la disponibilité des ressources (proies), 3) de l'habitat et des éléments du paysage, 4) de l'acceptation de l'espèce par la population locale et des activités humaines. Afin de pouvoir développer des stratégies efficaces de conservation et de favoriser la recolonisation, chacun de ces facteurs doit être pris en compte. En plus, la distribution potentielle de l'espèce doit pouvoir être déterminée et enfin, toutes les possibilités pour atteindre les objectifs, examinées. La phase de haute densité que la population de lynx a connue dans les années nonante dans le nord-ouest des Alpes suisses a donné lieu à une controverse assez vive. La protection du lynx dans de nombreux pays européens, promue par différentes organisations, a entraîné des conséquences inattendues; ces dernières montrent que tout plan de gestion doit impérativement indiquer des pistes quant à la manière de gérer les conflits, tout en trouvant un équilibre entre l'extinction et la surabondance de l'espèce. Les biologistes de la conservation et de la faune sauvage doivent pour cela évaluer les différents risques encourus par les populations de lynx, afin de pouvoir rapidement prendre les meilleuresmdécisions de gestion. Un modèle d'habitat pour le lynx, basé sur des caractéristiques de l'habitat et des données radio télémétriques collectées dans la chaîne du Jura, a été élaboré afin de prédire la distribution potentielle dans cette région, qui n'est que partiellement occupée par l'espèce. Trois des 18 variables testées, décrivant pour chaque kilomètre carré l'utilisation du sol, la végétation ainsi que la topographie, ont été retenues pour déterminer la probabilité de présence du lynx. La carte qui en résulte a été comparée aux données télémétriques de lynx subadultes en phase de dispersion. Les jeunes qui n'ont pas pu établir leur domaine vital dans l'habitat favorable prédit par le modèle n'ont pas survécu leur première année d'indépendance alors que le seul individu qui est mort dans l'habitat favorable a été braconné. Les données radio-télémétriques sont souvent utilisées pour l'étalonnage de modèles d'habitat. C'est un des seuls moyens à disposition qui permette de récolter des données non biaisées et précises sur l'occupation de l'habitat par des mammifères terrestres aux moeurs discrètes. Mais ces méthodes de- mandent un important investissement en moyens financiers et en temps et peuvent, de ce fait, n'être appliquées qu'à des zones limitées. Les modèles d'habitat sont ainsi souvent extrapolés à de grandes surfaces malgré le risque d'imprécision, qui résulte des variations des caractéristiques et de la disponibilité de l'habitat d'une zone à l'autre. Le pouvoir de prédiction de l'Analyse Ecologique de la Niche (AEN) dans les zones où les données de présence n'ont pas été prises en compte dans le calibrage du modèle a été analysée dans le cas du lynx en Suisse. D'après les résultats obtenus, la meilleure mé- thode pour prédire la distribution du lynx dans une zone alpine dépourvue d'indices de présence est de combiner des données provenant de régions contrastées (Alpes, Jura). Par contre, seules les données sur la présence locale de l'espèce doivent être utilisées pour les zones présentant une faible variance écologique tel que le Jura. La dispersion influence la dynamique et la stabilité des populations, la distribution et l'abondance des espèces et détermine les caractéristiques spatiales et temporelles des communautés vivantes et des écosystèmes. Entre 1988 et 2001, le comportement spatio-temporel de lynx eurasiens subadultes de deux populations réintroduites en Suisse a été étudié, basé sur le suivi de 39 individus juvéniles dont 24 étaient munis d'un collier émetteur, afin de déterminer les facteurs qui influencent la dispersion. Les subadultes se sont séparés de leur mère à l'âge de 8 à 11 mois. Le sexe n'a pas eu d'influence sur le nombre d'individus ayant dispersés et la distance parcourue au cours de la dispersion. Comparé à l'ours et au loup, le lynx reste très modéré dans ses mouvements de dispersion. Tous les individus ayant dispersés se sont établis à proximité ou dans des zones déjà occupées par des lynx. Les distances parcourues lors de la dispersion ont été plus courtes pour la population en phase de haute densité que celles relevées par les autres études de dispersion du lynx eurasien. Les zones d'habitat peu favorables et les barrières qui interrompent la connectivité entre les populations sont les principales entraves aux déplacements, lors de la dispersion. Dans un premier temps, nous avons fait l'hypothèse que les phases de haute densité favorisaient l'expansion des populations. Mais cette hypothèse a été infirmée par les résultats issus du suivi des lynx réalisé dans le nord-ouest des Alpes, où la population connaissait une phase de haute densité depuis 1995. Ce constat est important pour la conservation d'une population de carnivores dans un habitat fragmenté. Ainsi, instaurer une forte population source à un seul endroit n'est pas forcément la stratégie la plus judicieuse. Il est préférable d'établir des noyaux de populations dans des régions voisines où l'habitat est favorable. Des échanges entre des populations avoisinantes pourront avoir lieu par la suite car les lynx adultes sont plus enclins à franchir les barrières qui entravent leurs déplacements que les individus subadultes. Afin d'estimer la taille de la population de lynx dans le Jura et de déterminer les corridors potentiels entre cette région et les zones avoisinantes, un modèle d'habitat a été utilisé, basé sur un nouveau jeu de variables environnementales et extrapolé à l'ensemble du Jura. Le modèle prédit une population reproductrice de 74 à 101 individus et de 51 à 79 individus lorsque les surfaces d'habitat d'un seul tenant de moins de 50 km2 sont soustraites. Comme des corridors potentiels existent effectivement entre le Jura et les régions avoisinantes (Alpes, Vosges, et Forêt Noire), le Jura pourrait faire partie à l'avenir d'une métapopulation, lorsque les zones avoisinantes seront colonisées par l'espèce. La surveillance de la taille de la population, de son expansion spatiale et de sa structure génétique doit être maintenue car le statut de cette population est encore critique. L'AEN a également été utilisée pour prédire l'habitat favorable du lynx dans les Alpes. Le modèle qui en résulte divise les Alpes en 37 sous-unités d'habitat favorable dont la surface varie de 50 à 18'711 km2, pour une superficie totale de 93'600 km2. En utilisant le spectre des densités observées dans les études radio-télémétriques effectuées en Suisse, les Alpes pourraient accueillir une population de lynx résidents variant de 961 à 1'827 individus. Les résultats des analyses de connectivité montrent que les sous-unités d'habitat favorable se situent à des distances telles que le coût de la dispersion pour l'espèce est admissible. L'ensemble des Alpes pourrait donc un jour former une métapopulation. Mais l'expérience montre que très peu d'individus traverseront des habitats peu favorables et des barrières au cours de leur dispersion. Ce faible taux de migration rendra difficile toute nouvelle implantation de populations dans des zones inoccupées. Une solution alternative existe cependant : transférer artificiellement des individus d'une zone à l'autre. Les biologistes spécialistes de la faune sauvage peuvent jouer un rôle important et complémentaire pour les gestionnaires de la faune, en les aidant à mener des expériences de gestion par essai. Le cas du lynx en Suisse est un bel exemple d'une collaboration fructueuse entre biologistes de la faune sauvage, gestionnaires, organes décisionnaires et politiciens. Cette coopération a permis l'élaboration du Concept Lynx Suisse qui est entré en vigueur en 2000 et remis à jour en 2004. Ce plan donne des directives aux cantons pour appréhender la problématique du lynx. Il y a déjà eu des applications concrètes sur le terrain, notamment par des translocations d'individus dans des zones encore inoccupées.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many questions in evolutionary biology require an estimate of divergence times but, for groups with a sparse fossil record, such estimates rely heavily on molecular dating methods. The accuracy of these methods depends on both an adequate underlying model and the appropriate implementation of fossil evidence as calibration points. We explore the effect of these in Poaceae (grasses), a diverse plant lineage with a very limited fossil record, focusing particularly on dating the early divergences in the group. We show that molecular dating based on a data set of plastid markers is strongly dependent on the model assumptions. In particular, an acceleration of evolutionary rates at the base of Poaceae followed by a deceleration in the descendants strongly biases methods that assume an autocorrelation of rates. This problem can be circumvented by using markers that have lower rate variation, and we show that phylogenetic markers extracted from complete nuclear genomes can be a useful complement to the more commonly used plastid markers. However, estimates of divergence times remain strongly affected by different implementations of fossil calibration points. Analyses calibrated with only macrofossils lead to estimates for the age of core Poaceae ∼51-55 Ma, but the inclusion of microfossil evidence pushes this age to 74-82 Ma and leads to lower estimated evolutionary rates in grasses. These results emphasize the importance of considering markers from multiple genomes and alternative fossil placements when addressing evolutionary issues that depend on ages estimated for important groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Turtle Mountain in Alberta, Canada has become an important field laboratory for testing different techniques related to the characterization and monitoring of large slope mass movements as the stability of large portions of the eastern face of the mountain is still questionable. In order to better quantify the volumes potentially unstable and the most probable failure mechanisms and potential consequences, structural analysis and runout modeling were preformed. The structural features of the eastern face were investigated using a high resolution digital elevation model (HRDEM). According to displacement datasets and structural observations, potential failure mechanisms affecting different portions of the mountain have been assessed. The volumes of the different potentially unstable blocks have been calculated using the Sloping Local Base Level (SLBL) method. Based on the volume estimation, two and three dimensional dynamic runout analyses have been performed. Calibration of this analysis is based on the experience from the adjacent Frank Slide and other similar rock avalanches. The results will be used to improve the contingency plans within the hazard area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES:: For certain major operations, inpatient mortality risk is lower in high-volume hospitals than those in low-volume hospitals. Extending the analysis to a broader range of interventions and outcomes is necessary before adopting policies based on minimum volume thresholds. METHODS:: Using the United States 2004 Nationwide Inpatient Sample, we assessed the effect of intervention-specific and overall hospital volume on surgical complications, potentially avoidable reoperations, and deaths across 1.4 million interventions in 353 hospitals. Outcome variations across hospitals were analyzed through a 3-level hierarchical logistic regression model (patients, surgical interventions, and hospitals), which took into account interventions on multiple organs, 144 intervention categories, and structural hospital characteristics. Discriminative performance and calibration were good. RESULTS:: Hospitals with more experience in a given intervention had similar reoperation rates but lower mortality and complication rates: odds ratio per volume deciles 0.93 and 0.97. However, the benefit was limited to heart surgery and a small number of other operations. Risks were higher for hospitals that performed more interventions overall: odds ratio per 1000 for each event was approximately 1.02. Even after adjustment for specific volume, mortality varied substantially across both high- and low-volume hospitals. CONCLUSION:: Although the link between specific volume and certain inpatient outcomes suggests that specialization might help improve surgical safety, the variable magnitude of this link and the heterogeneity of hospital effect do not support the systematic use of volume-based referrals. It may be more efficient to monitor risk-adjusted postoperative outcomes and to investigate facilities with worse than expected outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Astrocytes fulfill a central role in regulating K+ and glutamate, both released by neurons into the extracellular space during activity. Glial glutamate uptake is a secondary active process that involves the influx of three Na+ ions and one proton and the efflux of one K+ ion. Thus, intracellular K+ concentration ([K+]i) is potentially influenced both by extracellular K+ concentration ([K+]o) fluctuations and glutamate transport in astrocytes. We evaluated the impact of these K+ ion movements on [K+]i in primary mouse astrocytes by microspectrofluorimetry. We established a new noninvasive and reliable approach to monitor and quantify [K+]i using the recently developed K+ sensitive fluorescent indicator Asante Potassium Green-1 (APG-1). An in situ calibration procedure enabled us to estimate the resting [K+]i at 133±1 mM. We first investigated the dependency of [K+]i levels on [K+]o. We found that [K+]i followed [K+]o changes nearly proportionally in the range 3-10 mM, which is consistent with previously reported microelectrode measurements of intracellular K+ concentration changes in astrocytes. We then found that glutamate superfusion caused a reversible drop of [K+]i that depended on the glutamate concentration with an apparent EC50 of 11.1±1.4 µM, corresponding to the affinity of astrocyte glutamate transporters. The amplitude of the [K+]i drop was found to be 2.3±0.1 mM for 200 µM glutamate applications. Overall, this study shows that the fluorescent K+ indicator APG-1 is a powerful new tool for addressing important questions regarding fine [K+]i regulation with excellent spatial resolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variations in the stable carbon-isotope ratio of marine and continental sediments can reflect changes in sink and flux modifications of the palaeocarbon cycle. Here we report carbon-isotope compositions of Middle Jurassic marine carbonates from the Betic Cordillera (southern Spain), which represents an ideal region to link the stable carbon-isotope curves directly to ammonite zones and subzones, and thereby for the first time achieve an accurate chronostratigraphic calibration. The five sections studied represent basin and high swell deposits of the Southern Iberian palaeomargin. We find a similar delta C-13 of carbonates between different oceanic areas, suggesting a homogeneous carbon-isotope oceanic reservoir through the Middle Jurassic. The Aalenian-Bajocian transition is a critical period in ammonite evolution; hence the Early Jurassic fauna are replaced by new ammonite families which become dominant throughout the Middle and Late Jurassic. For this reason, we compared the delta C-13 values of carbonates with ammonite diversity and extinction rates at different taxonomical levels in order to explore the possible relationship between the carbon cycle and ammonite evolution. The carbon-isotope values of carbonates are not exactly linearly correlated with the extinction rate and ammonite diversity, but the main faunal turnovers follow minimum delta C-13 values, where extinct taxa are replaced by new ones. Likewise, radiation episodes are associated with increasing delta C-13 values and with transgressive sea-level rise. All these data support the idea that perturbations in the global carbon cycle reflect rapid palaeoenvironmental changes. We made detailed analyses of these faunal turnovers, using them as a proxy to identify major palaeoenvironmental crises in their ecosystems forced by modification in the carbon cycle. (c) 2006 Elsevier B.V All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The quantification of total (free+sulfated) metanephrines in urine is recommended to diagnose pheochromocytoma. Urinary metanephrines include metanephrine itself, normetanephrine and methoxytyramine, mainly in the form of sulfate conjugates (60-80%). Their determination requires the hydrolysis of the sulfate ester moiety to allow electrochemical oxidation of the phenolic group. Commercially available urine calibrators and controls contain essentially free, unhydrolysable metanephrines which are not representative of native urines. The lack of appropriate calibrators may lead to uncertainty regarding the completion of the hydrolysis of sulfated metanephrines, resulting in incorrect quantification. METHODS: We used chemically synthesized sulfated metanephrines to establish whether the procedure most frequently recommended for commercial kits (pH 1.0 for 30 min over a boiling water bath) ensures their complete hydrolysis. RESULTS: We found that sulfated metanephrines differ in their optimum pH to obtain complete hydrolysis. Highest yields and minimal variance were established for incubation at pH 0.7-0.9 during 20 min. CONCLUSION: Urinary pH should be carefully controlled to ensure an efficient and reproducible hydrolysis of sulfated metanephrines. Synthetic sulfated metanephrines represent the optimal material for calibrators and proficiency testing to improve inter-laboratory accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bacterial bioreporters have substantial potential for contaminant assessment but their real world application is currently impaired by a lack of sensitivity. Here, we exploit the bioconcentration of chemicals in the urine of animals to facilitate pollutant detection. The shore crab Carcinus maenas was exposed to the organic contaminant 2-hydroxybiphenyl, and urine was screened using an Escherichia coli-based luciferase gene (luxAB) reporter assay specific to this compound. Bioassay measurements differentiated between the original contaminant and its metabolites, quantifying bioconcentration factors of up to one hundred-fold in crab urine. Our results reveal the substantial potential of using bacterial bioreporter assays in real-time monitoring of biological matricesto determine exposure histories, with wide ranging potential for the in situ measurement of xenobiotics in risk assessments and epidemiology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé : J'ai souvent vu des experts être d'avis contraires. Je n'en ai jamais vu aucun avoir tort. Auguste Detoeuf Propos d'O.L. Brenton, confiseur, Editions du Tambourinaire, 1948. En choisissant volontairement une problématique comptable typiquement empirique, ce travail s'est attelé à tenter de démontrer la possibilité de produire des enseignements purement comptables (ie à l'intérieur du schème de représentation de la Comptabilité) en s'interdisant l'emprunt systématique de theories clé-en-main à l'Économie -sauf quant cela s'avère réellement nécessaire et légitime, comme dans l'utilisation du CAPM au chapitre précédent. Encore une fois, rappelons que cette thèse n'est pas un réquisitoire contre l'approche économique en tant que telle, mais un plaidoyer visant à mitiger une telle approche en Comptabilité. En relation avec le positionnement épistémologique effectué au premier chapitre, il a été cherché à mettre en valeur l'apport et la place de la Comptabilité dans l'Économie par le positionnement de la Comptabilité en tant que discipline pourvoyeuse de mesures de représentation de l'activité économique. Il nous paraît clair que si l'activité économique, en tant que sémiosphère comptable directe, dicte les observations comptables, la mesure de ces dernières doit, tant que faire se peut, tenter de s'affranchir de toute dépendance à la discipline économique et aux théories-méthodes qui lui sont liées, en adoptant un mode opératoire orthogonal, rationnel et systématique dans le cadre d'axiomes lui appartenant en propre. Cette prise de position entraîne la définition d'un nouveau cadre épistémologique par rapport à l'approche positive de la Comptabilité. Cette dernière peut se décrire comme l'expression philosophique de l'investissement de la recherche comptable par une réflexion méthodique propre à la recherche économique. Afin d'être au moins partiellement validé, ce nouveau cadre -que nous voyons dérivé du constructivisme -devrait faire montre de sa capacité à traiter de manière satisfaisante une problématique classique de comptabilité empirico-positive. Cette problématique spécifique a été choisie sous la forme de traitement-validation du principe de continuité de l'exploitation. Le principe de continuité de l'exploitation postule (énonciation d'une hypothèse) et établit (vérification de l'hypothèse) que l'entreprise produit ses états financiers dans la perspective d'une poursuite normale de ses activités. Il y a rupture du principe de continuité de l'exploitation (qui devra alors être écartée au profit du principe de liquidation ou de cession) dans le cas de cessation d'activité, totale ou partielle, volontaire ou involontaire, ou la constatation de faits de nature à compromettre la continuité de l'exploitation. Ces faits concernent la situation financière, économique et sociale de l'entreprise et représentent l'ensemble des événements objectifs 33, survenus ou pouvant survenir, susceptibles d'affecter la poursuite de l'activité dans un avenir prévisible. A l'instar de tous les principes comptables, le principe de continuité de l'exploitation procède d'une considération purement théorique. Sa vérification requiert toutefois une analyse concrète, portant réellement et de manière mesurable à conséquence, raison pour laquelle il représente un thème de recherche fort apprécié en comptabilité positive, tant il peut (faussement) se confondre avec les études relatives à la banqueroute et la faillite des entreprises. Dans la pratique, certaines de ces études, basées sur des analyses multivariées discriminantes (VIDA), sont devenues pour l'auditeur de véritables outils de travail de par leur simplicité d'utilisation et d'interprétation. À travers la problématique de ce travail de thèse, il a été tenté de s'acquitter de nombreux objectifs pouvant être regroupés en deux ensembles : celui des objectifs liés à la démarche méthodologique et celui relevant de la mesure-calibration. Ces deux groupes-objectifs ont permis dans une dernière étape la construction d'un modèle se voulant une conséquence logique des choix et hypothèses retenus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: The ASTRAL score was recently introduced as a prognostic tool for acute ischemic stroke. It predicts 3-month outcome reliably in both the derivation and the validation European cohorts. We aimed to validate the ASTRAL score in a Chinese stroke population and moreover to explore its prognostic value to predict 12-month outcome. METHODS: We applied the ASTRAL score to acute ischemic stroke patients admitted to 132 study sites of the China National Stroke Registry. Unfavorable outcome was assessed as a modified Rankin Scale score >2 at 3 and 12 months. Areas under the curve were calculated to quantify the prognostic value. Calibration was assessed by comparing predicted and observed probability of unfavorable outcome using Pearson correlation coefficient. RESULTS: Among 3755 patients, 1473 (39.7%) had 3-month unfavorable outcome. Areas under the curve for 3 and 12 months were 0.82 and 0.81, respectively. There was high correlation between observed and expected probability of unfavorable 3- and 12-month outcome (Pearson correlation coefficient: 0.964 and 0.963, respectively). CONCLUSIONS: ASTRAL score is a reliable tool to predict unfavorable outcome at 3 and 12 months after acute ischemic stroke in the Chinese population. It is a useful tool that can be readily applied in clinical practice to risk-stratify acute stroke patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Workers with persistent disabilities after orthopaedic trauma may need occupational rehabilitation. Despite various risk profiles for non-return-to-work (non-RTW), there is no available predictive model. Moreover, injured workers may have various origins (immigrant workers), which may either affect their return to work or their eligibility for research purposes. The aim of this study was to develop and validate a predictive model that estimates the likelihood of non-RTW after occupational rehabilitation using predictors which do not rely on the worker's background. METHODS: Prospective cohort study (3177 participants, native (51%) and immigrant workers (49%)) with two samples: a) Development sample with patients from 2004 to 2007 with Full and Reduced Models, b) External validation of the Reduced Model with patients from 2008 to March 2010. We collected patients' data and biopsychosocial complexity with an observer rated interview (INTERMED). Non-RTW was assessed two years after discharge from the rehabilitation. Discrimination was assessed by the area under the receiver operating curve (AUC) and calibration was evaluated with a calibration plot. The model was reduced with random forests. RESULTS: At 2 years, the non-RTW status was known for 2462 patients (77.5% of the total sample). The prevalence of non-RTW was 50%. The full model (36 items) and the reduced model (19 items) had acceptable discrimination performance (AUC 0.75, 95% CI 0.72 to 0.78 and 0.74, 95% CI 0.71 to 0.76, respectively) and good calibration. For the validation model, the discrimination performance was acceptable (AUC 0.73; 95% CI 0.70 to 0.77) and calibration was also adequate. CONCLUSIONS: Non-RTW may be predicted with a simple model constructed with variables independent of the patient's education and language fluency. This model is useful for all kinds of trauma in order to adjust for case mix and it is applicable to vulnerable populations like immigrant workers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PAH (N-(4-aminobenzoyl)glycin) clearance measurements have been used for 50 years in clinical research for the determination of renal plasma flow. The quantitation of PAH in plasma or urine is generally performed by colorimetric method after diazotation reaction but the measurements must be corrected for the unspecific residual response observed in blank plasma. We have developed a HPLC method to specifically determine PAH and its metabolite NAc-PAH using a gradient elution ion-pair reversed-phase chromatography with UV detection at 273 and 265 nm, respectively. The separations were performed at room temperature on a ChromCart (125 mmx4 mm I.D.) Nucleosil 100-5 microm C18AB cartridge column, using a gradient elution of MeOH-buffer pH 3.9 1:99-->15:85 over 15 min. The pH 3.9 buffered aqueous solution consisted in a mixture of 375 ml sodium citrate-citric acid solution (21.01 g citric acid and 8.0 g NaOH per liter), added up with 2.7 ml H3PO4 85%, 1.0 g of sodium heptanesulfonate and completed ad 1000 ml with ultrapure water. The N-acetyltransferase activity does not seem to notably affect PAH clearances, although NAc-PAH represents 10.2+/-2.7% of PAH excreted unchanged in 12 healthy subjects. The performance of the HPLC and the colorimetric method have been compared using urine and plasma samples collected from healthy volunteers. Good correlations (r=0.94 and 0.97, for plasma and urine, respectively) are found between the results obtained with both techniques. However, the colorimetric method gives higher concentrations of PAH in urine and lower concentrations in plasma than those determined by HPLC. Hence, both renal (ClR) and systemic (Cls) clearances are systematically higher (35.1 and 17.8%, respectively) with the colorimetric method. The fraction of PAH excreted by the kidney ClR/ClS calculated from HPLC data (n=143) is, as expected, always <1 (mean=0.73+/-0.11), whereas the colorimetric method gives a mean extraction ratio of 0.87+/-0.13 implying some unphysiological values (>1). In conclusion, HPLC not only enables the simultaneous quantitation of PAH and NAc-PAH, but may also provide more accurate and precise PAH clearance measurements.