49 resultados para Optically transparent


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Plateforme Interdisciplinaire of the University of Lausanne is at the crossroad of soft and hard sciences. We question the place and the role of languages and scientific cultures in the construction and the transmission of knowledge. The examples - from the fields of law, health, mathematics, neurosciences and university education - exceed a conception of languages as transparent vehicles for ideas and discoveries. They enable to consider the diversity of languages and scientific cultures as a mean for a " thick standardization" of science, integrating and valuing the double need for conceptual depth and accessibility of scientific discourse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Species distribution models (SDMs) are increasingly proposed to support conservation decision making. However, evidence of SDMs supporting solutions for on-ground conservation problems is still scarce in the scientific literature. Here, we show that successful examples exist but are still largely hidden in the grey literature, and thus less accessible for analysis and learning. Furthermore, the decision framework within which SDMs are used is rarely made explicit. Using case studies from biological invasions, identification of critical habitats, reserve selection and translocation of endangered species, we propose that SDMs may be tailored to suit a range of decision-making contexts when used within a structured and transparent decision-making process. To construct appropriate SDMs to more effectively guide conservation actions, modellers need to better understand the decision process, and decision makers need to provide feedback to modellers regarding the actual use of SDMs to support conservation decisions. This could be facilitated by individuals or institutions playing the role of 'translators' between modellers and decision makers. We encourage species distribution modellers to get involved in real decision-making processes that will benefit from their technical input; this strategy has the potential to better bridge theory and practice, and contribute to improve both scientific knowledge and conservation outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When facing age-related cerebral decline, older adults are unequally affected by cognitive impairment without us knowing why. To explore underlying mechanisms and find possible solutions to maintain life-space mobility, there is a need for a standardized behavioral test that relates to behaviors in natural environments. The aim of the project described in this paper was therefore to provide a free, reliable, transparent, computer-based instrument capable of detecting age-related changes on visual processing and cortical functions for the purposes of research into human behavior in computational transportation science. After obtaining content validity, exploring psychometric properties of the developed tasks, we derived (Study 1) the scoring method for measuring cerebral decline on 106 older drivers aged ≥70 years attending a driving refresher course organized by the Swiss Automobile Association to test the instrument's validity against on-road driving performance (106 older drivers). We then validated the derived method on a new sample of 182 drivers (Study 2). We then measured the instrument's reliability having 17 healthy, young volunteers repeat all tests included in the instrument five times (Study 3) and explored the instrument's psychophysical underlying functions on 47 older drivers (Study 4). Finally, we tested the instrument's responsiveness to alcohol and effects on performance on a driving simulator in a randomized, double-blinded, placebo, crossover, dose-response, validation trial including 20 healthy, young volunteers (Study 5). The developed instrument revealed good psychometric properties related to processing speed. It was reliable (ICC = 0.853) and showed reasonable association to driving performance (R (2) = 0.053), and responded to blood alcohol concentrations of 0.5 g/L (p = 0.008). Our results suggest that MedDrive is capable of detecting age-related changes that affect processing speed. These changes nevertheless do not necessarily affect driving behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba solutions have been standardised using a 4πβ-4πγ coincidence counting system we have recently set up. The detection in the beta channel is performed using various geometries of a UPS-89 plastic scintillator optically coupled to a selected low-noise 1in. diameter photomultiplier tube. The light-tight thin capsule that encloses this beta detector is housed within the well of a 5in.×5in. NaI(Tl) monocrystal detector. The beta detection efficiency can be varied either by optical filtering or electronic discrimination when the electrons loose all their energy in the plastic scintillator. This 4πβ-4πγ coincidence system improves on our 4πβ(PC)-γ system in that its sample preparation is less labour intensive, it yields larger beta- and gamma-counting efficiencies thus enabling the standardisation of low activity sources with good statistics in reasonable time, and it makes standardising short-lived radionuclides easier. The resulting radioactive concentrations of (166m)Ho, (134)Cs and (133)Ba are found to agree with those measured with other primary measurement methods thus validating our 4πβ-4πγ coincidence counting system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Morphology is the aspect of language concerned with the internal structure of words. In the past decades, a large body of masked priming (behavioral and neuroimaging) data has suggested that the visual word recognition system automatically decomposes any morphologically complex word into a stem and its constituent morphemes. Yet the reliance of morphology on other reading processes (e.g., orthography and semantics), as well as its underlying neuronal mechanisms are yet to be determined. In the current magnetoencephalography study, we addressed morphology from the perspective of the unification framework, that is, by applying the Hold/Release paradigm, morphological unification was simulated via the assembly of internal morphemic units into a whole word. Trials representing real words were divided into words with a transparent (true) or a nontransparent (pseudo) morphological relationship. Morphological unification of truly suffixed words was faster and more accurate and additionally enhanced induced oscillations in the narrow gamma band (60-85 Hz, 260-440 ms) in the left posterior occipitotemporal junction. This neural signature could not be explained by a mere automatic lexical processing (i.e., stem perception), but more likely it related to a semantic access step during the morphological unification process. By demonstrating the validity of unification at the morphological level, this study contributes to the vast empirical evidence on unification across other language processes. Furthermore, we point out that morphological unification relies on the retrieval of lexical semantic associations via induced gamma band oscillations in a cerebral hub region for visual word form processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction : Les kystes intra-vitréens congénitaux sont rares et ne causent en général pas de complications oculaires. Objectifs et Méthodes : Nous rapportons le cas d'un kyste intra-vitréen non pigmenté que nous avons suivi pendant six ans pour analyser son comportement de croissance. Observation : Il s'agit d'un patient de 62 ans suivi pour un syndrome de Morbihan qui a consulté à cause d'une ombre aperçue dans le champ visuel de l'OD. L'AV était à 1,0 corrigée. L'examen à la lampe à fente a montré la présence d'un kyste transparent flottant dans le vitré. L'ultrason a confirmé l'existence d'un kyste avec un diamètre de 4,37 mm. Ce kyste était entouré d'un deuxième kyste plus grand de 7,15 mm de diamètre. Un contrôle clinique après six ans a montré que les diamètres des kystes ont augmenté à 6.20 mm pour le petit, et 7.95 mm pour le grand kyste. Les volumes des kystes ont augmenté de 43.70 mm3 à 124,97 mm3 pour le petit et de 191.39 mm3 à 263.09 mm3 pour le grand kyste. Discussion : Étant donné que les kystes n'avaient pas d'influence sur l'acuité visuelle, aucun traitement n'a été proposé. Conclusion : Un kyste intra-vitréen non pigmenté peut augmenter considérablement en volume durant six ans d'observation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urgonian-type carbonates are a characteristic feature of many late Early Cretaceous shallow-marine, tropical and subtropical environments. The presence of typical photozoan carbonate-producing communities including corals and rudists indicates the prevalence of warm, transparent and presumably oligotrophic conditions in a period otherwise characterized by the high density of globally occurring anoxic episodes. Of particular interest, therefore, is the exploration of relationships between Urgonian platform growth and palaeoceanographic change. In the French and Swiss Jura Mountains, the onset and evolution of the Urgonian platform have been controversially dated, and a correlation with other, better dated, successions is correspondingly difficult. It is for this reason that the stratigraphy and sedimentology of a series of recently exposed sections (Eclepens, Vaumarcus and Neuchatel) and, in addition, the section of the Gorges de l'Areuse were analysed. Calcareous nannofossil biostratigraphy, the evolution of phosphorus contents of bulk rock, a sequence-stratigraphic interpretation and a correlation of drowning unconformities with better dated sections in the Helvetic Alps were used to constrain the age of the Urgonian platform. The sum of the data and field observations suggests the following evolution: during the Hauterivian, important outward and upward growth of a bioclastic and oolitic carbonate platform is documented in two sequences, separated by a phase of platform drowning during the late Early Hauterivian. Following these two phases of platform growth, a second drowning phase occurred during the latest Hauterivian and Early Barremian, which was accompanied by significant platform erosion and sediment reworking. The Late Barremian witnessed the renewed installation of a carbonate platform, which initiated with a phase of oolite production, and which progressively evolved into a typical Urgonian carbonate platform colonized by corals and rudists. This phase terminated at the latest in the middle Early Aptian, due to a further drowning event. The evolution of this particular platform segment is compatible with that of more distal and well-dated segments of the same northern Tethyan platform preserved in the Helvetic zone of the Alps and in the northern subalpine chains (Chartreuse and Vercors).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unlike in adult heart, embryonic myocardium works at low PO2 and depends preferentially on glucose. Therefore, activity of the embryonic heart during anoxia and reoxygenation should be particularly affected by changes in glucose availability. Hearts excised from 4-d-old chick embryos were submitted in vitro to strictly controlled anoxia-reoxygenation transitions at glucose concentrations varying from 0 to 20 mmol/L. Spontaneous and regular heart contractions were detected optically as movements of the ventricle wall and instantaneous heart rate, amplitude of contraction, and velocities of contraction and relaxation were determined. Anoxia induced transient tachycardia and rapidly depressed contractile activity, whereas reoxygenation provoked a temporary and complete cardioplegia (oxygen paradox). In the presence of glucose, atrial rhythm became irregular during anoxia and chaotic-periodic during reoxygenation. The incidence of these arrhythmias depended on duration of anoxia, and no ventricular ectopic beats were observed. Removal of glucose or blockade of glycolysis suppressed arrhythmias. These results show similarities but also differences with respect to the adult heart. Indeed, glucose 1) delayed and anoxic contractile failure, shortened the reoxygenation-induced cardiac arrest, and improved the recovery of contractile activity; 2) attenuated stunning at 20 mmol/L but worsened it at 8 mmol/L; and 3) paradoxically, was arrhythmogenic during anoxia and reoxygenation, especially when present at the physiologic concentration of 8 mmol/L. The last named phenomenon seems to be characteristic of the young embryonic heart, and our findings underscore that fluctuations of glycolytic activity may play a role in the reactivity of the embryonic myocardium to anoxiareoxygenation transitions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION. Multimodal strategy targeted at prevention of catheter-related infection combine education to general measures of hygiene with specific guidelines for catheter insertion and dressing (1). OBJECTIVES. In this context, we tested the introduction of chlorhexidine(CHX)-impregnated sponges (2). METHODS. In our 32-beds mixed ICU, prospective surveillance of primary bacteremia and of microbiologically documented catheter-related bloodstream infections (CRBSI) is performed according to standardized definitions. New guidelines for central venous catheter (CVC) dressing combined a CHX-impregnated sponge (BioPatch_) with a transparent occlusive dressing (Tegaderm _) and planning for refection every 7 days. To contain costs, Biopatch_ was used only for internal jugular and femoral sites. Other elements of the prevention were not modified (overall compliance to hand hygiene 65-68%; non coated catheters except for burned patients [173 out of 9,542 patients];maximal sterile barriers for insertion; alcoholic solution ofCHXfor skin disinfection). RESULTS. Median monthly CVC-days increased from 710, to 749, 855 and 965 in 2006, 2007, 2008 and 2009, respectively (p\0.01). Following introduction of the new guidelines (4Q2007), the average monthly rate of infections decreased from 3.7 (95% CI: 2.6-4.8) episodes/1000 CVC-days over the 24 preceding months to 2.2 (95% CI: 1.5-2.8) over the 24 following months (p = 0.031). Dressings needed to be changed every 3-4 days. The decrease of catheter-related infections we observed in all consecutive admitted patients is comparable to that recently showed in a placeborandomized trial2. Further generalization to all CVC and arterial catheters access may be justified. CONCLUSIONS. Our data strongly suggest that combined with occlusive dressings, CHXimpregnated sponges for dressing of all CVC catheters inserted in internal jugular and/or femoral sites, significantly reduces the rate of primary bacteremia and CRBSI. REFERENCES. (1) Eggimann P, Harbarth S, Constantin MN, Touveneau S, Chevrolet JC, Pittet D. Impact of a prevention strategy targeted at vascular-access care on incidence of infections acquired in intensive care. Lancet 2000; 355:1864-1868. (2) Timsit JF, Schwebel C, Bouadma L, Geffroy A, Garrouste-Org, Pease S et al. Chlorhexidine- impregnated sponges and less frequent dressing changes for prevention of catheter-related infections in critically ill adults: a randomized controlled trial. JAMA 2009; 301(12):1231-1241.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recognition and identification processes for deceased persons. Determining the identity of deceased persons is a routine task performed essentially by police departments and forensic experts. This thesis highlights the processes necessary for the proper and transparent determination of the civil identities of deceased persons. The identity of a person is defined as the establishment of a link between that person ("the source") and information pertaining to the same individual ("identifiers"). Various identity forms could emerge, depending on the nature of the identifiers. There are two distinct types of identity, namely civil identity and biological identity. The paper examines four processes: identification by witnesses (the recognition process) and comparisons of fingerprints, dental data and DNA profiles (the identification processes). During the recognition process, the memory function is examined and helps to clarify circumstances that may give rise to errors. To make the process more rigorous, a body presentation procedure is proposed to investigators. Before examining the other processes, three general concepts specific to forensic science are considered with regard to the identification of a deceased person, namely, matter divisibility (Inman and Rudin), transfer (Locard) and uniqueness (Kirk). These concepts can be applied to the task at hand, although some require a slightly broader scope of application. A cross comparison of common forensic fields and the identification of deceased persons reveals certain differences, including 1 - reverse positioning of the source (i.e. the source is not sought from traces, but rather the identifiers are obtained from the source); 2 - the need for civil identity determination in addition to the individualisation stage; and 3 - a more restricted population (closed set), rather than an open one. For fingerprints, dental and DNA data, intravariability and intervariability are examined, as well as changes in these post mortem (PM) identifiers. Ante-mortem identifiers (AM) are located and AM-PM comparisons made. For DNA, it has been shown that direct identifiers (taken from a person whose civil identity has been alleged) tend to lead to determining civil identity whereas indirect identifiers (obtained from a close relative) direct towards a determination of biological identity. For each process, a Bayesian model is presented which includes sources of uncertainty deemed to be relevant. The results of the different processes combine to structure and summarise an overall outcome and a methodology. The modelling of dental data presents a specific difficulty with respect to intravariability, which in itself is not quantifiable. The concept of "validity" is, therefore, suggested as a possible solution to the problem. Validity uses various parameters that have an acknowledged impact on teeth intravariability. In cases where identifying deceased persons proves to be extremely difficult due to the limited discrimination of certain procedures, the use of a Bayesian approach is of great value in bringing a transparent and synthetic value. RESUME : Titre: Processus de reconnaissance et d'identification de personnes décédées. L'individualisation de personnes décédées est une tâche courante partagée principalement par des services de police, des odontologues et des laboratoires de génétique. L'objectif de cette recherche est de présenter des processus pour déterminer valablement, avec une incertitude maîtrisée, les identités civiles de personnes décédées. La notion d'identité est examinée en premier lieu. L'identité d'une personne est définie comme l'établissement d'un lien entre cette personne et des informations la concernant. Les informations en question sont désignées par le terme d'identifiants. Deux formes distinctes d'identité sont retenues: l'identité civile et l'identité biologique. Quatre processus principaux sont examinés: celui du témoignage et ceux impliquant les comparaisons d'empreintes digitales, de données dentaires et de profils d'ADN. Concernant le processus de reconnaissance, le mode de fonctionnement de la mémoire est examiné, démarche qui permet de désigner les paramètres pouvant conduire à des erreurs. Dans le but d'apporter un cadre rigoureux à ce processus, une procédure de présentation d'un corps est proposée à l'intention des enquêteurs. Avant d'entreprendre l'examen des autres processus, les concepts généraux propres aux domaines forensiques sont examinés sous l'angle particulier de l'identification de personnes décédées: la divisibilité de la matière (Inman et Rudin), le transfert (Locard) et l'unicité (Kirk). Il est constaté que ces concepts peuvent être appliqués, certains nécessitant toutefois un léger élargissement de leurs principes. Une comparaison croisée entre les domaines forensiques habituels et l'identification de personnes décédées montre des différences telles qu'un positionnement inversé de la source (la source n'est plus à rechercher en partant de traces, mais ce sont des identifiants qui sont recherchés en partant de la source), la nécessité de devoir déterminer une identité civile en plus de procéder à une individualisation ou encore une population d'intérêt limitée plutôt qu'ouverte. Pour les empreintes digitales, les dents et l'ADN, l'intra puis l'inter-variabilité sont examinées, de même que leurs modifications post-mortem (PM), la localisation des identifiants ante-mortem (AM) et les comparaisons AM-PM. Pour l'ADN, il est démontré que les identifiants directs (provenant de la personne dont l'identité civile est supposée) tendent à déterminer une identité civile alors que les identifiants indirects (provenant d'un proche parent) tendent à déterminer une identité biologique. Puis une synthèse des résultats provenant des différents processus est réalisée grâce à des modélisations bayesiennes. Pour chaque processus, une modélisation est présentée, modélisation intégrant les paramètres reconnus comme pertinents. À ce stade, une difficulté apparaît: celle de quantifier l'intra-variabilité dentaire pour laquelle il n'existe pas de règle précise. La solution préconisée est celle d'intégrer un concept de validité qui intègre divers paramètres ayant un impact connu sur l'intra-variabilité. La possibilité de formuler une valeur de synthèse par l'approche bayesienne s'avère d'une aide précieuse dans des cas très difficiles pour lesquels chacun des processus est limité en termes de potentiel discriminant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We launched a cryptoendolithic habitat, made of a gneissic impactite inoculated with Chroococcidiopsis sp., into Earth orbit. After orbiting the Earth for 16 days, the rock entered the Earth's atmosphere and was recovered in Kazakhstan. The heat of entry ablated and heated the rock to a temperature well above the upper temperature limit for life to below the depth at which light levels are insufficient for photosynthetic organisms ( approximately 5 mm), thus killing all of its photosynthetic inhabitants. This experiment shows that atmospheric transit acts as a strong biogeographical dispersal filter to the interplanetary transfer of photosynthesis. Following atmospheric entry we found that a transparent, glassy fusion crust had formed on the outside of the rock. Re-inoculated Chroococcidiopsis grew preferentially under the fusion crust in the relatively unaltered gneiss beneath. Organisms under the fusion grew approximately twice as fast as the organisms on the control rock. Thus, the biologically destructive effects of atmospheric transit can generate entirely novel and improved endolithic habitats for organisms on the destination planetary body that survive the dispersal filter. The experiment advances our understanding of how island biogeography works on the interplanetary scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Corneal integrity and transparency are indispensable for good vision. Cornea homeostasis is entirely dependent upon corneal stem cells, which are required for complex wound-healing processes that restore corneal integrity following epithelial damage. Here, we found that leucine-rich repeats and immunoglobulin-like domains 1 (LRIG1) is highly expressed in the human holoclone-type corneal epithelial stem cell population and sporadically expressed in the basal cells of ocular-surface epithelium. In murine models, LRIG1 regulated corneal epithelial cell fate during wound repair. Deletion of Lrig1 resulted in impaired stem cell recruitment following injury and promoted a cell-fate switch from transparent epithelium to keratinized skin-like epidermis, which led to corneal blindness. In addition, we determined that LRIG1 is a negative regulator of the STAT3-dependent inflammatory pathway. Inhibition of STAT3 in corneas of Lrig1-/- mice rescued pathological phenotypes and prevented corneal opacity. Additionally, transgenic mice that expressed a constitutively active form of STAT3 in the corneal epithelium had abnormal features, including corneal plaques and neovascularization similar to that found in Lrig1-/- mice. Bone marrow chimera experiments indicated that LRIG1 also coordinates the function of bone marrow-derived inflammatory cells. Together, our data indicate that LRIG1 orchestrates corneal-tissue transparency and cell fate during repair, and identify LRIG1 as a key regulator of tissue homeostasis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The athlete biological passport (ABP) was recently implemented in anti-doping work and is based on the individual and longitudinal monitoring of haematological or urine markers. These may be influenced by illicit procedures performed by some athletes with the intent to improve exercise performance. Hence the ABP is a valuable tool in the fight against doping. Actually, the passport has been defined as an individual and longitudinal observation of markers. These markers need to belong to the biological cascade influenced by the application of forbidden hormones or more generally, affected by biological manipulations which can improve the performance of the athlete. So far, the haematological and steroid profile modules of the ABP have been implemented in major sport organisations, and a further module is under development. The individual and longitudinal monitoring of some blood and urine markers are of interest, because the intraindividual variability is lower than the corresponding interindividual variability. Among the key prerequisites for the implementation of the ABP is its prospect to resist to the legal and scientific challenges. The ABP should be implemented in the most transparent way and with the necessary independence between planning, interpretation and result management of the passport. To ensure this, the Athlete Passport Management Unit (APMU) was developed and the WADA implemented different technical documents associated to the passport. This was carried out to ensure the correct implementation of a profile which can also stand the challenge of any scientific or legal criticism. This goal can be reached only by following strictly important steps in the chain of production of the results and in the management of the interpretation of the passport. Various technical documents have been then associated to the guidelines which correspond to the requirements for passport operation. The ABP has been completed very recently by the steroid profile module. As for the haematological module, individual and longitudinal monitoring have been applied and the interpretation cascade is also managed by a specific APMU in a similar way as applied in the haematological module. Thus, after exclusion of any possible pathology, specific variation from the individual norms will be then considered as a potential misuse of hormones or other modulators to enhance performance.