859 resultados para Law, John: After method : mess in social science research
Resumo:
Glucose-induced thermogenesis (GIT) after a 100-g oral glucose load was measured by continuous indirect calorimetry in 32 nondiabetic and diabetic obese subjects and compared to 17 young and 13 middle aged control subjects. The obese subjects were divided into three groups: A (n = 12) normal glucose tolerance, B (n = 13) impaired glucose tolerance, and C (n = 7) diabetics, and were studied before and after a body weight loss ranging from 9.6 to 33.5 kg consecutive to a 4 to 6 months hypocaloric diet. GIT, measured over 3 h and expressed as percentage of the energy content of the load, was significantly reduced in obese groups A and C (6.2 +/- 0.6, and 3.8 +/- 0.7%, respectively) when compared to their age-matched control groups: 8.6 +/- 0.7 (young) and 5.8 +/- 0.3% (middle aged). Obese group B had a GIT of 6.1 +/- 0.6% which was lower than that of the young control group but not different from the middle-aged control group. After weight loss, GIT in the obese was further reduced in groups A and B than before weight loss: ie, 3.4 +/- 0.6 (p less than 0.001), 3.7 +/- 0.5 (p less than 0.01) respectively, whereas in group C, weight loss induced no further diminution in GIT (3.8 +/- 0.6%). These results support the concept of a thermogenic defect after glucose ingestion in obese individuals which is not the consequence of their excess body weight but may be one of the factors favoring the relapse of obesity after weight loss.
Resumo:
Coordination games are important to explain efficient and desirable social behavior. Here we study these games by extensive numerical simulation on networked social structures using an evolutionary approach. We show that local network effects may promote selection of efficient equilibria in both pure and general coordination games and may explain social polarization. These results are put into perspective with respect to known theoretical results. The main insight we obtain is that clustering, and especially community structure in social networks has a positive role in promoting socially efficient outcomes.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Polyclonal intravenous immunoglobulin (IVIg) treatment reduces crossmatch positivity and increases rates of transplantation in highly sensitised patients (HS). We quantified the panel reactive antibody (PRA) by microlymphocytotoxicity (MLCC), and we analysed anti-HLA class I and class II IgG specific antibody repertoire by Luminex before and after IVIg infusion alone in HS patients awaiting kidney transplantation. Five patients received three monthly infusions of 1 g/kg of IVIg. Serum samples collected pre and post IVIg treatment were submitted for PRA analysis by MLCC. Anti-class I and anti-class II antibody specificities were then tested by Luminex. We focused on the anti-HLA class I and class II antibodies directed against HLA expressed by a previous graft. We also analysed the anti-HLA antibody repertoire in three patients who had not received IVIg infusion. The PRA level determined by MLCC decreased significantly in one of the five patients, dropping from 40% to 17%. The Luminex assay showed fluctuations of the anti-HLA antibody levels over time, but no significant longterm modifications of the anti-HLA antibody repertoire were observed, even in the patient with a strong and prolonged reduction of the PRA determined by MLCC. Our results show that IVIg at 1 g/kg is not sufficient to reduce PRA and does not modify the repertoire of specific anti-HLA antibody determined by Luminex.
Resumo:
Résumé Ce travail de thèse étudie des moyens de formalisation permettant d'assister l'expert forensique dans la gestion des facteurs influençant l'évaluation des indices scientifiques, tout en respectant des procédures d'inférence établies et acceptables. Selon une vue préconisée par une partie majoritaire de la littérature forensique et juridique - adoptée ici sans réserve comme point de départ - la conceptualisation d'une procédure évaluative est dite 'cohérente' lors qu'elle repose sur une implémentation systématique de la théorie des probabilités. Souvent, par contre, la mise en oeuvre du raisonnement probabiliste ne découle pas de manière automatique et peut se heurter à des problèmes de complexité, dus, par exemple, à des connaissances limitées du domaine en question ou encore au nombre important de facteurs pouvant entrer en ligne de compte. En vue de gérer ce genre de complications, le présent travail propose d'investiguer une formalisation de la théorie des probabilités au moyen d'un environment graphique, connu sous le nom de Réseaux bayesiens (Bayesian networks). L'hypothèse principale que cette recherche envisage d'examiner considère que les Réseaux bayesiens, en concert avec certains concepts accessoires (tels que des analyses qualitatives et de sensitivité), constituent une ressource clé dont dispose l'expert forensique pour approcher des problèmes d'inférence de manière cohérente, tant sur un plan conceptuel que pratique. De cette hypothèse de travail, des problèmes individuels ont été extraits, articulés et abordés dans une série de recherches distinctes, mais interconnectées, et dont les résultats - publiés dans des revues à comité de lecture - sont présentés sous forme d'annexes. D'un point de vue général, ce travail apporte trois catégories de résultats. Un premier groupe de résultats met en évidence, sur la base de nombreux exemples touchant à des domaines forensiques divers, l'adéquation en termes de compatibilité et complémentarité entre des modèles de Réseaux bayesiens et des procédures d'évaluation probabilistes existantes. Sur la base de ces indications, les deux autres catégories de résultats montrent, respectivement, que les Réseaux bayesiens permettent également d'aborder des domaines auparavant largement inexplorés d'un point de vue probabiliste et que la disponibilité de données numériques dites 'dures' n'est pas une condition indispensable pour permettre l'implémentation des approches proposées dans ce travail. Le présent ouvrage discute ces résultats par rapport à la littérature actuelle et conclut en proposant les Réseaux bayesiens comme moyen d'explorer des nouvelles voies de recherche, telles que l'étude de diverses formes de combinaison d'indices ainsi que l'analyse de la prise de décision. Pour ce dernier aspect, l'évaluation des probabilités constitue, dans la façon dont elle est préconisée dans ce travail, une étape préliminaire fondamentale de même qu'un moyen opérationnel.
Resumo:
The aim of this work is to study the influence of several analytical parameters on the variability of Raman spectra of paint samples. In the present study, microtome thin section and direct (no preparation) analysis are considered as sample preparation. In order to evaluate their influence on the measures, an experimental design such as 'fractional full factorial' with seven factors (including the sampling process) is applied, for a total of 32 experiments representing 160 measures. Once the influence of sample preparation highlighted, a depth profile of a paint sample is carried out by changing the focusing plane in order to measure the colored layer under a clearcoat. This is undertaken in order to avoid sample preparation such a microtome sectioning. Finally, chemometric treatments such as principal component analysis are applied to the resulting spectra. The findings of this study indicate the importance of sample preparation, or more specifically, the surface roughness, on the variability of the measurements on a same sample. Moreover, the depth profile experiment highlights the influence of the refractive index of the upper layer (clearcoat) when measuring through a transparent layer.
Resumo:
Jasmonates, potent lipid mediators of defense gene expression in plants, are rapidly synthesized in response to wounding. These lipid mediators also stimulate their own production via a positive feedback circuit, which depends on both JA synthesis and JA signaling. To date, molecular components regulating the activation of jasmonate biogenesis and its feedback loop have been poorly characterized. We employed a genetic screen capable of detecting the misregulated activity of 13-lipoxygenase, which operates at the entry point of the jasmonate biosynthesis pathway. Leaf extracts from the Arabidopsis fou2 (fatty acid oxygenation upregulated 2) mutant displayed an increased capacity to catalyze the synthesis of lipoxygenase (LOX) metabolites. Quantitative oxylipin analysis identified less than twofold increased jasmonate levels in healthy fou2 leaves compared to wild-type; however, wounded fou2 leaves strongly increased jasmonate biogenesis compared to wounded wild-type. Furthermore, the plants displayed enhanced resistance to the fungus Botrytis cinerea. Higher than wild-type LOX activity and enhanced resistance in the fou2 mutant depend fully on a functional jasmonate response pathway. The fou2 mutant carries a missense mutation in the putative voltage sensor of the Two Pore Channel 1 gene (TPC1), which encodes a Ca(2+)-permeant non-selective cation channel. Patch-clamp analysis of fou2 vacuolar membranes showed faster time-dependent conductivity and activation of the mutated channel at lower membrane potentials than wild-type. The results indicate that cation fluxes exert strong control over the positive feedback loop whereby JA stimulates its own synthesis.
Resumo:
The objective of this work was to evaluate the effect of biochar made from Eucalyptus on soil fertility, and on the yield and development of upland rice. The experiment was performed during two years in a randomized block design with four replicates, in a sandy loam Dystric Plinthosol. Four doses of NPK 05-25-15, annually distributed in stripes (0, 100, 200 and 300 kg ha-1), and four doses of biochar (0, 8, 16 and 32 Mg ha-1), applied once in the first year - alone or with NPK - were evaluated. In the first year, biochar positively affected soil fertility [total organic carbon (TOC), Ca, P, Al, H+Al, and pH], at 0-10 cm soil depth, and it was the only factor with significant effect on yield. In the second year, the effect of biochar diminished or was overcome by the fertilizer. TOC moved down in the soil profile to the 0-20 cm depth, influencing K availability in this layer. In the second year, there was a significant interaction between biochar and the fertilizer on plant growth and biomass dry matter accumulation.
Resumo:
Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describingthem. This is especially interesting in the field of open educational resources, asdelicious is a simple way to bridge the institutional point of view (i.e. learningobject repositories) with the individual one (i.e. personal collections), thuspromoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used toimprove descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags createclusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.
Resumo:
The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.
Resumo:
Answering patients' evolving, more complex needs has been recognized as a main incentive for the development of interprofessional care. Thus, it is not surprising that patient-centered practice (PCP) has been adopted as a major outcome for interprofessional education. Nevertheless, little research has focused on how PCP is perceived across the professions. This study aimed to address this issue by adopting a phenomenological approach and interviewing three groups of professionals: social workers (n = 10), nurses (n = 10) and physicians (n = 8). All the participants worked in the same department (the General Internal Medicine department of a university affiliated hospital). Although the participants agreed on a core meaning of PCP as identifying, understanding and answering patients' needs, they used many dimensions to define PCP. Overall, the participants expressed value for PCP as a philosophy of care, but there was the sense of a hierarchy of patient-centeredness across the professions, in which both social work and nursing regarded themselves as more patient-centered than others. On their side, physicians seemed inclined to accept their lower position in this hierarchy. Gieryn's concept of boundary work is employed to help illuminate the nature of PCP within an interprofessional context.