887 resultados para Analysis and statistical methods
Resumo:
Pharmacogenetics, the study of how individual genetic profiles influence the response to drugs, is an important topic. Results from pharmacogenetics studies in various clinical settings may lead to personalized medicine. Herein, we present the most important concepts of this discipline, as well as currently-used study methods.
Resumo:
The aim of this research was to evaluate how fingerprint analysts would incorporate information from newly developed tools into their decision making processes. Specifically, we assessed effects using the following: (1) a quality tool to aid in the assessment of the clarity of the friction ridge details, (2) a statistical tool to provide likelihood ratios representing the strength of the corresponding features between compared fingerprints, and (3) consensus information from a group of trained fingerprint experts. The measured variables for the effect on examiner performance were the accuracy and reproducibility of the conclusions against the ground truth (including the impact on error rates) and the analyst accuracy and variation for feature selection and comparison.¦The results showed that participants using the consensus information from other fingerprint experts demonstrated more consistency and accuracy in minutiae selection. They also demonstrated higher accuracy, sensitivity, and specificity in the decisions reported. The quality tool also affected minutiae selection (which, in turn, had limited influence on the reported decisions); the statistical tool did not appear to influence the reported decisions.
Resumo:
Whereas the reduction of transfusion related viral transmission has been a priority during the last decade, bacterial infection transmitted by transfusion still remains associated to a high morbidity and mortality, and constitutes the most frequent infectious risk of transfusion. This problem especially concerns platelet concentrates because of their favorable bacterial growth conditions. This review gives an overview of platelet transfusion-related bacterial contamination as well as on the different strategies to reduce this problem by using either bacterial detection or inactivation methods.
Resumo:
OBJECTIVES: Elevated plasma levels of the elastase alpha 1-proteinase inhibitor complex (E-alpha 1 PI) have been proposed as a marker of bacterial infection and neutrophil activation. Liberation of elastase from neutrophils after collection of blood may cause falsely elevated results. Collection methods have not been validated for critically ill neonates and children. We evaluated the influence of preanalytical methods on E-alpha 1 PI results including the recommended collection into EDTA tubes. DESIGN AND METHODS: First, we compared varying acceleration speeds and centrifugation times. Centrifugation at 1550 g for 3 min resulted in reliable preparation of leukocyte free plasma. Second, we evaluated all collection tubes under consideration for absorption of E-alpha 1 PI. Finally, 12 sets of samples from healthy adults and 42 sets obtained from critically ill neonates and children were distributed into the various sampling tubes. Samples were centrifuged within 15 min of collection and analyzed with a new turbidimetric assay adapted to routine laboratory analyzers. RESULTS: One of the two tubes containing a plasma-cell separation gel absorbed 22.1% of the E-alpha 1 PI content. In the remaining tubes without absorption of E-alpha 1 PI no differences were observed for samples from healthy adult patients. However, in samples from critically ill neonates or children, significantly higher results were obtained for plain Li-heparin tubes (mean = 183 micrograms/L), EDTA tubes (mean = 93 micrograms/L), and citrate tubes (mean = 88.5 micrograms/L) than for the Li-hep tube with cell-plasma separation gel and no absorption of E-alpha 1 PI (mean = 62.4 micrograms/L, p < 0.01). CONCLUSION: Contrary to healthy adults, E-alpha 1 PI results in plasma samples from critically ill neonates and children depend on the type of collection tube.
Resumo:
Thermal analysis, powder diffraction, and Raman scattering as a function of the temperature were carried out on K2BeF4. Moreover, the crystal structure was determined at 293 K from powder diffraction. The compound shows a transition from Pna21 to Pnam space group at 921 K with a transition enthalpy of 5 kJ/mol. The transition is assumed to be first order because the compound shows metastability. Structurally and spectroscopically the transition is similar to those observed in (NH4)2SO4, which suggests that the low-temperature phase is ferroelectric. In order to confirm it, the spontaneous polarization has been computed using an ionic model.
Resumo:
Leakage detection is an important issue in many chemical sensing applications. Leakage detection hy thresholds suffers from important drawbacks when sensors have serious drifts or they are affected by cross-sensitivities. Here we present an adaptive method based in a Dynamic Principal Component Analysis that models the relationships between the sensors in the may. In normal conditions a certain variance distribution characterizes sensor signals. However, in the presence of a new source of variance the PCA decomposition changes drastically. In order to prevent the influence of sensor drifts the model is adaptive and it is calculated in a recursive manner with minimum computational effort. The behavior of this technique is studied with synthetic signals and with real signals arising by oil vapor leakages in an air compressor. Results clearly demonstrate the efficiency of the proposed method.
Resumo:
Ground-penetrating radar (GPR) and microgravimetric surveys have been conducted in the southern Jura mountains of western Switzerland in order to map subsurface karstic features. The study site, La Grande Rolaz cave, is an extensive system in which many portions have been mapped. By using small station spacing and careful processing for the geophysical data, and by modeling these data with topographic information from within the cave, accurate interpretations have been achieved. The constraints on the interpreted geologic models are better when combining the geophysical methods than when using only one of the methods, despite the general limitations of two-dimensional (2D) profiling. For example, microgravimetry can complement GPR methods for accurately delineating a shallow cave section approximately 10 X 10 mt in size. Conversely, GPR methods can be complementary in determining cavity depths and in verifying the presence of off-line features and numerous areas of small cavities and fractures, which may be difficult to resolve in microgravimetric data.
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
Resumo:
The biological properties of wild-type A75/17 and cell culture-adapted Onderstepoort canine distemper virus differ markedly. To learn more about the molecular basis for these differences, we have isolated and sequenced the protein-coding regions of the attachment and fusion proteins of wild-type canine distemper virus strain A75/17. In the attachment protein, a total of 57 amino acid differences were observed between the Onderstepoort strain and strain A75/17, and these were distributed evenly over the entire protein. Interestingly, the attachment protein of strain A75/17 contained an extension of three amino acids at the C terminus. Expression studies showed that the attachment protein of strain A75/17 had a higher apparent molecular mass than the attachment protein of the Onderstepoort strain, in both the presence and absence of tunicamycin. In the fusion protein, 60 amino acid differences were observed between the two strains, of which 44 were clustered in the much smaller F2 portion of the molecule. Significantly, the AUG that has been proposed as a translation initiation codon in the Onderstepoort strain is an AUA codon in strain A75/17. Detailed mutation analyses showed that both the first and second AUGs of strain A75/17 are the major translation initiation sites of the fusion protein. Similar analyses demonstrated that, also in the Onderstepoort strain, the first two AUGs are the translation initiation codons which contribute most to the generation of precursor molecules yielding the mature form of the fusion protein.
Resumo:
Interaction analysis is not a prerogative of any discipline in social sciences. It has its own history within each disciplinary field and is related to specific research objects. From the standpoint of psychology, this article first draws upon a distinction between factorial and dialogical conceptions of interaction. It then briefly presents the basis of a dialogical approach in psychology and focuses upon four basic assumptions. Each of them is examined on a theoretical and on a methodological level with a leading question: to what extent is it possible to develop analytical tools that are fully coherent with dialogical assumptions? The conclusion stresses the difficulty of developing methodological tools that are fully consistent with dialogical assumptions and argues that there is an unavoidable tension between accounting for the complexity of an interaction and using methodological tools which necessarily "monologise" this complexity.
Resumo:
The spatial variability of soil and plant properties exerts great influence on the yeld of agricultural crops. This study analyzed the spatial variability of the fertility of a Humic Rhodic Hapludox with Arabic coffee, using principal component analysis, cluster analysis and geostatistics in combination. The experiment was carried out in an area under Coffea arabica L., variety Catucai 20/15 - 479. The soil was sampled at a depth 0.20 m, at 50 points of a sampling grid. The following chemical properties were determined: P, K+, Ca2+, Mg2+, Na+, S, Al3+, pH, H + Al, SB, t, T, V, m, OM, Na saturation index (SSI), remaining phosphorus (P-rem), and micronutrients (Zn, Fe, Mn, Cu and B). The data were analyzed with descriptive statistics, followed by principal component and cluster analyses. Geostatistics were used to check and quantify the degree of spatial dependence of properties, represented by principal components. The principal component analysis allowed a dimensional reduction of the problem, providing interpretable components, with little information loss. Despite the characteristic information loss of principal component analysis, the combination of this technique with geostatistical analysis was efficient for the quantification and determination of the structure of spatial dependence of soil fertility. In general, the availability of soil mineral nutrients was low and the levels of acidity and exchangeable Al were high.
Resumo:
As an expansion of SF2088, the Department of Administrative Services-Information Technology Enterprise (DAS-ITE) was asked to further analyze the potential costs and savings if the current practice of charging credit card and overhead fees (“value-added fees”) were to be eliminated. Value-added fees reflect the costs an agency incurs while providing online services, and those costs will always exist.. DAS-ITE researched these costs and identified ways of making the associated fees less burdensome to the citizens of Iowa. The three alternatives provide different ways in which agencies can recover those costs; they could be borne by either an annual appropriation or adjustment of the online service “price” to include the fees within the cost of the online transaction. An additional alternative is presented to leave the current value-added fee practices in place. Recognition must also be made of the fact that traditional forms of conducting business with the State of Iowa, face-to-face and paper-based transactions, are inherently more costly. These delivery channels are effectively subsidized by the agency as a “cost of doing business” and the associated expense of the transactions is not passed on to the customer.
Resumo:
The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.