985 resultados para Conjugate gradient methods
Resumo:
A study was carried out to evaluate the presence of serological markers for the immunodiagnosis of the vertical transmission of toxoplasmosis. We tested the sensitivity, specificity and predictive values (positive and negative) of different serological methods for the early diagnosis of congenital toxoplasmosis. In a prospective longitudinal study, 50 infants with suspected congenital toxoplasmosis were followed up in the ambulatory care centre of Congenital Infections at University Hospital in Goiânia, Goiás, Brazil, from 1 January 2004-30 September 2005. Microparticle Enzyme Immunoassay (MEIA), Enzyme-Linked Fluorescent Assay (ELFA) and Immune-Fluorescent Antibody Technique (IFAT) were used to detect specific IgM anti-Toxoplasma gondii antibodies and a capture ELISA was used to detect specific IgA antibodies. The results showed that 28/50 infants were infected. During the neonatal period, IgM was detected in 39.3% (11/28) of those infected infants and IgA was detected in 21.4% (6/28). The sensitivity, specificity and predictive values (positive and negative) of each assay were, respectively: MEIA and ELFA: 60.9%, 100%, 100%, 55.0%; IFAT: 59.6%, 91.7%, 93.3%, 53.7%; IgA capture ELISA: 57.1%, 100%, 100%, 51.2%. The presence of specific IgM and IgA antibodies during the neonatal period was not frequent, although it was correlated with the most severe cases of congenital transmission. The results indicate that the absence of congenital disease markers (IgM and IgA) in newborns, even after confirming the absence with several techniques, does not constitute an exclusion criterion for toxoplasmosis.
Resumo:
The aim of this study was to compare two nucleic acid extraction methods for the recovery of enteric viruses from activated sludge. Test samples were inoculated with human adenovirus (AdV), hepatitis A virus (HAV), poliovirus (PV) and rotavirus (RV) and were then processed by an adsorption-elution-precipitation method. Two extraction methods were used: an organic solvent-based method and a silica method. The organic-based method was able to recoup 20% of the AdV, 90% of the RV and 100% of both the PV and HAV from seeded samples. The silica method was able to recoup 1.8% of the AdV and 90% of the RV. These results indicate that the organic-based method is more suitable for detecting viruses in sewage sludge.
Resumo:
Tree nuts, peanuts and seeds are nutrient dense foods whose intake has been shown to be associated with reduced risk of some chronic diseases. They are regularly consumed in European diets either as whole, in spreads or from hidden sources (e.g. commercial products). However, little is known about their intake profiles or differences in consumption between European countries or geographic regions. The objective of this study was to analyse the population mean intake and average portion sizes in subjects reporting intake of nuts and seeds consumed as whole, derived from hidden sources or from spreads. Data was obtained from standardised 24-hour dietary recalls collected from 36 994 subjects in 10 different countries that are part of the European Prospective Investigation into Cancer and Nutrition (EPIC). Overall, for nuts and seeds consumed as whole, the percentage of subjects reporting intake on the day of the recall was: tree nuts = 4. 4%, peanuts = 2.3 % and seeds = 1.3 %. The data show a clear northern (Sweden: mean intake = 0.15 g/d, average portion size = 15.1 g/d) to southern (Spain: mean intake = 2.99 g/d, average portion size = 34.7 g/d) European gradient of whole tree nut intake. The three most popular tree nuts were walnuts, almonds and hazelnuts, respectively. In general, tree nuts were more widely consumed than peanuts or seeds. In subjects reporting intake, men consumed a significantly higher average portion size of tree nuts (28.5 v. 23.1 g/d, P<0.01) and peanuts (46.1 v. 35.1 g/d, P<0.01) per day than women. These data may be useful in devising research initiatives and health policy strategies based on the intake of this food group.
Resumo:
The variation with latitude of incidence and mortality for cutaneous malignant melanoma (CMM) in the non-Maori population of New Zealand was assessed. For those aged 20 to 74 years, the effects of age, time period, birth-cohort, gender, and region (latitude), and some interactions between them were evaluated by log-linear regression methods. Increasing age-standardized incidence and mortality rates with increasing proximity to the equator were found for men and women. These latitude gradients were greater for males than females. The relative risk of melanoma in the most southern part of New Zealand (latitude 44 degrees S) compared with the most northern region (latitude 36 degrees S) was 0.63 (95 percent confidence interval [CI] = 0.60-0.67) for incidence and 0.76 (CI = 0.68-0.86) for mortality, both genders combined. The mean percentage change in CMM rates per degree of latitude for males was greater than those reported in other published studies. Differences between men and women in melanoma risk with latitude suggest that regional sun-behavior patterns or other risk factors may contribute to the latitude gradient observed.
Resumo:
In most psychological tests and questionnaires, a test score is obtained bytaking the sum of the item scores. In virtually all cases where the test orquestionnaire contains multidimensional forced-choice items, this traditionalscoring method is also applied. We argue that the summation of scores obtained with multidimensional forced-choice items produces uninterpretabletest scores. Therefore, we propose three alternative scoring methods: a weakand a strict rank preserving scoring method, which both allow an ordinalinterpretation of test scores; and a ratio preserving scoring method, whichallows a proportional interpretation of test scores. Each proposed scoringmethod yields an index for each respondent indicating the degree to whichthe response pattern is inconsistent. Analysis of real data showed that withrespect to rank preservation, the weak and strict rank preserving methodresulted in lower inconsistency indices than the traditional scoring method;with respect to ratio preservation, the ratio preserving scoring method resulted in lower inconsistency indices than the traditional scoring method
Resumo:
Functional Data Analysis (FDA) deals with samples where a whole function is observedfor each individual. A particular case of FDA is when the observed functions are densityfunctions, that are also an example of infinite dimensional compositional data. In thiswork we compare several methods for dimensionality reduction for this particular typeof data: functional principal components analysis (PCA) with or without a previousdata transformation and multidimensional scaling (MDS) for diferent inter-densitiesdistances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (householdsincome distributions)
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
Influenza surveillance networks must detect early the viruses that will cause the forthcoming annual epidemics and isolate the strains for further characterization. We obtained the highest sensitivity (95.4%) with a diagnostic tool that combined a shell-vial assay and reverse transcription-PCR on cell culture supernatants at 48 h, and indeed, recovered the strain
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
Autonomous underwater vehicles (AUV) represent a challenging control problem with complex, noisy, dynamics. Nowadays, not only the continuous scientific advances in underwater robotics but the increasing number of subsea missions and its complexity ask for an automatization of submarine processes. This paper proposes a high-level control system for solving the action selection problem of an autonomous robot. The system is characterized by the use of reinforcement learning direct policy search methods (RLDPS) for learning the internal state/action mapping of some behaviors. We demonstrate its feasibility with simulated experiments using the model of our underwater robot URIS in a target following task
Resumo:
Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.
Resumo:
In the accounting literature, interaction or moderating effects are usually assessed by means of OLS regression and summated rating scales are constructed to reduce measurement error bias. Structural equation models and two-stage least squares regression could be used to completely eliminate this bias, but large samples are needed. Partial Least Squares are appropriate for small samples but do not correct measurement error bias. In this article, disattenuated regression is discussed as a small sample alternative and is illustrated on data of Bisbe and Otley (in press) that examine the interaction effect of innovation and style of use of budgets on performance. Sizeable differences emerge between OLS and disattenuated regression
Resumo:
Aim This study compares the direct, macroecological approach (MEM) for modelling species richness (SR) with the more recent approach of stacking predictions from individual species distributions (S-SDM). We implemented both approaches on the same dataset and discuss their respective theoretical assumptions, strengths and drawbacks. We also tested how both approaches performed in reproducing observed patterns of SR along an elevational gradient.Location Two study areas in the Alps of Switzerland.Methods We implemented MEM by relating the species counts to environmental predictors with statistical models, assuming a Poisson distribution. S-SDM was implemented by modelling each species distribution individually and then stacking the obtained prediction maps in three different ways - summing binary predictions, summing random draws of binomial trials and summing predicted probabilities - to obtain a final species count.Results The direct MEM approach yields nearly unbiased predictions centred around the observed mean values, but with a lower correlation between predictions and observations, than that achieved by the S-SDM approaches. This method also cannot provide any information on species identity and, thus, community composition. It does, however, accurately reproduce the hump-shaped pattern of SR observed along the elevational gradient. The S-SDM approach summing binary maps can predict individual species and thus communities, but tends to overpredict SR. The two other S-SDM approaches the summed binomial trials based on predicted probabilities and summed predicted probabilities - do not overpredict richness, but they predict many competing end points of assembly or they lose the individual species predictions, respectively. Furthermore, all S-SDM approaches fail to appropriately reproduce the observed hump-shaped patterns of SR along the elevational gradient.Main conclusions Macroecological approach and S-SDM have complementary strengths. We suggest that both could be used in combination to obtain better SR predictions by following the suggestion of constraining S-SDM by MEM predictions.
Resumo:
BACKGROUND: The Contegra® is a conduit made from the bovine jugular vein and then interposed between the right ventricle and the pulmonary artery. It is used for cardiac malformations in the reconstruction of right ventricular outflow tract. OBJECTIVE: To describe both normal and pathological appearances of the Contegra® in radiological imaging, to describe imaging of complications and to define the role of CT and MRI in postoperative follow-up. MATERIALS AND METHODS: Forty-three examinations of 24 patients (17 boys and 7 girls; mean age: 10.8 years old) with Contegra® conduits were reviewed. Anatomical description and measurements of the conduits were performed. Pathological items examined included stenosis, dilatation, plicature or twist, thrombus or vegetations, calcifications and valvular regurgitation. Findings were correlated to the echographic gradient through the conduit when available. RESULTS: CT and MR work-up showed Contegra® stenosis (n = 12), dilatation (n = 9) and plicature or twist (n = 7). CT displayed thrombus or vegetations in the Contegra® in three clinically infected patients. Calcifications of the conduit were present at CT in 12 patients and valvular regurgitation in three patients. The comparison between CT and/or MR results showed a good correlation between the echographic gradient and the presence of stenosis in the Contegra®. CONCLUSION: CT and MR bring additional information about permeability and postoperative anatomy especially when echocardiography is inconclusive. Both techniques depict the normal appearance of the conduit, and allow comparison and precise evaluation of changes in the postoperative follow-up.