964 resultados para Simplified and advanced calculation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of land cover change as a significant component of global change has become increasingly recognized in recent decades. Large databases measuring land cover change, and the data which can potentially be used to explain the observed changes, are also becoming more commonly available. When developing statistical models to investigate observed changes, it is important to be aware that the chosen sampling strategy and modelling techniques can influence results. We present a comparison of three sampling strategies and two forms of grouped logistic regression models (multinomial and ordinal) in the investigation of patterns of successional change after agricultural land abandonment in Switzerland. Results indicated that both ordinal and nominal transitional change occurs in the landscape and that the use of different sampling regimes and modelling techniques as investigative tools yield different results. Synthesis and applications. Our multimodel inference identified successfully a set of consistently selected indicators of land cover change, which can be used to predict further change, including annual average temperature, the number of already overgrown neighbouring areas of land and distance to historically destructive avalanche sites. This allows for more reliable decision making and planning with respect to landscape management. Although both model approaches gave similar results, ordinal regression yielded more parsimonious models that identified the important predictors of land cover change more efficiently. Thus, this approach is favourable where land cover change pattern can be interpreted as an ordinal process. Otherwise, multinomial logistic regression is a viable alternative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The study tests the hypothesis that intramodal visual binding is disturbed in schizophrenia and should be detectable in all illness stages as a stable trait marker. METHOD: Three groups of patients (rehospitalized chronic schizophrenic, first admitted schizophrenic and schizotypal patients believed to be suffering from a pre-schizophrenic prodrome) and a group of normal control subjects were tested on three tasks targeting visual 'binding' abilities (Muller-Lyer's illusion and two figure detection tasks) in addition to control parameters such as reaction time, visual selective attention, Raven's test and two conventional cortical tasks of spatial working memory (SWM) and a global local test. RESULTS: Chronic patients had a decreased performance on the binding tests. Unexpectedly, the prodromal group exhibited an enhanced Gestalt extraction on these tests compared both to schizophrenic patients and to healthy subjects. Furthermore, chronic schizophrenia was associated with a poor performance on cortical tests of SWM, global local and on Raven. This association appears to be mediated by or linked to the chronicity of the illness. CONCLUSION: The study confirms a variety of neurocognitive deficits in schizophrenia which, however, in this sample seem to be linked to chronicity of illness. However, certain aspects of visual processing concerned with Gestalt extraction deserve attention as potential vulnerability- or prodrome- indicators. The initial hypothesis of the study is rejected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When using a polynomial approximating function the most contentious aspect of the Heat Balance Integral Method is the choice of power of the highest order term. In this paper we employ a method recently developed for thermal problems, where the exponent is determined during the solution process, to analyse Stefan problems. This is achieved by minimising an error function. The solution requires no knowledge of an exact solution and generally produces significantly better results than all previous HBI models. The method is illustrated by first applying it to standard thermal problems. A Stefan problem with an analytical solution is then discussed and results compared to the approximate solution. An ablation problem is also analysed and results compared against a numerical solution. In both examples the agreement is excellent. A Stefan problem where the boundary temperature increases exponentially is analysed. This highlights the difficulties that can be encountered with a time dependent boundary condition. Finally, melting with a time-dependent flux is briefly analysed without applying analytical or numerical results to assess the accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To further validate the doubly labeled water method for measurement of CO2 production and energy expenditure in humans, we compared it with near-continuous respiratory gas exchange in nine healthy young adult males. Subjects were housed in a respiratory chamber for 4 days. Each received 2H2(18)O at either a low (n = 6) or a moderate (n = 3) isotope dose. Low and moderate doses produced initial 2H enrichments of 5 and 10 X 10(-3) atom percent excess, respectively, and initial 18O enrichments of 2 and 2.5 X 10(-2) atom percent excess, respectively. Total body water was calculated from isotope dilution in saliva collected at 4 and 5 h after the dose. CO2 production was calculated by the two-point method using the isotopic enrichments of urines collected just before each subject entered and left the chamber. Isotope enrichments relative to predose samples were measured by isotope ratio mass spectrometry. At low isotope dose, doubly labeled water overestimated average daily energy expenditure by 8 +/- 9% (SD) (range -7 to 22%). At moderate dose the difference was reduced to +4 +/- 5% (range 0-9%). The isotope elimination curves for 2H and 18O from serial urines collected from one of the subjects showed expected diurnal variations but were otherwise quite smooth. The overestimate may be due to approximations in the corrections for isotope fractionation and isotope dilution. An alternative approach to the corrections is presented that reduces the overestimate to 1%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ACuteTox is a project within the 6th European Framework Programme which had as one of its goals to develop, optimise and prevalidate a non-animal testing strategy for predicting human acute oral toxicity. In its last 6 months, a challenging exercise was conducted to assess the predictive capacity of the developed testing strategies and final identification of the most promising ones. Thirty-two chemicals were tested blind in the battery of in vitro and in silico methods selected during the first phase of the project. This paper describes the classification approaches studied: single step procedures and two step tiered testing strategies. In summary, four in vitro testing strategies were proposed as best performing in terms of predictive capacity with respect to the European acute oral toxicity classification. In addition, a heuristic testing strategy is suggested that combines the prediction results gained from the neutral red uptake assay performed in 3T3 cells, with information on neurotoxicity alerts identified by the primary rat brain aggregates test method. Octanol-water partition coefficients and in silico prediction of intestinal absorption and blood-brain barrier passage are also considered. This approach allows to reduce the number of chemicals wrongly predicted as not classified (LD50>2000 mg/kg b.w.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high sensitivity and the possibility of automation of the enzyme-linked-immunosorbent-assay (ELISA) has indicated this technique as one of the most useful serological test for epidemiological studies. In the present study, an ELISA for detection of IgG antibodies against adult worm antigens (IgG-ELISA) was investigated for epidemiological purposes, in a rural area of the municipality of Itariri (São Paulo, Brazil). Blood on filter paper (1,180 samples) from about 650 schoolchildren were submitted to ELISA and the data compared to the results of the parasitological method of Kato-Katz and also to the IgM-IFT (immunofluorescence test for IgM antibodies to gut associated antigens). The prevalence rates respectively of 8.5%, 43.0%, and 56.2% by the Kato-Katz, IgG-ELISA, and IgM-IFT methods suggest the poor sensitivity of the parasitological method for detection of Schistosoma mansoni eggs in individuals with low worm burden, situation commonly observed in low endemic areas. These results can partially explain the poor degree of agreement between the IgG-ELISA and the Kato-Katz, as suggested by the Kappa index of 0.170. Otherwise, the Kappa index of 0.675 showed substantial agreement between the two serological tests. Some discrepancy of results between the two serological techniques must be better investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mycobacterium tuberculosis strains resistant to streptomycin (SM), isoniazid (INH), and/or rifampin (RIF) as determined by the conventional Löwenstein-Jensen proportion method (LJPM) were compared with the E test, a minimum inhibitory concentration susceptibility method. Discrepant isolates were further evaluated by BACTEC and by DNA sequence analyses for mutations in genes most often associated with resistance to these drugs (rpsL, katG, inhA, and rpoB). Preliminary discordant E test results were seen in 75% of isolates resistant to SM and in 11% to INH. Discordance improved for these two drugs (63%) for SM and none for INH when isolates were re-tested but worsened for RIF (30%). Despite good agreement between phenotypic results and sequencing analyses, wild type profiles were detected on resistant strains mainly for SM and INH. It should be aware that susceptible isolates according to molecular methods might contain other mechanisms of resistance. Although reproducibility of the LJPM susceptibility method has been established, variable E test results for some M. tuberculosis isolates poses questions regarding its reproducibility particularly the impact of E test performance which may vary among laboratories despite adherence to recommended protocols. Further studies must be done to enlarge the evaluated samples and looked possible mutations outside of the hot spot sequenced gene among discrepant strains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mutations in the rpoB locus confer conformational changes leading to defective binding of rifampin (RIF) to rpoB and consequently resistance in Mycobacterium tuberculosis. Polymerase chain reaction-single-strand conformation polymorphism (PCR-SSCP) was established as a rapid screening test for the detection of mutations in the rpoB gene, and direct sequencing has been unambiguously applied to characterize mutations. A total of 37 of Iranian isolates of M. tuberculosis, 16 sensitive and 21 resistant to RIF, were used in this study. A 193-bp region of the rpoB gene was amplified and PCR-SSCP patterns were determined by electrophoresis in 10% acrylamide gel and silver staining. Also, 21 samples of 193-bp rpoB amplicons with different PCR-SSCP patterns from RIFr and 10 from RIFs were sequenced. Seven distinguishable PCR-SSCP patterns were recognized in the 21 Iranian RIFr strains, while 15 out of 16 RIFs isolates demonstrated PCR-SSCP banding patterns similar to that of sensitive standard strain H37Rv. However one of the sensitive isolates demonstrated a different pattern. There were seen six different mutations in the amplified region of rpoB gene: codon 516(GAC/GTC), 523(GGG/GGT), 526(CAC/TAC), 531(TCG/TTG), 511(CTG/TTG), and 512(AGC/TCG). This study demonstrated the high specificity (93.8%) and sensitivity (95.2%) of PCR-SSCP method for detection of mutation in rpoB gene; 85.7% of RIFr strains showed a single mutation and 14.3% had no mutations. Three strains showed mutations caused polymorphism. Our data support the common notion that rifampin resistance genotypes are generally present mutations in codons 531 and 526, most frequently found in M. tuberculosis populations regardless of geographic origin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this study was to demonstrate the usefulness of an enzyme-linked immunosorbent assay (ELISA) for the serodiagnosis of pulmonary tuberculosis (PTB) and extrapulmonary TB (EPTB). This assay used 20 amino acid-long, non-overlapped synthetic peptides that spanned the complete Mycobacterium tuberculosis ESAT-6 and Ag85A sequences. The validation cohort consisted of 1,102 individuals who were grouped into the following five diagnostic groups: 455 patients with PTB, 60 patients with EPTB, 40 individuals with non-EPTB, 33 individuals with leprosy and 514 healthy controls. For the PTB group, two ESAT-6 peptides (12033 and 12034) had the highest sensitivity levels of 96.9% and 96.2%, respectively, and an Ag85A-peptide (29878) was the most specific (97.4%) in the PTB groups. For the EPTB group, two Ag85A peptides (11005 and 11006) were observed to have a sensitivity of 98.3% and an Ag85A-peptide (29878) was also the most specific (96.4%). When combinations of peptides were used, such as 12033 and 12034 or 11005 and 11006, 99.5% and 100% sensitivities in the PTB and EPTB groups were observed, respectively. In conclusion, for a cohort that consists entirely of individuals from Venezuela, a multi-antigen immunoassay using highly sensitive ESAT-6 and Ag85A peptides alone and in combination could be used to more rapidly diagnose PTB and EPTB infection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure based on quantum molecular similarity measures (QMSM) has been used to compare electron densities obtained from conventional ab initio and density functional methodologies at their respective optimized geometries. This method has been applied to a series of small molecules which have experimentally known properties and molecular bonds of diverse degrees of ionicity and covalency. Results show that in most cases the electron densities obtained from density functional methodologies are of a similar quality than post-Hartree-Fock generalized densities. For molecules where Hartree-Fock methodology yields erroneous results, the density functional methodology is shown to yield usually more accurate densities than those provided by the second order Møller-Plesset perturbation theory

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present paper we discuss and compare two different energy decomposition schemes: Mayer's Hartree-Fock energy decomposition into diatomic and monoatomic contributions [Chem. Phys. Lett. 382, 265 (2003)], and the Ziegler-Rauk dissociation energy decomposition [Inorg. Chem. 18, 1558 (1979)]. The Ziegler-Rauk scheme is based on a separation of a molecule into fragments, while Mayer's scheme can be used in the cases where a fragmentation of the system in clearly separable parts is not possible. In the Mayer scheme, the density of a free atom is deformed to give the one-atom Mulliken density that subsequently interacts to give rise to the diatomic interaction energy. We give a detailed analysis of the diatomic energy contributions in the Mayer scheme and a close look onto the one-atom Mulliken densities. The Mulliken density ρA has a single large maximum around the nuclear position of the atom A, but exhibits slightly negative values in the vicinity of neighboring atoms. The main connecting point between both analysis schemes is the electrostatic energy. Both decomposition schemes utilize the same electrostatic energy expression, but differ in how fragment densities are defined. In the Mayer scheme, the electrostatic component originates from the interaction of the Mulliken densities, while in the Ziegler-Rauk scheme, the undisturbed fragment densities interact. The values of the electrostatic energy resulting from the two schemes differ significantly but typically have the same order of magnitude. Both methods are useful and complementary since Mayer's decomposition focuses on the energy of the finally formed molecule, whereas the Ziegler-Rauk scheme describes the bond formation starting from undeformed fragment densities

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count nonculturableor non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescencemicroscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the ''impaction on nutrient agar'' method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria. [Authors]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.