893 resultados para Statistical method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The issue in this matter is that rules for use of electricity in rural areas are limited to the provision of inputs. Adopting guidelines to consider managed sub regions can generate poor results. The focus of this study was to present parameters for indicators of electric energy and agricultural production to allow the formation of city groups in Sao Paulo State, Brazil, with similar electric energy consumption and rural agricultural production. The methodology was the development of indicators that characterize the electric energy consumption/agricultural production and the preparation of groups using indicators with ward of statistical method of groups. The main conclusions were the formation of six homogeneous groups with similar characteristics regarding agricultural production/consumption of electricity. The application of these groups in cities with similar characteristics would produce more satisfactory results than the division of administrative Rural Development Offices (RDO).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Asian rust currently is the main disease of soybean culture, having as characteristics the difficult control, by start at the bottom of plants where penetration of the droplets is harder. The fine droplets has been used with the intention of improve the penetration and increase efficiency of agrochemicals, but that are losses by drift easily. New products have been developed to increase deposition of the drops at targets. The aim of this work was evaluate the TA- 35 capacity to improving the deposition of fungicides spray solution with or without mineral oil by aerial and ground applications. Was used a factorial 3x2, three spray solutions composed by Priori Xtra (concentrated suspension of azoxystrobin 200 g L-1 + cyproconazole 80 g L-1 ) mixed with adjuvants, Nimbus (emulsifiable concentrate containing aliphatic hydrocarbons 428 g L-1 ) and TA-35 (soluble concentrate containing sodium lauryl ether sulfate, surfactants, sequestering agents and emulsifiers), in aerial and ground applications. In ground applications was used 50 L ha-1 , TXA 8002 VS spray nozzles and on aerials was used 15 L ha-1 , Turboaero atomizer, both applying fine droplets. Was utilized the Brilliant Blue (FD & C n. 1) tracer to determine the deposits. There were used glass slides as targets to collect spray droplets. After to extract the tracer of the targets using distilled water, the samples were analyzed by spectrophotometry, thereby was possible quantify the tracer deposited on each glass slide. A study to evaluate possible losses of the tracer by degradation or retention also was done. The comparative analysis of treatments was done by statistical method "Confidence Interval for Differences Between the Averages" with 95% of confidence degree (IC95%). There was degradation or retention of the tracer between the processes of application of the droplets and the processing of the samples. The deposition averages with the presence of TA-35 were greatest for both sprayers however, there were not significant differences among the treatments. The viability of TA-35 use may consider other parameters or complementary studies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates postgraduate students' perceptions of the Internet as an enabler of scholarship.The specific objectives of the study are: to ascertain the perception held by the postgraduate students on the Internet usage as an enabler of scholarship, and to determine what interventions are necessary to facilitate postgraduate students' adoption of the Internet as a tool for learning and research. The subjects of study are the postgraduate students of the University of Ibadan, Nigeria. A random sample of 100 students was selected with representation from each faculty of the university. A questionnaire instrument with a 12-item scale was designed and administered. Data analysis was done using the chi-square statistical method. The results show that majority of the postgraduate students have positive perceptions of the Internet as an enabler of information sourcing for learning and research. However some of these students have low self-efficacy in Internet use for information sourcing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The non-classical human leukocyte antigen (HLA) class I genes present a very low rate of variation. So far, only 10 HLA-E alleles encoding three proteins have been described, but only two are frequently found in worldwide populations. Because of its historical background, Brazilians are very suitable for population genetic studies. Therefore, 104 bone marrow donors from Brazil were evaluated for HLA-E exons 14. Seven variation sites were found, including two known single nucleotide polymorphisms (SNPs) at positions +424 and +756 and five new SNPs at positions +170 (intron 1), +1294 (intron 3), +1625, +1645 and +1857 (exon 4). Haplotyping analysis did show eight haplotypes, three of them known as E*01:01:01, E*01:03:01 and E*01:03:02:01 and five HLA-E new alleles that carry the new variation sites. The HLA-E*01:01:01 allele was the predominant haplotype (62.50%), followed by E*01:03:02:01 (24.52%). Selective neutrality tests have disclosed an interesting pattern of selective pressures in which balancing selection is probably shaping allele frequency distributions at an SNP at exon 3 (codon 107), sequence diversity at exon 4 and the non-coding regions is facing significant purifying pressure. Even in an admixed population such as the Brazilian one, the HLA-E locus is very conserved, presenting few polymorphic SNPs in the coding region.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report the measurement of direct photons at midrapidity in Au + Au collisions at root s(NN) = 200 GeV. The direct photon signal was extracted for the transverse momentum range of 4 GeV/c < pT < 22 GeV/c, using a statistical method to subtract decay photons from the inclusive photon sample. The direct photon nuclear modification factor R-AA was calculated as a function of p(T) for different Au + Au collision centralities using the measured p + p direct photon spectrum and compared to theoretical predictions. R-AA was found to be consistent with unity for all centralities over the entire measured pT range. Theoretical models that account for modifications of initial direct photon production due to modified parton distribution functions in Au and the different isospin composition of the nuclei predict a modest change of R-AA from unity. They are consistent with the data. Models with compensating effects of the quark-gluon plasma on high-energy photons, such as suppression of jet-fragmentation photons and induced-photon bremsstrahlung from partons traversing the medium, are also consistent with this measurement.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, the reduction reaction of paraquat herbicide was used to obtain analytical signals using electrochemical techniques of differential pulse voltammetry, square wave voltammetry and multiple square wave voltammetry. Analytes were prepared with laboratory purified water and natural water samples (from Mogi-Guacu River, SP). The electrochemical techniques were applied to 1.0 mol L-1 Na2SO4 solutions, at pH 5.5, and containing different concentrations of paraquat, in the range of 1 to 10 mu mol L-1, using a gold ultramicroelectrode. 5 replicate experiments were conducted and in each the mean value for peak currents obtained -0.70 V vs. Ag/AgCl yielded excellent linear relationships with pesticide concentrations. The slope values for the calibration plots (method sensitivity) were 4.06 x 10(-3), 1.07 x 10(-2) and 2.95 x 10(-2) A mol(-1) L for purified water by differential pulse voltammetry, square wave voltammetry and multiple square wave voltammetry, respectively. For river water samples, the slope values were 2.60 x 10(-3), 1.06 x 10(-2) and 3.35 x 10(-2) A mol(-1) L, respectively, showing a small interference from the natural matrix components in paraquat determinations. The detection limits for paraquat determinations were calculated by two distinct methodologies, i.e., as proposed by IUPAC and a statistical method. The values obtained with multiple square waves voltammetry were 0.002 and 0.12 mu mol L-1, respectively, for pure water electrolytes. The detection limit from IUPAC recommendations, when inserted in the calibration curve equation, an analytical signal (oxidation current) is smaller than the one experimentally observed for the blank solution under the same experimental conditions. This is inconsistent with the definition of detection limit, thus the IUPAC methodology requires further discussion. The same conclusion can be drawn by the analyses of detection limits obtained with the other techniques studied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Widespread occurrence of pharmaceuticals residues has been reported in aquatic ecosystems. However, their toxic effects on aquatic biota remain unclear. Generally, the acute toxicity has been assessed in laboratory experiments, while chronic toxicity studies have rarely been performed. Of importance appears also the assessment of mixture effects, since pharmaceuticals never occur in waters alone. The aim of the present work is to evaluate acute and chronic toxic response in the crustacean Daphnia magna exposed to single pharmaceuticals and mixtures. We tested fluoxetine, a SSRI widely prescribed as antidepressant, and propranolol, a non selective β-adrenergic receptor-blocking agent used to treat hypertension. Acute immobilization and chronic reproduction tests were performed according to OECD guidelines 202 and 211, respectively. Single chemicals were first tested separately. Toxicity of binary mixtures was then assessed using a fixed ratio experimental design with concentrations based on Toxic Units. The conceptual model of Concentration Addition was adopted in this study, as we assumed that the mixture effect mirrors the sum of the single substances for compounds having similar mode of action. The MixTox statistical method was applied to analyze the experimental results. Results showed a significant deviation from CA model that indicated antagonism between chemicals in both the acute and the chronic mixture tests. The study was integrated assessing the effects of fluoxetine on a battery of biomarkers. We wanted to evaluate the organism biological vulnerability caused by low concentrations of pharmaceutical occurring in the aquatic environment. We assessed the acetylcholinesterase and glutathione s-transferase enzymatic activities and the malondialdehyde production. No treatment induced significant alteration of biomarkers with respect to the control. Biological assays and the MixTox model application proved to be useful tools for pharmaceutical risk assessment. Although promising, the application of biomarkers in Daphnia magna needs further elucidation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In dieser Arbeit wird der Entwurf, der Aufbau, die Inbetriebnahme und die Charakterisierung einer neuartigen Penning-Falle im Rahmen des Experiments zur Bestimmung des g-Faktors des Protons präsentiert. Diese Falle zeichnet sich dadurch aus, dass die Magnetfeldlinien eines äußeren homogenen Magnetfeldes durch eine ferromagnetische Ringelektrode im Zentrum der Falle verzerrt werden. Der inhomogene Anteil des resultierenden Magnetfeldes, die sogenannte magnetische Flasche, lässt sich durch den Koeffizient B2 = 297(10) mT/mm2 des Terms zweiter Ordnung der Ortsabhängigkeit des Feldes quantifizieren. Eine solche ungewöhnlich starke Feldinhomogenität ist Grundvoraussetzung für den Nachweis der Spinausrichtung des Protons mittels des kontinuierlichen Stern-Gerlach-Effektes. Dieser Effekt basiert auf der im inhomogenen Magnetfeld entstehenden Kopplung des Spin-Freiheitsgrades des gefangenen Protons an eine seiner Eigenfrequenzen. Ein Spin-Übergang lässt sich so über einen Frequenzsprung detektieren. Dabei ist die nachzuweisende Änderung der Frequenz proportional zu B2 und zum im Fall des Protons extrem kleinen Verhältnis zwischen seinem magnetischen Moment nund seiner Masse. Die durch die benötigte hohe Inhomogenität des Magnetfeldes bedingten technischen Herausforderungen erfordern eine fundierte Kenntnis und Kontrolle der Eigenschaften der Penning-Falle sowie der experimentellen Bedingungen. Die in der vorliegenden Arbeit entwickelte Penning-Falle ermöglichte den erstmaligen zerstörungsfreien Nachweis von Spin-Quantensprüngen eines einzelnen gefangenen Protons, was einen Durchbruch für das Experiment zur direkten Bestimmung des g-Faktors mit der angestrebten relativen Genauigkeit von 10−9 darstellte. Mithilfe eines statistischen Verfahrens ließen sich die Larmor- und die Zyklotronfrequenz des Protons im inhomogenen Magnetfeld der Falle ermitteln. Daraus wurde der g-Faktor mit einer relativen Genauigkeit von 8,9 × 10−6 bestimmt. Die hier vorgestellten Messverfahren und der experimentelle Aufbau können auf ein äquivalentes Experiment zur Bestimmung des g-Faktors des Antiprotons zum Erreichen der gleichen Messgenauigkeit übertragen werden, womit der erste Schritt auf dem Weg zu einem neuen zwingenden Test der CPT-Symmetrie im baryonischen Sektor gemacht wäre.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCTION: Apical surgery has seen continuous development with regard to equipment and surgical technique. However, there is still a shortage of evidence-based information regarding healing determinants. The objective of this meta-analysis was to review clinical articles on apical surgery with root-end filling in order to assess potential prognostic factors. METHODS: An electronic search of PubMed and Cochrane databases was performed in 2008. Only studies with clearly defined healing criteria were included, and data for at least two categories per prognostic factor had to be reported. Prognostic factors were divided into patient-related, tooth-related, or treatment-related factors. The reported percentages of healed teeth ("the healed rate") were pooled per category. The statistical method of Mantel-Haenszel was applied to estimate the odds ratios and their 95% confidence intervals. RESULTS: With regard to tooth-related factors, the following categories were significantly associated with higher healed rates: cases without preoperative pain or signs, cases with good density of root canal filling, and cases with absence or size < or = 5 mm of periapical lesion. With regard to treatment-related factors, cases treated with the use of an endoscope tended to have higher healed rates than cases without the use of an endoscope. CONCLUSIONS: Although the clinician may be able to control treatment-related factors (by choosing a certain technique), patient- and tooth-related factors are usually beyond the surgeon's power. Nevertheless, patient- and tooth-related factors should be considered as important prognostic determinants when planning or weighing apical surgery against treatment alternatives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Equivalence testing is growing in use in scientific research outside of its traditional role in the drug approval process. Largely due to its ease of use and recommendation from the United States Food and Drug Administration guidance, the most common statistical method for testing (bio)equivalence is the two one-sided tests procedure (TOST). Like classical point-null hypothesis testing, TOST is subject to multiplicity concerns as more comparisons are made. In this manuscript, a condition that bounds the family-wise error rate (FWER) using TOST is given. This condition then leads to a simple solution for controlling the FWER. Specifically, we demonstrate that if all pairwise comparisons of k independent groups are being evaluated for equivalence, then simply scaling the nominal Type I error rate down by (k - 1) is sufficient to maintain the family-wise error rate at the desired value or less. The resulting rule is much less conservative than the equally simple Bonferroni correction. An example of equivalence testing in a non drug-development setting is given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ims: Periodic leg movements in sleep (PLMS) are a frequent finding in polysomnography. Most patients with restless legs syndrome (RLS) display PLMS. However, since PLMS are also often recorded in healthy elderly subjects, the clinical significance of PLMS is still discussed controversially. Leg movements are seen concurrently with arousals in obstructive sleep apnoea (OSA) may also appear periodically. Quantitative assessment of the periodicity of LM/PLM as measured by inter movement intervals (IMI) is difficult. This is mainly due to influencing factors like sleep architecture and sleep stage, medication, inter and intra patient variability, the arbitrary amplitude and sequence criteria which tend to broaden the IMI distributions or make them even multi-modal. Methods: Here a statistical method is presented that enables eliminating such effects from the raw data before analysing the statistics of IMI. Rather than studying the absolute size of IMI (measured in seconds) we focus on the shape of their distribution (suitably normalized IMI). To this end we employ methods developed in Random Matrix Theory (RMT). Patients: The periodicity of leg movements (LM) of four patient groups (10 to 15 each) showing LM without PLMS (group 1), OSA without PLMS (group 2), PLMS and OSA (group 3) as well as PLMS without OSA (group 4) are compared. Results: The IMI of patients without PLMS (groups 1 and 2) and with PLMS (groups 3 and 4) are statistically different. In patients without PLMS the distribution of normalized IMI resembles closely the one of random events. In contrary IMI of PLMS patients show features of periodic systems (e.g. a pendulum) when studied in normalized manner. Conclusions: For quantifying PLMS periodicity proper normalization of the IMI is crucial. Without this procedure important features are hidden when grouping LM/PLM over whole nights or across patients. The clinical significance of PLMS might be eluded when properly separating random LM from LM that show features of periodic systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Can one observe an increasing level of individual lack of orientation because of rapid social change in modern societies? This question is examined using data from a representative longitudinal survey in Germany conducted in 2002–04. The study examines the role of education, age, sex, region (east/west), and political orientation for the explanation of anomia and its development. First we present the different sources of anomie in modern societies, based on the theoretical foundations of Durkheim and Merton, and introduce the different definitions of anomia, including our own cognitive version. Then we deduce several hypotheses from the theory, which we test by means of longitudinal data for the period 2002–04 in Germany using the latent growth curve model as our statistical method. The empirical findings show that all the sociodemographic variables, including political orientation, are strong predictors of the initial level of anomia. Regarding the development of anomia over time (2002–04), only the region (west) has a significant impact. In particular, the results of a multi-group analysis show that western German people with a right-wing political orientation become more anomic over this period. The article concludes with some theoretical implications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SUMMARY Split-mouth designs first appeared in dental clinical trials in the late sixties. The main advantage of this study design is its efficiency in terms of sample size as the patients act as their own controls. Cited disadvantages relate to carry-across effects, contamination or spilling of the effects of one intervention to another, period effects if the interventions are delivered at different time periods, difficulty in finding similar comparison sites within patients and the requirement for more complex data analysis. Although some additional thought is required when utilizing a split-mouth design, the efficiency of this design is attractive, particularly in orthodontic clinical studies where carry-across, period effects and dissimilarity between intervention sites does not pose a problem. Selection of the appropriate research design, intervention protocol and statistical method accounting for both the reduced variability and potential clustering effects within patients should be considered for the trial results to be valid.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Atlantic subpolar gyre (SPG) is one of the main drivers of decadal climate variability in the North Atlantic. Here we analyze its dynamics in pre-industrial control simulations of 19 different comprehensive coupled climate models. The analysis is based on a recently proposed description of the SPG dynamics that found the circulation to be potentially bistable due to a positive feedback mechanism including salt transport and enhanced deep convection in the SPG center. We employ a statistical method to identify multiple equilibria in time series that are subject to strong noise and analyze composite fields to assess whether the bistability results from the hypothesized feedback mechanism. Because noise dominates the time series in most models, multiple circulation modes can unambiguously be detected in only six models. Four of these six models confirm that the intensification is caused by the positive feedback mechanism.