976 resultados para Nonparametric statistical analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Includes bibliographical references.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Two different slug test field methods are conducted in wells completed in a Puget Lowland aquifer and are examined for systematic error resulting from water column displacement techniques. Slug tests using the standard slug rod and the pneumatic method were repeated on the same wells and hydraulic conductivity estimates were calculated according to Bouwer & Rice and Hvorslev before using a non-parametric statistical test for analysis. Practical considerations of performing the tests in real life settings are also considered in the method comparison. Statistical analysis indicates that the slug rod method results in up to 90% larger hydraulic conductivity values than the pneumatic method, with at least a 95% certainty that the error is method related. This confirms the existence of a slug-rod bias in a real world scenario which has previously been demonstrated by others in synthetic aquifers. In addition to more accurate values, the pneumatic method requires less field labor, less decontamination, and provides the ability to control the magnitudes of the initial displacement, making it the superior slug test procedure.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

beta2-Laminin is important for the formation of neuromuscular junctions in vertebrates. Previously, we have inactivated the gene that encodes for beta2-laminin in mice and observed predominantly prejunctional structural defects. In this study, we have used both intra- and extracellular recording methods to investigate evoked neurotransmission in beta2-laminin-deficient mice, from postnatal day 8 (P8) through to day 18(P18). Our results confirmed that there was a decrease in the frequency of spontaneous release, but no change in the postjunctional response to such release. Analysis of evoked neurotransmission showed an increase in the frequency of stimuli that failed to elicit an evoked postjunctional response in the mutants compared to litter mate controls, resulting in a 50% reduction in mean quantal content at mutant terminals. Compared to littermate controls, beta2-laminin-deficient terminals showed greater synaptic depression when subjected to high frequency stimulation. Furthermore, the paired pulse ratio of the first two stimuli was significantly lower in beta2-laminin mutant terminals. Statistical analysis of the binomial parameters of release showed that the decrease in quantal content was due to a decrease in the number of release sites without any significant change in the average probability of release. This suggestion was supported by the observation of fewer synaptic vesicle protein 2 (SV2)-positive varicosities in beta2-laminin-deficient terminals and by ultrastructural observations showing smaller terminal profiles and increased Schwann cell invasion in beta2-laminin mutants; the differences between beta2-laminin mutants and wild-type mice were the same at both P8 and P18. From these results we conclude that beta2-laminin plays a role in the early structural development of the neuromuscular junction. We also suggest that transmitter release activity may act as a deterrent to Schwarm cell invasion in the absence of beta2-laminin.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An understanding of inheritance requires comprehension of genetic processes at all levels, from molecules to populations. Frequently genetics courses are separated into molecular and organismal genetics and students may fail to see the relationships between them. This is particularly true with human genetics, because of the difficulties in designing experimental approaches which are consistent with ethical restrictions, student abilities and background knowledge, and available time and materials. During 2005 we used analysis of single nucleotide polymorphisms (SNPs) in two genetic regions to enhance student learning and provide a practical experience in human genetics. Students scanned databases to discover SNPs in a gene of interest, used software to design PCR primers and a restriction enzyme based assay for the alleles, and carried out an analysis of the SNP on anonymous individual and family DNAs. The project occupied eight to ten hours per week for one semester, with some time spent in the laboratory and some spent in database searching, reading and writing the report. In completing their projects, students acquired a knowledge of Mendel’s first law (through looking at inheritance patterns), Mendel’s second law and the exceptions (the concepts of linkage and linkage disequilibrium), DNA structure (primer design and restriction enzyme analysis) and function (SNPs in coding and non-coding regions), population genetics and the statistical analysis of allele frequencies, genomics, bioinformatics and the ethical issues associated with the use of human samples. They also developed skills in presentation of results by publication and conference participation. Deficiencies in their understanding (for example of inheritance patterns, gene structure, statistical approaches and report writing) were detected and guidance given during the project. SNP analysis was found to be a powerful approach to enhance and integrate student understanding of genetic concepts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The topic of this thesis is the development of knowledge based statistical software. The shortcomings of conventional statistical packages are discussed to illustrate the need to develop software which is able to exhibit a greater degree of statistical expertise, thereby reducing the misuse of statistical methods by those not well versed in the art of statistical analysis. Some of the issues involved in the development of knowledge based software are presented and a review is given of some of the systems that have been developed so far. The majority of these have moved away from conventional architectures by adopting what can be termed an expert systems approach. The thesis then proposes an approach which is based upon the concept of semantic modelling. By representing some of the semantic meaning of data, it is conceived that a system could examine a request to apply a statistical technique and check if the use of the chosen technique was semantically sound, i.e. will the results obtained be meaningful. Current systems, in contrast, can only perform what can be considered as syntactic checks. The prototype system that has been implemented to explore the feasibility of such an approach is presented, the system has been designed as an enhanced variant of a conventional style statistical package. This involved developing a semantic data model to represent some of the statistically relevant knowledge about data and identifying sets of requirements that should be met for the application of the statistical techniques to be valid. Those areas of statistics covered in the prototype are measures of association and tests of location.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The central argument to this thesis is that the nature and purpose of corporate reporting has changed over time to become a more outward looking and forward looking document designed to promote the company and its performance to a wide range of shareholders, rather than merely to report to its owners upon past performance. it is argued that the discourse of environmental accounting and reporting is one driver for this change but that this discourse has been set up as in conflicting with the discourse of traditional accounting and performance measurement. The effect of this opposition between the discourses is that the two have been interpreted to be different and incompatible dimensions of performance with good performance along one dimension only being achievable through a sacrifice of performance along the other dimension. Thus a perceived dialectic in performance is believed to exist. One of the principal purposes of this thesis is to explore this perceived dialectic and, through analysis, to show that it does not exist and that there is not incompatibility. This exploration and analysis is based upon an investigation of the inherent inconsistencies in such corporate reports and the analysis makes use of both a statistical analysis and a semiotic analysis of corporate reports and the reported performance of companies along these dimensions. Thus the development of a semiology of corporate reporting is one of the significant outcomes of this thesis. A further outcome is a consideration of the implications of the analysis for corporate performance and its measurement. The thesis concludes with a consideration of the way in which the advent of electronic reporting may affect the ability of organisations to maintain the dialectic and the implications for corporate reporting.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Orthodox contingency theory links effective organisational performance to compatible relationships between the environment and organisation strategy and structure and assumes that organisations have the capacity to adapt as the environment changes. Recent contributions to the literature on organisation theory claim that the key to effective performance is effective adaptation which in turn requires the simultaneous reconciliation of efficiency and innovation which is afforded by an unique environment-organisation configuration. The literature on organisation theory recognises the continuing confusion caused by the fragmented and often conflicting results from cross-sectional studies. Although the case is made for longitudinal studies which comprehensively describe the evolving relationship between the environment and the organisation there is little to suggest how such studies should be executed in practice. Typically the choice is between the approaches of the historicised case study and statistical analysis of large populations which examine the relationship between environment and organisation strategy and/or structure and ignore the product-process relationship. This study combines the historicised case study and the multi-variable and ordinal scale approach of statistical analysis to construct an analytical framework which tracks and exposes the environment-organisation-performance relationship over time. The framework examines changes in the environment, strategy and structure and uniquely includes an assessment of the organisation's product-process relationship and its contribution to organisational efficiency and innovation. The analytical framework is applied to examine the evolving environment-organisation relationship of two organisations in the same industry over the same twenty-five year period to provide a sector perspective of organisational adaptation. The findings demonstrate the significance of the environment-organisation configuration to the scope and frequency of adaptation and suggest that the level of sector homogeneity may be linked to the level of product-process standardisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, a new entropy measure known as kernel entropy (KerEnt), which quantifies the irregularity in a series, was applied to nocturnal oxygen saturation (SaO 2) recordings. A total of 96 subjects suspected of suffering from sleep apnea-hypopnea syndrome (SAHS) took part in the study: 32 SAHS-negative and 64 SAHS-positive subjects. Their SaO 2 signals were separately processed by means of KerEnt. Our results show that a higher degree of irregularity is associated to SAHS-positive subjects. Statistical analysis revealed significant differences between the KerEnt values of SAHS-negative and SAHS-positive groups. The diagnostic utility of this parameter was studied by means of receiver operating characteristic (ROC) analysis. A classification accuracy of 81.25% (81.25% sensitivity and 81.25% specificity) was achieved. Repeated apneas during sleep increase irregularity in SaO 2 data. This effect can be measured by KerEnt in order to detect SAHS. This non-linear measure can provide useful information for the development of alternative diagnostic techniques in order to reduce the demand for conventional polysomnography (PSG). © 2011 IEEE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H30, 62J20, 62P12, 68T99

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The adverse health effects of long-term exposure to lead are well established, with major uptake into the human body occurring mainly through oral ingestion by young children. Lead-based paint was frequently used in homes built before 1978, particularly in inner-city areas. Minority populations experience the effects of lead poisoning disproportionately. ^ Lead-based paint abatement is costly. In the United States, residents of about 400,000 homes, occupied by 900,000 young children, lack the means to correct lead-based paint hazards. The magnitude of this problem demands research on affordable methods of hazard control. One method is encapsulation, defined as any covering or coating that acts as a permanent barrier between the lead-based paint surface and the environment. ^ Two encapsulants were tested for reliability and effective life span through an accelerated lifetime experiment that applied stresses exceeding those encountered under normal use conditions. The resulting time-to-failure data were used to extrapolate the failure time under conditions of normal use. Statistical analysis and models of the test data allow forecasting of long-term reliability relative to the 20-year encapsulation requirement. Typical housing material specimens simulating walls and doors coated with lead-based paint were overstressed before encapsulation. A second, un-aged set was also tested. Specimens were monitored after the stress test with a surface chemical testing pad to identify the presence of lead breaking through the encapsulant. ^ Graphical analysis proposed by Shapiro and Meeker and the general log-linear model developed by Cox were used to obtain results. Findings for the 80% reliability time to failure varied, with close to 21 years of life under normal use conditions for encapsulant A. The application of product A on the aged gypsum and aged wood substrates yielded slightly lower times. Encapsulant B had an 80% reliable life of 19.78 years. ^ This study reveals that encapsulation technologies can offer safe and effective control of lead-based paint hazards and may be less expensive than other options. The U.S. Department of Health and Human Services and the CDC are committed to eliminating childhood lead poisoning by 2010. This ambitious target is feasible, provided there is an efficient application of innovative technology, a goal to which this study aims to contribute. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background The HIV virus is known for its ability to exploit numerous genetic and evolutionary mechanisms to ensure its proliferation, among them, high replication, mutation and recombination rates. Sliding MinPD, a recently introduced computational method [1], was used to investigate the patterns of evolution of serially-sampled HIV-1 sequence data from eight patients with a special focus on the emergence of X4 strains. Unlike other phylogenetic methods, Sliding MinPD combines distance-based inference with a nonparametric bootstrap procedure and automated recombination detection to reconstruct the evolutionary history of longitudinal sequence data. We present serial evolutionary networks as a longitudinal representation of the mutational pathways of a viral population in a within-host environment. The longitudinal representation of the evolutionary networks was complemented with charts of clinical markers to facilitate correlation analysis between pertinent clinical information and the evolutionary relationships. Results Analysis based on the predicted networks suggests the following:: significantly stronger recombination signals (p = 0.003) for the inferred ancestors of the X4 strains, recombination events between different lineages and recombination events between putative reservoir virus and those from a later population, an early star-like topology observed for four of the patients who died of AIDS. A significantly higher number of recombinants were predicted at sampling points that corresponded to peaks in the viral load levels (p = 0.0042). Conclusion Our results indicate that serial evolutionary networks of HIV sequences enable systematic statistical analysis of the implicit relations embedded in the topology of the structure and can greatly facilitate identification of patterns of evolution that can lead to specific hypotheses and new insights. The conclusions of applying our method to empirical HIV data support the conventional wisdom of the new generation HIV treatments, that in order to keep the virus in check, viral loads need to be suppressed to almost undetectable levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies reveal that in recent decades a decrease in sleep duration has occurred. Social commitments, such as work and school are often not aligned to the "biological time" of individuals. Added to this, there is a reduced force of zeitgeber caused by less exposure to daylight and larger exposure to evenings. This causes a chronic sleep debt that is offset in a free days. Indeed, a restriction and extent of sleep called "social Jet lag" occurs weekly. Sleep deprivation has been associated to obesity, cancer, and cardiovascular risk. It is suggested that the autonomic nervous system is a pathway that connects sleep problems to cardiovascular diseases. However, beyond the evidence demonstrated by studies using models of acute and controlled sleep deprivation, studies are needed to investigate the effects of chronic sleep deprivation as it occurs in the social jet lag. The aim of this study was to investigate the influence of social jet lag in circadian rest-activity markers and heart function in medical students. It is a cross-sectional, observational study conducted in the Laboratory of Neurobiology and Biological Rhythmicity (LNRB) at the Department of Physiology UFRN. Participated in the survey medical students enrolled in the 1st semester of their course at UFRN. Instruments for data collection: Munich Chronotype Questionnaire, Morningness Eveningness Questionnaire of Horne and Östberg, Pittsburgh Sleep Quality Index, Epworth Sleepiness Scale, Actimeter; Heart rate monitor. Analysed were descriptive variables of sleep, nonparametric (IV60, IS60, L5 and M10) and cardiac indexes of time domain, frequency (LF, HF LF / HF) and nonlinear (SD1, SD2, SD1 / SD2). Descriptive, comparative and correlative statistical analysis was performed with SPSS software version 20. 41 students participated in the study, 48.8% (20) females and 51.2% (21) males, 19.63 ± 2.07 years. The social jet lag had an average of 02: 39h ± 00:55h, 82.9% (34) with social jet lag ≥ 1h and there was a negative correlation with the Munich chronotype score indicating greater sleep deprivation in subjects prone to eveningness. Poor sleep quality was detected in 90.2% (37) (X2 = 26.56, p <0.001) and 56.1% (23) excessive daytime sleepiness (X2 = 0.61, p = 0.435). Significant differences were observed in the values of LFnu, HFnu and LF / HF between the groups of social jet lag <2h and ≥ 2h and correlation of the social jet lag with LFnu (rs = 0.354, p = 0.023), HFnu (rs = - 0.354 , p = 0.023) and LF / HF (r = 0.355, p = 0.023). There was also a negative association between IV60 and indexes in the time domain and non-linear. It is suggested that chronic sleep deprivation may be associated with increased sympathetic activation promoting greater cardiovascular risk.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study is a variationist sociolinguistic analysis of two speech styles, performance and interview, of a dinner theatre troupe in Ferryland on the Southern Shore of Newfoundland. Five actors and ten of their characters are analyzed to test if their vowels change across styles. The study adopts a variationist framework with a Community of Practice model, drawing on Bell’s audience and referee design to argue that the performers’ stage conventions and identity construction are influenced by a third person referee: the Idealized Authentic Newfoundlander (IAN). Under this view the goal of the performer is to both communicate with and entertain the audience, which requires different tactics when speaking. These tactics manifest phonetically and are discussed in a quantitative, statistical analysis of the acoustic measurements of the vowel tokens [variables FACE, KIT, LOT/PALM and GOAT lexical sets with Newfoundland Irish English (NIE) variants] and a qualitative discussion.