53 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate stimulant medication response following a single dose of methylphenidate (MPH) in children and young people with hyperkinetic disorder using infrared motion analysis combined with a continuous performance task (QbTest system) as objective measures. The hypothesis was put forward that a moderate testdose of stimulant medication could determine a robust treatment response, partial response and non-response in relation to activity, attention and impulse control measures. Methods: The study included 44 children and young people between the ages of 7-18 years with a diagnosis of hyperkinetic disorder (F90 & F90.1). A single dose-protocol incorporated the time course effects of both immediate release MPH and extended release MPH (Concerta XL, Equasym XL) to determine comparable peak efficacy periods post intake. Results: A robust treatment response with objective measures reverting to the population mean was found in 37 participants (84%). Three participants (7%) demonstrated a partial response to MPH and four participants (9%) were determined as non-responders due to deteriorating activity measures together with no improvements in attention and impulse control measures. Conclusion: Objective measures provide early into prescribing the opportunity to measure treatment response and monitor adverse reactions to stimulant medication. Most treatment responders demonstrated an effective response to MPH on a moderate testdose facilitating a swift and more optimal titration process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whilst the vast majority of the research on property market forecasting has concentrated on statistical methods of forecasting future rents, this report investigates the process of property market forecast production with particular reference to the level and effect of judgemental intervention in this process. Expectations of future investment performance at the levels of individual asset, sector, region, country and asset class are crucial to stock selection and tactical and strategic asset allocation decisions. Given their centrality to investment performance, we focus on the process by which forecasts of rents and yields are generated and expectations formed. A review of the wider literature on forecasting suggests that there are strong grounds to expect that forecast outcomes are not the result of purely mechanical calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the current state of development of both near-infrared (NIR) and mid-infrared (MIR) spectroscopic techniques for process monitoring, quality control, and authenticity determination in cheese processing. Infrared spectroscopy has been identified as an ideal process analytical technology tool, and recent publications have demonstrated the potential of both NIR and MIR spectroscopy, coupled with chemometric techniques, for monitoring coagulation, syneresis, and ripening as well as determination of authenticity, composition, sensory, and rheological parameters. Recent research is reviewed and compared on the basis of experimental design, spectroscopic and chemometric methods employed to assess the potential of infrared spectroscopy as a technology for improving process control and quality in cheese manufacture. Emerging research areas for these technologies, such as cheese authenticity and food chain traceability, are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, there has been a drive to save development costs and shorten time-to-market of new therapies. Research into novel trial designs to facilitate this goal has led to, amongst other approaches, the development of methodology for seamless phase II/III designs. Such designs allow treatment or dose selection at an interim analysis and comparative evaluation of efficacy with control, in the same study. Methods have gained much attention because of their potential advantages compared to conventional drug development programmes with separate trials for individual phases. In this article, we review the various approaches to seamless phase II/III designs based upon the group-sequential approach, the combination test approach and the adaptive Dunnett method. The objective of this article is to describe the approaches in a unified framework and highlight their similarities and differences to allow choice of an appropriate methodology by a trialist considering conducting such a trial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002–2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a Bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aircraft systems are highly nonlinear and time varying. High-performance aircraft at high angles of incidence experience undesired coupling of the lateral and longitudinal variables, resulting in departure from normal controlled � ight. The construction of a robust closed-loop control that extends the stable and decoupled � ight envelope as far as possible is pursued. For the study of these systems, nonlinear analysis methods are needed. Previously, bifurcation techniques have been used mainly to analyze open-loop nonlinear aircraft models and to investigate control effects on dynamic behavior. Linear feedback control designs constructed by eigenstructure assignment methods at a � xed � ight condition are investigated for a simple nonlinear aircraft model. Bifurcation analysis, in conjunction with linear control design methods, is shown to aid control law design for the nonlinear system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The radiation of the mammals provides a 165-million-year test case for evolutionary theories of how species occupy and then fill ecological niches. It is widely assumed that species often diverge rapidly early in their evolution, and that this is followed by a longer, drawn-out period of slower evolutionary fine-tuning as natural selection fits organisms into an increasingly occupied niche space1,2. But recent studies have hinted that the process may not be so simple3–5. Here we apply statistical methods that automatically detect temporal shifts in the rate of evolution through time to a comprehensive mammalian phylogeny6 and data set7 of body sizes of 3,185 extant species. Unexpectedly, the majority of mammal species, including two of the most speciose orders (Rodentia and Chiroptera), have no history of substantial and sustained increases in the rates of evolution. Instead, a subset of the mammals has experienced an explosive increase (between 10- and 52-fold) in the rate of evolution along the single branch leading to the common ancestor of their monophyletic group (for example Chiroptera), followed by a quick return to lower or background levels. The remaining species are a taxonomically diverse assemblage showing a significant, sustained increase or decrease in their rates of evolution. These results necessarily decouple morphological diversification from speciation and suggest that the processes that give rise to the morphological diversity of a class of animals are far more free to vary than previously considered. Niches do not seem to fill up, and diversity seems to arise whenever, wherever and at whatever rate it is advantageous.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Association mapping, initially developed in human disease genetics, is now being applied to plant species. The model species Arabidopsis provided some of the first examples of association mapping in plants, identifying previously cloned flowering time genes, despite high population sub-structure. More recently, association genetics has been applied to barley, where breeding activity has resulted in a high degree of population sub-structure. A major genotypic division within barley is that between winter- and spring-sown varieties, which differ in their requirement for vernalization to promote subsequent flowering. To date, all attempts to validate association genetics in barley by identifying major flowering time loci that control vernalization requirement (VRN-H1 and VRN-H2) have failed. Here, we validate the use of association genetics in barley by identifying VRN-H1 and VRN-H2, despite their prominent role in determining population sub-structure. Results: By taking barley as a typical inbreeding crop, and seasonal growth habit as a major partitioning phenotype, we develop an association mapping approach which successfully identifies VRN-H1 and VRN-H2, the underlying loci largely responsible for this agronomic division. We find a combination of Structured Association followed by Genomic Control to correct for population structure and inflation of the test statistic, resolved significant associations only with VRN-H1 and the VRN-H2 candidate genes, as well as two genes closely linked to VRN-H1 (HvCSFs1 and HvPHYC). Conclusion: We show that, after employing appropriate statistical methods to correct for population sub-structure, the genome-wide partitioning effect of allelic status at VRN-H1 and VRN-H2 does not result in the high levels of spurious association expected to occur in highly structured samples. Furthermore, we demonstrate that both VRN-H1 and the candidate VRN-H2 genes can be identified using association mapping. Discrimination between intragenic VRN-H1 markers was achieved, indicating that candidate causative polymorphisms may be discerned and prioritised within a larger set of positive associations. This proof of concept study demonstrates the feasibility of association mapping in barley, even within highly structured populations. A major advantage of this method is that it does not require large numbers of genome-wide markers, and is therefore suitable for fine mapping and candidate gene evaluation, especially in species for which large numbers of genetic markers are either unavailable or too costly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the ten years since the first edition of this book appeared there have been significant developments in food process engineering, notably in biotechnology and membrane application. Advances have been made in the use of sensors for process control, and the growth of information technology and on-line computer applications continues apace. In addition, plant investment decisions are increasingly determined by quality assurance considerations and have to incorporate a greater emphasis on health and safety issues. The content of this edition has been rearranged to include descriptions of recent developments and to reflect the influence of new technology on the control and operations of automated plant. Original examples have been retained where relevant and these, together with many new illustrations, provide a comprehensive guide to good practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The real-time quality control (RTQC) methods applied to Argo profiling float data by the United Kingdom (UK) Met Office, the United States (US) Fleet Numerical Meteorology and Oceanography Centre, the Australian Bureau of Meteorology and the Coriolis Centre are compared and contrasted. Data are taken from the period 2007 to 2011 inclusive and RTQC performance is assessed with respect to Argo delayed-mode quality control (DMQC). An intercomparison of RTQC techniques is performed using a common data set of profiles from 2010 and 2011. The RTQC systems are found to have similar power in identifying faulty Argo profiles but to vary widely in the number of good profiles incorrectly rejected. The efficacy of individual QC tests are inferred from the results of the intercomparison. Techniques to increase QC performance are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The soil microflora is very heterogeneous in its spatial distribution. The origins of this heterogeneity and its significance for soil function are not well understood. A problem for understanding spatial variation better is the assumption of statistical stationarity that is made in most of the statistical methods used to assess it. These assumptions are made explicit in geostatistical methods that have been increasingly used by soil biologists in recent years. Geostatistical methods are powerful, particularly for local prediction, but they require the assumption that the variability of a property of interest is spatially uniform, which is not always plausible given what is known about the complexity of the soil microflora and the soil environment. We have used the wavelet transform, a relatively new innovation in mathematical analysis, to investigate the spatial variation of abundance of Azotobacter in the soil of a typical agricultural landscape. The wavelet transform entails no assumptions of stationarity and is well suited to the analysis of variables that show intermittent or transient features at different spatial scales. In this study, we computed cross-variograms of Azotobacter abundance with the pH, water content and loss on ignition of the soil. These revealed scale-dependent covariation in all cases. The wavelet transform also showed that the correlation of Azotobacter abundance with all three soil properties depended on spatial scale, the correlation generally increased with spatial scale and was only significantly different from zero at some scales. However, the wavelet analysis also allowed us to show how the correlation changed across the landscape. For example, at one scale Azotobacter abundance was strongly correlated with pH in part of the transect, and not with soil water content, but this was reversed elsewhere on the transect. The results show how scale-dependent variation of potentially limiting environmental factors can induce a complex spatial pattern of abundance in a soil organism. The geostatistical methods that we used here make assumptions that are not consistent with the spatial changes in the covariation of these properties that our wavelet analysis has shown. This suggests that the wavelet transform is a powerful tool for future investigation of the spatial structure and function of soil biota. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solar electromagnetic radiation powers Earth’s climate system and, consequently, it is often naively assumed that changes in this solar output must be responsible for changes in Earth’s climate. However, the Sun is close to a blackbody radiator and so emits according to its surface temperature and the huge thermal time constant of the outer part of the Sun limits the variability in surface temperature and hence output. As a result, on all timescales of interest, changes in total power output are limited to small changes in effective surface temperature (associated with magnetic fields) and potential, although as yet undetected, solar radius variations. Larger variations are seen in the UV part of the spectrum which is emitted from the lower solar atmosphere (the chromosphere) and which influences Earth’s stratosphere. There is interest in“top-down” mechanisms whereby solar UV irradiance modulates stratospheric temperatures and winds which, in turn, may influence the underlying troposphere where Earth’s climate and weather reside. This contrasts with “bottom-up” effects in which the small total solar irradiance (dominated by the visible and near-IR) variations cause surface temperature changes which drive atmospheric circulations. In addition to these electromagnetic outputs, the Sun modulates energetic particle fluxes incident on the Earth. Solar Energetic Particles (SEP) are emitted by solar flares and from the shock fronts ahead of supersonic (and super-Alfvenic) ejections of material from the solar atmosphere. These SEPs enhance the destruction of polar stratospheric ozone which could be an additional form of top-down climate forcing. Even more energetic are Galactic Cosmic Rays (GCRs). These particles are not generated by the Sun, rather they originate at the shock fronts emanating from violent galactic events such as supernovae explosions; however, the expansion of the solar magnetic field into interplanetary space means that the Sun modulates the number of GCRs reaching Earth. These play a key role in enabling Earth’s global electric (thunderstorm) circuit and it has been proposed that they also modulate the formation of clouds. Both electromagnetic and corpuscular solar effects are known to vary over the solar magnetic cycle which is typically between 10 and 14 yrs in length (with an average close to 11 yrs). The solar magnetic field polarity at any one phase of one of these activity cycles is opposite to that at the same phase of the next cycle and this influences some phenomena, for example GCRs, which therefore show a 22 yr (“Hale”) cycle on average. Other phenomena, such as irradiance modulation, do not depend on the polarity of the magnetic field and so show only the basic 11-yr activity cycle. However, any effects on climate are much more significant for solar drifts over centennial timescales. This chapter discusses and evaluates potential effects on Earth’s climate system of variations in these solar inputs. Because of the great variety of proposed mechanisms, the wide range of timescales studied (from days to millennia) and the many debates (often triggered by the application of inadequate statistical methods), the literature on this subject is vast, complex, divergent and rapidly changing: consequently the number of references cited in this review is very large (yet still only a small fraction of the total).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper considers meta-analysis of diagnostic studies that use a continuous score for classification of study participants into healthy or diseased groups. Classification is often done on the basis of a threshold or cut-off value, which might vary between studies. Consequently, conventional meta-analysis methodology focusing solely on separate analysis of sensitivity and specificity might be confounded by a potentially unknown variation of the cut-off value. To cope with this phenomena it is suggested to use, instead, an overall estimate of the misclassification error previously suggested and used as Youden’s index and; furthermore, it is argued that this index is less prone to between-study variation of cut-off values. A simple Mantel–Haenszel estimator as a summary measure of the overall misclassification error is suggested, which adjusts for a potential study effect. The measure of the misclassification error based on Youden’s index is advantageous in that it easily allows an extension to a likelihood approach, which is then able to cope with unobserved heterogeneity via a nonparametric mixture model. All methods are illustrated at hand of an example on a diagnostic meta-analysis on duplex doppler ultrasound, with angiography as the standard for stroke prevention.