891 resultados para Process control - Statistical methods
Resumo:
Cross-sectional and longitudinal data consistently indicate that mathematical difficulties are more prevalent in older than in younger children (e.g. Department of Education, 2011). Children’s trajectories can take a variety of shapes such as linear, flat, curvilinear and uneven, and shape has been found to vary within children and across tasks (J Jordan et al. 2009). There has been an increase in the use of statistical methods which are specifically designed to study development, and this has greatly improved our understanding of children’s mathematical development. However, the effects of many cognitive and social variables (e.g. working memory and verbal ability) on mathematical development are unclear. It is likely that greater consistency between studies will be achieved by adopting a componential approach to study mathematics, rather than treating mathematics as a unitary concept.
Resumo:
The Kawakawa/Oruanui tephra (KOT) is a key chronostratigraphic marker in terrestrial and marine deposits of the New Zealand (NZ) sector of the southwest Pacific. Erupted early during the Last Glacial Maximum (LGM), the wide distribution of the KOT enables inter-regional alignment of proxy records and facilitates comparison between NZ climatic variations and those from well-dated records elsewhere. We present 22 new radiocarbon ages for the KOT from sites and materials considered optimal for dating, and apply Bayesian statistical methods via OxCal4.1.7 that incorporate stratigraphic information to develop a new age probability model for KOT. The revised calibrated age, ±2 standard deviations, for the eruption of the KOT is 25,360 ± 160 cal yr BP. The age revision provides a basis for refining marine reservoir ages for the LGM in the southwest Pacific.
Resumo:
Aims/hypothesis: Diabetic nephropathy is a major diabetic complication, and diabetes is the leading cause of end-stage renal disease (ESRD). Family studies suggest a hereditary component for diabetic nephropathy. However, only a few genes have been associated with diabetic nephropathy or ESRD in diabetic patients. Our aim was to detect novel genetic variants associated with diabetic nephropathy and ESRD. Methods: We exploited a novel algorithm, ‘Bag of Naive Bayes’, whose marker selection strategy is complementary to that of conventional genome-wide association models based on univariate association tests. The analysis was performed on a genome-wide association study of 3,464 patients with type 1 diabetes from the Finnish Diabetic Nephropathy (FinnDiane) Study and subsequently replicated with 4,263 type 1 diabetes patients from the Steno Diabetes Centre, the All Ireland-Warren 3-Genetics of Kidneys in Diabetes UK collection (UK–Republic of Ireland) and the Genetics of Kidneys in Diabetes US Study (GoKinD US). Results: Five genetic loci (WNT4/ZBTB40-rs12137135, RGMA/MCTP2-rs17709344, MAPRE1P2-rs1670754, SEMA6D/SLC24A5-rs12917114 and SIK1-rs2838302) were associated with ESRD in the FinnDiane study. An association between ESRD and rs17709344, tagging the previously identified rs12437854 and located between the RGMA and MCTP2 genes, was replicated in independent case–control cohorts. rs12917114 near SEMA6D was associated with ESRD in the replication cohorts under the genotypic model (p < 0.05), and rs12137135 upstream of WNT4 was associated with ESRD in Steno. Conclusions/interpretation: This study supports the previously identified findings on the RGMA/MCTP2 region and suggests novel susceptibility loci for ESRD. This highlights the importance of applying complementary statistical methods to detect novel genetic variants in diabetic nephropathy and, in general, in complex diseases.
Resumo:
In recent years, the issue of life expectancy has become of upmost importance to pension providers, insurance companies and the government bodies in the developed world. Significant and consistent improvements in mortality rates and, hence, life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data in order to anticipate future life expectancy and, hence, quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age and cohort, and forecast these trends into the future using standard statistical methods. The modeling approaches used failed to capture the effects of any structural change in the trend and, thus, potentially produced incorrect forecasts of future mortality rates. In this paper, we look at a range of leading stochastic models of mortality and test for structural breaks in the trend time series.
Resumo:
Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.
Resumo:
In recent years, the issue of life expectancy has become of utmost importance to pension providers, insurance companies, and government bodies in the developed world. Significant and consistent improvements in mortality rates and hence life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data to anticipate future life expectancy and hence quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age, and cohort and forecast these trends into the future by using standard statistical methods. These approaches rely on the assumption that structural breaks in the trend do not exist or do not have a significant impact on the mortality forecasts. Recent literature has started to question this assumption. In this paper, we carry out a comprehensive investigation of the presence or of structural breaks in a selection of leading mortality models. We find that structural breaks are present in the majority of cases. In particular, we find that allowing for structural break, where present, improves the forecast result significantly.
Resumo:
The River Bush must reach a standard of good ecological potential (GEP) by 2015 due to the requirements of the water framework directive. The role of sediments within a water body is extremely important to all aspects of a river's regime. The aim of this research is to investigate the effects of Altnahinch Dam on sediment distribution in the River Bush (a heavily modified water body) with comparison made against the Glendun River (an unmodified water body). Samples collected from the rivers were analysed by physical (pebble count, sieve analysis) and statistical methods (ANOVA, GRADISTAT). An increase in fine sediments upstream of the dam provides evidence that the dam is impacting sediment distribution. Downstream effects are not shown to be significant. The output of this study also implies similar impacts at other drinking water storage impoundments. This research recommends that a sediment management plan be put in place for Altnahinch Dam and that further studies be carried-out concentrating on fine sediment distribution upstream of the dam.
Resumo:
BACKGROUND: This series of guidance documents on cough, which will be published over time, is a hybrid of two processes: (1) evidence-based guidelines and (2) trustworthy consensus statements based on a robust and transparent process.
METHODS: The CHEST Guidelines Oversight Committee selected a nonconflicted Panel Chair and jointly assembled an international panel of experts in each clinical area with few, if any, conflicts of interest. PICO (population, intervention, comparator, outcome)-based key questions and parameters of eligibility were developed for each clinical topic to inform the comprehensive literature search. Existing guidelines, systematic reviews, and primary studies were assessed for relevance and quality. Data elements were extracted into evidence tables and synthesized to provide summary statistics. These, in turn, are presented to support the evidence-based graded recommendations. A highly structured consensus-based Delphi approach was used to provide expert advice on all guidance statements. Transparency of process was documented.
RESULTS: Evidence-based guideline recommendations and consensus-based suggestions were carefully crafted to provide direction to health-care providers and investigators who treat and/or study patients with cough. Manuscripts and tables summarize the evidence in each clinical area supporting the recommendations and suggestions.
CONCLUSIONS: The resulting guidance statements are based on a rigorous methodology and transparency of process. Unless otherwise stated, the recommendations and suggestions meet the guidelines for trustworthiness developed by the Institute of Medicine and can be applied with confidence by physicians, nurses, other health-care providers, investigators, and patients.
Resumo:
High density polyethylene (HDPE)/multi-walled carbon nanotube (MWCNT) nanocomposites were prepared by melt mixing using twin-screw extrusion. The extruded pellets were compression moulded at 200°C for 5min followed by cooling at different cooling rates (20°C/min and 300°C/min respectively) to produce sheets for characterization. Scanning electron microscopy (SEM) shows that the MWCNTs are uniformly dispersed in the HDPE. At 4 wt% addition of MWCNTs composite modulus increased by over 110% compared with the unfilled HDPE (regardless of the cooling rate). The yield strength of both unfilled and filled HDPE decreased after rapid cooling by about 10% due to a lower crystallinity and imperfect crystallites. The electrical percolation threshold of composites, irrespective of the cooling rate, is between a MWCNT concentration of 1∼2 wt%. Interestingly, the electrical resistivity of the rapidly cooled composite with 2 wt% MWCNTs is lower than that of the slowly cooled composites with the same MWCNT loading. This may be due to the lower crystallinity and smaller crystallites facilitating the formation of conductive pathways. This result may have significant implications for both process control and the tailoring of electrical conductivity in the manufacture of conductive HDPE/MWCNT nanocomposites.
Resumo:
BACKGROUND: Core outcome sets can increase the efficiency and value of research and, as a result, there are an increasing number of studies looking to develop core outcome sets (COS). However, the credibility of a COS depends on both the use of sound methodology in its development and clear and transparent reporting of the processes adopted. To date there is no reporting guideline for reporting COS studies. The aim of this programme of research is to develop a reporting guideline for studies developing COS and to highlight some of the important methodological considerations in the process.
METHODS/DESIGN: The study will include a reporting guideline item generation stage which will then be used in a Delphi study. The Delphi study is anticipated to include two rounds. The first round will ask stakeholders to score the items listed and to add any new items they think are relevant. In the second round of the process, participants will be shown the distribution of scores for all stakeholder groups separately and asked to re-score. A final consensus meeting will be held with an expert panel and stakeholder representatives to review the guideline item list. Following the consensus meeting, a reporting guideline will be drafted and review and testing will be undertaken until the guideline is finalised. The final outcome will be the COS-STAR (Core Outcome Set-STAndards for Reporting) guideline for studies developing COS and a supporting explanatory document.
DISCUSSION: To assess the credibility and usefulness of a COS, readers of a COS development report need complete, clear and transparent information on its methodology and proposed core set of outcomes. The COS-STAR guideline will potentially benefit all stakeholders in COS development: COS developers, COS users, e.g. trialists and systematic reviewers, journal editors, policy-makers and patient groups.
Resumo:
Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.
Resumo:
Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.