127 resultados para panel regression


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plasma etch is a key process in modern semiconductor manufacturing facilities as it offers process simplification and yet greater dimensional tolerances compared to wet chemical etch technology. The main challenge of operating plasma etchers is to maintain a consistent etch rate spatially and temporally for a given wafer and for successive wafers processed in the same etch tool. Etch rate measurements require expensive metrology steps and therefore in general only limited sampling is performed. Furthermore, the results of measurements are not accessible in real-time, limiting the options for run-to-run control. This paper investigates a Virtual Metrology (VM) enabled Dynamic Sampling (DS) methodology as an alternative paradigm for balancing the need to reduce costly metrology with the need to measure more frequently and in a timely fashion to enable wafer-to-wafer control. Using a Gaussian Process Regression (GPR) VM model for etch rate estimation of a plasma etch process, the proposed dynamic sampling methodology is demonstrated and evaluated for a number of different predictive dynamic sampling rules. © 2013 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasingly semiconductor manufacturers are exploring opportunities for virtual metrology (VM) enabled process monitoring and control as a means of reducing non-value added metrology and achieving ever more demanding wafer fabrication tolerances. However, developing robust, reliable and interpretable VM models can be very challenging due to the highly correlated input space often associated with the underpinning data sets. A particularly pertinent example is etch rate prediction of plasma etch processes from multichannel optical emission spectroscopy data. This paper proposes a novel input-clustering based forward stepwise regression methodology for VM model building in such highly correlated input spaces. Max Separation Clustering (MSC) is employed as a pre-processing step to identify a reduced srt of well-conditioned, representative variables that can then be used as inputs to state-of-the-art model building techniques such as Forward Selection Regression (FSR), Ridge regression, LASSO and Forward Selection Ridge Regression (FCRR). The methodology is validated on a benchmark semiconductor plasma etch dataset and the results obtained are compared with those achieved when the state-of-art approaches are applied directly to the data without the MSC pre-processing step. Significant performance improvements are observed when MSC is combined with FSR (13%) and FSRR (8.5%), but not with Ridge Regression (-1%) or LASSO (-32%). The optimal VM results are obtained using the MSC-FSR and MSC-FSRR generated models. © 2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a Bayesian learning setting, the posterior distribution of a predictive model arises from a trade-off between its prior distribution and the conditional likelihood of observed data. Such distribution functions usually rely on additional hyperparameters which need to be tuned in order to achieve optimum predictive performance; this operation can be efficiently performed in an Empirical Bayes fashion by maximizing the posterior marginal likelihood of the observed data. Since the score function of this optimization problem is in general characterized by the presence of local optima, it is necessary to resort to global optimization strategies, which require a large number of function evaluations. Given that the evaluation is usually computationally intensive and badly scaled with respect to the dataset size, the maximum number of observations that can be treated simultaneously is quite limited. In this paper, we consider the case of hyperparameter tuning in Gaussian process regression. A straightforward implementation of the posterior log-likelihood for this model requires O(N^3) operations for every iteration of the optimization procedure, where N is the number of examples in the input dataset. We derive a novel set of identities that allow, after an initial overhead of O(N^3), the evaluation of the score function, as well as the Jacobian and Hessian matrices, in O(N) operations. We prove how the proposed identities, that follow from the eigendecomposition of the kernel matrix, yield a reduction of several orders of magnitude in the computation time for the hyperparameter optimization problem. Notably, the proposed solution provides computational advantages even with respect to state of the art approximations that rely on sparse kernel matrices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In collaboration with Airbus-UK, the dimensional growth of small panels while being riveted with stiffeners is investigated. The stiffeners have been fastened to the panels with rivets and it has been observed that during this operation the panels expand in the longitudinal and transverse directions. It has been observed that the growth is variable and the challenge is to control the riveting process to minimize this variability. In this investigation, the assembly of the small panels and longitudinal stiffeners has been simulated using low and high fidelity nonlinear finite element models. The models have been validated against a limited set of experimental measurements; it was found that more accurate predictions of the riveting process are achieved using high fidelity explicit finite element models. Furthermore, through a series of numerical simulations and probabilistic analyses, the manufacturing process control parameters that influence panel growth have been identified. Alternative fastening approaches were examined and it was found that dimensional growth can be controlled by changing the design of the dies used for forming the rivets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering the development of aerospace composite components, designing for reduced manufacturing layup cost and structural complexity is increasingly important. While the advantage of composite materials is the ability to tailor designs to various structural loads for minimum mass, the challenge is obtaining a design that is manufacturable and minimizes local ply incompatibility. The focus of the presented research is understanding how the relationships between mass, manufacturability and design complexity, under realistic loads and design requirements, can be affected by enforcing ply continuity in the design process. Presented are a series of sizing case studies on an upper wing cover, designed using conventional analyses and the tabular laminate design process. Introducing skin ply continuity constraints can generate skin designs with minimal ply discontinuities, fewer ply drops and larger ply areas than designs not constrained for continuity. However, the reduced design freedom associated with the addition of these constraints results in a weight penalty over the total wing cover. Perhaps more interestingly, when considering manual hand layup the reduced design complexity is not translated into a reduced recurring manufacturing cost. In contrast, heavier wing cover designs appear to take more time to layup regardless of the laminate design complexity. © 2012 AIAA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. The prediction models for VM can be from a large variety of linear and nonlinear regression methods and the selection of a proper regression method for a specific VM problem is not straightforward, especially when the candidate predictor set is of high dimension, correlated and noisy. Using process data from a benchmark semiconductor manufacturing process, this paper evaluates the performance of four typical regression methods for VM: multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), neural networks (NN) and Gaussian process regression (GPR). It is observed that GPR performs the best among the four methods and that, remarkably, the performance of linear regression approaches that of GPR as the subset of selected input variables is increased. The observed competitiveness of high-dimensional linear regression models, which does not hold true in general, is explained in the context of extreme learning machines and functional link neural networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pseudomonas aeruginosa is a major opportunistic pathogen in cystic fibrosis (CF) patients and causes a wide range of infections among other susceptible populations. Its inherent resistance to many antimicrobials also makes it difficult to treat infections with this pathogen. Recent evidence has highlighted the diversity of this species, yet despite this, the majority of studies on virulence and pathogenesis focus on a small number of strains. There is a pressing need for a P. aeruginosa reference panel to harmonize and coordinate the collective efforts of the P. aeruginosa research community. We have collated a panel of 43 P. aeruginosa strains that reflects the organism's diversity. In addition to the commonly studied clones, this panel includes transmissible strains, sequential CF isolates, strains with specific virulence characteristics, and strains that represent serotype, genotype or geographic diversity. This focussed panel of P. aeruginosa isolates will help accelerate and consolidate the discovery of virulence determinants, improve our understanding of the pathogenesis of infections caused by this pathogen, and provide the community with a valuable resource for the testing of novel therapeutic agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Tobacco smoke is a major risk to the health of its users and arsenic is among the components of smoke present at concentrations of toxicological concern. There are significant variations in human toxicity between inorganic and organic arsenic species and the aim of this study was to determine whether there are predictable relationships among major arsenic species in tobacco that could be useful for risk assessment.

Methods: 14 samples of tobacco were studied spanning a wide range of concentrations in samples from different geographical regions, including certified reference materials and cigarette products. Inorganic and major organic arsenic species were extracted from powdered tobacco samples by nitric acid using microwave digestion. Concentrations of arsenic species in these extracts were determined using HPLC-ICPMS.

Results: The concentrations of total inorganic arsenic species range from 144 to 3914 mu g kg(-1), while organic species dimethylarsinic acid (DMA) ranges from 21 to 176 mu g As kg(-1), and monomethylarsonic acid (MA) ranges from 30 to 116 mu g kg(-1). The percentage of species eluted compared to the total arsenic extracted ranges from 11.1 to 36.8% suggesting that some As species (possibly macro-molecules, strongly complexed or in organic forms) do not elute from the column. This low percentage of column-speciated arsenic is indicative that more complex forms of arsenic exist in the tobacco. All the analysed species correlate positively with total arsenic concentration over the whole compositional range and regression analysis indicates a consistent ratio of about 4:1 in favour of inorganic arsenic compared with MA + DMA.

Conclusions: The dominance of inorganic arsenic species among those components analysed is a marked feature of the diverse range of tobaccos selected for study. Such consistency is important in the context of a WHO expert panel recommendation to regulate tobacco crops and products using total arsenic concentration. If implemented more research would be required to develop models that accurately predict the smoker's exposure to reduced inorganic arsenic species on the basis of leaf or product concentration and product design features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Since the publication of the 2006 American College of Chest Physicians (CHEST) cough guidelines, a variety of tools has been developed or further refined for assessing cough. The purpose of the present committee was to evaluate instruments used by investigators performing clinical research on chronic cough. The specific aims were to (1) assess the performance of tools designed to measure cough frequency, severity, and impact in adults, adolescents, and children with chronic cough and (2) make recommendations or suggestions related to these findings.

METHODS: By following the CHEST methodologic guidelines, the CHEST Expert Cough Panel based its recommendations and suggestions on a recently published comparative effectiveness review commissioned by the US Agency for Healthcare Research and Quality, a corresponding summary published in CHEST, and an updated systematic review through November 2013. Recommendations or suggestions based on these data were discussed, graded, and voted on during a meeting of the Expert Cough Panel.

RESULTS: We recommend for adults, adolescents (≥ 14 years of age), and children complaining of chronic cough that validated and reliable health-related quality-of-life (QoL) questionnaires be used as the measurement of choice to assess the impact of cough, such as the Leicester Cough Questionnaire and the Cough-Specific Quality-of-Life Questionnaire in adult and adolescent patients and the Parent Cough-Specific Quality of Life Questionnaire in children. We recommend acoustic cough counting to assess cough frequency but not cough severity. Limited data exist regarding the performance of visual analog scales, numeric rating scales, and tussigenic challenges.

CONCLUSIONS: Validated and reliable cough-specific health-related QoL questionnaires are recommended as the measurement of choice to assess the impact of cough on patients. How they compare is yet to be determined. When used, the reporting of cough severity by visual analog or numeric rating scales should be standardized. Previously validated QoL questionnaires or other cough assessments should not be modified unless the new version has been shown to be reliable and valid. Finally, in research settings, tussigenic challenges play a role in understanding mechanisms of cough.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: This series of guidance documents on cough, which will be published over time, is a hybrid of two processes: (1) evidence-based guidelines and (2) trustworthy consensus statements based on a robust and transparent process.

METHODS: The CHEST Guidelines Oversight Committee selected a nonconflicted Panel Chair and jointly assembled an international panel of experts in each clinical area with few, if any, conflicts of interest. PICO (population, intervention, comparator, outcome)-based key questions and parameters of eligibility were developed for each clinical topic to inform the comprehensive literature search. Existing guidelines, systematic reviews, and primary studies were assessed for relevance and quality. Data elements were extracted into evidence tables and synthesized to provide summary statistics. These, in turn, are presented to support the evidence-based graded recommendations. A highly structured consensus-based Delphi approach was used to provide expert advice on all guidance statements. Transparency of process was documented.

RESULTS: Evidence-based guideline recommendations and consensus-based suggestions were carefully crafted to provide direction to health-care providers and investigators who treat and/or study patients with cough. Manuscripts and tables summarize the evidence in each clinical area supporting the recommendations and suggestions.

CONCLUSIONS: The resulting guidance statements are based on a rigorous methodology and transparency of process. Unless otherwise stated, the recommendations and suggestions meet the guidelines for trustworthiness developed by the Institute of Medicine and can be applied with confidence by physicians, nurses, other health-care providers, investigators, and patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The liver fluke, Fasciola hepatica is an economically important pathogen of sheep and cattle and has been described by the WHO as a re-emerging zoonosis. Control is heavily reliant on the use of drugs, particularly triclabendazole and as a result resistance has now emerged. The population structure of F. hepatica is not well known, yet it can impact on host-parasite interactions and parasite control with drugs, particularly regarding the spread of triclabendazole resistance. We have identified 2448 potential microsatellites from 83Mb of F. hepatica genome sequence using msatfinder. Thirty-five loci were developed and optimised for microsatellite PCR, resulting in a panel of 15 polymorphic loci, with a range of three to 15 alleles. This panel was validated on genomic DNA from 46 adult F. hepatica; 38 liver flukes sourced from a Northwest abattoir, UK and 8 liver flukes from an established isolate (Shrewsbury; Ridgeway Research). Evidence for null alleles was found at four loci (Fh_1, Fh_8, Fh_13 and Fh_14), which showed markedly higher levels of homozygosity than the remaining 11 loci. Of the 38 liver flukes isolated from cattle livers (n=10) at the abattoir, 37 genotypes were identified. Using a multiplex approach all 15 loci could be amplified from several life cycle stages that typically yield low amounts of DNA, including metacercariae, the infective life cycle stage present on pasture, highlighting the utility of this multiplex microsatellite panel. This study reports the largest panel of microsatellite markers available to date for population studies of F. hepatica and the first multiplex panel of microsatellite markers that can be used for several life cycle stages.