919 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the issue of life expectancy has become of utmost importance to pension providers, insurance companies, and government bodies in the developed world. Significant and consistent improvements in mortality rates and hence life expectancy have led to unprecedented increases in the cost of providing for older ages. This has resulted in an explosion of stochastic mortality models forecasting trends in mortality data to anticipate future life expectancy and hence quantify the costs of providing for future aging populations. Many stochastic models of mortality rates identify linear trends in mortality rates by time, age, and cohort and forecast these trends into the future by using standard statistical methods. These approaches rely on the assumption that structural breaks in the trend do not exist or do not have a significant impact on the mortality forecasts. Recent literature has started to question this assumption. In this paper, we carry out a comprehensive investigation of the presence or of structural breaks in a selection of leading mortality models. We find that structural breaks are present in the majority of cases. In particular, we find that allowing for structural break, where present, improves the forecast result significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The River Bush must reach a standard of good ecological potential (GEP) by 2015 due to the requirements of the water framework directive. The role of sediments within a water body is extremely important to all aspects of a river's regime. The aim of this research is to investigate the effects of Altnahinch Dam on sediment distribution in the River Bush (a heavily modified water body) with comparison made against the Glendun River (an unmodified water body). Samples collected from the rivers were analysed by physical (pebble count, sieve analysis) and statistical methods (ANOVA, GRADISTAT). An increase in fine sediments upstream of the dam provides evidence that the dam is impacting sediment distribution. Downstream effects are not shown to be significant. The output of this study also implies similar impacts at other drinking water storage impoundments. This research recommends that a sediment management plan be put in place for Altnahinch Dam and that further studies be carried-out concentrating on fine sediment distribution upstream of the dam. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: This series of guidance documents on cough, which will be published over time, is a hybrid of two processes: (1) evidence-based guidelines and (2) trustworthy consensus statements based on a robust and transparent process.

METHODS: The CHEST Guidelines Oversight Committee selected a nonconflicted Panel Chair and jointly assembled an international panel of experts in each clinical area with few, if any, conflicts of interest. PICO (population, intervention, comparator, outcome)-based key questions and parameters of eligibility were developed for each clinical topic to inform the comprehensive literature search. Existing guidelines, systematic reviews, and primary studies were assessed for relevance and quality. Data elements were extracted into evidence tables and synthesized to provide summary statistics. These, in turn, are presented to support the evidence-based graded recommendations. A highly structured consensus-based Delphi approach was used to provide expert advice on all guidance statements. Transparency of process was documented.

RESULTS: Evidence-based guideline recommendations and consensus-based suggestions were carefully crafted to provide direction to health-care providers and investigators who treat and/or study patients with cough. Manuscripts and tables summarize the evidence in each clinical area supporting the recommendations and suggestions.

CONCLUSIONS: The resulting guidance statements are based on a rigorous methodology and transparency of process. Unless otherwise stated, the recommendations and suggestions meet the guidelines for trustworthiness developed by the Institute of Medicine and can be applied with confidence by physicians, nurses, other health-care providers, investigators, and patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High density polyethylene (HDPE)/multi-walled carbon nanotube (MWCNT) nanocomposites were prepared by melt mixing using twin-screw extrusion. The extruded pellets were compression moulded at 200°C for 5min followed by cooling at different cooling rates (20°C/min and 300°C/min respectively) to produce sheets for characterization. Scanning electron microscopy (SEM) shows that the MWCNTs are uniformly dispersed in the HDPE. At 4 wt% addition of MWCNTs composite modulus increased by over 110% compared with the unfilled HDPE (regardless of the cooling rate). The yield strength of both unfilled and filled HDPE decreased after rapid cooling by about 10% due to a lower crystallinity and imperfect crystallites. The electrical percolation threshold of composites, irrespective of the cooling rate, is between a MWCNT concentration of 1∼2 wt%. Interestingly, the electrical resistivity of the rapidly cooled composite with 2 wt% MWCNTs is lower than that of the slowly cooled composites with the same MWCNT loading. This may be due to the lower crystallinity and smaller crystallites facilitating the formation of conductive pathways. This result may have significant implications for both process control and the tailoring of electrical conductivity in the manufacture of conductive HDPE/MWCNT nanocomposites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Core outcome sets can increase the efficiency and value of research and, as a result, there are an increasing number of studies looking to develop core outcome sets (COS). However, the credibility of a COS depends on both the use of sound methodology in its development and clear and transparent reporting of the processes adopted. To date there is no reporting guideline for reporting COS studies. The aim of this programme of research is to develop a reporting guideline for studies developing COS and to highlight some of the important methodological considerations in the process.

METHODS/DESIGN: The study will include a reporting guideline item generation stage which will then be used in a Delphi study. The Delphi study is anticipated to include two rounds. The first round will ask stakeholders to score the items listed and to add any new items they think are relevant. In the second round of the process, participants will be shown the distribution of scores for all stakeholder groups separately and asked to re-score. A final consensus meeting will be held with an expert panel and stakeholder representatives to review the guideline item list. Following the consensus meeting, a reporting guideline will be drafted and review and testing will be undertaken until the guideline is finalised. The final outcome will be the COS-STAR (Core Outcome Set-STAndards for Reporting) guideline for studies developing COS and a supporting explanatory document.

DISCUSSION: To assess the credibility and usefulness of a COS, readers of a COS development report need complete, clear and transparent information on its methodology and proposed core set of outcomes. The COS-STAR guideline will potentially benefit all stakeholders in COS development: COS developers, COS users, e.g. trialists and systematic reviewers, journal editors, policy-makers and patient groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional practice in Regional Geochemistry includes as a final step of any geochemical campaign the generation of a series of maps, to show the spatial distribution of each of the components considered. Such maps, though necessary, do not comply with the compositional, relative nature of the data, which unfortunately make any conclusion based on them sensitive
to spurious correlation problems. This is one of the reasons why these maps are never interpreted isolated. This contribution aims at gathering a series of statistical methods to produce individual maps of multiplicative combinations of components (logcontrasts), much in the flavor of equilibrium constants, which are designed on purpose to capture certain aspects of the data.
We distinguish between supervised and unsupervised methods, where the first require an external, non-compositional variable (besides the compositional geochemical information) available in an analogous training set. This external variable can be a quantity (soil density, collocated magnetics, collocated ratio of Th/U spectral gamma counts, proportion of clay particle fraction, etc) or a category (rock type, land use type, etc). In the supervised methods, a regression-like model between the external variable and the geochemical composition is derived in the training set, and then this model is mapped on the whole region. This case is illustrated with the Tellus dataset, covering Northern Ireland at a density of 1 soil sample per 2 square km, where we map the presence of blanket peat and the underlying geology. The unsupervised methods considered include principal components and principal balances
(Pawlowsky-Glahn et al., CoDaWork2013), i.e. logcontrasts of the data that are devised to capture very large variability or else be quasi-constant. Using the Tellus dataset again, it is found that geological features are highlighted by the quasi-constant ratios Hf/Nb and their ratio against SiO2; Rb/K2O and Zr/Na2O and the balance between these two groups of two variables; the balance of Al2O3 and TiO2 vs. MgO; or the balance of Cr, Ni and Co vs. V and Fe2O3. The largest variability appears to be related to the presence/absence of peat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os sistemas distribuídos embarcados (Distributed Embedded Systems – DES) têm sido usados ao longo dos últimos anos em muitos domínios de aplicação, da robótica, ao controlo de processos industriais passando pela aviónica e pelas aplicações veiculares, esperando-se que esta tendência continue nos próximos anos. A confiança no funcionamento é uma propriedade importante nestes domínios de aplicação, visto que os serviços têm de ser executados em tempo útil e de forma previsível, caso contrário, podem ocorrer danos económicos ou a vida de seres humanos poderá ser posta em causa. Na fase de projecto destes sistemas é impossível prever todos os cenários de falhas devido ao não determinismo do ambiente envolvente, sendo necessária a inclusão de mecanismos de tolerância a falhas. Adicionalmente, algumas destas aplicações requerem muita largura de banda, que também poderá ser usada para a evolução dos sistemas, adicionandolhes novas funcionalidades. A flexibilidade de um sistema é uma propriedade importante, pois permite a sua adaptação às condições e requisitos envolventes, contribuindo também para a simplicidade de manutenção e reparação. Adicionalmente, nos sistemas embarcados, a flexibilidade também é importante por potenciar uma melhor utilização dos, muitas vezes escassos, recursos existentes. Uma forma evidente de aumentar a largura de banda e a tolerância a falhas dos sistemas embarcados distribuídos é a replicação dos barramentos do sistema. Algumas soluções existentes, quer comerciais quer académicas, propõem a replicação dos barramentos para aumento da largura de banda ou para aumento da tolerância a falhas. No entanto e quase invariavelmente, o propósito é apenas um, sendo raras as soluções que disponibilizam uma maior largura de banda e um aumento da tolerância a falhas. Um destes raros exemplos é o FlexRay, com a limitação de apenas ser permitido o uso de dois barramentos. Esta tese apresentada e discute uma proposta para usar a replicação de barramentos de uma forma flexível com o objectivo duplo de aumentar a largura de banda e a tolerância a falhas. A flexibilidade dos protocolos propostos também permite a gestão dinâmica da topologia da rede, sendo o número de barramentos apenas limitado pelo hardware/software. As propostas desta tese foram validadas recorrendo ao barramento de campo CAN – Controller Area Network, escolhido devido à sua grande implantação no mercado. Mais especificamente, as soluções propostas foram implementadas e validadas usando um paradigma que combina flexibilidade com comunicações event-triggered e time-triggered: o FTT – Flexible Time- Triggered. No entanto, uma generalização para CAN nativo é também apresentada e discutida. A inclusão de mecanismos de replicação do barramento impõe a alteração dos antigos protocolos de replicação e substituição do nó mestre, bem como a definição de novos protocolos para esta finalidade. Este trabalho tira partido da arquitectura centralizada e da replicação do nó mestre para suportar de forma eficiente e flexível a replicação de barramentos. Em caso de ocorrência de uma falta num barramento (ou barramentos) que poderia provocar uma falha no sistema, os protocolos e componentes propostos nesta tese fazem com que o sistema reaja, mudando para um modo de funcionamento degradado. As mensagens que estavam a ser transmitidas nos barramentos onde ocorreu a falta são reencaminhadas para os outros barramentos. A replicação do nó mestre baseia-se numa estratégia líder-seguidores (leaderfollowers), onde o líder (leader) controla todo o sistema enquanto os seguidores (followers) servem como nós de reserva. Se um erro ocorrer no nó líder, um dos nós seguidores passará a controlar o sistema de uma forma transparente e mantendo as mesmas funcionalidades. As propostas desta tese foram também generalizadas para CAN nativo, tendo sido para tal propostos dois componentes adicionais. É, desta forma possível ter as mesmas capacidades de tolerância a falhas ao nível dos barramentos juntamente com a gestão dinâmica da topologia de rede. Todas as propostas desta tese foram implementadas e avaliadas. Uma implementação inicial, apenas com um barramento foi avaliada recorrendo a uma aplicação real, uma equipa de futebol robótico onde o protocolo FTT-CAN foi usado no controlo de movimento e da odometria. A avaliação do sistema com múltiplos barramentos foi feita numa plataforma de teste em laboratório. Para tal foi desenvolvido um sistema de injecção de faltas que permite impor faltas nos barramentos e nos nós mestre, e um sistema de medida de atrasos destinado a medir o tempo de resposta após a ocorrência de uma falta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No contexto econômico competitivo e globalizado no qual as corporações estão inseridas, emerge a necessidade de evolução constante para acompanhar as mudanças que o ambiente lhes impõe, visando a sustentabilidade e a perpetuidade. A evolução econômica e financeira das corporações pode promover o desenvolvimento de uma nação, mesmo que o aumento da concorrência no mercado obrigue-as a investirem em novas relações com o seu universo, buscando melhorar os seus níveis de desempenho mensurados por meio de novos instrumentos economicos e financeiros. Desta forma, o grau de investimento corporativo passa a ser relevante, pois pode gerar confiança em novos investimentos, sendo visto como sinônimo de economia forte. No concernente ao objetivo, esta tese teve como escopo o desenvolvimento de um indicador econômico e financeiro visando balizar o grau de credibilidade rating que as corporações apresentam em sua estrutura corporativa, por meio de um conjunto de índices econômicos e financeiros ligados à liquidez, à lucratividade, ao endividamento e à rentabilidade, provindos das demonstrações econômicas e financeiras das corporações estudadas. Este estudo caracteriza-se no contexto da tipologia aplicada, de objetivo descritivo com delineamento bibliográfico, na amplitude da problemática, caracteriza-se como quantitativo, compreendendo a população de 70 corporações brasileiras reconhecidas pelas certificadoras internacionais, Standard & Poor's, Moody's e Fitch Ratings, as quais detinham o grau de investimento corporativo no ano de 2008. Quanto aos métodos e procedimentos estatísticos, primeiramente utilizou-se a análise descritiva com vistas ao resumo dos dados, posteriormente foi feita a análise de correlação por meio do Coeficiente de Correlação Linear de Pearson, aplicando-se em seguida a análise de regressão. Em seguida para a confecção do modelo utilizou-se a análise fatorial e para testificar sua confiabilidade o Alfa de Cronbach, utilizou-se também a análise discriminante, para classificação dos quartis. As conclusões do estudo baseiamse nos resultados apresentados pela evolução do tratamento estatístico, que inicialmente apresentam uma correlação predominantemente fraca, no entanto isto não invalida a correlação de Pearson, pois todos os coeficientes apresentaram uma significância de (p<0,05). Na aplicação da análise de regressão, todos os modelos apresentaram resultados satisfatórios sendo perceptível a existência de uma forte correlação. A confiabilidade do modelo de grau de investimento corporativo provindo da análise fatorial foi testificada pelo coeficiente do Alpha de Cronbach, que apresentou valor de 0,768, o que indica consistência interna satisfatória ao estudo. O grau de investimento na base longitudinal de 2008 a 2010 apresenta variabilidade de 95,72% a 98,33% de acertividade. Portanto, conclui-se que o indicador criado por este estudo, possui condições de ser utilizado como base de definição do grau de investimento de corporações empresariais.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban soil quality may be severely affected by hydrophobic organic contaminants (HOCs), impairing environmental quality and human health. A comprehensive study was conducted in two contrasting Portuguese urban areas (Lisbon and Viseu) in order to assess the levels and potential risks of these contaminants, to identify sources and study their behaviour in soils. The concentrations of HOCs were related to the size of the city, with much higher contamination levels observed in Lisbon urban area. Source apportionment was performed by studying the HOCs profiles, their relationship with potentially toxic elements and general characteristics of soil using multivariate statistical methods. Lisbon seems to be affected by nearby sources (traffic, industry and incineration processes) whereas in Viseu the atmospheric transport may be playing an important role. In a first tier of risk assessment (RA) it was possible to identify polycyclic aromatic hydrocarbons (PAHs) in Lisbon soils as a potential hazard. The levels of PAHs in street dusts were further studied and allowed to clarify that traffic, tire and pavement debris can be an important source of PAHs to urban soils. Street dusts were also identified as being a potential concern regarding human and environmental health, especially if reaching the nearby aquatic bodies. Geostatistical tools were also used and their usefulness in a RA analysis and urban planning was discussed. In order to obtain a more realistic assessment of risks of HOCs to environment and human health it is important to evaluate their available fraction, which is also the most accessible for organisms. Therefore, a review of the processes involved on the availability of PAHs was performed and the outputs produced by the different chemical methods were evaluated. The suitability of chemical methods to predict bioavailability of PAHs in dissimilar naturally contaminated soils has not been demonstrated, being especially difficult for high molecular weight compounds. No clear relationship between chemical and biological availability was found in this work. Yet, in spite of the very high total concentrations found in some Lisbon soils, both the water soluble fraction and the body residues resulting from bioaccumulation assays were generally very low, which may be due to aging phenomena. It was observed that the percentage of soluble fraction of PAHs in soils was found to be different among compounds and mostly regulated by soil properties. Regarding bioaccumulation assays, although no significant relationship was found between soil properties and bioavailability, it was verified that biota-to-soil bioaccumulation factors were sample dependent rather than compound dependent. In conclusion, once the compounds of potential concern are targeted, then performing a chemical screening as a first tier can be a simple and effective approach to start a RA. However, reliable data is still required to improve the existing models for risk characterization.