970 resultados para Statistical performance indexes
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Considering the high competitivity in the market, the application of quantitative methods can assist in analyzing the efficiency of production facilities of areas of export and import processes of the chemical industry sector. In this sense, this work aims to apply the model GPDEA-BCC optimization in order to develop an analysis of the production units of this chemical industry. So, were chosen variables relevant to the process and elaborated a final comparison between the results obtained by the optimization tool and performance indexes provided by the company. These results indicated that some production units should be monitored more carefully because some of them had a low efficiency when analyzed with multi criteria
Resumo:
Considering the high competitivity in the market, the application of quantitative methods can assist in analyzing the efficiency of production facilities of areas of export and import processes of the chemical industry sector. In this sense, this work aims to apply the model GPDEA-BCC optimization in order to develop an analysis of the production units of this chemical industry. So, were chosen variables relevant to the process and elaborated a final comparison between the results obtained by the optimization tool and performance indexes provided by the company. These results indicated that some production units should be monitored more carefully because some of them had a low efficiency when analyzed with multi criteria
Resumo:
O Modelo de Gestão do Sistema Integrado de Bibliote cas da Universidade de São Paulo (SIBi/USP) incorpora conceitos e ferramentas de ger enciamento que permeiam as organizações modernas para garantir melhores índice s de desempenho sistêmico e prestar serviços com qualidade e eficiência aos usuários. C omo parte do Modelo está a identificação e a descrição detalhada dos processos de trabalho do Sistema, assim como o estabelecimento de alguns indicadores de desempenho . A partir do trabalho inicial, foi elaborado um levantamento complementar para identif icação dos processos que abrangessem o conjunto do Sistema de forma ampla. O s dados foram dispostos em planilhas para melhor visualização, especialmente q uanto aos macro processos, processos, sub processos e atividades. Os processos foram sepa rados em essenciais, gerenciais e de apoio, além de elencadas as atividades pertinentes a cada um deles, bem como as instruções técnicas e fluxos de trabalho. Foram est abelecidos alguns indicadores, tendo por referência os da IFLA já estudados por outro Grupo de Trabalho. Daquele estudo quatro indicadores foram testados e validados pelo SIBi/US P por meio de aplicação piloto em algumas das Bibliotecas do Sistema e outros foram d efinidos no estudo atual. Com isso foi possível mapear os processos e as atividades desenv olvidas pelo conjunto de Bibliotecas sendo que cada Biblioteca, em função de sua especia lidade e especificidade pode adequar o seu mapeamento. A definição de um núcleo básico d e indicadores objetiva viabilizar a concretização da missão e dos objetivos em consonân cia com a política do SIBI/USP
Resumo:
The aim of this study was to compare the techniques of indirect immunofluorescence assay (IFA) and flow cytometry to clinical and laboratorial evaluation of patients before and after clinical cure and to evaluate the applicability of flow cytometry in post-therapeutic monitoring of patients with American tegumentary leishmaniasis (ATL). Sera from 14 patients before treatment (BT), 13 patients 1 year after treatment (AT), 10 patients 2 and 5 years AT were evaluated. The results from flow cytometry were expressed as levels of IgG reactivity, based on the percentage of positive fluorescent parasites (PPFP). The 1:256 sample dilution allowed us to differentiate individuals BT and AT. Comparative analysis of IFA and flow cytometry by ROC (receiver operating characteristic curve) showed, respectively, AUC (area under curve) = 0.8 (95% CI = 0.64–0.89) and AUC = 0.90 (95% CI = 0.75–0.95), demonstrating that the flow cytometry had equivalent accuracy. Our data demonstrated that 20% was the best cut-off point identified by the ROC curve for the flow cytometry assay. This test showed a sensitivity of 86% and specificity of 77% while the IFA had a sensitivity of 78% and specificity of 85%. The after-treatment screening, through comparative analysis of the technique performance indexes, 1, 2 and 5 years AT, showed an equal performance of the flow cytometry compared with the IFA. However, flow cytometry shows to be a better diagnostic alternative when applied to the study of ATL in the cure criterion. The information obtained in this work opens perspectives to monitor cure after treatment of ATL.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
This paper studies relational goods as immaterial assets creating real effects in society. The work starts answering to this question: what kind of effects do relational goods produce? After an accurate literature examination we suppose relational goods are social relations of second order. In the hypotesis they come from the emergence of two distinct social relations: interpersonal and reflexive relations. We describe empirical evidences of these emergent assets in social life and we test the effects they produce with a model. In the work we focus on four targets. First of all we describe the emergence of relational goods through a mathematical model. Then we individualize social realities where relational goods show evident effects and we outline our scientific hypotesis. The following step consists in the formulation of empirical tests. At last we explain final results. Our aim is to set apart the constitutive structure of relational goods into a checkable model coherently with the empirical evidences shown in the research. In the study we use multi-variate analysis techniques to see relational goods in a new way and we use qualitative and quantitative strategies. Relational goods are analysed both as dependent and independent variable in order to consider causative factors acting in a black-box model. Moreover we analyse effects of relational goods inside social spheres, especially in third sector and capitalistic economy. Finally we attain to effective indexes of relational goods in order to compare them with some performance indexes.
Resumo:
Integrating sociological and psychological perspectives, this research considers the value of organizational ethnic diversity as a function of community diversity. Employee and patient surveys, census data, and performance indexes relevant to 142 hospitals in the United Kingdom suggest that intraorganizational ethnic diversity is associated with reduced civility toward patients. However, the degree to which organizational demography was representative of community demography was positively related to civility experienced by patients and ultimately enhanced organizational performance. These findings underscore the understudied effects of community context and imply that intergroup biases manifested in incivility toward out-group members hinder organizational performance.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
A água é um recurso essencial e escasso, como tal, é necessário encontrar medidas que permitam o seu uso de modo sustentável e garantam a proteção do meio ambiente. Devido a esta crescente preocupação assiste-se a um movimento legislativo, nacional e internacional, no sentido de garantir o desenvolvimento sustentável. Surge assim, a Diretiva Quadro da Água e a Lei da Água, que é complementada com legislação diversa. Como elemento constituinte do ciclo urbano da água, os Sistemas de Abastecimento têm sofrido evoluções nem sempre adequadas. É neste contexto que, em Portugal, nascem as diversas ferramentas para a melhoria da gestão dos recursos hídricos. As Entidades Gestoras têm como finalidade a gestão eficiente do bem água, e dispõe de dois importantes instrumentos, o Programa Nacional para o Uso Eficiente da Água e o Guia para o “controlo de perdas de água em sistemas públicos de adução e distribuição”(ERSAR). Esta Gestão passa, não só pela abordagem da problemática das perdas de água, reais e aparentes, como também pela análise do comportamento que origina o desperdício. A APA, enquanto entidade gestora, procura maximizar a eficiência do seu sistema de abastecimento, para tal, foram aplicadas as ferramentas propostas pelo ERSAR. Concluindo-se que este sistema tem um total de perdas de água de 34%, devendo-se estas perdas essencialmente ao envelhecido parque de contadores e perdas nos ramais de distribuição (teórico). As perdas comerciais representam cerca de 69%, o que revela que os volumes de água não faturados (medidos ou não) são muito elevados. Por outro lado, a realização do cálculo do Balanço Hídrico e dos índices de desempenho permitem classificar a performance do sistema de abastecimento e compará-la com os seus objetivos de gestão. Atendendo ao volume de água perdido nos ramais, foram efetuadas medições noturnas, verificando-se que no Porto de Pesca Costeira existe um volume de água escoado não justificado. Neste sentido, elaborou-se um plano de ação para aumentar a eficiência do sistema, ou seja, reduzir as perdas totais de 34% para 15%.
Resumo:
The aim of the present study was to provide a numerical measure, through the process capability indexes (PCIs), C(p) and C(pk), on whether or not the manufacturing process can be considered capable of producing metamizol (500 mg) tablets. They were also used as statistical tool in order to prove the consistency of the tabletting process, making sure that the tablet weight and the content uniformity of metamizol are able to comply with the preset requirements. Besides that, the ANOVA, the t-test and the test for equal variances were applied to this study, allowing additional knowledge of the tabletting phase. Therefore, the proposed statistical approach intended to assure more safety, precision and accuracy on the process validation analysis.
Resumo:
The demand for costs and time reductions in companies’ processes, in order to increase efficiency, leads companies to seek innovative management paradigms to support their needs for growth and continuous improvement. The Lean paradigm has great relevance in companies’ need for waste reduction, particularly in manufacturing companies. On the other hand the demand of companies for waste reduction has gained a new dimension not only at the material level, but also at the environmental level with the introduction of the Green paradigm. As such, manufacturing companies have been adopting practices that reduce the impact of their activities on the environment. Although nowadays many manufacturing companies already implement waste reduction practices related to Lean and Green paradigms, many of them are unable to understand specifically if their efforts are enough for the application of these practices to be successful or even if their actual performance in implementing Lean or Green practices reflects the self-assessment that they have of themselves. Thus, besides the study of the development of Lean and Green paradigms in recent years, the present dissertation has the important objective of the construction of two indexes (the Lean Index and the Green Index) enabling the measurement of the performance of Portuguese manufacturing companies relating the implementation of Lean and Green practices. The data used to create the Lean and Green indexes where obtained from the implementation of the European Manufacturing Survey 2012 in Portugal. The survey questions related to the implementation of Lean and Green practices are used as variables in the development of the model for the two indexes. For the construction of representative expressions of Lean Index and Green Index it was applied the Factorial Analysis for assigning the variables weights and aggregation.
Resumo:
Boundaries for delta, representing a "quantitatively significant" or "substantively impressive" distinction, have not been established, analogous to the boundary of alpha, usually set at 0.05, for the stochastic or probabilistic component of "statistical significance". To determine what boundaries are being used for the "quantitative" decisions, we reviewed pertinent articles in three general medical journals. For each contrast of two means, contrast of two rates, or correlation coefficient, we noted the investigators' decisions about stochastic significance, stated in P values or confidence intervals, and about quantitative significance, indicated by interpretive comments. The boundaries between impressive and unimpressive distinctions were best formed by a ratio of greater than or equal to 1.2 for the smaller to the larger mean in 546 comparisons, by a standardized increment of greater than or equal to 0.28 and odds ratio of greater than or equal to 2.2 in 392 comparisons of two rates; and by an r value of greater than or equal to 0.32 in 154 correlation coefficients. Additional boundaries were also identified for "substantially" and "highly" significant quantitative distinctions. Although the proposed boundaries should be kept flexible, indexes and boundaries for decisions about "quantitative significance" are particularly useful when a value of delta must be chosen for calculating sample size before the research is done, and when the "statistical significance" of completed research is appraised for its quantitative as well as stochastic components.