939 resultados para process parameter monitoring


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a multilayered specimen, the back-scattered signal in frequency-domain optical-coherence tomography (FDOCT) is expressible as a sum of cosines, each corresponding to a change of refractive index in the specimen. Each of the cosines represent a peak in the reconstructed tomogram. We consider a truncated cosine series representation of the signal, with the constraint that the coefficients in the basis expansion be sparse. An l(2) (sum of squared errors) data error is considered with an l(1) (summation of absolute values) constraint on the coefficients. The optimization problem is solved using Weiszfeld's iteratively reweighted least squares (IRLS) algorithm. On real FDOCT data, improved results are obtained over the standard reconstruction technique with lower levels of background measurement noise and artifacts due to a strong l(1) penalty. The previous sparse tomogram reconstruction techniques in the literature proposed collecting sparse samples, necessitating a change in the data capturing process conventionally used in FDOCT. The IRLS-based method proposed in this paper does not suffer from this drawback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transition process of the thermocapillary convection from a steady and axisymmetric mode to the oscillatory mode in a liquid bridge with a fixed aspect ratio and varied volume ratio was studied experimentally. To ensure the surface tension to play an important role in the ground-based experiment, the geometrical configuration of the liquid bridge was so designed that the associated dynamic Bond number Bd ≈ 1. The velocity fields were measured by Particle Image Velocimetry (PIV) technique to effectively distinguish the different flow modes during the transition period in the experiments. Our experiments showed that as the temperature difference increased the slender and fat bridges presented quite different features on the evolution in their flow feature: for the former the thermocapillary convection transformed from a steady and axisymmetric pattern directly into an oscillatory one; but for the latter a transition flow status, characterized by an axial asymmetric steady convection, appeared before reaching the oscillatory mode. Experimental observations agree with the results of numerical simulations and it is obvious that the volume of liquid bridge is a sensitive geometric parameter. In addition, at the initial stage of the oscillation, for the former a rotating oscillatory convection with azimuthal wave number m = 1 was observed while for the latter a pulsating oscillatory pattern with azimuthal wave number m = 2 emerged, and then with further increase of the temperature difference, the pulsating oscillatory convection with azimuthal wave number m = 2 evolved into a rotating oscillatory pattern with azimuthal wave number m = 2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical Process Control (SPC) technique are well established across a wide range of industries. In particular, the plotting of key steady state variables with their statistical limit against time (Shewart charting) is a common approach for monitoring the normality of production. This paper aims with extending Shewart charting techniques to the quality monitoring of variables driven by uncertain dynamic processes, which has particular application in the process industries where it is desirable to monitor process variables on-line as well as final product. The robust approach to dynamic SPC is based on previous work on guaranteed cost filtering for linear systems and is intended to provide a basis for both a wide application of SPC monitoring and also motivate unstructured fault detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pure diffusion process has been often used to study the crystal growth of a binary alloy in the microgravity environment. In the present paper, a geometric parameter, the ratio of the maximum deviation distance of curved solidification and melting interfaces from the plane to the radius of the crystal rod, was adopted as a small parameter, and the analytical solution was obtained based on the perturbation theory. The radial segregation of a diffusion dominated process was obtained for cases of arbitrary Peclet number in a region of finite extension with both a curved solidification interface and a curved melting interface. Two types of boundary conditions at the melting interface were analyzed. Some special cases such as infinite extension in the longitudinal direction and special range of Peclet number were reduced from the general solution and discussed in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The PICES Science Board and the Science and Technology Agency of Japan held a Workshop on Monitoring Subarctic North Pacific Vaiability,October 22-23,1994, in Nemuro,Hokkaido,Japan,in conjunction with the PICES Third Annual Meeting. The Workshop was not intended to discuss process studies or to review the science of the subaractic Pacific,but rather to focus on the longterm monitoring programs required for assessment of the physical and ecological responses to long-term forcing,both natural and man-made. (PDF contains 90 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Channel Islands—sometimes called the Galapagos of North America—are known for their great beauty, rich biodiversity, cultural heritage, and recreational opportunities. In 1980, in recognition of the islands’ importance, the United States Congress established a national park encompassing 5 of California’s Channel Islands (Santa Barbara, Anacapa, Santa Cruz, Santa Rosa, and San Miguel Islands) and waters within 1 nautical mile of the islands. In the same year, Congress declared a national marine sanctuary around each of these islands, including waters up to 6 nautical miles offshore. Approximately 60,000 people visit the Channel Islands each year for aquatic recreation such as fishing, sailing, kayaking, wildlife watching, surfing, and diving. Another 30,000 people visit the islands for hiking, camping, and sightseeing. Dozens of commercial fishing boats based in Santa Barbara, Ventura, Oxnard, and other ports go to the Channel Islands to catch squid, spiny lobster, sea urchin, rockfish, crab, sheephead, flatfish, and sea cucumber, among other species. In the past few decades, advances in fishing technology and the rising number of fishermen, in conjunction with changing ocean conditions and diseases, have contributed to declines in some marine fishes and invertebrates at the Channel Islands. In 1998, citizens from Santa Barbara and Ventura proposed establishment of no-take marine reserves at the Channel Islands, beginning a 4-year process of public meetings, discussions, and scientific analyses. In 2003, the California Fish and Game Commission designated a network of marine protected areas (MPAs) in state waters around the northern Channel Islands. In 2006 and 2007, the National Oceanic and Atmospheric Administration (NOAA) extended the MPAs into the national marine sanctuary’s deeper, federal waters. To determine if the MPAs are protecting marine species and habitats, scientists are monitoring ecological changes. They are studying changes in habitats; abundance and size of species of interest; the ocean food web and ecosystem; and movement of fish and invertebrates from MPAs to surrounding waters. Additionally, scientists are monitoring human activities such as commercial and recreational fisheries, and compliance with MPA regulations. This booklet describes some results from the first 5 years of monitoring the Channel Islands MPAs. Although 5 years is not long enough to determine if the MPAs will accomplish all of their goals, this booklet offers a glimpse of the changes that are beginning to take place and illustrates the types of information that will eventually be used to assess the MPAs’ effectiveness. (PDF contains 24 pages.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In response to infection or tissue dysfunction, immune cells develop into highly heterogeneous repertoires with diverse functions. Capturing the full spectrum of these functions requires analysis of large numbers of effector molecules from single cells. However, currently only 3-5 functional proteins can be measured from single cells. We developed a single cell functional proteomics approach that integrates a microchip platform with multiplex cell purification. This approach can quantitate 20 proteins from >5,000 phenotypically pure single cells simultaneously. With a 1-million fold miniaturization, the system can detect down to ~100 molecules and requires only ~104 cells. Single cell functional proteomic analysis finds broad applications in basic, translational and clinical studies. In the three studies conducted, it yielded critical insights for understanding clinical cancer immunotherapy, inflammatory bowel disease (IBD) mechanism and hematopoietic stem cell (HSC) biology.

To study phenotypically defined cell populations, single cell barcode microchips were coupled with upstream multiplex cell purification based on up to 11 parameters. Statistical algorithms were developed to process and model the high dimensional readouts. This analysis evaluates rare cells and is versatile for various cells and proteins. (1) We conducted an immune monitoring study of a phase 2 cancer cellular immunotherapy clinical trial that used T-cell receptor (TCR) transgenic T cells as major therapeutics to treat metastatic melanoma. We evaluated the functional proteome of 4 antigen-specific, phenotypically defined T cell populations from peripheral blood of 3 patients across 8 time points. (2) Natural killer (NK) cells can play a protective role in chronic inflammation and their surface receptor – killer immunoglobulin-like receptor (KIR) – has been identified as a risk factor of IBD. We compared the functional behavior of NK cells that had differential KIR expressions. These NK cells were retrieved from the blood of 12 patients with different genetic backgrounds. (3) HSCs are the progenitors of immune cells and are thought to have no immediate functional capacity against pathogen. However, recent studies identified expression of Toll-like receptors (TLRs) on HSCs. We studied the functional capacity of HSCs upon TLR activation. The comparison of HSCs from wild-type mice against those from genetics knock-out mouse models elucidates the responding signaling pathway.

In all three cases, we observed profound functional heterogeneity within phenotypically defined cells. Polyfunctional cells that conduct multiple functions also produce those proteins in large amounts. They dominate the immune response. In the cancer immunotherapy, the strong cytotoxic and antitumor functions from transgenic TCR T cells contributed to a ~30% tumor reduction immediately after the therapy. However, this infused immune response disappeared within 2-3 weeks. Later on, some patients gained a second antitumor response, consisted of the emergence of endogenous antitumor cytotoxic T cells and their production of multiple antitumor functions. These patients showed more effective long-term tumor control. In the IBD mechanism study, we noticed that, compared with others, NK cells expressing KIR2DL3 receptor secreted a large array of effector proteins, such as TNF-α, CCLs and CXCLs. The functions from these cells regulated disease-contributing cells and protected host tissues. Their existence correlated with IBD disease susceptibility. In the HSC study, the HSCs exhibited functional capacity by producing TNF-α, IL-6 and GM-CSF. TLR stimulation activated the NF-κB signaling in HSCs. Single cell functional proteome contains rich information that is independent from the genome and transcriptome. In all three cases, functional proteomic evaluation uncovered critical biological insights that would not be resolved otherwise. The integrated single cell functional proteomic analysis constructed a detail kinetic picture of the immune response that took place during the clinical cancer immunotherapy. It revealed concrete functional evidence that connected genetics to IBD disease susceptibility. Further, it provided predictors that correlated with clinical responses and pathogenic outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seismic reflection methods have been extensively used to probe the Earth's crust and suggest the nature of its formative processes. The analysis of multi-offset seismic reflection data extends the technique from a reconnaissance method to a powerful scientific tool that can be applied to test specific hypotheses. The treatment of reflections at multiple offsets becomes tractable if the assumptions of high-frequency rays are valid for the problem being considered. Their validity can be tested by applying the methods of analysis to full wave synthetics.

Three studies illustrate the application of these principles to investigations of the nature of the crust in southern California. A survey shot by the COCORP consortium in 1977 across the San Andreas fault near Parkfield revealed events in the record sections whose arrival time decreased with offset. The reflectors generating these events are imaged using a multi-offset three-dimensional Kirchhoff migration. Migrations of full wave acoustic synthetics having the same limitations in geometric coverage as the field survey demonstrate the utility of this back projection process for imaging. The migrated depth sections show the locations of the major physical boundaries of the San Andreas fault zone. The zone is bounded on the southwest by a near-vertical fault juxtaposing a Tertiary sedimentary section against uplifted crystalline rocks of the fault zone block. On the northeast, the fault zone is bounded by a fault dipping into the San Andreas, which includes slices of serpentinized ultramafics, intersecting it at 3 km depth. These interpretations can be made despite complications introduced by lateral heterogeneities.

In 1985 the Calcrust consortium designed a survey in the eastern Mojave desert to image structures in both the shallow and the deep crust. Preliminary field experiments showed that the major geophysical acquisition problem to be solved was the poor penetration of seismic energy through a low-velocity surface layer. Its effects could be mitigated through special acquisition and processing techniques. Data obtained from industry showed that quality data could be obtained from areas having a deeper, older sedimentary cover, causing a re-definition of the geologic objectives. Long offset stationary arrays were designed to provide reversed, wider angle coverage of the deep crust over parts of the survey. The preliminary field tests and constant monitoring of data quality and parameter adjustment allowed 108 km of excellent crustal data to be obtained.

This dataset, along with two others from the central and western Mojave, was used to constrain rock properties and the physical condition of the crust. The multi-offset analysis proceeded in two steps. First, an increase in reflection peak frequency with offset is indicative of a thinly layered reflector. The thickness and velocity contrast of the layering can be calculated from the spectral dispersion, to discriminate between structures resulting from broad scale or local effects. Second, the amplitude effects at different offsets of P-P scattering from weak elastic heterogeneities indicate whether the signs of the changes in density, rigidity, and Lame's parameter at the reflector agree or are opposed. The effects of reflection generation and propagation in a heterogeneous, anisotropic crust were contained by the design of the experiment and the simplicity of the observed amplitude and frequency trends. Multi-offset spectra and amplitude trend stacks of the three Mojave Desert datasets suggest that the most reflective structures in the middle crust are strong Poisson's ratio (σ) contrasts. Porous zones or the juxtaposition of units of mutually distant origin are indicated. Heterogeneities in σ increase towards the top of a basal crustal zone at ~22 km depth. The transition to the basal zone and to the mantle include increases in σ. The Moho itself includes ~400 m layering having a velocity higher than that of the uppermost mantle. The Moho maintains the same configuration across the Mojave despite 5 km of crustal thinning near the Colorado River. This indicates that Miocene extension there either thinned just the basal zone, or that the basal zone developed regionally after the extensional event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Bayesian probabilistic methodology for on-line structural health monitoring which addresses the issue of parameter uncertainty inherent in problem is presented. The method uses modal parameters for a limited number of modes identified from measurements taken at a restricted number of degrees of freedom of a structure as the measured structural data. The application presented uses a linear structural model whose stiffness matrix is parameterized to develop a class of possible models. Within the Bayesian framework, a joint probability density function (PDF) for the model stiffness parameters given the measured modal data is determined. Using this PDF, the marginal PDF of the stiffness parameter for each substructure given the data can be calculated.

Monitoring the health of a structure using these marginal PDFs involves two steps. First, the marginal PDF for each model parameter given modal data from the undamaged structure is found. The structure is then periodically monitored and updated marginal PDFs are determined. A measure of the difference between the calibrated and current marginal PDFs is used as a means to characterize the health of the structure. A procedure for interpreting the measure for use by an expert system in on-line monitoring is also introduced.

The probabilistic framework is developed in order to address the model parameter uncertainty issue inherent in the health monitoring problem. To illustrate this issue, consider a very simplified deterministic structural health monitoring method. In such an approach, the model parameters which minimize an error measure between the measured and model modal values would be used as the "best" model of the structure. Changes between the model parameters identified using modal data from the undamaged structure and subsequent modal data would be used to find the existence, location and degree of damage. Due to measurement noise, limited modal information, and model error, the "best" model parameters might vary from one modal dataset to the next without any damage present in the structure. Thus, difficulties would arise in separating normal variations in the identified model parameters based on limitations of the identification method and variations due to true change in the structure. The Bayesian framework described in this work provides a means to handle this parametric uncertainty.

The probabilistic health monitoring method is applied to simulated data and laboratory data. The results of these tests are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Papaseit et al. (Proc. Nati. Acad. Sci. U.S.A. 97, 8364, 2000) showed the decisive role of gravity in the formation of patterns by assemblies of microtubules in vitro. By virtue of a functional scaling, the free energy for MT systems in a gravitational field was constructed. The influence of the gravitational field on MT's self-organization process, that can lead to the isotropic to nematic phase transition, is the focus of this paper. A coupling of a concentration gradient with orientational order characteristic of nernatic ordering pattern formation is the new feature emerging in the presence of gravity. The concentration range corresponding to a phase coexistence region increases with increasing g or NIT concentration. Gravity facilitates the isotropic to nernatic phase transition leading to a significantly broader transition region. The phase transition represents the interplay between the growth in the isotropic phase and the precipitation into the nematic phase. We also present and discuss the numerical results obtained for local NIT concentration change with the height of the vessel, order parameter and phase transition properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grinding is an advanced machining process for the manufacturing of valuable complex and accurate parts for high added value sectors such as aerospace, wind generation, etc. Due to the extremely severe conditions inside grinding machines, critical process variables such as part surface finish or grinding wheel wear cannot be easily and cheaply measured on-line. In this paper a virtual sensor for on-line monitoring of those variables is presented. The sensor is based on the modelling ability of Artificial Neural Networks (ANNs) for stochastic and non-linear processes such as grinding; the selected architecture is the Layer-Recurrent neural network. The sensor makes use of the relation between the variables to be measured and power consumption in the wheel spindle, which can be easily measured. A sensor calibration methodology is presented, and the levels of error that can be expected are discussed. Validation of the new sensor is carried out by comparing the sensor's results with actual measurements carried out in an industrial grinding machine. Results show excellent estimation performance for both wheel wear and surface roughness. In the case of wheel wear, the absolute error is within the range of microns (average value 32 mu m). In the case of surface finish, the absolute error is well below R-a 1 mu m (average value 0.32 mu m). The present approach can be easily generalized to other grinding operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The measurement of high speed laser beam parameters during processing is a topic that has seen growing attention over the last few years as quality assurance places greater demand on the monitoring of the manufacturing process. The targets for any monitoring system is to be non-intrusive, low cost, simple to operate, high speed and capable of operation in process. A new ISO compliant system is presented based on the integration of an imaging plate and camera located behind a proprietary mirror sampling device. The general layout of the device is presented along with the thermal and optical performance of the sampling optic. Diagnostic performance of the system is compared with industry standard devices, demonstrating the high quality high speed data which has been generated using this system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uma rede de trocadores de calor pode ser definida como um grupo de trocadores de calor interligados com o objetivo de reduzir a necessidade de energia de um sistema, sendo largamente usada nas indústrias de processos. Entretanto, uma rede está sujeita à deposição, a qual causa um decréscimo na efetividade térmica dos trocadores. Este fenômeno é provocado pelo acúmulo de materiais indesejáveis sobre a superfície de troca térmica. Para compensar a redução de efetividade térmica causada pela deposição, torna-se necessário um aumento no consumo de utilidades. Isto eleva os custos de operação, assim como os custos de manutenção. Estima-se que os custos associados à deposição atinjam bilhões de dólares anualmente. Em face a este problema, vários trabalhos de pesquisa têm investigado métodos para prevenir a deposição e/ou gerenciar as operações em uma rede. Estudos envolvem desde a otimização de trocadores de calor individuais, simulação e monitoramento de redes, até a otimização da programação das paradas para limpeza de trocadores de calor em uma rede. O presente trabalho apresenta a proposição de um modelo para simulação de redes de trocadores de calor com aplicações no gerenciamento da deposição. Como conseqüência, foi desenvolvido um conjunto de códigos computacionais integrados, envolvendo a simulação estacionária de redes, a simulação pseudo-estacionária do comportamento de redes em relação à evolução da deposição, a estimação de parâmetros para diagnóstico do problema da deposição e a otimização operacional deste tipo de sistema. Com relação ao simulador estacionário, o modelo da rede foi formulado matricialmente e os balanços de massa e energia são resolvidos como sistemas de equações lineares. Do ponto de vista da otimização, o procedimento proposto redistribui as vazões, visando um melhor aproveitamento térmico dos trocadores da rede, como, por exemplo, buscando as vazões da rede que maximizem a temperatura da corrente de entrada no forno em unidades de destilação atmosférica de óleo cru. Os algoritmos foram implementados em alguns exemplos da literatura e em um problema de uma refinaria real. Os resultados foram promissores, o que sugere que a proposta deste trabalho pode vir a ser uma abordagem interessante para operações envolvendo redes de trocadores de calor

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Processos de produção precisam ser avaliados continuamente para que funcionem de modo mais eficaz e eficiente possível. Um conjunto de ferramentas utilizado para tal finalidade é denominado controle estatístico de processos (CEP). Através de ferramentas do CEP, o monitoramento pode ser realizado periodicamente. A ferramenta mais importante do CEP é o gráfico de controle. Nesta tese, foca-se no monitoramento de uma variável resposta, por meio dos parâmetros ou coeficientes de um modelo de regressão linear simples. Propõe-se gráficos de controle χ2 adaptativos para o monitoramento dos coeficientes do modelo de regressão linear simples. Mais especificamente, são desenvolvidos sete gráficos de controle χ2 adaptativos para o monitoramento de perfis lineares, a saber: gráfico com tamanho de amostra variável; intervalo de amostragem variável; limites de controle e de advertência variáveis; tamanho de amostra e intervalo de amostragem variáveis; tamanho de amostra e limites variáveis; intervalo de amostragem e limites variáveis e por fim, com todos os parâmetros de projeto variáveis. Medidas de desempenho dos gráficos propostos foram obtidas através de propriedades de cadeia de Markov, tanto para a situação zero-state como para a steady-state, verificando-se uma diminuição do tempo médio até um sinal no caso de desvios pequenos a moderados nos coeficientes do modelo de regressão do processo de produção. Os gráficos propostos foram aplicados a um exemplo de um processo de fabricação de semicondutores. Além disso, uma análise de sensibilidade dos mesmos é feita em função de desvios de diferentes magnitudes nos parâmetros do processo, a saber, no intercepto e na inclinação, comparando-se o desempenho entre os gráficos desenvolvidos e também com o gráfico χ2 com parâmetros fixos. Os gráficos propostos nesta tese são adequados para vários tipos de aplicações. Neste trabalho também foi considerado características de qualidade as quais são representadas por um modelo de regressão não-linear. Para o modelo de regressão não-linear considerado, a proposta é utilizar um método que divide o perfil não-linear em partes lineares, mais especificamente, um algoritmo para este fim, proposto na literatura, foi utilizado. Desta forma, foi possível validar a técnica proposta, mostrando que a mesma é robusta no sentido que permite tipos diferentes de perfis não-lineares. Aproxima-se, portanto um perfil não-linear por perfis lineares por partes, o que proporciona o monitoramento de cada perfil linear por gráficos de controle, como os gráficos de controle desenvolvidos nesta tese. Ademais apresenta-se a metodologia de decompor um perfil não-linear em partes lineares de forma detalhada e completa, abrindo espaço para ampla utilização.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Desde 2004, o CONAMA (Conselho Nacional de Meio Ambiente), através de sua Resolução n 344, vem exigindo que as análises físicas, químicas e biológicas em matrizes ambientais sejam realizadas em laboratórios ambientais que possuam sua competência técnica reconhecida formalmente através da acreditação concedida pelo Inmetro. Assim, algumas Unidades Federativas vem adotando o mesmo critério para cadastramento de laboratórios em seus bancos de dados. Com isso, houve um crescimento no número de acreditações: em 2002 haviam 12 laboratórios acreditados e em 2012 foram concedidas 198 acreditações a laboratórios ambientais. A adoção da ABNT NBR ISO/IEC 17025 como padrão de trabalho, além de atender as legislações vigentes, possui as seguintes vantagens: satisfação do cliente, credibilidade e melhoria contínua do laboratório, melhoria da capacitação profissional e a conquista de um mercado mais amplo. Buscando adequar-se a essa realidade, apesar de todas as dificuldades inerentes ao processo de implementação dos requisitos da ABNT NBR ISO/IEC 17025 em laboratórios universitários e de pesquisa, o Laboratório de Engenharia Sanitária (LES/DESMA) priorizou a adequação da determinação da demanda química de oxigênio (DQO) aos requisitos técnicos da ABNT NBR ISO/IEC 17025:2005, por ser um parâmetro indicador global de matéria orgânica em águas residuárias e superficiais e ser amplamente utilizado no monitoramento de estações de tratamento de efluentes líquidos e pelo fato deste poder ser determinado por duas técnicas analíticas distintas: espectrofotometria e colorimetria. Em razão deste cenário, o objetivo deste trabalho foi avaliar o desempenho dos métodos 5220 B e 5220 D descritos pelo Standard Methods, através dos parâmetros de validação de métodos analíticos. Ambos os métodos mostraram-se adequados ao uso a que se destinam e o limite de quantificação determinado apresentou-se compatível com o praticado com os laboratórios acreditados. As incertezas foram calculadas de forma a quantificar a qualidade do resultado.