947 resultados para Spinning--Quality control


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aquesta tesi estudia com estimar la distribució de les variables regionalitzades l'espai mostral i l'escala de les quals admeten una estructura d'espai Euclidià. Apliquem el principi del treball en coordenades: triem una base ortonormal, fem estadística sobre les coordenades de les dades, i apliquem els output a la base per tal de recuperar un resultat en el mateix espai original. Aplicant-ho a les variables regionalitzades, obtenim una aproximació única consistent, que generalitza les conegudes propietats de les tècniques de kriging a diversos espais mostrals: dades reals, positives o composicionals (vectors de components positives amb suma constant) són tractades com casos particulars. D'aquesta manera, es generalitza la geostadística lineal, i s'ofereix solucions a coneguts problemes de la no-lineal, tot adaptant la mesura i els criteris de representativitat (i.e., mitjanes) a les dades tractades. L'estimador per a dades positives coincideix amb una mitjana geomètrica ponderada, equivalent a l'estimació de la mediana, sense cap dels problemes del clàssic kriging lognormal. El cas composicional ofereix solucions equivalents, però a més permet estimar vectors de probabilitat multinomial. Amb una aproximació bayesiana preliminar, el kriging de composicions esdevé també una alternativa consistent al kriging indicador. Aquesta tècnica s'empra per estimar funcions de probabilitat de variables qualsevol, malgrat que sovint ofereix estimacions negatives, cosa que s'evita amb l'alternativa proposada. La utilitat d'aquest conjunt de tècniques es comprova estudiant la contaminació per amoníac a una estació de control automàtic de la qualitat de l'aigua de la conca de la Tordera, i es conclou que només fent servir les tècniques proposades hom pot detectar en quins instants l'amoni es transforma en amoníac en una concentració superior a la legalment permesa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En les últimes dècades, l'increment dels nivells de radiació solar ultraviolada (UVR) que arriba a la Terra (principalment degut a la disminució d'ozó estratosfèric) juntament amb l'augment detectat en malalties relacionades amb l'exposició a la UVR, ha portat a un gran volum d'investigacions sobre la radiació solar en aquesta banda i els seus efectes en els humans. L'índex ultraviolat (UVI), que ha estat adoptat internacionalment, va ser definit amb el propòsit d'informar al públic general sobre els riscos d'exposar el cos nu a la UVR i per tal d'enviar missatges preventius. L'UVI es va definir inicialment com el valor màxim diari. No obstant, el seu ús actual s'ha ampliat i té sentit referir-se a un valor instantani o a una evolució diària del valor d'UVI mesurat, modelitzat o predit. El valor concret d'UVI està afectat per la geometria Sol-Terra, els núvols, l'ozó, els aerosols, l'altitud i l'albedo superficial. Les mesures d'UVI d'alta qualitat són essencials com a referència i per estudiar tendències a llarg termini; es necessiten també tècniques acurades de modelització per tal d'entendre els factors que afecten la UVR, per predir l'UVI i com a control de qualitat de les mesures. És d'esperar que les mesures més acurades d'UVI s'obtinguin amb espectroradiòmetres. No obstant, com que els costs d'aquests dispositius són elevats, és més habitual trobar dades d'UVI de radiòmetres eritemàtics (de fet, la majoria de les xarxes d'UVI estan equipades amb aquest tipus de sensors). Els millors resultats en modelització s'obtenen amb models de transferència radiativa de dispersió múltiple quan es coneix bé la informació d'entrada. No obstant, habitualment no es coneix informació d'entrada, com per exemple les propietats òptiques dels aerosols, la qual cosa pot portar a importants incerteses en la modelització. Sovint, s'utilitzen models més simples per aplicacions com ara la predicció d'UVI o l'elaboració de mapes d'UVI, ja que aquests són més ràpids i requereixen menys paràmetres d'entrada. Tenint en compte aquest marc de treball, l'objectiu general d'aquest estudi és analitzar l'acord al qual es pot arribar entre la mesura i la modelització d'UVI per condicions de cel sense núvols. D'aquesta manera, en aquest estudi es presenten comparacions model-mesura per diferents tècniques de modelització, diferents opcions d'entrada i per mesures d'UVI tant de radiòmetres eritemàtics com d'espectroradiòmeters. Com a conclusió general, es pot afirmar que la comparació model-mesura és molt útil per detectar limitacions i estimar incerteses tant en les modelitzacions com en les mesures. Pel que fa a la modelització, les principals limitacions que s'han trobat és la falta de coneixement de la informació d'aerosols considerada com a entrada dels models. També, s'han trobat importants diferències entre l'ozó mesurat des de satèl·lit i des de la superfície terrestre, la qual cosa pot portar a diferències importants en l'UVI modelitzat. PTUV, una nova i simple parametrització pel càlcul ràpid d'UVI per condicions de cel serens, ha estat desenvolupada en base a càlculs de transferència radiativa. La parametrització mostra una bona execució tant respecte el model base com en comparació amb diverses mesures d'UVI. PTUV ha demostrat la seva utilitat per aplicacions particulars com ara l'estudi de l'evolució anual de l'UVI per un cert lloc (Girona) i la composició de mapes d'alta resolució de valors d'UVI típics per un territori concret (Catalunya). En relació a les mesures, es constata que és molt important saber la resposta espectral dels radiòmetres eritemàtics per tal d'evitar grans incerteses a la mesura d'UVI. Aquest instruments, si estan ben caracteritzats, mostren una bona comparació amb els espectroradiòmetres d'alta qualitat en la mesura d'UVI. Les qüestions més importants respecte les mesures són la calibració i estabilitat a llarg termini. També, s'ha observat un efecte de temperatura en el PTFE, un material utilitzat en els difusors en alguns instruments, cosa que potencialment podria tenir implicacions importants en el camp experimental. Finalment, i pel que fa a les comparacions model-mesura, el millor acord s'ha trobat quan es consideren mesures d'UVI d'espectroradiòmetres d'alta qualitat i s'usen models de transferència radiativa que consideren les millors dades disponibles pel que fa als paràmetres òptics d'ozó i aerosols i els seus canvis en el temps. D'aquesta manera, l'acord pot ser tan alt dins un 0.1º% en UVI, i típicament entre menys d'un 3%. Aquest acord es veu altament deteriorat si s'ignora la informació d'aerosols i depèn de manera important del valor d'albedo de dispersió simple dels aerosols. Altres dades d'entrada del model, com ara l'albedo superficial i els perfils d'ozó i temperatura introdueixen una incertesa menor en els resultats de modelització.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The human visual ability to perceive depth looks like a puzzle. We perceive three-dimensional spatial information quickly and efficiently by using the binocular stereopsis of our eyes and, what is mote important the learning of the most common objects which we achieved through living. Nowadays, modelling the behaviour of our brain is a fiction, that is why the huge problem of 3D perception and further, interpretation is split into a sequence of easier problems. A lot of research is involved in robot vision in order to obtain 3D information of the surrounded scene. Most of this research is based on modelling the stereopsis of humans by using two cameras as if they were two eyes. This method is known as stereo vision and has been widely studied in the past and is being studied at present, and a lot of work will be surely done in the future. This fact allows us to affirm that this topic is one of the most interesting ones in computer vision. The stereo vision principle is based on obtaining the three dimensional position of an object point from the position of its projective points in both camera image planes. However, before inferring 3D information, the mathematical models of both cameras have to be known. This step is known as camera calibration and is broadly describes in the thesis. Perhaps the most important problem in stereo vision is the determination of the pair of homologue points in the two images, known as the correspondence problem, and it is also one of the most difficult problems to be solved which is currently investigated by a lot of researchers. The epipolar geometry allows us to reduce the correspondence problem. An approach to the epipolar geometry is describes in the thesis. Nevertheless, it does not solve it at all as a lot of considerations have to be taken into account. As an example we have to consider points without correspondence due to a surface occlusion or simply due to a projection out of the camera scope. The interest of the thesis is focused on structured light which has been considered as one of the most frequently used techniques in order to reduce the problems related lo stereo vision. Structured light is based on the relationship between a projected light pattern its projection and an image sensor. The deformations between the pattern projected into the scene and the one captured by the camera, permits to obtain three dimensional information of the illuminated scene. This technique has been widely used in such applications as: 3D object reconstruction, robot navigation, quality control, and so on. Although the projection of regular patterns solve the problem of points without match, it does not solve the problem of multiple matching, which leads us to use hard computing algorithms in order to search the correct matches. In recent years, another structured light technique has increased in importance. This technique is based on the codification of the light projected on the scene in order to be used as a tool to obtain an unique match. Each token of light is imaged by the camera, we have to read the label (decode the pattern) in order to solve the correspondence problem. The advantages and disadvantages of stereo vision against structured light and a survey on coded structured light are related and discussed. The work carried out in the frame of this thesis has permitted to present a new coded structured light pattern which solves the correspondence problem uniquely and robust. Unique, as each token of light is coded by a different word which removes the problem of multiple matching. Robust, since the pattern has been coded using the position of each token of light with respect to both co-ordinate axis. Algorithms and experimental results are included in the thesis. The reader can see examples 3D measurement of static objects, and the more complicated measurement of moving objects. The technique can be used in both cases as the pattern is coded by a single projection shot. Then it can be used in several applications of robot vision. Our interest is focused on the mathematical study of the camera and pattern projector models. We are also interested in how these models can be obtained by calibration, and how they can be used to obtained three dimensional information from two correspondence points. Furthermore, we have studied structured light and coded structured light, and we have presented a new coded structured light pattern. However, in this thesis we started from the assumption that the correspondence points could be well-segmented from the captured image. Computer vision constitutes a huge problem and a lot of work is being done at all levels of human vision modelling, starting from a)image acquisition; b) further image enhancement, filtering and processing, c) image segmentation which involves thresholding, thinning, contour detection, texture and colour analysis, and so on. The interest of this thesis starts in the next step, usually known as depth perception or 3D measurement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Durant aquests darrers anys hem pogut veure com el concepte de qualitat apareixia amb molta força a casa nostra. En l'àmbit empresarial i possiblement degut a la influència que ha tingut la implicació de les grans empreses en tot aquest moviment, o potser només a la creixent competència de les empreses estrangeres, el petit i mitjà empresari del nostre país s'ha vist obligat a obrir les portes de casa seva al que podríem definir com a la "Cultura de la Qualitat". És evident que en aquesta entrada l'administració també hi ha pres part. Ara bé, sovint aquest missatge a favor de la qualitat ha estat "disfressat" de la tant coneguda normativa ISO 9000. Poc a poc, sota aquestes sigles la gent s'ha fet la distorsionada idea de que això vol dir qualitat i tot el que surti d'això és sinònim de mala qualitat. Així doncs, val realment la pena la implicació en el compliment d'aquestes normes? És realment per això pel que la majoria de les petites i mitjanes empreses del nostre entorn s'han implicat en l'assegurament de la qualitat segons la ISO 9000? I més encara, una vegada han aconseguit el certificat de qualitat n'estan realment satisfetes de les millores que han tingut? O la única millora important és la de la imatge externa de l'empresa? La recerca d'una resposta amb base científica és la que porta al desenvolupament del present treball, omplint el buit que hi ha actualment entre els treballs teòrics i la realitat. Aquesta tesi vol esbrinar quin ha estat l'impacte que hi ha hagut a les empreses catalanes degut a la implicació en la normativa d'assegurament de la qualitat ISO 9000. El mètode de recerca es basarà en una anàlisi de les dades que es recullin d'un treball empíric a realitzar en petites i mitjanes empreses de les comarques de Catalunya. Al primer capítol es defineix exactament quin és l'objectiu d'aquest treball de recerca. Al capítol dos es pot trobar una perspectiva del que ha estat la qualitat en la història i de la gent que ha fet possible que prengués la importància que ara té. La normativa d'assegurament de la qualitat ISO 9000, marc de treball d'aquesta tesi es mereix un capítol apart, el tres. A partir de les hipòtesis a contrastar, i la literatura sobre la gestió de la qualitat, al quart capítol es defineix com es portarà a terme es treball, quedant definit doncs quin serà el treball empíric que s'ha portat a terme. Al cinquè capítol es troben els resultats obtinguts del treball empíric, així com un anàlisi descriptiu d'aquests. Les hipòtesis plantejades es resolen al capítol sisè, mitjançant l'aplicació de la tècnica estadística denominada: "Anàlisi cluster". Aquesta tècnica permetrà veure per a quines agrupacions d'empreses es compleixen les hipòtesis i per quines no, realitzant-se així un estudi més concret de la situació. Finalment al capítol setè és on s'hi troben les conclusions d'aquest treball,analitzant-se també quines poden ser futures línies d'investigació en aquest camp.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Actualmente, os dispositivos médicos são cada vez mais uma base em que assenta a qualidade de vida na área da saúde, sendo desta forma importante garantir que estes se ajustam à sua função e sejam seguros. Não apresentando qualquer acção a nível farmacológico, metabólico ou imunológico, mecanismo de acção atribuído aos medicamentos, auxiliam directa ou indirectamente o homem no tratamento ou prevenção de doenças e estados de saúde actuando por meios físicos ou mecânicos. Estes podem ser importantes no diagnóstico, prevenção, monitorização, tratamento ou alívio de uma doença ou ferimento; em investigação, substituição, modificação de um processo fisiológico e no controlo da concepção. Tendo um papel tão relevante na saúde dos doentes, é essencial que haja uma supervisão directa por parte dos profissionais de saúde, nomeadamente os farmacêuticos. Hoje em dia, devido ao facto de cada vez mais doentes quererem diagnosticar e controlar as suas próprias condições médicas, o papel do farmacêutico é cada vez mais importante porque está envolvido directamente na sua supervisão e dispensa. Para além disso, o farmacêutico está ligado à aquisição, selecção e fornecimento de numerosos dispositivos médicos que considera mais adequado a cada situação, sendo estes para uso por iniciativa própria ou por outros profissionais de saúde.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A homeopatia rege-se por princípios antagónicos relativamente à medicina clássica. Centra-se no doente e está apoiada em princípios como a cura pelos similares - ‹‹similia similibus curentur››, a utilização de pequenas doses de substância activa e na avaliação do doente como um todo. A introdução desta doutrina em Portugal, veio trazer um novo tipo de abordagem ao modo de tratar um paciente. Criada por Samuel Hahnemann, em 1796, e introduzida em Portugal por Manuel da Silva Passos, em meados dos anos 1830, a homeopatia assume-se como uma das principais medicinas alternativas complementares [MAC]. Apoia-se no rigor da formulação do medicamento, na qualidade das matérias-primas, em processos de diluição e controlo de qualidade descritos em farmacopeias oficiais e, disponíveis em várias formas farmacêuticas; o medicamento homeopático apresenta uma razoável cota de mercado, o que requer legislação adequada. A adopção dos conteúdos da Directiva 2001/83/CE no Decreto-Lei [DL] n.º 176/2006, de 30 de Agosto, permitiu a Portugal inserir-se num mercado com maior liberdade de circulação de produtos homeopáticos, mais seguro e previsível. Ferramentas como os procedimentos administrativos centralizados e descentralizados envolvendo os Estadosmembros da Comissão Europeia [CE] e a introdução de medicamentos homeopáticos via registo simplificado [RS], permitiram a Portugal aproximar-se de países como a Alemanha e França, os mercados mais significativos ao nível europeu.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ 1] There has been a paucity of information on trends in daily climate and climate extremes, especially from developing countries. We report the results of the analysis of daily temperature ( maximum and minimum) and precipitation data from 14 south and west African countries over the period 1961 - 2000. Data were subject to quality control and processing into indices of climate extremes for release to the global community. Temperature extremes show patterns consistent with warming over most of the regions analyzed, with a large proportion of stations showing statistically significant trends for all temperature indices. Over 1961 to 2000, the regionally averaged occurrence of extreme cold ( fifth percentile) days and nights has decreased by - 3.7 and - 6.0 days/decade, respectively. Over the same period, the occurrence of extreme hot (95th percentile) days and nights has increased by 8.2 and 8.6 days/decade, respectively. The average duration of warm ( cold) has increased ( decreased) by 2.4 (0.5) days/decade and warm spells. Overall, it appears that the hot tails of the distributions of daily maximum temperature have changed more than the cold tails; for minimum temperatures, hot tails show greater changes in the NW of the region, while cold tails have changed more in the SE and east. The diurnal temperature range (DTR) does not exhibit a consistent trend across the region, with many neighboring stations showing opposite trends. However, the DTR shows consistent increases in a zone across Namibia, Botswana, Zambia, and Mozambique, coinciding with more rapid increases in maximum temperature than minimum temperature extremes. Most precipitation indices do not exhibit consistent or statistically significant trends across the region. Regionally averaged total precipitation has decreased but is not statistically significant. At the same time, there has been a statistically significant increase in regionally averaged daily rainfall intensity and dry spell duration. While the majority of stations also show increasing trends for these two indices, only a few of these are statistically significant. There are increasing trends in regionally averaged rainfall on extreme precipitation days and in maximum annual 5-day and 1-day rainfall, but only trends for the latter are statistically significant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Remote sensing from space-borne platforms is often seen as an appealing method of monitoring components of the hydrological cycle, including river discharge, due to its spatial coverage. However, data from these platforms is often less than ideal because the geophysical properties of interest are rarely measured directly and the measurements that are taken can be subject to significant errors. This study assimilated water levels derived from a TerraSAR-X synthetic aperture radar image and digital aerial photography with simulations from a two dimensional hydraulic model to estimate discharge, inundation extent, depths and velocities at the confluence of the rivers Severn and Avon, UK. An ensemble Kalman filter was used to assimilate spot heights water levels derived by intersecting shorelines from the imagery with a digital elevation model. Discharge was estimated from the ensemble of simulations using state augmentation and then compared with gauge data. Assimilating the real data reduced the error between analyzed mean water levels and levels from three gauging stations to less than 0.3 m, which is less than typically found in post event water marks data from the field at these scales. Measurement bias was evident, but the method still provided a means of improving estimates of discharge for high flows where gauge data are unavailable or of poor quality. Posterior estimates of discharge had standard deviations between 63.3 m3s-1 and 52.7 m3s-1, which were below 15% of the gauged flows along the reach. Therefore, assuming a roughness uncertainty of 0.03-0.05 and no model structural errors discharge could be estimated by the EnKF with accuracy similar to that arguably expected from gauging stations during flood events. Quality control prior to assimilation, where measurements were rejected for being in areas of high topographic slope or close to tall vegetation and trees, was found to be essential. The study demonstrates the potential, but also the significant limitations of currently available imagery to reduce discharge uncertainty in un-gauged or poorly gauged basins when combined with model simulations in a data assimilation framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Standardisation of microsatellite allele profiles between laboratories is of fundamental importance to the transferability of genetic fingerprint data and the identification of clonal individuals held at multiple sites. Here we describe two methods of standardisation applied to the microsatellite fingerprinting of 429 Theobroma cacao L. trees representing 345 accessions held in the worlds largest Cocoa Intermediate Quarantine facility: the use of a partial allelic ladder through the production of 46 cloned and sequenced allelic standards (AJ748464 to AJ48509), and the use of standard genotypes selected to display a diverse allelic range. Until now a lack of accurate and transferable identification information has impeded efforts to genetically improve the cocoa crop. To address this need, a global initiative to fingerprint all international cocoa germplasm collections using a common set of 15 microsatellite markers is in progress. Data reported here have been deposited with the International Cocoa Germplasm Database and form the basis of a searchable resource for clonal identification. To our knowledge, this is the first quarantine facility to be completely genotyped using microsatellite markers for the purpose of quality control and clonal identification. Implications of the results for retrospective tracking of labelling errors are briefly explored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality control on fruits requires reliable methods, able to assess with reasonable accuracy and possibly in a non-destructive way their physical and chemical characteristics. More specifically, a decreased firmness indicates the presence of damage or defects in the fruit or else that the fruit has exceeded its “best before date”, becoming unsuitable for consumption. In high-value exotic fruits, such as mangoes, where firmness cannot be easily measured from a simple observation of texture, colour changes and unevenness of fruits surface, the use of non-destructive techniques is highly recommendable. In particular, the application of Laser vibrometry, based on the Doppler effect, a non-contact technique sensitive to differences in displacements inferior to the nanometre, appears ideal for a possible on-line control on food. Previous results indicated that a phase shift can be in a repeatable way associated with the presence of damage on the fruit, whilst a decreased firmness results in significant differences in the displacement of the fruits under the same excitation signal. In this work, frequency ranges for quality control via the application of a sound chirp are suggested, based on the measurement of the signal coherence. The variations of the average vibration spectrum of a grid of points, or of point-by-point signal velocity allows the go-no go recognition of “firm” and “over-ripe” fruits, with notable success in the particular case of mangoes. The future exploitation of this work will include the application of this method to allow on-line control during conveyor belt distribution of fruits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dielectric properties of 16 process cheeses were determined over the frequency range 0.3-3 GHz. The effect of temperature on the dielectric properties of process cheeses were investigated at temperature intervals of 10 degrees C between 5 and 85 degrees C. Results showed that the dielectric constant decreased gradually as frequency increased, for all cheeses. The dielectric loss factor (epsilon") decreased from above 125 to below 12 as frequency increased. epsilon' was highest at 5 degrees C and generally decreased up to a temperature between 55 and 75 degrees C. epsilon" generally increased with increasing temperature for high and medium moisture/fat ratio cheeses. epsilon" decreased with temperature between 5 and 55 degrees C and then increased, for low moisture/fat ratio cheese. Partial least square regression models indicated that epsilon' and epsilon" could be used as a quality control screening application to measure moisture content and inorganic salt content of process cheese, respectively. (c) 2005 Elsevier Ltd. All rights reserved..

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we explore classification techniques for ill-posed problems. Two classes are linearly separable in some Hilbert space X if they can be separated by a hyperplane. We investigate stable separability, i.e. the case where we have a positive distance between two separating hyperplanes. When the data in the space Y is generated by a compact operator A applied to the system states ∈ X, we will show that in general we do not obtain stable separability in Y even if the problem in X is stably separable. In particular, we show this for the case where a nonlinear classification is generated from a non-convergent family of linear classes in X. We apply our results to the problem of quality control of fuel cells where we classify fuel cells according to their efficiency. We can potentially classify a fuel cell using either some external measured magnetic field or some internal current. However we cannot measure the current directly since we cannot access the fuel cell in operation. The first possibility is to apply discrimination techniques directly to the measured magnetic fields. The second approach first reconstructs currents and then carries out the classification on the current distributions. We show that both approaches need regularization and that the regularized classifications are not equivalent in general. Finally, we investigate a widely used linear classification algorithm Fisher's linear discriminant with respect to its ill-posedness when applied to data generated via a compact integral operator. We show that the method cannot stay stable when the number of measurement points becomes large.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data assimilation algorithms are a crucial part of operational systems in numerical weather prediction, hydrology and climate science, but are also important for dynamical reconstruction in medical applications and quality control for manufacturing processes. Usually, a variety of diverse measurement data are employed to determine the state of the atmosphere or to a wider system including land and oceans. Modern data assimilation systems use more and more remote sensing data, in particular radiances measured by satellites, radar data and integrated water vapor measurements via GPS/GNSS signals. The inversion of some of these measurements are ill-posed in the classical sense, i.e. the inverse of the operator H which maps the state onto the data is unbounded. In this case, the use of such data can lead to significant instabilities of data assimilation algorithms. The goal of this work is to provide a rigorous mathematical analysis of the instability of well-known data assimilation methods. Here, we will restrict our attention to particular linear systems, in which the instability can be explicitly analyzed. We investigate the three-dimensional variational assimilation and four-dimensional variational assimilation. A theory for the instability is developed using the classical theory of ill-posed problems in a Banach space framework. Further, we demonstrate by numerical examples that instabilities can and will occur, including an example from dynamic magnetic tomography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times