999 resultados para Inner reference solution
Resumo:
A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.
Resumo:
Il trattamento numerico dell'equazione di convezione-diffusione con le relative condizioni al bordo, comporta la risoluzione di sistemi lineari algebrici di grandi dimensioni in cui la matrice dei coefficienti è non simmetrica. Risolutori iterativi basati sul sottospazio di Krylov sono ampiamente utilizzati per questi sistemi lineari la cui risoluzione risulta particolarmente impegnativa nel caso di convezione dominante. In questa tesi vengono analizzate alcune strategie di precondizionamento, atte ad accelerare la convergenza di questi metodi iterativi. Vengono confrontati sperimentalmente precondizionatori molto noti come ILU e iterazioni di tipo inner-outer flessibile. Nel caso in cui i coefficienti del termine di convezione siano a variabili separabili, proponiamo una nuova strategia di precondizionamento basata sull'approssimazione, mediante equazione matriciale, dell'operatore differenziale di convezione-diffusione. L'azione di questo nuovo precondizionatore sfrutta in modo opportuno recenti risolutori efficienti per equazioni matriciali lineari. Vengono riportati numerosi esperimenti numerici per studiare la dipendenza della performance dei diversi risolutori dalla scelta del termine di convezione, e dai parametri di discretizzazione.
Resumo:
Several methods have been investigated, with some success, for treating scrap brass to recover copper and zinc, either as pure metals or as salts of the metals. One of the more promising of these methods is electrolysis in sulfate solution for the recovery of pure copper and zinc.
Resumo:
Ventricular assist devices (VADs) are blood pumps that offer an option to support the circulation of patients with severe heart failure. Since a failing heart has a remaining pump function, its interaction with the VAD influences the hemodynamics. Ideally, the heart's action is taken into account for actuating the device such that the device is synchronized to the natural cardiac cycle. To realize this in practice, a reliable real-time algorithm for the automatic synchronization of the VAD to the heart rate is required. This paper defines the tasks such an algorithm needs to fulfill: the automatic detection of irregular heart beats and the feedback control of the phase shift between the systolic phases of the heart and the assist device. We demonstrate a possible solution to these problems and analyze its performance in two steps. First, the algorithm is tested using the MIT-BIH arrhythmia database. Second, the algorithm is implemented in a controller for a pulsatile and a continuous-flow VAD. These devices are connected to a hybrid mock circulation where three test scenarios are evaluated. The proposed algorithm ensures a reliable synchronization of the VAD to the heart cycle, while being insensitive to irregularities in the heart rate.
Resumo:
A natural smoky quartz crystal from Shandong province, China, was characterised by laser ablation ICP-MS, electron probe microanalysis (EPMA) and solution ICP-MS to determine the concentration of twenty-four trace and ultra trace elements. Our main focus was on Ti quantification because of the increased use of this element for titanium in- quartz (TitaniQ) thermobarometry. Pieces of a uniform growth zone of 9 mm thickness within the quartz crystal were analysed in four different LA-ICP-MS laboratories, three EPMA laboratories and one solution-ICP-MS laboratory. The results reveal reproducible concentrations of Ti (57 ± 4 lg g-1),Al (154 ± 15 lg g-1), Li (30 ± 2 lg g-1), Fe (2.2 ± 0.3 lg g-1), Mn (0.34 ± 0.04 lg g-1), Ge (1.7 ± 0.2 lg g-1) and Ga (0.020 ± 0.002 lg g-1) and detectable, but less reproducible, concentrations of Be, B, Na, Cu, Zr, Sn and Pb. oncentrations of K, Ca, Sr, Mo, Ag, Sb, Ba and Au were below the limits of detection of all three techniques. The uncertainties on the average concentration determinations by multiple techniques and laboratories for Ti, Al, Li, Fe, Mn, Ga and Ge are low; hence, this quartz can serve as a reference material or a secondary reference material for microanalytical applications involving the quantification of trace elements in quartz.
Resumo:
In this chapter a low-cost surgical navigation solution for periacetabular osteotomy (PAO) surgery is described. Two commercial inertial measurement units (IMU, Xsens Technologies, The Netherlands), are attached to a patient’s pelvis and to the acetabular fragment, respectively. Registration of the patient with a pre-operatively acquired computer model is done by recording the orientation of the patient’s anterior pelvic plane (APP) using one IMU. A custom-designed device is used to record the orientation of the APP in the reference coordinate system of the IMU. After registration, the two sensors are mounted to the patient’s pelvis and acetabular fragment, respectively. Once the initial position is recorded, the orientation is measured and displayed on a computer screen. A patient-specific computer model generated from a pre-operatively acquired computed tomography (CT) scan is used to visualize the updated orientation of the acetabular fragment. Experiments with plastic bones (7 hip joints) performed in an operating room comparing a previously developed optical navigation system with our inertial-based navigation system showed no statistical difference on the measurement of acetabular component reorientation (anteversion and inclination). In six out of seven hip joints the mean absolute difference was below five degrees for both anteversion and inclination.
Resumo:
PURPOSE To evaluate a low-cost, inertial sensor-based surgical navigation solution for periacetabular osteotomy (PAO) surgery without the line-of-sight impediment. METHODS Two commercial inertial measurement units (IMU, Xsens Technologies, The Netherlands), are attached to a patient's pelvis and to the acetabular fragment, respectively. Registration of the patient with a pre-operatively acquired computer model is done by recording the orientation of the patient's anterior pelvic plane (APP) using one IMU. A custom-designed device is used to record the orientation of the APP in the reference coordinate system of the IMU. After registration, the two sensors are mounted to the patient's pelvis and acetabular fragment, respectively. Once the initial position is recorded, the orientation is measured and displayed on a computer screen. A patient-specific computer model generated from a pre-operatively acquired computed tomography scan is used to visualize the updated orientation of the acetabular fragment. RESULTS Experiments with plastic bones (eight hip joints) performed in an operating room comparing a previously developed optical navigation system with our inertial-based navigation system showed no statistically significant difference on the measurement of acetabular component reorientation. In all eight hip joints the mean absolute difference was below four degrees. CONCLUSION Using two commercially available inertial measurement units we show that it is possible to accurately measure the orientation (inclination and anteversion) of the acetabular fragment during PAO surgery and therefore to successfully eliminate the line-of-sight impediment that optical navigation systems have.
Resumo:
The daytime abundance and localized distribution of fishes in relation to temperature were studied in a small tidal cove by beach seining on seven dates in the Back River estuary, Maine, during the summers of 1971 and 1972. Temperatures on the seven dates ranged from 15.1–26.2 C, and salinities ranged from 17.3–24.7‰. Eighteen species of fishes were captured, with mummichogs, smooth flounders, Atlantic silversides and Atlantic herring together comprising over 98% of the catch. Mummichogs and Atlantic silversides were captured primarily near the inner end of the cove, while other abundant species were caught mainly at the outer end of the cove. Several species seem well adapted to naturally warm cove temperatures. Others seem now virtually excluded because of warm temperatures. Winter flounder, Atlantic herring, and Atlantic tomcod might be excluded from the cove during daytime in summer if artificial warming of the cove were permitted.
Resumo:
To estimate the kinematics of the SIRGAS reference frame, the Deutsches Geodätisches Forschungsinstitut (DGFI) as the IGS Regional Network Associate Analysis Centre for SIRGAS (IGS RNNAC SIR), yearly computes a cumulative (multi-year) solution containing all available weekly solutions delivered by the SIRGAS analysis centres. These cumulative solutions include those models, standards, and strategies widely applied at the time in which they were computed and cover different time spans depending on the availability of the weekly solutions. This data set corresponds to the multi-year solution SIR11P01. It is based on the combination of the weekly normal equations covering the time span from 2000-01-02 (GPS week 1043) to 2011-04-16 (GPS week 1631), when the IGS08 reference frame was introduced. It refers to ITRF2008, epoch 2005.0 and contains 230 stations with 269 occupations. Its precision was estimated to be ±1.0 mm (horizontal) and ±2.4 mm (vertical) for the station positions, and ±0.7 mm/a (horizontal) and ±1.1 mm/a (vertical) for the constant velocities. Computation strategy and results are in detail described in Sánchez and Seitz (2011). The IGS RNAAC SIR computation of the SIRGAS reference frame is possible thanks to the active participation of many Latin American and Caribbean colleagues, who not only make the measurements of the stations available, but also operate SIRGAS analysis centres processing the observational data on a routine basis (more details in http://www.sirgas.org). The achievements of SIRGAS are a consequence of a successful international geodetic cooperation not only following and meeting concrete objectives, but also becoming a permanent and self-sustaining geodetic community to guarantee quality, reliability, and long-term stability of the SIRGAS reference frame. The SIRGAS activities are strongly supported by the International Association of Geodesy (IAG) and the Pan-American Institute for Geography and History (PAIGH). The IGS RNAAC SIR highly appreciates all this support.
Resumo:
Pockmarks are geological features that are found on the bottom of lakes and oceans all over the globe. Some are active, seeping oil or methane, while others are inactive. Active pockmarks are well studied since they harbor specialized microbial communities that proliferate on the seeping compounds. Such communities are not found in inactive pockmarks. Interestingly, inactive pockmarks are known to have different macrofaunal communities compared to the surrounding sediments. It is undetermined what the microbial composition of inactive pockmarks is and if it shows a similar pattern as the macrofauna. The Norwegian Oslo Fjord contains many inactive pockmarks and they are well suited to study the influence of these geological features on the microbial community in the sediment. Here we present a detailed analysis of the microbial communities found in three inactive pockmarks and two control samples at two core depth intervals. The communities were analyzed using high-throughput amplicon sequencing of the 16S rRNA V3 region. Microbial communities of surface pockmark sediments were indistinguishable from communities found in the surrounding seabed. In contrast, pockmark communities at 40 cm sediment depth had a significantly different community structure from normal sediments at the same depth. Statistical analysis of chemical variables indicated significant differences in the concentrations of total carbon and non-particulate organic carbon between 40 cm pockmark and reference sample sediments. We discuss these results in comparison with the taxonomic classification of the OTUs identified in our samples. Our results indicate that microbial surface sediment communities are affect by the water column, while the 40 cm communities are affect by local conditions within the sediment.
Resumo:
Pore water and turnover rates were determined for surface sediment cores obtained in 2009 and 2010. The pore water was extracted with Rhizons (Rhizon CSS: length 5 cm, pore diameter 0.15 µm; Rhizosphere Research Products, Wageningen, Netherlands) in 1 cm-resolution and immediately fixed in 5% zinc acetate (ZnAc) solution for sulfate, and sulfide analyses. The samples were diluted, filtered and the concentrations measured with non-suppressed anion exchange chromatography (Waters IC-Pak anion exchange column, waters 430 conductivity detector). The total sulfide concentrations (H2S + HS- + S**2-) were determined using the diamine complexation method (doi:10.4319/lo.1969.14.3.0454). Samples for dissolved inorganic carbon (DIC) and alkalinity measurements were preserved by adding 2 µl saturated mercury chloride (HgCl2) solution and stored headspace-free in gas-tight glass vials. DIC and alkalinity were measured using the flow injection method (detector VWR scientific model 1054) (doi:10.4319/lo.1992.37.5.1113). Dissolved sulfide was eliminated prior to the DIC measurement by adding 0.5 M molybdate solution (doi:10.4319/lo.1995.40.5.1011). Nutrient subsamples (10 - 15 ml) were stored at - 20 °C prior to concentration measurements with a Skalar Continuous-Flow Analyzer (doi:10.1002/9783527613984).
Resumo:
Turnover rates were determined for surface sediment cores obtained in 2009 and 2010. Sulfate reduction (SR) were measured ex situ by the whole core injection method (doi:10.1080/01490457809377722). We incubated the samples at in situ temperature (1.0°C) for 12 hours with carrier-free 35**SO4 (dissolved in water, 50 kBq). Sediment was fixed in 20 ml 20% ZnAc solution for AOM or SR, respectively. Turnover rates were measured as previously described (doi:10.4319/lom.2004.2.171).
Resumo:
Métrica de calidad de video de alta definición construida a partir de ratios de referencia completa. La medida de calidad de video, en inglés Visual Quality Assessment (VQA), es uno de los mayores retos por solucionar en el entorno multimedia. La calidad de vídeo tiene un impacto altísimo en la percepción del usuario final (consumidor) de los servicios sustentados en la provisión de contenidos multimedia y, por tanto, factor clave en la valoración del nuevo paradigma denominado Calidad de la Experiencia, en inglés Quality of Experience (QoE). Los modelos de medida de calidad de vídeo se pueden agrupar en varias ramas según la base técnica que sustenta el sistema de medida, destacando en importancia los que emplean modelos psicovisuales orientados a reproducir las características del sistema visual humano, en inglés Human Visual System, del que toman sus siglas HVS, y los que, por el contrario, optan por una aproximación ingenieril en la que el cálculo de calidad está basado en la extracción de parámetros intrínsecos de la imagen y su comparación. A pesar de los avances recogidos en este campo en los últimos años, la investigación en métricas de calidad de vídeo, tanto en presencia de referencia (los modelos denominados de referencia completa), como en presencia de parte de ella (modelos de referencia reducida) e incluso los que trabajan en ausencia de la misma (denominados sin referencia), tiene un amplio camino de mejora y objetivos por alcanzar. Dentro de ellos, la medida de señales de alta definición, especialmente las utilizadas en las primeras etapas de la cadena de valor que son de muy alta calidad, son de especial interés por su influencia en la calidad final del servicio y no existen modelos fiables de medida en la actualidad. Esta tesis doctoral presenta un modelo de medida de calidad de referencia completa que hemos llamado PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), basado en la ponderación de cuatro ratios de calidad calculados a partir de características intrínsecas de la imagen. Son: El Ratio de Fidelidad, calculado mediante el gradiente morfológico o gradiente de Beucher. El Ratio de Similitud Visual, calculado mediante los puntos visualmente significativos de la imagen a través de filtrados locales de contraste. El Ratio de Nitidez, que procede de la extracción del estadístico de textura de Haralick contraste. El Ratio de Complejidad, obtenido de la definición de homogeneidad del conjunto de estadísticos de textura de Haralick PARMENIA presenta como novedad la utilización de la morfología matemática y estadísticos de Haralick como base de una métrica de medida de calidad, pues esas técnicas han estado tradicionalmente más ligadas a la teledetección y la segmentación de objetos. Además, la aproximación de la métrica como un conjunto ponderado de ratios es igualmente novedosa debido a que se alimenta de modelos de similitud estructural y otros más clásicos, basados en la perceptibilidad del error generado por la degradación de la señal asociada a la compresión. PARMENIA presenta resultados con una altísima correlación con las valoraciones MOS procedentes de las pruebas subjetivas a usuarios que se han realizado para la validación de la misma. El corpus de trabajo seleccionado procede de conjuntos de secuencias validados internacionalmente, de modo que los resultados aportados sean de la máxima calidad y el máximo rigor posible. La metodología de trabajo seguida ha consistido en la generación de un conjunto de secuencias de prueba de distintas calidades a través de la codificación con distintos escalones de cuantificación, la obtención de las valoraciones subjetivas de las mismas a través de pruebas subjetivas de calidad (basadas en la recomendación de la Unión Internacional de Telecomunicaciones BT.500), y la validación mediante el cálculo de la correlación de PARMENIA con estos valores subjetivos, cuantificada a través del coeficiente de correlación de Pearson. Una vez realizada la validación de los ratios y optimizada su influencia en la medida final y su alta correlación con la percepción, se ha realizado una segunda revisión sobre secuencias del hdtv test dataset 1 del Grupo de Expertos de Calidad de Vídeo (VQEG, Video Quality Expert Group) mostrando los resultados obtenidos sus claras ventajas. Abstract Visual Quality Assessment has been so far one of the most intriguing challenges on the media environment. Progressive evolution towards higher resolutions while increasing the quality needed (e.g. high definition and better image quality) aims to redefine models for quality measuring. Given the growing interest in multimedia services delivery, perceptual quality measurement has become a very active area of research. First, in this work, a classification of objective video quality metrics based on their underlying methodologies and approaches for measuring video quality has been introduced to sum up the state of the art. Then, this doctoral thesis describes an enhanced solution for full reference objective quality measurement based on mathematical morphology, texture features and visual similarity information that provides a normalized metric that we have called PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), with a high correlated MOS score. The PARMENIA metric is based on the pooling of different quality ratios that are obtained from three different approaches: Beucher’s gradient, local contrast filtering, and contrast and homogeneity Haralick’s texture features. The metric performance is excellent, and improves the current state of the art by providing a wide dynamic range that make easier to discriminate between very close quality coded sequences, especially for very high bit rates whose quality, currently, is transparent for quality metrics. PARMENIA introduces a degree of novelty against other working metrics: on the one hand, exploits the structural information variation to build the metric’s kernel, but complements the measure with texture information and a ratio of visual meaningful points that is closer to typical error sensitivity based approaches. We would like to point out that PARMENIA approach is the only metric built upon full reference ratios, and using mathematical morphology and texture features (typically used in segmentation) for quality assessment. On the other hand, it gets results with a wide dynamic range that allows measuring the quality of high definition sequences from bit rates of hundreds of Megabits (Mbps) down to typical distribution rates (5-6 Mbps), even streaming rates (1- 2 Mbps). Thus, a direct correlation between PARMENIA and MOS scores are easily constructed. PARMENIA may further enhance the number of available choices in objective quality measurement, especially for very high quality HD materials. All this results come from validation that has been achieved through internationally validated datasets on which subjective tests based on ITU-T BT.500 methodology have been carried out. Pearson correlation coefficient has been calculated to verify the accuracy of PARMENIA and its reliability.
Resumo:
Reverberation chambers are well known for providing a random-like electric field distribution. Detection of directivity or gain thereof requires an adequate procedure and smart post-processing. In this paper, a new method is proposed for estimating the directivity of radiating devices in a reverberation chamber (RC). The method is based on the Rician K-factor whose estimation in an RC benefits from recent improvements. Directivity estimation relies on the accurate determination of the K-factor with respect to a reference antenna. Good agreement is reported with measurements carried out in near-field anechoic chamber (AC) and using a near-field to far-field transformation.