992 resultados para data refinement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we develop a multithreaded VLSI processor linear array architecture to render complex environments based on the radiosity approach. The processing elements are identical and multithreaded. They work in Single Program Multiple Data (SPMD) mode. A new algorithm to do the radiosity computations based on the progressive refinement approach[2] is proposed. Simulation results indicate that the architecture is latency tolerant and scalable. It is shown that a linear array of 128 uni-threaded processing elements sustains a throughput close to 0.4 million patches/sec.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electron Diffraction Structure Analysis (EDSA) with data from standard selected-area electron diffraction (SAED) is still the method of choice for structure determination of nano-sized single crystals. The recently determined heavy atom structure α-Ti2Se (Albe & Weirich, 2003) is used as an example to illustrate the developed procedure for structure determination from two-dimensionally SAED data via direct methods and kinematical least-squares refinement. Despite the investigated crystallite had a relatively large effective thickness of about 230 Å as determined from dynamical calculations, the obtained structural model from SAED data was found in good agreement with the result from an earlier single crystal X-ray study (Weirich, Pöttgen & Simon, 1996). Arguments, which support the validity of the used quasi-kinematical approach, are given in the text. The influences of dynamical and secondary scattering on the quality of the data and the structure solution are discussed. Moreover, the usefulness of first-principles calculations for verifying the results from EDSA is demonstrated by two examples, whereas one of the structures was unattainable by conventional X-ray diffraction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The standard Gibbs energy of formation of ReO2 in the temperature range from 900 to 1200 K has been determined with high precision using a novel apparatus incorporating a buffer electrode between reference and working electrodes. The role of the buffer electrode was to absorb the electrochemical flux of oxygen through the solid electrolyte from the electrode with higher oxygen chemical potential to the electrode with lower oxygen potential. It prevented the polarization of the measuring electrode and ensured accurate data. The Re+ReO2 working electrode was placed in a closed stabilized-zirconia crucible to prevent continuous vaporization of Re2O7 at high temperatures. The standard Gibbs energy of the formation of ReO2 can be represented by the equation View the MathML source Accurate values of low and high temperature heat capacity of ReO2 are available in the literature. The thermal data are coupled with the standard Gibbs energy of formation, obtained in this study, to evaluate the standard enthalpy of formation of ReO2 at 298.15 K by the ‘third law’ method. The value of standard enthalpy of formation at 298.15 K is: View the MathML source(ReO2)/kJ mol−1=−445.1 (±0.2). The uncertainty estimate includes both random (2σ) and systematic errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A network of ship-mounted real-time Automatic Weather Stations integrated with Indian geosynchronous satellites Indian National Satellites (INSATs)] 3A and 3C, named Indian National Centre for Ocean Information Services Real-Time Automatic Weather Stations (I-RAWS), is established. The purpose of I-RAWS is to measure the surface meteorological-ocean parameters and transmit the data in real time in order to validate and refine the forcing parameters (obtained from different meteorological agencies) of the Indian Ocean Forecasting System (INDOFOS). Preliminary validation and intercomparison of analyzed products obtained from the National Centre for Medium Range Weather Forecasting and the European Centre for Medium-Range Weather Forecasts using the data collected from I-RAWS were carried out. This I-RAWS was mounted on board oceanographic research vessel Sagar Nidhi during a cruise across three oceanic regimes, namely, the tropical Indian Ocean, the extratropical Indian Ocean, and the Southern Ocean. The results obtained from such a validation and intercomparison, and its implications with special reference to the usage of atmospheric model data for forcing ocean model, are discussed in detail. It is noticed that the performance of analysis products from both atmospheric models is similar and good; however, European Centre for Medium-Range Weather Forecasts air temperature over the extratropical Indian Ocean and wind speed in the Southern Ocean are marginally better.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective air flow distribution through perforated tiles is required to efficiently cool servers in a raised floor data center. We present detailed computational fluid dynamics (CFD) modeling of air flow through a perforated tile and its entrance to the adjacent server rack. The realistic geometrical details of the perforated tile, as well as of the rack are included in the model. Generally, models for air flow through perforated tiles specify a step pressure loss across the tile surface, or porous jump model based on the tile porosity. An improvement to this includes a momentum source specification above the tile to simulate the acceleration of the air flow through the pores, or body force model. In both of these models, geometrical details of tile such as pore locations and shapes are not included. More details increase the grid size as well as the computational time. However, the grid refinement can be controlled to achieve balance between the accuracy and computational time. We compared the results from CFD using geometrical resolution with the porous jump and body force model solution as well as with the measured flow field using particle image velocimetry (PIV) experiments. We observe that including tile geometrical details gives better results as compared to elimination of tile geometrical details and specifying physical models across and above the tile surface. A modification to the body force model is also suggested and improved results were achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pyridoxal kinase (PdxK; EC 2.7.1.35) belongs to the phosphotransferase family of enzymes and catalyzes the conversion of the three active forms of vitamin B-6, pyridoxine, pyridoxal and pyridoxamine, to their phosphorylated forms and thereby plays a key role in pyridoxal 5 `-phosphate salvage. In the present study, pyridoxal kinase from Salmonella typhimurium was cloned and overexpressed in Escherichia coli, purified using Ni-NTA affinity chromatography and crystallized. X-ray diffraction data were collected to 2.6 angstrom resolution at 100 K. The crystal belonged to the primitive orthorhombic space group P2(1)2(1)2(1), with unitcell parameters a = 65.11, b = 72.89, c = 107.52 angstrom. The data quality obtained by routine processing was poor owing to the presence of strong diffraction rings caused by a polycrystalline material of an unknown small molecule in all oscillation images. Excluding the reflections close to powder/polycrystalline rings provided data of sufficient quality for structure determination. A preliminary structure solution has been obtained by molecular replacement with the Phaser program in the CCP4 suite using E. coli pyridoxal kinase (PDB entry 2ddm) as the phasing model. Further refinement and analysis of the structure are likely to provide valuable insights into catalysis by pyridoxal kinases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Samarium doped barium zirconate titanate ceramics with general formula Ba1-x Sm-2x/3](Zr0.05Ti0.95)O-3 x = 0, 0.01, 0.02, and 0.03] were prepared by high energy ball milling method. X-ray diffraction patterns and micro-Raman spectroscopy confirmed that these ceramics have a single phase with a tetragonal structure. Rietveld refinement data were employed to model BaO12], SmO12], ZrO6], and TiO6] clusters in the lattice. Scanning electron microscopy shows a reduction in average grain size with the increase of Sm3+ ions into lattice. Temperature-dependent dielectric studies indicate a ferroelectric phase transition and the transition temperature decreases with an increase in Sm3+ ion content. The nature of the transition was investigated by the Curie-Weiss law and it is observed that the diffusivity increases with Sm3+ ion content. The ferroelectric hysteresis loop illustrates that the remnant polarization and coercive field increase with an increase in Sm3+ ions content. Optical properties of the ceramics were studied using ultraviolet-visible diffuse reflectance spectroscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I. THE CRYSTAL STRUCTURE OF A NEW DIMER OF TRIPHENYLFLUOROCYCLOBUTADIENE

The crystal structure of thermal isomer of the “head-to-head” dimer of triphenylfluorocyclobutadiene was determined by the direct method. The Σ2 relationship involving the low angle reflections with the largest E’s were found and solved for the signs by the symbolic method of Zachariasen. The structure was seen in the electron density map and the E-map, and was refined antisotropically by the method of least squares. The residual R was 0.065.

The structure is a gem-difluorohexaphenyldihydropentalene. All of the phenyl groups are planar as it is the cyclopentadiene ring of the dihydropentalene skeleton. Overcrowding at the position of the flourines causes some deviations from the normal bond angles in the cyclopentene ring.

The list of observed and calculated structure factors on pages 32-34 will not be legible on the microfilm. Photographic copies may be obtained from the California Institute of Technology.

II. A LOW TEMPERATURE REFINEMENT OF THE CYANURIC TRIAZIDE STRUCTURE

The structure of cyanuric triazide was refined anisotropically by the method of least squares. Three-dimensional intensity data, which has been collected photographically with MoKα radiation at -110˚C, were used in the refinement. The residual R was reduced to 0.081.

The structure is completely planar, and there is no significant bond alternation in the cyanuric ring. The packing of the molecules causes the azide groups to deviate from linearity by 8 degrees.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CAD software can be structured as a set of modular 'software tools' only if there is some agreement on the data structures which are to be passed between tools. Beyond this basic requirement, it is desirable to give the agreed structures the status of 'data types' in the language used for interactive design. The ultimate refinement is to have a data management capability which 'understands' how to manipulate such data types. In this paper the requirements of CACSD are formulated from the point of view of Database Management Systems. Progress towards meeting these requirements in both the DBMS and the CACSD community is reviewed. The conclusion reached is that there has been considerable movement towards the realisation of software tools for CACSD, but that this owes more to modern ideas about programming languages, than to DBMS developments. The DBMS field has identified some useful concepts, but further significant progress is expected to come from the exploitation of concepts such as object-oriented programming, logic programming, or functional programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Active control has been shown as a feasible technology for suppressing thermoacoustic instability in continuous combustion systems, and the control strategy design is substantially dependent on the reliability of the flame model. In this paper, refinement of G-equation flame model for the dynamics of lean premixed combustion is investigated. Precisely, the dynamics between the flame speed S_u and equivalence ratio phi are proposed based on numerical calculations and physical explanations. Finally, the developed model is tested on one set of experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Temporal synchronization of multiple video recordings of the same dynamic event is a critical task in many computer vision applications e.g. novel view synthesis and 3D reconstruction. Typically this information is implied through the time-stamp information embedded in the video streams. User-generated videos shot using consumer grade equipment do not contain this information; hence, there is a need to temporally synchronize signals using the visual information itself. Previous work in this area has either assumed good quality data with relatively simple dynamic content or the availability of precise camera geometry. Our first contribution is a synchronization technique which tries to establish correspondence between feature trajectories across views in a novel way, and specifically targets the kind of complex content found in consumer generated sports recordings, without assuming precise knowledge of fundamental matrices or homographies. We evaluate performance using a number of real video recordings and show that our method is able to synchronize to within 1 sec, which is significantly better than previous approaches. Our second contribution is a robust and unsupervised view-invariant activity recognition descriptor that exploits recurrence plot theory on spatial tiles. The descriptor is individually shown to better characterize the activities from different views under occlusions than state-of-the-art approaches. We combine this descriptor with our proposed synchronization method and show that it can further refine the synchronization index. © 2013 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background English National Quality Requirements mandate out-of-hours primary care services to routinely audit patient experience, but do not state how it should be done.

Objectives We explored how providers collect patient feedback data and use it to inform service provision. We also explored staff views on the utility of out-of-hours questions from the English General Practice Patient Survey (GPPS).

Methods A qualitative study was conducted with 31 staff (comprising service managers, general practitioners and administrators) from 11 out-of-hours primary care providers in England, UK. Staff responsible for patient experience audits within their service were sampled and data collected via face-to-face semistructured interviews.

Results Although most providers regularly audited their patients’ experiences by using patient surveys, many participants expressed a strong preference for additional qualitative feedback. Staff provided examples of small changes to service delivery resulting from patient feedback, but service-wide changes were not instigated. Perceptions that patients lacked sufficient understanding of the urgent care system in which out-of-hours primary care services operate were common and a barrier to using feedback to enable change. Participants recognised the value of using patient experience feedback to benchmark services, but perceived weaknesses in the out-of-hours items from the GPPS led them to question the validity of using these data for benchmarking in its current form.

Conclusions The lack of clarity around how out-of-hours providers should audit patient experience hinders the utility of the National Quality Requirements. Although surveys were common, patient feedback data had only a limited role in service change. Data derived from the GPPS may be used to benchmark service providers, but refinement of the out-of-hours items is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithm developed uses an octree pyramid in which noise is reduced at the expense of the spatial resolution. At a certain level an unsupervised clustering without spatial connectivity constraints is applied. After the classification, isolated voxels and insignificant regions are removed by assigning them to their neighbours. The spatial resolution is then increased by the downprojection of the regions, level by level. At each level the uncertainty of the boundary voxels is minimised by a dynamic selection and classification of these, using an adaptive 3D filtering. The algorithm is tested using different data sets, including NMR data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nous développons dans cette thèse, des méthodes de bootstrap pour les données financières de hautes fréquences. Les deux premiers essais focalisent sur les méthodes de bootstrap appliquées à l’approche de "pré-moyennement" et robustes à la présence d’erreurs de microstructure. Le "pré-moyennement" permet de réduire l’influence de l’effet de microstructure avant d’appliquer la volatilité réalisée. En se basant sur cette ap- proche d’estimation de la volatilité intégrée en présence d’erreurs de microstructure, nous développons plusieurs méthodes de bootstrap qui préservent la structure de dépendance et l’hétérogénéité dans la moyenne des données originelles. Le troisième essai développe une méthode de bootstrap sous l’hypothèse de Gaussianité locale des données financières de hautes fréquences. Le premier chapitre est intitulé: "Bootstrap inference for pre-averaged realized volatility based on non-overlapping returns". Nous proposons dans ce chapitre, des méthodes de bootstrap robustes à la présence d’erreurs de microstructure. Particulièrement nous nous sommes focalisés sur la volatilité réalisée utilisant des rendements "pré-moyennés" proposés par Podolskij et Vetter (2009), où les rendements "pré-moyennés" sont construits sur des blocs de rendements à hautes fréquences consécutifs qui ne se chevauchent pas. Le "pré-moyennement" permet de réduire l’influence de l’effet de microstructure avant d’appliquer la volatilité réalisée. Le non-chevauchement des blocs fait que les rendements "pré-moyennés" sont asymptotiquement indépendants, mais possiblement hétéroscédastiques. Ce qui motive l’application du wild bootstrap dans ce contexte. Nous montrons la validité théorique du bootstrap pour construire des intervalles de type percentile et percentile-t. Les simulations Monte Carlo montrent que le bootstrap peut améliorer les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques, pourvu que le choix de la variable externe soit fait de façon appropriée. Nous illustrons ces méthodes en utilisant des données financières réelles. Le deuxième chapitre est intitulé : "Bootstrapping pre-averaged realized volatility under market microstructure noise". Nous développons dans ce chapitre une méthode de bootstrap par bloc basée sur l’approche "pré-moyennement" de Jacod et al. (2009), où les rendements "pré-moyennés" sont construits sur des blocs de rendements à haute fréquences consécutifs qui se chevauchent. Le chevauchement des blocs induit une forte dépendance dans la structure des rendements "pré-moyennés". En effet les rendements "pré-moyennés" sont m-dépendant avec m qui croît à une vitesse plus faible que la taille d’échantillon n. Ceci motive l’application d’un bootstrap par bloc spécifique. Nous montrons que le bloc bootstrap suggéré par Bühlmann et Künsch (1995) n’est valide que lorsque la volatilité est constante. Ceci est dû à l’hétérogénéité dans la moyenne des rendements "pré-moyennés" au carré lorsque la volatilité est stochastique. Nous proposons donc une nouvelle procédure de bootstrap qui combine le wild bootstrap et le bootstrap par bloc, de telle sorte que la dépendance sérielle des rendements "pré-moyennés" est préservée à l’intérieur des blocs et la condition d’homogénéité nécessaire pour la validité du bootstrap est respectée. Sous des conditions de taille de bloc, nous montrons que cette méthode est convergente. Les simulations Monte Carlo montrent que le bootstrap améliore les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques. Nous illustrons cette méthode en utilisant des données financières réelles. Le troisième chapitre est intitulé: "Bootstrapping realized covolatility measures under local Gaussianity assumption". Dans ce chapitre nous montrons, comment et dans quelle mesure on peut approximer les distributions des estimateurs de mesures de co-volatilité sous l’hypothèse de Gaussianité locale des rendements. En particulier nous proposons une nouvelle méthode de bootstrap sous ces hypothèses. Nous nous sommes focalisés sur la volatilité réalisée et sur le beta réalisé. Nous montrons que la nouvelle méthode de bootstrap appliquée au beta réalisé était capable de répliquer les cummulants au deuxième ordre, tandis qu’il procurait une amélioration au troisième degré lorsqu’elle est appliquée à la volatilité réalisée. Ces résultats améliorent donc les résultats existants dans cette littérature, notamment ceux de Gonçalves et Meddahi (2009) et de Dovonon, Gonçalves et Meddahi (2013). Les simulations Monte Carlo montrent que le bootstrap améliore les propriétés en échantillon fini de l’estimateur de la volatilité intégrée par rapport aux résultats asymptotiques et les résultats de bootstrap existants. Nous illustrons cette méthode en utilisant des données financières réelles.