919 resultados para Pechini method and chromium


Relevância:

100.00% 100.00%

Publicador:

Resumo:

*** Purpose – Computer tomography (CT) for 3D reconstruction entails a huge number of coplanar fan-beam projections for each of a large number of 2D slice images, and excessive radiation intensities and dosages. For some applications its rate of throughput is also inadequate. A technique for overcoming these limitations is outlined. *** Design methodology/approach – A novel method to reconstruct 3D surface models of objects is presented, using, typically, ten, 2D projective images. These images are generated by relative motion between this set of objects and a set of ten fanbeam X-ray sources and sensors, with their viewing axes suitably distributed in 2D angular space. *** Findings – The method entails a radiation dosage several orders of magnitude lower than CT, and requires far less computational power. Experimental results are given to illustrate the capability of the technique *** Practical implications – The substantially lower cost of the method and, more particularly, its dramatically lower irradiation make it relevant to many applications precluded by current techniques *** Originality/value – The method can be used in many applications such as aircraft hold-luggage screening, 3D industrial modelling and measurement, and it should also have important applications to medical diagnosis and surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: During development managers, analysts and designers often need to know whether enough requirements analysis work has been done and whether or not it is safe to proceed to the design stage. Objective: This paper describes a new, simple and practical method for assessing our confidence in a set of requirements. Method: We identified 4 confidence factors and used a goal oriented framework with a simple ordinal scale to develop a method for assessing confidence. We illustrate the method and show how it has been applied to a real systems development project. Results: We show how assessing confidence in the requirements could have revealed problems in this project earlier and so saved both time and money. Conclusion: Our meta-level assessment of requirements provides a practical and pragmatic method that can prove useful to managers, analysts and designers who need to know when sufficient requirements analysis has been performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integrated infrared cross-sections and wavenumber positions for the vibrational modes of a range of hydrofluoroethers (HFEs) and hydrofluoropolyethers (HFPEs) have been calculated. Spectra were determined using a density functional method with an empirically derived correction for the wavenumbers of band positions. Radiative efficiencies (REs) were determined using the Pinnock et al. method and were used with atmospheric lifetimes from the literature to determine global warming potentials (GWPs). For the HFEs and the majority of the molecules in the HG series HFPEs, theoretically determined absorption cross-sections and REs lie within ca. 10% of those determined using measured spectra. For the larger molecules in the HG series and the HG′ series of HFPEs, agreement is less good, with theoretical values for the integrated cross-sections being up to 35% higher than the experimental values; REs are up to 45% higher. Our method gives better results than previous theoretical approaches, because of the level of theory chosen and, for REs, because an empirical wavenumber correction derived for perfluorocarbons is effective in predicting the positions of C–F stretching frequencies at around 1250 cm−1 for the molecules considered here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Newton‐Raphson method is proposed for the solution of the nonlinear equation arising from a theoretical model of an acid/base titration. It is shown that it is necessary to modify the form of the equation in order that the iteration is guaranteed to converge. A particular example is considered to illustrate the analysis and method, and a BASIC program is included that can be used to predict the pH of any weak acid/weak base titration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The molecular structure of trans-[PtCl(CCPh)(PEt2Ph)2] has been determined by X-ray diffraction methods. The crystals are monoclinic, space group P21, with a= 12.359(3), b= 13.015(3), c= 9.031(2)Å, β= 101.65(2)°, and Z= 2. The structure has been solved by the heavy-atom method and refined by full-matrix least squares to R 0.046 for 1 877 diffractometric intensity data. The crystals contain discrete molecules in which the platinum coordination is square planar. The phenylethynyl group is non-linear, with a Pt–CC angle of 163(2)°. Selected bond lengths are Pt–Cl 2.407(5) and Pt–C 1.98(2)Å. The structural trans influences of CCPh, CHCH2, and CH2SiMe3 ligands in platinum(II) complexes are compared; there is only a small dependence on hybridization at the ligating carbon atom.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The molecular structure of trans-[PtCl(CHCH2)(PEt2Ph)2] has been determined by X-ray diffraction methods. The crystals are orthorhombic, space group Pbcn, with a= 10.686(2), b= 13.832(4), c= 16.129(4)Å, and Z= 4. The structure has been solved by the heavy-atom method and refined by full-matrix least squares to R 0.044 for 1 420 diffractometric intensity data. The crystals contain discrete molecules in which the platinum co-ordination is square planar. The Pt–Cl bond vector coincides with a crystallographic diad axis about which the atoms of the vinyl group are disordered. Selected bond lengths (Å) are Pt–Cl 2.398(4), Pt–P 2.295(3), and Pt–C 2.03(2). The Pt–CC angle is 127(2)°. From a survey of the available structural data it is concluded that there is little, if any, back donation from platinum to carbon in platinum–alkenyl linkages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the statistical properties of tropical ice clouds (ice water content, visible extinction, effective radius, and total number concentration) derived from 3 yr of ground-based radar–lidar retrievals from the U.S. Department of Energy Atmospheric Radiation Measurement Climate Research Facility in Darwin, Australia, are compared with the same properties derived using the official CloudSat microphysical retrieval methods and from a simpler statistical method using radar reflectivity and air temperature. It is shown that the two official CloudSat microphysical products (2B-CWC-RO and 2B-CWC-RVOD) are statistically virtually identical. The comparison with the ground-based radar–lidar retrievals shows that all satellite methods produce ice water contents and extinctions in a much narrower range than the ground-based method and overestimate the mean vertical profiles of microphysical parameters below 10-km height by over a factor of 2. Better agreements are obtained above 10-km height. Ways to improve these estimates are suggested in this study. Effective radii retrievals from the standard CloudSat algorithms are characterized by a large positive bias of 8–12 μm. A sensitivity test shows that in response to such a bias the cloud longwave forcing is increased from 44.6 to 46.9 W m−2 (implying an error of about 5%), whereas the negative cloud shortwave forcing is increased from −81.6 to −82.8 W m−2. Further analysis reveals that these modest effects (although not insignificant) can be much larger for optically thick clouds. The statistical method using CloudSat reflectivities and air temperature was found to produce inaccurate mean vertical profiles and probability distribution functions of effective radius. This study also shows that the retrieval of the total number concentration needs to be improved in the official CloudSat microphysical methods prior to a quantitative use for the characterization of tropical ice clouds. Finally, the statistical relationship used to produce ice water content from extinction and air temperature obtained by the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite is evaluated for tropical ice clouds. It is suggested that the CALIPSO ice water content retrieval is robust for tropical ice clouds, but that the temperature dependence of the statistical relationship used should be slightly refined to better reproduce the radar–lidar retrievals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ERA-Interim is the latest global atmospheric reanalysis produced by the European Centre for Medium-Range Weather Forecasts (ECMWF). The ERA-Interim project was conducted in part to prepare for a new atmospheric reanalysis to replace ERA-40, which will extend back to the early part of the twentieth century. This article describes the forecast model, data assimilation method, and input datasets used to produce ERA-Interim, and discusses the performance of the system. Special emphasis is placed on various difficulties encountered in the production of ERA-40, including the representation of the hydrological cycle, the quality of the stratospheric circulation, and the consistency in time of the reanalysed fields. We provide evidence for substantial improvements in each of these aspects. We also identify areas where further work is needed and describe opportunities and objectives for future reanalysis projects at ECMWF

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling spatial covariance is an essential part of all geostatistical methods. Traditionally, parametric semivariogram models are fit from available data. More recently, it has been suggested to use nonparametric correlograms obtained from spatially complete data fields. Here, both estimation techniques are compared. Nonparametric correlograms are shown to have a substantial negative bias. Nonetheless, when combined with the sample variance of the spatial field under consideration, they yield an estimate of the semivariogram that is unbiased for small lag distances. This justifies the use of this estimation technique in geostatistical applications. Various formulations of geostatistical combination (Kriging) methods are used here for the construction of hourly precipitation grids for Switzerland based on data from a sparse realtime network of raingauges and from a spatially complete radar composite. Two variants of Ordinary Kriging (OK) are used to interpolate the sparse gauge observations. In both OK variants, the radar data are only used to determine the semivariogram model. One variant relies on a traditional parametric semivariogram estimate, whereas the other variant uses the nonparametric correlogram. The variants are tested for three cases and the impact of the semivariogram model on the Kriging prediction is illustrated. For the three test cases, the method using nonparametric correlograms performs equally well or better than the traditional method, and at the same time offers great practical advantages. Furthermore, two variants of Kriging with external drift (KED) are tested, both of which use the radar data to estimate nonparametric correlograms, and as the external drift variable. The first KED variant has been used previously for geostatistical radar-raingauge merging in Catalonia (Spain). The second variant is newly proposed here and is an extension of the first. Both variants are evaluated for the three test cases as well as an extended evaluation period. It is found that both methods yield merged fields of better quality than the original radar field or fields obtained by OK of gauge data. The newly suggested KED formulation is shown to be beneficial, in particular in mountainous regions where the quality of the Swiss radar composite is comparatively low. An analysis of the Kriging variances shows that none of the methods tested here provides a satisfactory uncertainty estimate. A suitable variable transformation is expected to improve this.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a new method for the assessment of palaeohydrology through the Holocene. A palaeoclimate model was linked with a hydrological model, using a weather generator to correct bias in the rainfall estimates, to simulate the changes in the flood frequency and the groundwater response through the late Pleistocene and Holocene for the Wadi Faynan in southern Jordan, a site considered internationally important due to its rich archaeological heritage spanning the Pleistocene and Holocene. This is the first study to describe the hydrological functioning of the Wadi Faynan, a meso-scale (241 km2) semi-arid catchment, setting this description within the framework of contemporary archaeological investigations. Historic meteorological records were collated and supplemented with new hydrological and water quality data. The modelled outcomes indicate that environmental changes, such as deforestation, had a major impact on the local water cycle and this amplified the effect of the prevailing climate on the flow regime. The results also show that increased rainfall alone does not necessarily imply better conditions for farming and highlight the importance of groundwater. The discussion focuses on the utility of the method and the importance of the local hydrology to the sustained settlement of the Wadi Faynan through pre-history and history.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new method to determine mesospheric electron densities from partially reflected medium frequency radar pulses. The technique uses an optimal estimation inverse method and retrieves both an electron density profile and a gradient electron density profile. As well as accounting for the absorption of the two magnetoionic modes formed by ionospheric birefringence of each radar pulse, the forward model of the retrieval parameterises possible Fresnel scatter of each mode by fine electronic structure, phase changes of each mode due to Faraday rotation and the dependence of the amplitudes of the backscattered modes upon pulse width. Validation results indicate that known profiles can be retrieved and that χ2 tests upon retrieval parameters satisfy validity criteria. Application to measurements shows that retrieved electron density profiles are consistent with accepted ideas about seasonal variability of electron densities and their dependence upon nitric oxide production and transport.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Specific traditional plate count method and real-time PCR systems based on SYBR Green I and TaqMan technologies using a specific primer pair and probe for amplification of iap-gene were used for quantitative assay of Listeria monocytogenes in seven decimal serial dilution series of nutrient broth and milk samples containing 1.58 to 1.58×107 cfu /ml and the real-time PCR methods were compared with the plate count method with respect to accuracy and sensitivity. In this study, the plate count method was performed using surface-plating of 0.1 ml of each sample on Palcam Agar. The lowest detectable level for this method was 1.58×10 cfu/ml for both nutrient broth and milk samples. Using purified DNA as a template for generation of standard curves, as few as four copies of the iap-gene could be detected per reaction with both real-time PCR assays, indicating that they were highly sensitive. When these real-time PCR assays were applied to quantification of L. monocytogenes in decimal serial dilution series of nutrient broth and milk samples, 3.16×10 to 3.16×105 copies per reaction (equals to 1.58×103 to 1.58×107 cfu/ml L. monocytogenes) were detectable. As logarithmic cycles, for Plate Count and both molecular assays, the quantitative results of the detectable steps were similar to the inoculation levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and aims: In addition to the well-known linguistic processing impairments in aphasia, oro-motor skills and articulatory implementation of speech segments are reported to be compromised to some degree in most types of aphasia. This study aimed to identify differences in the characteristics and coordination of lip movements in the production of a bilabial closure gesture between speech-like and nonspeech tasks in individuals with aphasia and healthy control subjects. Method and procedure: Upper and lower lip movement data were collected for a speech-like and a nonspeech task using an AG 100 EMMA system from five individuals with aphasia and five age and gender matched control subjects. Each task was produced at two rate conditions (normal and fast), and in a familiar and a less-familiar manner. Single articulator kinematic parameters (peak velocity, amplitude, duration, and cyclic spatio-temporal index) and multi-articulator coordination indices (average relative phase and variability of relative phase) were measured to characterize lip movements. Outcome and results: The results showed that when the two lips had similar task goals (bilabial closure) in speech-like versus nonspeech task, kinematic and coordination characteristics were not found to be different. However, when changes in rate were imposed on the bilabial gesture, only speech-like task showed functional adaptations, indicated by a greater decrease in amplitude and duration at fast rates. In terms of group differences, individuals with aphasia showed smaller amplitudes and longer movement durations for upper lip, higher spatio-temporal variability for both lips, and higher variability in lip coordination than the control speakers. Rate was an important factor in distinguishing the two groups, and individuals with aphasia were limited in implementing the rate changes. Conclusion and implications: The findings support the notion of subtle but robust differences in motor control characteristics between individuals with aphasia and the control participants, even in the context of producing bilabial closing gestures for a relatively simple speech-like task. The findings also highlight the functional differences between speech-like and nonspeech tasks, despite a common movement coordination goal for bilabial closure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study puts forward a method to model and simulate the complex system of hospital on the basis of multi-agent technology. The formation of the agents of hospitals with intelligent and coordinative characteristics was designed, the message object was defined, and the model operating mechanism of autonomous activities and coordination mechanism was also designed. In addition, the Ontology library and Norm library etc. were introduced using semiotic method and theory, to enlarge the method of system modelling. Swarm was used to develop the multi-agent based simulation system, which is favorable for making guidelines for hospital's improving it's organization and management, optimizing the working procedure, improving the quality of medical care as well as reducing medical charge costs.