994 resultados para Image Simulation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The observation of mirror-image clefts in conjoined twins may suggest an influence from environmental factors (e.g., poor blood supply) on the appearance of clefts. The present paper reports on a pair of male thoracopagus twins born to a 20-year-old woman. The twins were stillborn. Both twins exhibited complete unilateral cleft lip and palate with mirror-image configuration, affecting the left side for twin A and the right side for twin B. The twins also shared some organs. The case is discussed with similar information in the literature, with reference to possible related etiologic factors. Reporting on such occurrences throughout the world is important to shed light on important aspects underlying the formation of clefts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recently developed portable dental X-ray units increase the mobility of the forensic odontologists and allow more efficient X-ray work in a disaster field, especially when used in combination with digital sensors. This type of machines might also have potential for application in remote areas, military and humanitarian missions, dental care of patients with mobility limitation, as well as imaging in operating rooms. Objective: To evaluate radiographic image quality acquired by three portable X-ray devices in combination with four image receptors and to evaluate their medical physics parameters. Materials and methods: Images of five samples consisting of four teeth and one formalin-fixed mandible were acquired by one conventional wall-mounted X-ray unit, MinRay (R) 60/70 kVp, used as a clinical standard, and three portable dental X-ray devices: AnyRay (R) 60 kVp, Nomad (R) 60 kVp and Rextar (R) 70 kVp, in combination with a phosphor image plate (PSP), a CCD, or a CMOS sensor. Three observers evaluated images for standard image quality besides forensic diagnostic quality on a 4-point rating scale. Furthermore, all machines underwent tests for occupational as well as patient dosimetry. Results: Statistical analysis showed good quality imaging for all system, with the combination of Nomad (R) and PSP yielding the best score. A significant difference in image quality between the combination of the four X-ray devices and four sensors was established (p < 0.05). For patient safety, the exposure rate was determined and exit dose rates for MinRay (R) at 60 kVp, MinRay (R) at 70 kVp, AnyRay (R), Nomad (R) and Rextar (R) were 3.4 mGy/s, 4.5 mGy/s, 13.5 mGy/s, 3.8 mGy/s and 2.6 mGy/s respectively. The kVp of the AnyRay (R) system was the most stable, with a ripple of 3.7%. Short-term variations in the tube output of all the devices were less than 10%. AnyRay (R) presented higher estimated effective dose than other machines. Occupational dosimetry showed doses at the operator`s hand being lowest with protective shielding (Nomad (R): 0.1 mu Gy). It was also low while using remote control (distance > 1 m: Rextar (R) < 0.2 mu Gy, MinRay (R) < 0.1 mu Gy). Conclusions: The present study demonstrated the feasibility of three portable X-ray systems to be used for specific indications, based on acceptable image quality and sufficient accuracy of the machines and following the standard guidelines for radiation hygiene. (C) 2010 Elsevier Ireland Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant loss in electron probe current can occur before the electron beam enters the specimen chamber of an environmental scanning electron microscope (ESEM). This loss results from electron scattering in a gaseous jet formed inside and downstream (above) the pressure-limiting aperture (PLA), which separates the high-pressure and high-vacuum regions of the microscope. The electron beam loss above the PLA has been calculated for three different ESEMs, each with a different PLA geometry: an ElectroScan E3, a Philips XL30 ESEM, and a prototype instrument. The mass thickness of gas above the PLA in each case has been determined by Monte Carlo simulation of the gas density variation in the gas jet. It has been found that the PLA configurations used in the commercial instruments produce considerable loss in the electron probe current that dramatically degrades their performance at high chamber pressure and low accelerating voltage. These detrimental effects are minimized in the prototype instrument, which has an optimized thin-foil PLA design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the stress levels at the core layer and the veneer layer of zirconia crowns (comprising an alternative core design vs. a standard core design) under mechanical/thermal simulation, and subjected simulated models to laboratory mouth-motion fatigue. The dimensions of a mandibular first molar were imported into computer-aided design (CAD) software and a tooth preparation was modeled. A crown was designed using the space between the original tooth and the prepared tooth. The alternative core presented an additional lingual shoulder that lowered the veneer bulk of the cusps. Finite element analyses evaluated the residual maximum principal stresses fields at the core and veneer of both designs under loading and when cooled from 900 degrees C to 25 degrees C. Crowns were fabricated and mouth-motion fatigued, generating master Weibull curves and reliability data. Thermal modeling showed low residual stress fields throughout the bulk of the cusps for both groups. Mechanical simulation depicted a shift in stress levels to the core of the alternative design compared with the standard design. Significantly higher reliability was found for the alternative core. Regardless of the alternative configuration, thermal and mechanical computer simulations showed stress in the alternative core design comparable and higher to that of the standard configuration, respectively. Such a mechanical scenario probably led to the higher reliability of the alternative design under fatigue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To evaluate the influence of JPEG quality factors 100, 80 and 60 on the reproducibility of identification of cephalometric points on images of lateral cephalograms, compared with the Digital Imaging and Communications in Medicine (DICOM) format. Methods: The sample was composed of 30 images of digital lateral cephalograms obtained from 30 individuals (15 males and 15 females) on a phosphor plate system in DICOM format. The images were converted to JPEG with quality factors 100, 80 and 60 with the aid of software, adding up to 90 images. The 120 images (DICOM, JPEG 100, 80 and 60) were blinded and 12 cephalometric points were identified on each image by three calibrated orthodontists, using the x-y coordinate system, on a cephalometric software. Results: The results revealed that identification of cephalometric points was highly reproducible, except for the point Orbitale (Or) on the x-axis. The different file formats did not present a statistically significant difference. Conclusions: JPEG images of lateral cephalograms with quality factors 100, 80 and 60 did not present alterations in the reproducibility of identification of cephalometric points compared with the DICOM format. Good reproducibility was achieved for the 12 points, except for point Or on the x-axis. Dentomaxillofacial Radiology (2009) 38, 393-400. doi: 10.1259/dmfr/40996636

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The collection of spatial information to quantify changes to the state and condition of the environment is a fundamental component of conservation or sustainable utilization of tropical and subtropical forests, Age is an important structural attribute of old-growth forests influencing biological diversity in Australia eucalypt forests. Aerial photograph interpretation has traditionally been used for mapping the age and structure of forest stands. However this method is subjective and is not able to accurately capture fine to landscape scale variation necessary for ecological studies. Identification and mapping of fine to landscape scale vegetative structural attributes will allow the compilation of information associated with Montreal Process indicators lb and ld, which seek to determine linkages between age structure and the diversity and abundance of forest fauna populations. This project integrated measurements of structural attributes derived from a canopy-height elevation model with results from a geometrical-optical/spectral mixture analysis model to map forest age structure at a landscape scale. The availability of multiple-scale data allows the transfer of high-resolution attributes to landscape scale monitoring. Multispectral image data were obtained from a DMSV (Digital Multi-Spectral Video) sensor over St Mary's State Forest in Southeast Queensland, Australia. Local scene variance levels for different forest tapes calculated from the DMSV data were used to optimize the tree density and canopy size output in a geometric-optical model applied to a Landsat Thematic Mapper (TU) data set. Airborne laser scanner data obtained over the project area were used to calibrate a digital filter to extract tree heights from a digital elevation model that was derived from scanned colour stereopairs. The modelled estimates of tree height, crown size, and tree density were used to produce a decision-tree classification of forest successional stage at a landscape scale. The results obtained (72% accuracy), were limited in validation, but demonstrate potential for using the multi-scale methodology to provide spatial information for forestry policy objectives (ie., monitoring forest age structure).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The step size determines the accuracy of a discrete element simulation. The position and velocity updating calculation uses a pre-calculated table and hence the control of step size can not use the integration formulas for step size control. A step size control scheme for use with the table driven velocity and position calculation uses the difference between the calculation result from one big step and that from two small steps. This variable time step size method chooses the suitable time step size for each particle at each step automatically according to the conditions. Simulation using fixed time step method is compared with that of using variable time step method. The difference in computation time for the same accuracy using a variable step size (compared to the fixed step) depends on the particular problem. For a simple test case the times are roughly similar. However, the variable step size gives the required accuracy on the first run. A fixed step size may require several runs to check the simulation accuracy or a conservative step size that results in longer run times. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.