878 resultados para Multi-resolution Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The three-dimensional documentation of footwear and tyre impressions in snow offers an opportunity to capture additional fine detail for the identification as present photographs. For this approach, up to now, different casting methods have been used. Casting of footwear impressions in snow has always been a difficult assignment. This work demonstrates that for the three-dimensional documentation of impressions in snow the non-destructive method of 3D optical surface scanning is suitable. The new method delivers more detailed results of higher accuracy than the conventional casting techniques. The results of this easy to use and mobile 3D optical surface scanner were very satisfactory in different meteorological and snow conditions. The method is also suitable for impressions in soil, sand or other materials. In addition to the side by side comparison, the automatic comparison of the 3D models and the computation of deviations and accuracy of the data simplify the examination and delivers objective and secure results. The results can be visualized efficiently. Data exchange between investigating authorities at a national or an international level can be achieved easily with electronic data carriers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This paper examines four different levels of possible variation in symptom reporting: occasion, day, person and family. DESIGN: In order to rule out effects of retrospection, concurrent symptom reporting was assessed prospectively using a computer-assisted self-report method. METHODS: A decomposition of variance in symptom reporting was conducted using diary data from families with adolescent children. We used palmtop computers to assess concurrent somatic complaints from parents and children six times a day for seven consecutive days. In two separate studies, 314 and 254 participants from 96 and 77 families, respectively, participated. A generalized multilevel linear models approach was used to analyze the data. Symptom reports were modelled using a logistic response function, and random effects were allowed at the family, person and day level, with extra-binomial variation allowed for on the occasion level. RESULTS: Substantial variability was observed at the person, day and occasion level but not at the family level. CONCLUSIONS: To explain symptom reporting in normally healthy individuals, situational as well as person characteristics should be taken into account. Family characteristics, however, would not help to clarify symptom reporting in all family members.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a feasibility study with the objective of investigating the potential of multi-detector computed tomography (MDCT) to estimate the bone age and sex of deceased persons. To obtain virtual skeletons, the bodies of 22 deceased persons with known age at death were scanned by MDCT using a special protocol that consisted of high-resolution imaging of the skull, shoulder girdle (including the upper half of the humeri), the symphysis pubis and the upper halves of the femora. Bone and soft-tissue reconstructions were performed in two and three dimensions. The resulting data were investigated by three anthropologists with different professional experience. Sex was determined by investigating three-dimensional models of the skull and pelvis. As a basic orientation for the age estimation, the complex method according to Nemeskéri and co-workers was applied. The final estimation was effected using additional parameters like the state of dentition, degeneration of the spine, etc., which where chosen individually by the three observers according to their experience. The results of the study show that the estimation of sex and age is possible by the use of MDCT. Virtual skeletons present an ideal collection for anthropological studies, because they are obtained in a non-invasive way and can be investigated ad infinitum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dental identification is the most valuable method to identify human remains in single cases with major postmortem alterations as well as in mass casualties because of its practicability and demanding reliability. Computed tomography (CT) has been investigated as a supportive tool for forensic identification and has proven to be valuable. It can also scan the dentition of a deceased within minutes. In the present study, we investigated currently used restorative materials using ultra-high-resolution dual-source CT and the extended CT scale for the purpose of a color-encoded, in scale, and artifact-free visualization in 3D volume rendering. In 122 human molars, 220 cavities with 2-, 3-, 4- and 5-mm diameter were prepared. With presently used filling materials (different composites, temporary filling materials, ceramic, and liner), these cavities were restored in six teeth for each material and cavity size (exception amalgam n = 1). The teeth were CT scanned and images reconstructed using an extended CT scale. Filling materials were analyzed in terms of resulting Hounsfield units (HU) and filling size representation within the images. Varying restorative materials showed distinctively differing radiopacities allowing for CT-data-based discrimination. Particularly, ceramic and composite fillings could be differentiated. The HU values were used to generate an updated volume-rendering preset for postmortem extended CT scale data of the dentition to easily visualize the position of restorations, the shape (in scale), and the material used which is color encoded in 3D. The results provide the scientific background for the application of 3D volume rendering to visualize the human dentition for forensic identification purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1998-2001 Finland suffered the most severe insect outbreak ever recorded, over 500,000 hectares. The outbreak was caused by the common pine sawfly (Diprion pini L.). The outbreak has continued in the study area, Palokangas, ever since. To find a good method to monitor this type of outbreaks, the purpose of this study was to examine the efficacy of multi-temporal ERS-2 and ENVISAT SAR imagery for estimating Scots pine (Pinus sylvestris L.) defoliation. Three methods were tested: unsupervised k-means clustering, supervised linear discriminant analysis (LDA) and logistic regression. In addition, I assessed if harvested areas could be differentiated from the defoliated forest using the same methods. Two different speckle filters were used to determine the effect of filtering on the SAR imagery and subsequent results. The logistic regression performed best, producing a classification accuracy of 81.6% (kappa 0.62) with two classes (no defoliation, >20% defoliation). LDA accuracy was with two classes at best 77.7% (kappa 0.54) and k-means 72.8 (0.46). In general, the largest speckle filter, 5 x 5 image window, performed best. When additional classes were added the accuracy was usually degraded on a step-by-step basis. The results were good, but because of the restrictions in the study they should be confirmed with independent data, before full conclusions can be made that results are reliable. The restrictions include the small size field data and, thus, the problems with accuracy assessment (no separate testing data) as well as the lack of meteorological data from the imaging dates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to evaluate the feasibility and reproducibility of high-resolution magnetic resonance imaging (MRI) and quantitative T2 mapping of the talocrural cartilage within a clinically applicable scan time using a new dedicated ankle coil and high-field MRI. MATERIALS AND METHODS: Ten healthy volunteers (mean age 32.4 years) underwent MRI of the ankle. As morphological sequences, proton density fat-suppressed turbo spin echo (PD-FS-TSE), as a reference, was compared with 3D true fast imaging with steady-state precession (TrueFISP). Furthermore, biochemical quantitative T2 imaging was prepared using a multi-echo spin-echo T2 approach. Data analysis was performed three times each by three different observers on sagittal slices, planned on the isotropic 3D-TrueFISP; as a morphological parameter, cartilage thickness was assessed and for T2 relaxation times, region-of-interest (ROI) evaluation was done. Reproducibility was determined as a coefficient of variation (CV) for each volunteer; averaged as root mean square (RMSA) given as a percentage; statistical evaluation was done using analysis of variance. RESULTS: Cartilage thickness of the talocrural joint showed significantly higher values for the 3D-TrueFISP (ranging from 1.07 to 1.14 mm) compared with the PD-FS-TSE (ranging from 0.74 to 0.99 mm); however, both morphological sequences showed comparable good results with RMSA of 7.1 to 8.5%. Regarding quantitative T2 mapping, measurements showed T2 relaxation times of about 54 ms with an excellent reproducibility (RMSA) ranging from 3.2 to 4.7%. CONCLUSION: In our study the assessment of cartilage thickness and T2 relaxation times could be performed with high reproducibility in a clinically realizable scan time, demonstrating new possibilities for further investigations into patient groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation investigates high performance cooperative localization in wireless environments based on multi-node time-of-arrival (TOA) and direction-of-arrival (DOA) estimations in line-of-sight (LOS) and non-LOS (NLOS) scenarios. Here, two categories of nodes are assumed: base nodes (BNs) and target nodes (TNs). BNs are equipped with antenna arrays and capable of estimating TOA (range) and DOA (angle). TNs are equipped with Omni-directional antennas and communicate with BNs to allow BNs to localize TNs; thus, the proposed localization is maintained by BNs and TNs cooperation. First, a LOS localization method is proposed, which is based on semi-distributed multi-node TOA-DOA fusion. The proposed technique is applicable to mobile ad-hoc networks (MANETs). We assume LOS is available between BNs and TNs. One BN is selected as the reference BN, and other nodes are localized in the coordinates of the reference BN. Each BN can localize TNs located in its coverage area independently. In addition, a TN might be localized by multiple BNs. High performance localization is attainable via multi-node TOA-DOA fusion. The complexity of the semi-distributed multi-node TOA-DOA fusion is low because the total computational load is distributed across all BNs. To evaluate the localization accuracy of the proposed method, we compare the proposed method with global positioning system (GPS) aided TOA (DOA) fusion, which are applicable to MANETs. The comparison criterion is the localization circular error probability (CEP). The results confirm that the proposed method is suitable for moderate scale MANETs, while GPS-aided TOA fusion is suitable for large scale MANETs. Usually, TOA and DOA of TNs are periodically estimated by BNs. Thus, Kalman filter (KF) is integrated with multi-node TOA-DOA fusion to further improve its performance. The integration of KF and multi-node TOA-DOA fusion is compared with extended-KF (EKF) when it is applied to multiple TOA-DOA estimations made by multiple BNs. The comparison depicts that it is stable (no divergence takes place) and its accuracy is slightly lower than that of the EKF, if the EKF converges. However, the EKF may diverge while the integration of KF and multi-node TOA-DOA fusion does not; thus, the reliability of the proposed method is higher. In addition, the computational complexity of the integration of KF and multi-node TOA-DOA fusion is much lower than that of EKF. In wireless environments, LOS might be obstructed. This degrades the localization reliability. Antenna arrays installed at each BN is incorporated to allow each BN to identify NLOS scenarios independently. Here, a single BN measures the phase difference across two antenna elements using a synchronized bi-receiver system, and maps it into wireless channel’s K-factor. The larger K is, the more likely the channel would be a LOS one. Next, the K-factor is incorporated to identify NLOS scenarios. The performance of this system is characterized in terms of probability of LOS and NLOS identification. The latency of the method is small. Finally, a multi-node NLOS identification and localization method is proposed to improve localization reliability. In this case, multiple BNs engage in the process of NLOS identification, shared reflectors determination and localization, and NLOS TN localization. In NLOS scenarios, when there are three or more shared reflectors, those reflectors are localized via DOA fusion, and then a TN is localized via TOA fusion based on the localization of shared reflectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded siloxane polymer waveguides have shown promising results for use in optical backplanes. They exhibit high temperature stability, low optical absorption, and require common processing techniques. A challenging aspect of this technology is out-of-plane coupling of the waveguides. A multi-software approach to modeling an optical vertical interconnect (via) is proposed. This approach utilizes the beam propagation method to generate varied modal field distribution structures which are then propagated through a via model using the angular spectrum propagation technique. Simulation results show average losses between 2.5 and 4.5 dB for different initial input conditions. Certain configurations show losses of less than 3 dB and it is shown that in an input/output pair of vias, average losses per via may be lower than the targeted 3 dB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Range estimation is the core of many positioning systems such as radar, and Wireless Local Positioning Systems (WLPS). The estimation of range is achieved by estimating Time-of-Arrival (TOA). TOA represents the signal propagation delay between a transmitter and a receiver. Thus, error in TOA estimation causes degradation in range estimation performance. In wireless environments, noise, multipath, and limited bandwidth reduce TOA estimation performance. TOA estimation algorithms that are designed for wireless environments aim to improve the TOA estimation performance by mitigating the effect of closely spaced paths in practical (positive) signal-to-noise ratio (SNR) regions. Limited bandwidth avoids the discrimination of closely spaced paths. This reduces TOA estimation performance. TOA estimation methods are evaluated as a function of SNR, bandwidth, and the number of reflections in multipath wireless environments, as well as their complexity. In this research, a TOA estimation technique based on Blind signal Separation (BSS) is proposed. This frequency domain method estimates TOA in wireless multipath environments for a given signal bandwidth. The structure of the proposed technique is presented and its complexity and performance are theoretically evaluated. It is depicted that the proposed method is not sensitive to SNR, number of reflections, and bandwidth. In general, as bandwidth increases, TOA estimation performance improves. However, spectrum is the most valuable resource in wireless systems and usually a large portion of spectrum to support high performance TOA estimation is not available. In addition, the radio frequency (RF) components of wideband systems suffer from high cost and complexity. Thus, a novel, multiband positioning structure is proposed. The proposed technique uses the available (non-contiguous) bands to support high performance TOA estimation. This system incorporates the capabilities of cognitive radio (CR) systems to sense the available spectrum (also called white spaces) and to incorporate white spaces for high-performance localization. First, contiguous bands that are divided into several non-equal, narrow sub-bands that possess the same SNR are concatenated to attain an accuracy corresponding to the equivalent full band. Two radio architectures are proposed and investigated: the signal is transmitted over available spectrum either simultaneously (parallel concatenation) or sequentially (serial concatenation). Low complexity radio designs that handle the concatenation process sequentially and in parallel are introduced. Different TOA estimation algorithms that are applicable to multiband scenarios are studied and their performance is theoretically evaluated and compared to simulations. Next, the results are extended to non-contiguous, non-equal sub-bands with the same SNR. These are more realistic assumptions in practical systems. The performance and complexity of the proposed technique is investigated as well. This study’s results show that selecting bandwidth, center frequency, and SNR levels for each sub-band can adapt positioning accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For half a century the integrated circuits (ICs) that make up the heart of electronic devices have been steadily improving by shrinking at an exponential rate. However, as the current crop of ICs get smaller and the insulating layers involved become thinner, electrons leak through due to quantum mechanical tunneling. This is one of several issues which will bring an end to this incredible streak of exponential improvement of this type of transistor device, after which future improvements will have to come from employing fundamentally different transistor architecture rather than fine tuning and miniaturizing the metal-oxide-semiconductor field effect transistors (MOSFETs) in use today. Several new transistor designs, some designed and built here at Michigan Tech, involve electrons tunneling their way through arrays of nanoparticles. We use a multi-scale approach to model these devices and study their behavior. For investigating the tunneling characteristics of the individual junctions, we use a first-principles approach to model conduction between sub-nanometer gold particles. To estimate the change in energy due to the movement of individual electrons, we use the finite element method to calculate electrostatic capacitances. The kinetic Monte Carlo method allows us to use our knowledge of these details to simulate the dynamics of an entire device— sometimes consisting of hundreds of individual particles—and watch as a device ‘turns on’ and starts conducting an electric current. Scanning tunneling microscopy (STM) and the closely related scanning tunneling spectroscopy (STS) are a family of powerful experimental techniques that allow for the probing and imaging of surfaces and molecules at atomic resolution. However, interpretation of the results often requires comparison with theoretical and computational models. We have developed a new method for calculating STM topographs and STS spectra. This method combines an established method for approximating the geometric variation of the electronic density of states, with a modern method for calculating spin-dependent tunneling currents, offering a unique balance between accuracy and accessibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximately 90% of fine aerosol in the Midwestern United States has a regional component with a sizable fraction attributed to secondary production of organic aerosol (SOA). The Ozark Forest is an important source of biogenic SOA precursors like isoprene (> 150 mg m-2 d-1), monoterpenes (10-40 mg m-2 d-1), and sesquiterpenes (10-40 mg m-2d-1). Anthropogenic sources include secondary sulfate and nitrate and biomass burning (51-60%), vehicle emissions (17-26%), and industrial emissions (16-18%). Vehicle emissions are an important source of volatile and vapor-phase, semivolatile aliphatic and aromatic hydrocarbons that are important anthropogenic sources of SOA precursors. The short lifetime of SOA precursors and the complex mixture of functionalized oxidation products make rapid sampling, quantitative processing methods, and comprehensive organic molecular analysis essential elements of a comprehensive strategy to advance understanding of SOA formation pathways. Uncertainties in forecasting SOA production on regional scales are large and related to uncertainties in biogenic emission inventories and measurement of SOA yields under ambient conditions. This work presents a bottom-up approach to develop a conifer emission inventory based on foliar and cortical oleoresin composition, development of a model to estimate terpene and terpenoid signatures of foliar and bole emissions from conifers, development of processing and analytic techniques for comprehensive organic molecular characterization of SOA precursors and oxidation products, implementation of the high-volume sampling technique to measure OA and vapor-phase organic matter, and results from a 5 day field experiment conducted to evaluate temporal and diurnal trends in SOA precursors and oxidation products. A total of 98, 115, and 87 terpene and terpenoid species were identified and quantified in commercially available essential oils of Pinus sylvestris, Picea mariana, and Thuja occidentalis, respectively, by comprehensive, two-dimensional gas chromatography with time-of-flight mass spectrometric detection (GC × GC-ToF-MS). Analysis of the literature showed that cortical oleoresin composition was similar to foliar composition of the oldest branches. Our proposed conceptual model for estimation of signatures of terpene and terpenoid emissions from foliar and cortical oleoresin showed that emission potentials of the foliar and bole release pathways are dissimilar and should be considered for conifer species that develop resin blisters or are infested with herbivores or pathogens. Average derivatization efficiencies for Methods 1 and 2 were 87.9 and 114%, respectively. Despite the lower average derivatization efficiency of Method 1, distinct advantages included a greater certainty of derivatization yield for the entire suite of multi- and poly-functional species and fewer processing steps for sequential derivatization. Detection limits for Method 1 using GC × GC- ToF-MS were 0.09-1.89 ng μL-1. A theoretical retention index diagram was developed for a hypothetical GC × 2GC analysis of the complex mixture of SOA precursors and derivatized oxidation products. In general, species eluted (relative to the alkyl diester reference compounds) from the primary column (DB-210) in bands according to n and from the secondary columns (BPX90, SolGel-WAX) according to functionality, essentially making the GC × 2GC retention diagram a Carbon number-functionality grid. The species clustered into 35 groups by functionality and species within each group exhibited good separation by n. Average recoveries of n-alkanes and polyaromatic hydrocarbons (PAHs) by Soxhlet extraction of XAD-2 resin with dichloromethane were 80.1 ± 16.1 and 76.1 ± 17.5%, respectively. Vehicle emissions were the common source for HSVOCs [i.e., resolved alkanes, the unresolved complex mixture (UCM), alkylbenzenes, and 2- and 3-ring PAHs]. An absence of monoterpenes at 0600-1000 and high concentrations of monoterpenoids during the same period was indicative of substantial losses of monoterpenes overnight and the early morning hours. Post-collection, comprehensive organic molecular characterization of SOA precursors and products by GC × GC-ToFMS in ambient air collected with ~2 hr resolution is a promising method for determining biogenic and anthropogenic SOA yields that can be used to evaluate SOA formation models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For improving the identification of potential heparin impurities such as oversulfated chondroitin sulfate (OSCS) the standard 2D (1)H-(1)H NMR NOESY was applied. Taking advantage of spin diffusion and adjusting the experimental parameters accordingly additional contaminant-specific signals of the corresponding sugar ring protons can easily be detected. These are usually hidden by the more intense heparin signals. Compared to the current 1D (1)H procedure proposed for screening commercial unfractionated heparin samples and focusing on the contaminants acetyl signals more informative and unique fingerprints may be obtained. Correspondingly measured (1)H fingerprints of a few potential impurities are given and their identification in two contaminated commercial heparin samples is demonstrated. The proposed 2D NOESY method is not intended to replace the current 1D method for detecting and quantifying heparin impurities but may be regarded as a valuable supplement for an improved and more reliable identification of these contaminants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Disturbances in power systems may lead to electromagnetic transient oscillations due to mismatch of mechanical input power and electrical output power. Out-of-step conditions in power system are common after the disturbances where the continuous oscillations do not damp out and the system becomes unstable. Existing out-of-step detection methods are system specific as extensive off-line studies are required for setting of relays. Most of the existing algorithms also require network reduction techniques to apply in multi-machine power systems. To overcome these issues, this research applies Phasor Measurement Unit (PMU) data and Zubov’s approximation stability boundary method, which is a modification of Lyapunov’s direct method, to develop a novel out-of-step detection algorithm. The proposed out-of-step detection algorithm is tested in a Single Machine Infinite Bus system, IEEE 3-machine 9-bus, and IEEE 10-machine 39-bus systems. Simulation results show that the proposed algorithm is capable of detecting out-of-step conditions in multi-machine power systems without using network reduction techniques and a comparative study with an existing blinder method demonstrate that the decision times are faster. The simulation case studies also demonstrate that the proposed algorithm does not depend on power system parameters, hence it avoids the need of extensive off-line system studies as needed in other algorithms.