873 resultados para COHERENCE TOMOGRAPHY FINDINGS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tsunami waves of the Sumatra-Andaman earthquake on 26 December 2004 claimed approximately 230 000 lives and started the biggest identification operation in Interpol's history. The aim of this study was to resolve methods of the identification and results received. The viewpoint is mainly that of forensic odontology, but also includes other means of identification and results of the medico-legal examination performed in Finland. Of the 5395 victims in Thailand, approximately 2 400 were foreigners from 36 nations including 177 Finnish nationals. Additionally, a Finnish woman perished in Sri Lanka and a severely injured man after the evacuation in a hospital. The final numbers of missing persons and dead bodies registered in the Information Management Centre in Phuket,Thailand, were 3 574 ante-mortem (AM) and 3 681 post-mortem (PM) files. The number of identifications by December 2006 was 3 271 or 89% of the victims registered. Of Finnish victims, 172 have been identified in Thailand and 163 repatriated to Finland. One adult and four children are still missing. For AM data, a list of Finnish missing persons including 178 names was published on 30 December 2004. By February 2005 all useful dental AM data were available. Five persons on the list living in Finland lacked records. Based on the AM database, for the children under age 18 years (n=60) dental identification could be established for 12 (20%). The estimated number for adults (n=112) was 96 (86%). The final identification rate, based on PM examinations in Finland, was 14 (25%) for children (n= 56) and 98 (90%) for adults (n= 109). The number of Finnish victims identified by dental methods, 112 (68%), was high compared to all examined in Thailand (43%). DNA was applied for 26 Finnish children and for 6 adults, fingerprints for 24 and 7, respectively. In 12 cases two methods were applied. Every victim (n=165) underwent in Finland a medico-legal investigation including an autopsy with sampling specimens for DNA, the toxicological and histological investigation. Digital radiographs and computed tomography were taken of the whole body to verify autopsy findings and bring out changes caused by trauma, autolysis, and sampling for DNA in Thailand. Data for identification purposes were also noted. Submersion was the cause of death for 101 of 109 adults (92.7%), and trauma for 8 (7.3%). Injuries were 33 times contributing factors for submersion and 3 times for trauma-based death. Submersion was the cause of death for 51 (92.7%) children and trauma for 4 (7.3%). Injuries were in 3 cases contributing factors in submersion and once in trauma-based death. The success of the dental identification of Finnish victims is mainly based on careful registration of dental records, and on an education program from 1999 in forensic odontology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hypertexts are digital texts characterized by interactive hyperlinking and a fragmented textual organization. Increasingly prominent since the early 1990s, hypertexts have become a common text type both on the Internet and in a variety of other digital contexts. Although studied widely in disciplines like hypertext theory and media studies, formal linguistic approaches to hypertext continue to be relatively rare. This study examines coherence negotiation in hypertext with particularly reference to hypertext fiction. Coherence, or the quality of making sense, is a fundamental property of textness. Proceeding from the premise that coherence is a subjectively evaluated property rather than an objective quality arising directly from textual cues, the study focuses on the processes through which readers interact with hyperlinks and negotiate continuity between hypertextual fragments. The study begins with a typological discussion of textuality and an overview of the historical and technological precedents of modern hypertexts. Then, making use of text linguistic, discourse analytical, pragmatic, and narratological approaches to textual coherence, the study takes established models developed for analyzing and describing conventional texts, and examines their applicability to hypertext. Primary data derived from a collection of hyperfictions is used throughout to illustrate the mechanisms in practice. Hypertextual coherence negotiation is shown to require the ability to cognitively operate between local and global coherence by means of processing lexical cohesion, discourse topical continuities, inferences and implications, and shifting cognitive frames. The main conclusion of the study is that the style of reading required by hypertextuality fosters a new paradigm of coherence. Defined as fuzzy coherence, this new approach to textual sensemaking is predicated on an acceptance of the coherence challenges readers experience when the act of reading comes to involve repeated encounters with referentially imprecise hyperlinks and discourse topical shifts. A practical application of fuzzy coherence is shown to be in effect in the way coherence is actively manipulated in hypertext narratives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The European Union has agreed on implementing the Policy Coherence for Development (PCD) principle in all policy sectors that are likely to have a direct impact on developing countries. This is in order to take account of and support the EU development cooperation objectives and the achievement of the internationally agreed Millennium Development Goals. The common EU migration policy and the newly introduced EU Blue Card directive present an example of the implementation of the principle in practice: the directive is not only designed to respond to the occurring EU labour demand by attracting highly skilled third-country professionals, but is also intended to contribute to the development objectives of the migrant-sending developing countries, primarily through the tool of circular migration and the consequent skills transfers. My objective in this study is to assess such twofold role of the EU Blue Card and to explore the idea that migration could be harnessed for the benefit of development in conformity with the notion that the two form a positive nexus. Seeing that the EU Blue Card fails to differentiate the most vulnerable countries and sectors from those that are in a better position to take advantage of the global migration flows, the developmental consequences of the directive must be accounted for even in the most severe settings. Accordingly, my intention is to question whether circular migration, as claimed, could address the problem of brain drain in the Malawian health sector, which has witnessed an excessive outflow of its professionals to the UK during the past decade. In order to assess the applicability, likelihood and relevance of circular migration and consequent skills transfers for development in the Malawian context, a field study of a total of 23 interviews with local health professionals was carried out in autumn 2010. The selected approach not only allows me to introduce a developing country perspective to the on-going discussion at the EU level, but also enables me to assess the development dimension of the EU Blue Card and the intended PCD principle through a local lens. Thus these interviews and local viewpoints are at the very heart of this study. Based on my findings from the field, the propensity of the EU Blue Card to result in circular migration and to address the persisting South-North migratory flows as well as the relevance of skills transfers can be called to question. This is as due to the bias in its twofold role the directive overlooks the importance of the sending country circumstances, which are known to determine any developmental outcomes of migration, and assumes that circular migration alone could bring about immediate benefits. Without initial emphasis on local conditions, however, positive outcomes for vulnerable countries such as Malawi are ever more distant. Indeed it seems as if the EU internal interests in migration policy forbid the fulfilment of the PCD principle and diminish the attempt to harness migration for development to bare rhetoric.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new feature-based technique is introduced to solve the nonlinear forward problem (FP) of the electrical capacitance tomography with the target application of monitoring the metal fill profile in the lost foam casting process. The new technique is based on combining a linear solution to the FP and a correction factor (CF). The CF is estimated using an artificial neural network (ANN) trained using key features extracted from the metal distribution. The CF adjusts the linear solution of the FP to account for the nonlinear effects caused by the shielding effects of the metal. This approach shows promising results and avoids the curse of dimensionality through the use of features and not the actual metal distribution to train the ANN. The ANN is trained using nine features extracted from the metal distributions as input. The expected sensors readings are generated using ANSYS software. The performance of the ANN for the training and testing data was satisfactory, with an average root-mean-square error equal to 2.2%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An adaptive regularization algorithm that combines elementwise photon absorption and data misfit is proposed to stabilize the non-linear ill-posed inverse problem. The diffuse photon distribution is low near the target compared to the normal region. A Hessian is proposed based on light and tissue interaction, and is estimated using adjoint method by distributing the sources inside the discretized domain. As iteration progresses, the photon absorption near the inhomogeneity becomes high and carries more weightage to the regularization matrix. The domain's interior photon absorption and misfit based adaptive regularization method improves quality of the reconstructed Diffuse Optical Tomographic images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resistivity imaging of a reconfigurable phantom with circular inhomogeneities is studied with a simple instrumentation and data acquisition system for Electrical Impedance Tomography. The reconfigurable phantom is developed with stainless steel electrodes and a sinusoidal current of constant amplitude is injected to the phantom boundary using opposite current injection protocol. Nylon and polypropylene cylinders with different cross sectional areas are kept inside the phantom and the boundary potential data are collected. The instrumentation and the data acquisition system with a DIP switch-based multiplexer board are used to inject a constant current of desired amplitude and frequency. Voltage data for the first eight current patterns (128 voltage data) are found to be sufficient to reconstruct the inhomogeneities and hence the acquisition time is reduced. Resistivity images are reconstructed from the boundary data for different inhomogeneity positions using EIDORS-2D. The results show that the shape and resistivity of the inhomogeneity as well as the background resistivity are successfully reconstructed from the potential data for single or double inhomogeneity phantoms. The resistivity images obtained from the single and double inhomogeneity phantom clearly indicate the inhomogeneity as the high resistive material. Contrast to noise ratio (CNR) and contrast recovery (CR) of the reconstructed images are found high for the inhomogeneities near all the electrodes arbitrarily chosen for the entire study. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a cache coherence protocol for multistage interconnection network (MIN)-based multiprocessors with two distinct private caches: private-blocks caches (PCache) containing blocks private to a process and shared-blocks caches (SCache) containing data accessible by all processes. The architecture is extended by a coherence control bus connecting all shared-block cache controllers. Timing problems due to variable transit delays through the MIN are dealt with by introducing Transient states in the proposed cache coherence protocol. The impact of the coherence protocol on system performance is evaluated through a performance study of three phases. Assuming homogeneity of all nodes, a single-node queuing model (phase 3) is developed to analyze system performance. This model is solved for processor and coherence bus utilizations using the mean value analysis (MVA) technique with shared-blocks steady state probabilities (phase 1) and communication delays (phase 2) as input parameters. The performance of our system is compared to that of a system with an equivalent-sized unified cache and with a multiprocessor implementing a directory-based coherence protocol. System performance measures are verified through simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multisensor recordings are becoming commonplace. When studying functional connectivity between different brain areas using such recordings, one defines regions of interest, and each region of interest is often characterized by a set (block) of time series. Presently, for two such regions, the interdependence is typically computed by estimating the ordinary coherence for each pair of individual time series and then summing or averaging the results over all such pairs of channels (one from block 1 and other from block 2). The aim of this paper is to generalize the concept of coherence so that it can be computed for two blocks of non-overlapping time series. This quantity, called block coherence, is first shown mathematically to have properties similar to that of ordinary coherence, and then applied to analyze local field potential recordings from a monkey performing a visuomotor task. It is found that an increase in block coherence between the channels from V4 region and the channels from prefrontal region in beta band leads to a decrease in response time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let D denote the open unit disk in C centered at 0. Let H-R(infinity) denote the set of all bounded and holomorphic functions defined in D that also satisfy f(z) = <(f <(z)over bar>)over bar> for all z is an element of D. It is shown that H-R(infinity) is a coherent ring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe here two non-interferometric methods for the estimation of the phase of transmitted wavefronts through refracting objects. The phase of the wavefronts obtained is used to reconstruct either the refractive index distribution of the objects or their contours. Refraction corrected reconstructions are obtained by the application of an iterative loop incorporating digital ray tracing for forward propagation and a modified filtered back projection (FBP) for reconstruction. The FBP is modified to take into account non-straight path propagation of light through the object. When the iteration stagnates, the difference between the projection data and an estimate of it obtained by ray tracing through the final reconstruction is reconstructed using a diffraction tomography algorithm. The reconstruction so obtained, viewed as a correction term, is added to the estimate of the object from the loop to obtain an improved final refractive index reconstruction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two methods based on wavelet/wavelet packet expansion to denoise and compress optical tomography data containing scattered noise are presented, In the first, the wavelet expansion coefficients of noisy data are shrunk using a soft threshold. In the second, the data are expanded into a wavelet packet tree upon which a best basis search is done. The resulting coefficients are truncated on the basis of energy content. It can be seen that the first method results in efficient denoising of experimental data when scattering particle density in the medium surrounding the object was up to 12.0 x 10(6) per cm(3). This method achieves a compression ratio of approximate to 8:1. The wavelet packet based method resulted in a compression of up to 11:1 and also exhibited reasonable noise reduction capability. Tomographic reconstructions obtained from denoised data are presented. (C) 1999 Published by Elsevier Science B.V. All rights reserved,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New metallurgical and ethnographic observations of the traditional manufacture of specular high-tin bronze mirrors in Kerala state of southern India are discussed, which is an exceptional example of a surviving craft practice of metal mirror-making in the world. The manufacturing process has been reconstructed from analytical investigations made by Srinivasan following a visit late in 1991 to a mirror making workshop and from her technical studies of equipment acquired by Glover in March 1992 from another group of mirror makers from Pathanamthita at an exhibition held at Crafts Museum, Delhi. Finished and unfinished mirror from two workshops were of a binary, copper-tin alloy of 33% tin which is close to the composition of pure delta phase, so that these mirrors are referred to here as ‘delta’ bronzes. For the first time, metallurgical and field observations were made by Srinivasan in 1991 of the manufacture of high-tin ‘beta’ bonze vessels from Palghat district, Kerala, i‥e of wrought and quenched 23% tin bronze. This has provided the first metallurgical record for a surviving craft of high-tin bronze bowl making which can be directly related to archaeological finds of high-tin bronze vessels from the Indian subcontinent and Southeast Asia. New analytical investigations are presented of high-tin beta bronzes from the Indian subcontinent which are some of the earliest reported worldwide. These coupled with the archaeometallurgical evidence suggests that these high-tin bronze techniques are part of a long, continuing, and probably indigenous tradition of the use of high-tin bronzes in the Indian subcontinent with finds reported even from Indus Valley sites. While the source of tin has been problematic, new evidence on bronze smelting slags and literary evidence suggests there may have been some sources of tin in South India.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explore a pseudodynamic form of the quadratic parameter update equation for diffuse optical tomographic reconstruction from noisy data. A few explicit and implicit strategies for obtaining the parameter updates via a semianalytical integration of the pseudodynamic equations are proposed. Despite the ill-posedness of the inverse problem associated with diffuse optical tomography, adoption of the quadratic update scheme combined with the pseudotime integration appears not only to yield higher convergence, but also a muted sensitivity to the regularization parameters, which include the pseudotime step size for integration. These observations are validated through reconstructions with both numerically generated and experimentally acquired data. (C) 2011 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We recast the reconstruction problem of diffuse optical tomography (DOT) in a pseudo-dynamical framework and develop a method to recover the optical parameters using particle filters, i.e., stochastic filters based on Monte Carlo simulations. In particular, we have implemented two such filters, viz., the bootstrap (BS) filter and the Gaussian-sum (GS) filter and employed them to recover optical absorption coefficient distribution from both numerically simulated and experimentally generated photon fluence data. Using either indicator functions or compactly supported continuous kernels to represent the unknown property distribution within the inhomogeneous inclusions, we have drastically reduced the number of parameters to be recovered and thus brought the overall computation time to within reasonable limits. Even though the GS filter outperformed the BS filter in terms of accuracy of reconstruction, both gave fairly accurate recovery of the height, radius, and location of the inclusions. Since the present filtering algorithms do not use derivatives, we could demonstrate accurate contrast recovery even in the middle of the object where the usual deterministic algorithms perform poorly owing to the poor sensitivity of measurement of the parameters. Consistent with the fact that the DOT recovery, being ill posed, admits multiple solutions, both the filters gave solutions that were verified to be admissible by the closeness of the data computed through them to the data used in the filtering step (either numerically simulated or experimentally generated). (C) 2011 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address a certain inverse problem in ultrasound-modulated optical tomography: the recovery of the amplitude of vibration of scatterers [p(r)] in the ultrasound focal volume in a diffusive object from boundary measurement of the modulation depth (M) of the amplitude autocorrelation of light [phi(r, tau)] traversing through it. Since M is dependent on the stiffness of the material, this is the precursor to elasticity imaging. The propagation of phi(r, tau) is described by a diffusion equation from which we have derived a nonlinear perturbation equation connecting p(r) and refractive index modulation [Delta n(r)] in the region of interest to M measured on the boundary. The nonlinear perturbation equation and its approximate linear counterpart are solved for the recovery of p(r). The numerical results reveal regions of different stiffness, proving that the present method recovers p(r) with reasonable quantitative accuracy and spatial resolution. (C) 2011 Optical Society of America