983 resultados para Dark objects method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Differentiation of various types of soft tissues is of high importance in medical imaging, because changes in soft tissue structure are often associated with pathologies, such as cancer. However, the densities of different soft tissues may be very similar, making it difficult to distinguish them in absorption images. This is especially true when the consideration of patient dose limits the available signal-to-noise ratio. Refraction is more sensitive than absorption to changes in the density, and small angle x-ray scattering on the other hand contains information about the macromolecular structure of the tissues. Both of these can be used as potential sources of contrast when soft tissues are imaged, but little is known about the visibility of the signals in realistic imaging situations. In this work the visibility of small-angle scattering and refraction in the context of medical imaging has been studied using computational methods. The work focuses on the study of analyzer based imaging, where the information about the sample is recorded in the rocking curve of the analyzer crystal. Computational phantoms based on simple geometrical shapes with differing material properties are used. The objects have realistic dimensions and attenuation properties that could be encountered in real imaging situations. The scattering properties mimic various features of measured small-angle scattering curves. Ray-tracing methods are used to calculate the refraction and attenuation of the beam, and a scattering halo is accumulated, including the effect of multiple scattering. The changes in the shape of the rocking curve are analyzed with different methods, including diffraction enhanced imaging (DEI), extended DEI (E-DEI) and multiple image radiography (MIR). A wide angle DEI, called W-DEI, is introduced and its performance is compared with that of the established methods. The results indicate that the differences in scattered intensities from healthy and malignant breast tissues are distinguishable to some extent with reasonable dose. Especially the fraction of total scattering has large enough differences that it can serve as a useful source of contrast. The peaks related to the macromolecular structure come to angles that are rather large, and have intensities that are only a small fraction of the total scattered intensity. It is found that such peaks seem to have only limited usefulness in medical imaging. It is also found that W-DEI performs rather well when most of the intensity remains in the direct beam, indicating that dark field imaging methods may produce the best results when scattering is weak. Altogether, it is found that the analysis of scattered intensity is a viable option even in medical imaging where the patient dose is the limiting factor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to develop and trial a method to monitor the evolution of clinical reasoning in a PBL curriculum that is suitable for use in a large medical school. Termed Clinical Reasoning Problems (CRPs), it is based on the notion that clinical reasoning is dependent on the identification and correct interpretation of certain critical clinical features. Each problem consists of a clinical scenario comprising presentation, history and physical examination. Based on this information, subjects are asked to nominate the two most likely diagnoses and to list the clinical features that they considered in formulating their diagnoses, indicating whether these features supported or opposed the nominated diagnoses. Students at different levels of medical training completed a set of 10 CRPs as well as the Diagnostic Thinking Inventory, a self-reporting questionnaire designed to assess reasoning style. Responses were scored against those of a reference group of general practitioners. Results indicate that the CRPs are an easily administered, reliable and valid assessment of clinical reasoning, able to successfully monitor its development throughout medical training. Consequently, they can be employed to assess clinical reasoning skill in individual students and to evaluate the success of undergraduate medical schools in providing effective tuition in clinical reasoning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We establish a unified model to explain Quasi-Periodic-Oscillation (QPO) observed from black hole and neutron star systems globally. This is based on the accreting systems thought to be damped harmonic oscillators with higher order nonlinearity. The model explains multiple properties parallelly independent of the nature of the compact object. It describes QPOs successfully for several compact sources. Based on it, we predict the spin frequency of the neutron star Sco X-1 and the specific angular momentum of black holes GRO J1655-40, GRS 1915+105.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

α-Manganese dioxide is synthesized in a microemulsion medium by a redox reaction between KMnO4 and MnSO4 in presence of sodium dodecyl sulphate as a surface active agent. The morphology of MnO2 resembles nanopetals, which are spread parallel to the field. The material is further characterized by powder X-ray diffraction, energy dispersive analysis of X-ray, and Brunauer–Emmett–Teller surface area. Supercapacitance property of α-MnO2 nanopetals is studied by cyclic voltammetry and galvanostatic charge–discharge cycling. High values of specific capacitance are obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An efficient and statistically robust solution for the identification of asteroids among numerous sets of astrometry is presented. In particular, numerical methods have been developed for the short-term identification of asteroids at discovery, and for the long-term identification of scarcely observed asteroids over apparitions, a task which has been lacking a robust method until now. The methods are based on the solid foundation of statistical orbital inversion properly taking into account the observational uncertainties, which allows for the detection of practically all correct identifications. Through the use of dimensionality-reduction techniques and efficient data structures, the exact methods have a loglinear, that is, O(nlog(n)), computational complexity, where n is the number of included observation sets. The methods developed are thus suitable for future large-scale surveys which anticipate a substantial increase in the astrometric data rate. Due to the discontinuous nature of asteroid astrometry, separate sets of astrometry must be linked to a common asteroid from the very first discovery detections onwards. The reason for the discontinuity in the observed positions is the rotation of the observer with the Earth as well as the motion of the asteroid and the observer about the Sun. Therefore, the aim of identification is to find a set of orbital elements that reproduce the observed positions with residuals similar to the inevitable observational uncertainty. Unless the astrometric observation sets are linked, the corresponding asteroid is eventually lost as the uncertainty of the predicted positions grows too large to allow successful follow-up. Whereas the presented identification theory and the numerical comparison algorithm are generally applicable, that is, also in fields other than astronomy (e.g., in the identification of space debris), the numerical methods developed for asteroid identification can immediately be applied to all objects on heliocentric orbits with negligible effects due to non-gravitational forces in the time frame of the analysis. The methods developed have been successfully applied to various identification problems. Simulations have shown that the methods developed are able to find virtually all correct linkages despite challenges such as numerous scarce observation sets, astrometric uncertainty, numerous objects confined to a limited region on the celestial sphere, long linking intervals, and substantial parallaxes. Tens of previously unknown main-belt asteroids have been identified with the short-term method in a preliminary study to locate asteroids among numerous unidentified sets of single-night astrometry of moving objects, and scarce astrometry obtained nearly simultaneously with Earth-based and space-based telescopes has been successfully linked despite a substantial parallax. Using the long-term method, thousands of realistic 3-linkages typically spanning several apparitions have so far been found among designated observation sets each spanning less than 48 hours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scene understanding has been investigated from a mainly visual information point of view. Recently depth has been provided an extra wealth of information, allowing more geometric knowledge to fuse into scene understanding. Yet to form a holistic view, especially in robotic applications, one can create even more data by interacting with the world. In fact humans, when growing up, seem to heavily investigate the world around them by haptic exploration. We show an application of haptic exploration on a humanoid robot in cooperation with a learning method for object segmentation. The actions performed consecutively improve the segmentation of objects in the scene.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluated trained listener-based acoustic sampling as a reliable and non-invasive method for rapid assessment of ensiferan species diversity in tropical evergreen forests. This was done by evaluating the reliability of identification of species and numbers of calling individuals using psychoacoustic experiments in the laboratory and by comparing psychoacoustic sampling in the field with ambient noise recordings made at the same time. The reliability of correct species identification by the trained listener was 100% for 16 out of 20 species tested in the laboratory. The reliability of identifying the numbers of individuals correctly was 100% for 13 out of 20 species. The human listener performed slightly better than the instrument in detecting low frequency and broadband calls in the field, whereas the recorder detected high frequency calls with greater probability. To address the problem of pseudoreplication during spot sampling in the field, we monitored the movement of calling individuals using focal animal sampling. The average distance moved by calling individuals for 17 out of 20 species was less than 1.5 m in half an hour. We suggest that trained listener-based sampling is preferable for crickets and low frequency katydids, whereas broadband recorders are preferable for katydid species with high frequency calls for accurate estimation of ensiferan species richness and relative abundance in an area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe the prevalence and demographic, clinical and functional correlates of childhood trauma in patients attending early psychosis clinics. Method: Participants were recruited from outpatients attending four early psychosis services. Exposure to childhood trauma was assessed using the Childhood Trauma Questionnaire (CTQ). Psychopathology was measured using the Positive and Negative Syndrome Scale and the Depression, Anxiety and Stress Scale. Social and vocational functioning and substance use were also assessed. Results: Over three-quarters of the 100 patients reported exposure to any childhood trauma. Emotional, physical and sexual abuse were reported by 54%, 23% and 28% of patients, respectively, while 49% and 42% of patients reported emotional and physical neglect, respectively. Female participants were significantly more likely to be exposed to emotional and sexual abuse. Exposure to childhood trauma was correlated with positive psychotic symptoms and higher levels of depressive, anxiety and stress symptoms; however, it had no impact on social or vocational functioning or recent substance use. Conclusion: Exposure to childhood trauma was common in patients with early psychosis, and associated with increased symptomatology. Existing recommendations that standard clinical assessment of patients with early psychosis should include inquiry into exposure to childhood trauma are supported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports a numerical method for modelling the elastic wave propagation in plates. The method is based on the partition of unity approach, in which the approximate spectral properties of the infinite dimensional system are embedded within the space of a conventional finite element method through a consistent technique of waveform enrichment. The technique is general, such that it can be applied to the Lagrangian family of finite elements with specific waveform enrichment schemes, depending on the dominant modes of wave propagation in the physical system. A four-noded element for the Reissner-indlin plate is derived in this paper, which is free of shear locking. Such a locking-free property is achieved by removing the transverse displacement degrees of freedom from the element nodal variables and by recovering the same through a line integral and a weak constraint in the frequency domain. As a result, the frequency-dependent stiffness matrix and the mass matrix are obtained, which capture the higher frequency response with even coarse meshes, accurately. The steps involved in the numerical implementation of such element are discussed in details. Numerical studies on the performance of the proposed element are reported by considering a number of cases, which show very good accuracy and low computational cost. Copyright (C)006 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New stars form in dense interstellar clouds of gas and dust called molecular clouds. The actual sites where the process of star formation takes place are the dense clumps and cores deeply embedded in molecular clouds. The details of the star formation process are complex and not completely understood. Thus, determining the physical and chemical properties of molecular cloud cores is necessary for a better understanding of how stars are formed. Some of the main features of the origin of low-mass stars, like the Sun, are already relatively well-known, though many details of the process are still under debate. The mechanism through which high-mass stars form, on the other hand, is poorly understood. Although it is likely that the formation of high-mass stars shares many properties similar to those of low-mass stars, the very first steps of the evolutionary sequence are unclear. Observational studies of star formation are carried out particularly at infrared, submillimetre, millimetre, and radio wavelengths. Much of our knowledge about the early stages of star formation in our Milky Way galaxy is obtained through molecular spectral line and dust continuum observations. The continuum emission of cold dust is one of the best tracers of the column density of molecular hydrogen, the main constituent of molecular clouds. Consequently, dust continuum observations provide a powerful tool to map large portions across molecular clouds, and to identify the dense star-forming sites within them. Molecular line observations, on the other hand, provide information on the gas kinematics and temperature. Together, these two observational tools provide an efficient way to study the dense interstellar gas and the associated dust that form new stars. The properties of highly obscured young stars can be further examined through radio continuum observations at centimetre wavelengths. For example, radio continuum emission carries useful information on conditions in the protostar+disk interaction region where protostellar jets are launched. In this PhD thesis, we study the physical and chemical properties of dense clumps and cores in both low- and high-mass star-forming regions. The sources are mainly studied in a statistical sense, but also in more detail. In this way, we are able to examine the general characteristics of the early stages of star formation, cloud properties on large scales (such as fragmentation), and some of the initial conditions of the collapse process that leads to the formation of a star. The studies presented in this thesis are mainly based on molecular line and dust continuum observations. These are combined with archival observations at infrared wavelengths in order to study the protostellar content of the cloud cores. In addition, centimetre radio continuum emission from young stellar objects (YSOs; i.e., protostars and pre-main sequence stars) is studied in this thesis to determine their evolutionary stages. The main results of this thesis are as follows: i) filamentary and sheet-like molecular cloud structures, such as infrared dark clouds (IRDCs), are likely to be caused by supersonic turbulence but their fragmentation at the scale of cores could be due to gravo-thermal instability; ii) the core evolution in the Orion B9 star-forming region appears to be dynamic and the role played by slow ambipolar diffusion in the formation and collapse of the cores may not be significant; iii) the study of the R CrA star-forming region suggests that the centimetre radio emission properties of a YSO are likely to change with its evolutionary stage; iv) the IRDC G304.74+01.32 contains candidate high-mass starless cores which may represent the very first steps of high-mass star and star cluster formation; v) SiO outflow signatures are seen in several high-mass star-forming regions which suggest that high-mass stars form in a similar way as their low-mass counterparts, i.e., via disk accretion. The results presented in this thesis provide constraints on the initial conditions and early stages of both low- and high-mass star formation. In particular, this thesis presents several observational results on the early stages of clustered star formation, which is the dominant mode of star formation in our Galaxy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the approach for automatic road extraction for an urban region using structural, spectral and geometric characteristics of roads has been presented. Roads have been extracted based on two levels: Pre-processing and road extraction methods. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, parking lots, vegetation regions and other open spaces). The road segments are then extracted using Texture Progressive Analysis (TPA) and Normalized cut algorithm. The TPA technique uses binary segmentation based on three levels of texture statistical evaluation to extract road segments where as, Normalizedcut method for road extraction is a graph based method that generates optimal partition of road segments. The performance evaluation (quality measures) for road extraction using TPA and normalized cut method is compared. Thus the experimental result show that normalized cut method is efficient in extracting road segments in urban region from high resolution satellite image.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acceleration of the universe has been established but not explained. During the past few years precise cosmological experiments have confirmed the standard big bang scenario of a flat universe undergoing an inflationary expansion in its earliest stages, where the perturbations are generated that eventually form into galaxies and other structure in matter, most of which is non-baryonic dark matter. Curiously, the universe has presently entered into another period of acceleration. Such a result is inferred from observations of extra-galactic supernovae and is independently supported by the cosmic microwave background radiation and large scale structure data. It seems there is a positive cosmological constant speeding up the universal expansion of space. Then the vacuum energy density the constant describes should be about a dozen times the present energy density in visible matter, but particle physics scales are enormously larger than that. This is the cosmological constant problem, perhaps the greatest mystery of contemporary cosmology. In this thesis we will explore alternative agents of the acceleration. Generically, such are called dark energy. If some symmetry turns off vacuum energy, its value is not a problem but one needs some dark energy. Such could be a scalar field dynamically evolving in its potential, or some other exotic constituent exhibiting negative pressure. Another option is to assume that gravity at cosmological scales is not well described by general relativity. In a modified theory of gravity one might find the expansion rate increasing in a universe filled by just dark matter and baryons. Such possibilities are taken here under investigation. The main goal is to uncover observational consequences of different models of dark energy, the emphasis being on their implications for the formation of large-scale structure of the universe. Possible properties of dark energy are investigated using phenomenological paramaterizations, but several specific models are also considered in detail. Difficulties in unifying dark matter and dark energy into a single concept are pointed out. Considerable attention is on modifications of gravity resulting in second order field equations. It is shown that in a general class of such models the viable ones represent effectively the cosmological constant, while from another class one might find interesting modifications of the standard cosmological scenario yet allowed by observations. The thesis consists of seven research papers preceded by an introductory discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Properties of nanoparticles are size dependent, and a model to predict particle size is of importance. Gold nanoparticles are commonly synthesized by reducing tetrachloroauric acid with trisodium citrate, a method pioneered by Turkevich et al (Discuss. Faraday Soc. 1951, 11, 55). Data from several investigators that used this method show that when the ratio of initial concentrations of citrate to gold is varied from 0.4 to similar to 2, the final mean size of the particles formed varies by a factor of 7, while subsequent increases in the ratio hardly have any effect on the size. In this paper, a model is developed to explain this widely varying dependence. The steps that lead to the formation of particles are as follows: reduction of Au3+ in solution, disproportionation of Au+ to gold atoms and their nucleation, growth by disproportionation on particle surface, and coagulation. Oxidation of citrate results in the formation of dicarboxy acetone, which aids nucleation but also decomposes into side products. A detailed kinetic model is developed on the basis of these steps and is combined with population balance to predict particle-size distribution. The model shows that, unlike the usual balance between nucleation and growth that determines the particle size, it is the balance between rate of nucleation and degradation of dicarboxy acetone that determines the particle size in the citrate process. It is this feature that is able to explain the unusual dependence of the mean particle size on the ratio of citrate to gold salt concentration. It is also found that coagulation plays an important role in determining the particle size at high concentrations of citrate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of the physical properties of asteroids is crucial in many branches of solar-system research. Knowledge of the spin states and shapes is needed, e.g., for accurate orbit determination and to study the history and evolution of the asteroids. In my thesis, I present new methods for using photometric lightcurves of asteroids in the determination of their spin states and shapes. The convex inversion method makes use of a general polyhedron shape model and provides us at best with an unambiguous spin solution and a convex shape solution that reproduces the main features of the original shape. Deriving information about the non-convex shape features is, in principle, also possible, but usually requires a priori information about the object. Alternatively, a distribution of non-convex solutions, describing the scale of the non-convexities, is also possible to be obtained. Due to insufficient number of absolute observations and inaccurately defined asteroid phase curves, the $c/b$-ratio, i.e., the flatness of the shape model is often somewhat ill-defined. However, especially in the case of elongated objects, the flatness seems to be quite well constrained, even in the case when only relative lightcurves are available. The results prove that it is, contrary to the earlier misbelief, possible to derive shape information from the lightcurve data if a sufficiently wide range of observing geometries is covered by the observations. Along with the more accurate shape models, also the rotational states, i.e., spin vectors and rotation periods, are defined with improved accuracy. The shape solutions obtained so far reveal a population of irregular objects whose most descriptive shape characteristics, however, can be expressed with only a few parameters. Preliminary statistical analyses for the shapes suggests that there are correlations between shape and other physical properties, such as the size, rotation period and taxonomic type of the asteroids. More shape data of, especially, the smallest and largest asteroids, as well as the fast and slow rotators is called for in order to be able to study the statistics more thoroughly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Instability in conventional haptic rendering destroys the perception of rigid objects in virtual environments. Inherent limitations in the conventional haptic loop restrict the maximum stiffness that can be rendered. In this paper we present a method to render virtual walls that are much stiffer than those achieved by conventional techniques. By removing the conventional digital haptic loop and replacing it with a part-continuous and part-discrete time hybrid haptic loop, we were able to render stiffer walls. The control loop is implemented as a combinational logic circuit on an field-programmable gate array. We compared the performance of the conventional haptic loop and our hybrid haptic loop on the same haptic device, and present mathematical analysis to show the limit of stability of our device. Our hybrid method removes the computer-intensive haptic loop from the CPU-this can free a significant amount of resources that can be used for other purposes such as graphical rendering and physics modeling. It is our hope that, in the future, similar designs will lead to a haptics processing unit (HPU).