994 resultados para Resolution strategies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electromagnetic tomography has been applied to problems in nondestructive evolution, ground-penetrating radar, synthetic aperture radar, target identification, electrical well logging, medical imaging etc. The problem of electromagnetic tomography involves the estimation of cross sectional distribution dielectric permittivity, conductivity etc based on measurement of the scattered fields. The inverse scattering problem of electromagnetic imaging is highly non linear and ill posed, and is liable to get trapped in local minima. The iterative solution techniques employed for computing the inverse scattering problem of electromagnetic imaging are highly computation intensive. Thus the solution to electromagnetic imaging problem is beset with convergence and computational issues. The attempt of this thesis is to develop methods suitable for improving the convergence and reduce the total computations for tomographic imaging of two dimensional dielectric cylinders illuminated by TM polarized waves, where the scattering problem is defmed using scalar equations. A multi resolution frequency hopping approach was proposed as opposed to the conventional frequency hopping approach employed to image large inhomogeneous scatterers. The strategy was tested on both synthetic and experimental data and gave results that were better localized and also accelerated the iterative procedure employed for the imaging. A Degree of Symmetry formulation was introduced to locate the scatterer in the investigation domain when the scatterer cross section was circular. The investigation domain could thus be reduced which reduced the degrees of freedom of the inverse scattering process. Thus the entire measured scattered data was available for the optimization of fewer numbers of pixels. This resulted in better and more robust reconstructions of the scatterer cross sectional profile. The Degree of Symmetry formulation could also be applied to the practical problem of limited angle tomography, as in the case of a buried pipeline, where the ill posedness is much larger. The formulation was also tested using experimental data generated from an experimental setup that was designed. The experimental results confirmed the practical applicability of the formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Robots must act purposefully and successfully in an uncertain world. Sensory information is inaccurate or noisy, actions may have a range of effects, and the robot's environment is only partially and imprecisely modeled. This thesis introduces active randomization by a robot, both in selecting actions to execute and in focusing on sensory information to interpret, as a basic tool for overcoming uncertainty. An example of randomization is given by the strategy of shaking a bin containing a part in order to orient the part in a desired stable state with some high probability. Another example consists of first using reliable sensory information to bring two parts close together, then relying on short random motions to actually mate the two parts, once the part motions lie below the available sensing resolution. Further examples include tapping parts that are tightly wedged, twirling gears before trying to mesh them, and vibrating parts to facilitate a mating operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three naming strategies are discussed that allow the processes of a distributed application to continue being addressed by their original logical name, along all the migrations they may be forced to undertake because of performance-improvement goals. A simple centralised solution is firstly discussed which showed a software bottleneck with the increase of the number of processes; other two solutions are considered that entail different communication schemes and different communication overheads for the naming protocol. All these strategies are based on the facility that each process is allowed to survive after migration, even in its original site, only to provide a forwarding service to those communications that used its obsolete address.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For first-order Horn clauses without equality, resolution is complete with an arbitrary selection of a single literal in each clause [dN 96]. Here we extend this result to the case of clauses with equality for superposition-based inference systems. Our result is a generalization of the result given in [BG 01]. We answer their question about the completeness of a superposition-based system for general clauses with an arbitrary selection strategy, provided there exists a refutation without applications of the factoring inference rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an automatic methodology for road network extraction from medium-and high-resolution aerial images. It is based on two steps. In the first step, the road seeds (i.e., road segments) are extracted using a set of four road objects and another set of connection rules among road objects. Each road object is a local representation of an approximately straight road fragment and its construction is based on a combination of polygons describing all relevant image edges, according to some rules embodying road knowledge. Each road seed is composed by a sequence of connected road objects in which each sequence of this type can be geometrically structured as a chain of contiguous quadrilaterals. In the second step, two strategies for road completion are applied in order to generate the complete road network. The first strategy is based on two basic perceptual grouping rules, i.e., proximity and collinearity rules, which allow the sequential reconstruction of gaps between every pair of disconnected road segments. This strategy does not allow the reconstruction of road crossings, but it allows the extraction of road centerlines from the contiguous quadrilaterals representing connected road segments. The second strategy for road completion aims at reconstructing road crossings. Firstly, the road centerlines are used to find reference points for road crossings, which are their approximate positions. Then these points are used to extract polygons representing the contours of road crossings. This paper presents the proposed methodology and experimental results. © Pleiades Publishing, Inc. 2006.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Subjects with a ring scotoma can use two retinal loci, a foveal and a peripheral, for reading. Our aim was to investigate the relative use of both retinal loci as a function of the spared foveal area size and the spatial resolution at both retinal loci. FINDINGS: Two patients with Stargardt's disease and ring scotomas read through a scanning laser ophthalmoscope a series of letters and words at various character sizes. The number of fixations made using each retinal locus was quantified. The relative use of each retinal locus depended on character size of the stimulus. Both patients used exclusively the eccentric retinal locus to read words of large character sizes. At small character sizes, the central retinal locus was predominantly used. For reading letters or words, once foveal fixation was used, patients did not shift back to the eccentric retinal locus. When spatial resolution allowed deciphering at both the eccentric and the central areas, patients consistently fixated with the eccentric retinal locus. CONCLUSIONS: Spatial resolution at the eccentric locus appears as a determinant factor to select the retinal area for reading. Reading strategies in patients with Stargardt's disease and a ring scotoma demonstrate a pattern of coordination of both eccentric and central retinal loci, reflecting a high degree of adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A gene expression atlas is an essential resource to quantify and understand the multiscale processes of embryogenesis in time and space. The automated reconstruction of a prototypic 4D atlas for vertebrate early embryos, using multicolor fluorescence in situ hybridization with nuclear counterstain, requires dedicated computational strategies. To this goal, we designed an original methodological framework implemented in a software tool called Match-IT. With only minimal human supervision, our system is able to gather gene expression patterns observed in different analyzed embryos with phenotypic variability and map them onto a series of common 3D templates over time, creating a 4D atlas. This framework was used to construct an atlas composed of 6 gene expression templates from a cohort of zebrafish early embryos spanning 6 developmental stages from 4 to 6.3 hpf (hours post fertilization). They included 53 specimens, 181,415 detected cell nuclei and the segmentation of 98 gene expression patterns observed in 3D for 9 different genes. In addition, an interactive visualization software, Atlas-IT, was developed to inspect, supervise and analyze the atlas. Match-IT and Atlas-IT, including user manuals, representative datasets and video tutorials, are publicly and freely available online. We also propose computational methods and tools for the quantitative assessment of the gene expression templates at the cellular scale, with the identification, visualization and analysis of coexpression patterns, synexpression groups and their dynamics through developmental stages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conflict resolution is a key issue to manage when dealing with diverse stakeholders. By analysing in depth the most relevant and implicit aspects of the construct "conflict", this study focuses on examining how the five main strategies in solving common disagreements are adopted by considering different conflict sources. Hypotheses are tested using data collected from both the academic and business world. Perceptions of project managers and team members allows the authors not only to find significant differences by role played or type of organization, but to narrow the design of future approaches to investigate the relation between conflict and project performance. More specifically, the research indicates that project managers adopt confronting and compromising styles in most cases as first options, highlighting the influence of responsibility degree factor in how issues are undertaken within a project team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a highly reliable physical map with landmark sites spaced an average of 100 kbp apart has been a central goal of the Human Genome Project. We have approached the physical mapping of human chromosome 11 with this goal as a primary target. We have focused on strategies that would utilize yeast artificial chromosome (YAC) technology, thus permitting long-range coverage of hundreds of kilobases of genomic DNA, yet we sought to minimize the ambiguities inherent in the use of this technology, particularly the occurrence of chimeric genomic DNA clones. This was achieved through the development of a chromosome 11-specific YAC library from a human somatic cell hybrid line that has retained chromosome 11 as its sole human component.To maximize the efficiency of YAC contig assembly and extension, we have employed an Alu-PCR-based hybridization screening system. This system eliminates many of the more costly and time-consuming steps associated with sequence tagged site content mapping such as sequencing, primer production, and hierarchical screening, resulting in greater efficiency with increased throughput and reduced cost. Using these approaches, we have achieved YAC coverage for >90% of human chromosome 11, with an average intermarker distance of <100 kbp. Cytogenetic localization has been determined for each contig by fluorescent in situ hybridization and/or sequence tagged site content. The YAC contigs that we have generated should provide a robust framework to move forward to sequence-ready templates for the sequencing efforts of the Human Genome Project as well as more focused positional cloning on chromosome 11.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is often claimed that policy makers and scholars inhabit different worlds and have little for each other. We challenge this perception and claim that there is a strong symbiotic relationship between the two. This relationship is particularly strong in the field of conflict where policy makers may be in desperate need of guidelines, advice and analysis on how to transform complex conflict situations into more peaceful ones. We suggest that policy makers may think in terms of macro and micro-level theories and ideas if they wish to embrace better strategies of conflict resolution.