595 resultados para merging


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most general quantum mechanical wave equation for a massive scalar particle in a metric generated by a spherically symmetric mass distribution is considered within the framework of higher derivative gravity (HDG). The exact effective Hamiltonian is constructed and the significance of the various terms is discussed using the linearized version of the above-mentioned theory. Not only does this analysis shed new light on the long standing problem of quantum gravity concerning the exact nature of the coupling between a massive scalar field and the background geometry, it also greatly improves our understanding of the role of HDG's coupling parameters in semiclassical calculations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Panic disorder patients are vulnerable to recurrent panic attacks. Two neurochemical hypotheses have been proposed to explain this susceptibility. The first assumes that panic patients have deficient serotonergic inhibition of neurons localized in the dorsal periaqueductal gray matter of the midbrain that organize defensive reactions to cope with proximal threats and of sympathomotor control areas of the rostral ventrolateral medulla that generate most of the neurovegetative symptoms of the panic attack. The second suggests that endogenous opioids buffer normal subjects from the behavioral and physiological manifestations of the panic attack, and their deficit brings about heightened suffocation sensitivity and separation anxiety in panic patients, making them more vulnerable to panic attacks. Experimental results obtained in rats performing one-way escape in the elevated T-maze, an animal model of panic, indicate that the inhibitory action of serotonin on defense is connected with activation of endogenous opioids in the periaqueductal gray. This allows reconciliation of the serotonergic and opioidergic hypotheses of panic pathophysiology, the periaqueductal gray being the fulcrum of serotonin-opioid interaction.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many observed time series of the global radiosonde or PILOT networks exist as fragments distributed over different archives. Identifying and merging these fragments can enhance their value for studies on the three-dimensional spatial structure of climate change. The Comprehensive Historical Upper-Air Network (CHUAN version 1.7), which was substantially extended in 2013, and the Integrated Global Radiosonde Archive (IGRA) are the most important collections of upper-air measurements taken before 1958. CHUAN (tracked) balloon data start in 1900, with higher numbers from the late 1920s onward, whereas IGRA data start in 1937. However, a substantial fraction of those measurements have not been taken at synoptic times (preferably 00:00 or 12:00 GMT) and on altitude levels instead of standard pressure levels. To make them comparable with more recent data, the records have been brought to synoptic times and standard pressure levels using state-of-the-art interpolation techniques, employing geopotential information from the National Oceanic and Atmospheric Administration (NOAA) 20th Century Reanalysis (NOAA 20CR). From 1958 onward the European Re-Analysis archives (ERA-40 and ERA-Interim) available at the European Centre for Medium-Range Weather Forecasts (ECMWF) are the main data sources. These are easier to use, but pilot data still have to be interpolated to standard pressure levels. Fractions of the same records distributed over different archives have been merged, if necessary, taking care that the data remain traceable back to their original sources. If possible, station IDs assigned by the World Meteorological Organization (WMO) have been allocated to the station records. For some records which have never been identified by a WMO ID, a local ID above 100 000 has been assigned. The merged data set contains 37 wind records longer than 70 years and 139 temperature records longer than 60 years. It can be seen as a useful basis for further data processing steps, most notably homogenization and gridding, after which it should be a valuable resource for climatological studies. Homogeneity adjustments for wind using the NOAA-20CR as a reference are described in Ramella Pralungo and Haimberger (2014). Reliable homogeneity adjustments for temperature beyond 1958 using a surface-data-only reanalysis such as NOAA-20CR as a reference have yet to be created. All the archives and metadata files are available in ASCII and netCDF format in the PANGAEA archive

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The North Atlantic jet stream during winter 2010 was unusually zonal, so the typically separated Atlantic and African jets were merged into one zonal jet. Moreover, the latitude–height structure and temporal variability of the North Atlantic jet during this winter were more characteristic of the North Pacific. This work examines the possibility of a flow regime change from an eddy-driven to a mixed eddy–thermally driven jet. A monthly jet zonality index is defined, which shows that a persistent merged jet state has occurred in the past, both at the end of the 1960s and during a few sporadic months. The anomalously zonal jet is found to be associated with anomalous tropical Pacific diabatic heating and eddy anomalies similar to those found during a negative North Atlantic Oscillation (NAO) state. A Lagrangian back-trajectory diagnosis of eight winters suggests the tropical Pacific is a source of momentum to the Atlantic and African jets and that this source was stronger during the winter of 2010. The results suggest that the combination of weak eddy variance and fluxes in the North Atlantic, along with strong tropical heating, act to push the jet toward a merged eddy–thermally driven state. The authors also find significant SST anomalies in the North Atlantic, which reinforce the anomalous zonal winds, particularly in the eastern Atlantic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To reach the goals established by the Institute of Medicine (IOM) and the Centers for Disease Control's (CDC) STOP TB USA, measures must be taken to curtail a future peak in Tuberculosis (TB) incidence and speed the currently stagnant rate of TB elimination. Both efforts will require, at minimum, the consideration and understanding of the third dimension of TB transmission: the location-based spread of an airborne pathogen among persons known and unknown to each other. This consideration will require an elucidation of the areas within the U.S. that have endemic TB. The Houston Tuberculosis Initiative (HTI) was a population-based active surveillance of confirmed Houston/Harris County TB cases from 1995–2004. Strengths in this dataset include the molecular characterization of laboratory confirmed cases, the collection of geographic locations (including home addresses) frequented by cases, and the HTI time period that parallels a decline in TB incidence in the United States (U.S.). The HTI dataset was used in this secondary data analysis to implement a GIS analysis of TB cases, the locations frequented by cases, and their association with risk factors associated with TB transmission. ^ This study reports, for the first time, the incidence of TB among the homeless in Houston, Texas. The homeless are an at-risk population for TB disease, yet they are also a population whose TB incidence has been unknown and unreported due to their non-enumeration. The first section of this dissertation identifies local areas in Houston with endemic TB disease. Many Houston TB cases who reported living in these endemic areas also share the TB risk factor of current or recent homelessness. Merging the 2004–2005 Houston enumeration of the homeless with historical HTI surveillance data of TB cases in Houston enabled this first-time report of TB risk among the homeless in Houston. The homeless were more likely to be US-born, belong to a genotypic cluster, and belong to a cluster of a larger size. The calculated average incidence among homeless persons was 411/100,000, compared to 9.5/100,000 among housed. These alarming rates are not driven by a co-infection but by social determinants. The unsheltered persons were hospitalized more days and required more follow-up time by staff than those who reported a steady housing situation. The homeless are a specific example of the increased targeting of prevention dollars that could occur if TB rates were reported for specific areas with known health disparities rather than as a generalized rate normalized over a diverse population. ^ It has been estimated that 27% of Houstonians use public transportation. The city layout allows bus routes to run like veins connecting even the most diverse of populations within the metropolitan area. Secondary data analysis of frequent bus use (defined as riding a route weekly) among TB cases was assessed for its relationship with known TB risk factors. The spatial distribution of genotypic clusters associated with bus use was assessed, along with the reported routes and epidemiologic-links among cases belonging to the identified clusters. ^ TB cases who reported frequent bus use were more likely to have demographic and social risk factors associated with poverty, immune suppression and health disparities. An equal proportion of bus riders and non-bus riders were cultured for Mycobacterium tuberculosis, yet 75% of bus riders were genotypically clustered, indicating recent transmission, compared to 56% of non-bus riders (OR=2.4, 95%CI(2.0, 2.8), p<0.001). Bus riders had a mean cluster size of 50.14 vs. 28.9 (p<0.001). Second order spatial analysis of clustered fingerprint 2 (n=122), a Beijing family cluster, revealed geographic clustering among cases based on their report of bus use. Univariate and multivariate analysis of routes reported by cases belonging to these clusters found that 10 of the 14 clusters were associated with use. Individual Metro routes, including one route servicing the local hospitals, were found to be risk factors for belonging to a cluster shown to be endemic in Houston. The routes themselves geographically connect the census tracts previously identified as having endemic TB. 78% (15/23) of Houston Metro routes investigated had one or more print groups reporting frequent use for every HTI study year. We present data on three specific but clonally related print groups and show that bus-use is clustered in time by route and is the only known link between cases in one of the three prints: print 22. (Abstract shortened by UMI.)^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the 2005 Miracle’s team approach to the Ad-Hoc Information Retrieval tasks. The goal for the experiments this year was twofold: to continue testing the effect of combination approaches on information retrieval tasks, and improving our basic processing and indexing tools, adapting them to new languages with strange encoding schemes. The starting point was a set of basic components: stemming, transforming, filtering, proper nouns extraction, paragraph extraction, and pseudo-relevance feedback. Some of these basic components were used in different combinations and order of application for document indexing and for query processing. Second-order combinations were also tested, by averaging or selective combination of the documents retrieved by different approaches for a particular query. In the multilingual track, we concentrated our work on the merging process of the results of monolingual runs to get the overall multilingual result, relying on available translations. In both cross-lingual tracks, we have used available translation resources, and in some cases we have used a combination approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Once admitted the advantages of object-based classification compared to pixel-based classification; the need of simple and affordable methods to define and characterize objects to be classified, appears. This paper presents a new methodology for the identification and characterization of objects at different scales, through the integration of spectral information provided by the multispectral image, and textural information from the corresponding panchromatic image. In this way, it has defined a set of objects that yields a simplified representation of the information contained in the two source images. These objects can be characterized by different attributes that allow discriminating between different spectral&textural patterns. This methodology facilitates information processing, from a conceptual and computational point of view. Thus the vectors of attributes defined can be used directly as training pattern input for certain classifiers, as for example artificial neural networks. Growing Cell Structures have been used to classify the merged information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-invasive quantitative assessment of the right ventricular anatomical and functional parameters is a challenging task. We present a semi-automatic approach for right ventricle (RV) segmentation from 4D MR images in two variants, which differ in the amount of user interaction. The method consists of three main phases: First, foreground and background markers are generated from the user input. Next, an over-segmented region image is obtained applying a watershed transform. Finally, these regions are merged using 4D graph-cuts with an intensity based boundary term. For the first variant the user outlines the inside of the RV wall in a few end-diastole slices, for the second two marker pixels serve as starting point for a statistical atlas application. Results were obtained by blind evaluation on 16 testing 4D MR volumes. They prove our method to be robust against markers location and place it favourably in the ranks of existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immersion and interaction have been identified as key factors influencing the quality of experience in stereoscopic video systems. An experimental prototype designed to explore the influence of these factors in 3D video applications is described here1. The focus is on the real-time insertion algorithm of new 3D models into the original video streams. Using this algorithm, our prototype is aimed to explore a new interaction paradigm ? similar to the augmented reality approach ? with 3D video applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the major problems related to cancer treatment is its recurrence. Without knowing in advance how likely the cancer will relapse, clinical practice usually recommends adjuvant treatments that have strong side effects. A way to optimize treatments is to predict the recurrence probability by analyzing a set of bio-markers. The NeoMark European project has identified a set of preliminary bio-markers for the case of oral cancer by collecting a large series of data from genomic, imaging, and clinical evidence. This heterogeneous set of data needs a proper representation in order to be stored, computed, and communicated efficiently. Ontologies are often considered the proper mean to integrate biomedical data, for their high level of formality and for the need of interoperable, universally accepted models. This paper presents the NeoMark system and how an ontology has been designed to integrate all its heterogeneous data. The system has been validated in a pilot in which data will populate the ontology and will be made public for further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an increasing need of easy and affordable technologies to automatically generate virtual 3D models from their real counterparts. In particular, 3D human reconstruction has driven the creation of many clever techniques, most of them based on the visual hull (VH) concept. Such techniques do not require expensive hardware; however, they tend to yield 3D humanoids with realistic bodies but mediocre faces, since VH cannot handle concavities. On the other hand, structured light projectors allow to capture very accurate depth data, and thus to reconstruct realistic faces, but they are too expensive to use several of them. We have developed a technique to merge a VH-based 3D mesh of a reconstructed humanoid and the depth data of its face, captured by a single structured light projector. By combining the advantages of both systems in a simple setting, we are able to reconstruct realistic 3D human models with believable faces.