995 resultados para Woods-Saxon Strutinsky method


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a method for segmenting white matter tracts from high angular resolution diffusion MR. images by representing the data in a 5 dimensional space of position and orientation. Whereas crossing fiber tracts cannot be separated in 3D position space, they clearly disentangle in 5D position-orientation space. The segmentation is done using a 5D level set method applied to hyper-surfaces evolving in 5D position-orientation space. In this paper we present a methodology for constructing the position-orientation space. We then show how to implement the standard level set method in such a non-Euclidean high dimensional space. The level set theory is basically defined for N-dimensions but there are several practical implementation details to consider, such as mean curvature. Finally, we will show results from a synthetic model and a few preliminary results on real data of a human brain acquired by high angular resolution diffusion MRI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The immunoreactivity of seven peptides synthesized from Schistosoma mansoni proteins, was evaluated by dot-blot and ELISA assays using two different sensitization methodologies. The best results were obtained on wells of the Costar 3590 microplates coated with peptides P1, P2, P3, P6, and P7 using conventional methodology. The signals increased considerably (p < 0.0003) on wells sensitized with P1 to P6 using alternative methodology. In contrast, the well coated with peptide P7 presented lower signal when compared with conventional methodology (p = 0.0019). These results, establish the basis for the application of synthetic peptides for laboratory diagnosis of schistosomiasis mansoni.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tendency for more closely related species to share similar traits and ecological strategies can be explained by their longer shared evolutionary histories and represents phylogenetic conservatism. How strongly species traits co-vary with phylogeny can significantly impact how we analyze cross-species data and can influence our interpretation of assembly rules in the rapidly expanding field of community phylogenetics. Phylogenetic conservatism is typically quantified by analyzing the distribution of species values on the phylogenetic tree that connects them. Many phylogenetic approaches, however, assume a completely sampled phylogeny: while we have good estimates of deeper phylogenetic relationships for many species-rich groups, such as birds and flowering plants, we often lack information on more recent interspecific relationships (i.e., within a genus). A common solution has been to represent these relationships as polytomies on trees using taxonomy as a guide. Here we show that such trees can dramatically inflate estimates of phylogenetic conservatism quantified using S. P. Blomberg et al.'s K statistic. Using simulations, we show that even randomly generated traits can appear to be phylogenetically conserved on poorly resolved trees. We provide a simple rarefaction-based solution that can reliably retrieve unbiased estimates of K, and we illustrate our method using data on first flowering times from Thoreau's woods (Concord, Massachusetts, USA).

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Schistosomes are endoparasites causing a serious human disease called schistosomiasis. The quantification of parasite genetic diversity is an essential component to understand the schistosomiasis epidemiology and disease transmission patterns. In this paper, we propose a novel assay for a rapid, low costly and efficient DNA extraction method of egg, larval and adult stages of Schistosoma mansoni. One euro makes possible to perform 60,000 DNA extraction reactions at top speed (only 15 min of incubation and 5 handling steps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we look at how a web-based social software can be used to make qualitative data analysis of online peer-to-peer learning experiences. Specifically, we propose to use Cohere, a web-based social sense-making tool, to observe, track, annotate and visualize discussion group activities in online courses. We define a specific methodology for data observation and structuring, and present results of the analysis of peer interactions conducted in discussion forum in a real case study of a P2PU course. Finally we discuss how network visualization and analysis can be used to gather a better understanding of the peer-to-peer learning experience. To do so, we provide preliminary insights on the social, dialogical and conceptual connections that have been generated within one online discussion group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the response to F+0 renography and the relative and absolute individual kidney function in neonates and &lt; 6-mo-old infants before and after surgery for unilateral ureteropelvic junction obstruction (UJO). METHODS: The results obtained at diagnosis and after pyeloplasty for 9 children (8 boys, 1 girl; age range, 0.8-5.9 mo; mean age +/- SD, 2.4 +/- 1.5 mo) with proven unilateral UJO (i.e., affected kidney [AK]) and an unremarkable contralateral kidney (i.e., normal kidney [NK]) were evaluated and compared with a control group of 10 children (6 boys, 4 girls; age range, 0.8-2.8 mo; mean age, 1.5 +/- 0.7 mo) selected because of symmetric renal function, absence of vesicoureteral reflux or infection, and an initially dilated but not obstructed renal pelvis as proven by follow-up. Renography was performed for 20 min after injection of (123)I-hippuran (OIH) (0.5-1.0 MBq/kg) immediately followed by furosemide (1 mg/kg). The relative and absolute renal functions and the response to furosemide were measured on background-subtracted and depth-corrected renograms. The response to furosemide was quantified by an elimination index (EI), defined as the ratio of the 3- to 20-min activities: An EI &gt; or = 3 was considered definitively normal and an EI &lt; or = 1 definitively abnormal. If EI was equivocal (1 &lt; EI &lt; 3), the response to gravity-assisted drainage was used to differentiate AKs from NKs. Absolute separate renal function was measured by an accumulation index (AI), defined as the percentage of (123)I-OIH (%ID) extracted by the kidney 30-90 s after maximal cardiac activity. RESULTS: All AKs had definitively abnormal EIs at diagnosis (mean, 0.56 +/- 0.12) and were significantly lower than the EIs of the NKs (mean, 3.24 +/- 1.88) and of the 20 control kidneys (mean, 3.81 +/- 1.97; P &lt; 0.001). The EIs of the AKs significantly improved (mean, 2.81 +/- 0.64; P &lt; 0.05) after pyeloplasty. At diagnosis, the AIs of the AKs were significantly lower (mean, 6.31 +/- 2.33 %ID) than the AIs of the NKs (mean, 9.43 +/- 1.12 %ID) and of the control kidneys (mean, 9.05 +/- 1.17 %ID; P &lt; 0.05). The AIs of the AKs increased at follow-up (mean, 7.81 +/- 2.23 %ID) but remained lower than those of the NKs (mean, 10.75 +/- 1.35 %ID; P &lt; 0.05). CONCLUSION: In neonates and infants younger than 6 mo, (123)I-OIH renography with early furosemide injection (F+0) allowed us to reliably diagnose AKs and to determine if parenchymal function was normal or impaired and if it improved after surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. The investigation of exactly how much benefit can be brought by geophysical data in terms of its effect on hydrological predictions, however, has received considerably less attention in the literature. Here, we examine the potential hydrological benefits brought by a recently introduced simulated annealing (SA) conditional stochastic simulation method designed for the assimilation of diverse hydrogeophysical data sets. We consider the specific case of integrating crosshole ground-penetrating radar (GPR) and borehole porosity log data to characterize the porosity distribution in saturated heterogeneous aquifers. In many cases, porosity is linked to hydraulic conductivity and thus to flow and transport behavior. To perform our evaluation, we first generate a number of synthetic porosity fields exhibiting varying degrees of spatial continuity and structural complexity. Next, we simulate the collection of crosshole GPR data between several boreholes in these fields, and the collection of porosity log data at the borehole locations. The inverted GPR data, together with the porosity logs, are then used to reconstruct the porosity field using the SA-based method, along with a number of other more elementary approaches. Assuming that the grid-cell-scale relationship between porosity and hydraulic conductivity is unique and known, the porosity realizations are then used in groundwater flow and contaminant transport simulations to assess the benefits and limitations of the different approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The purpose of this study was to find loci for major depression via linkage analysis of a large sibling pair sample. Method: The authors conducted a genome-wide linkage analysis of 839 families consisting of 971 affected sibling pairs with severe recurrent major depression, comprising waves I and II of the Depression Network Study cohort. In addition to examining affected status, linkage analyses in the full data set were performed using diagnoses restricted by impairment severity, and association mapping of hits in a large case-control data set was attempted. Results: The authors identified genome-wide significant linkage to chromosome 3p25-26 when the diagnoses were restricted by severity, which was a maximum LOD score of 4.0 centered at the linkage marker D3S1515. The linkage signal identified was genome-wide significant after correction for the multiple phenotypes tested, although subsequent association mapping of the region in a genome-wide association study of a U.K. depression sample did not provide significant results. Conclusions: The authors report a genome-wide significant locus for depression that implicates genes that are highly plausible for involvement in the etiology of recurrent depression. Despite the fact that association mapping in the region was negative, the linkage finding was replicated by another group who found genome-wide-significant linkage for depression in the same region. This suggests that 3p25-26 is a new locus for severe recurrent depression. This represents the first report of a genome-wide significant locus for depression that also has an independent genome-wide significant replication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of our study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable for the routine determination of blood CO concentration in forensic toxicology laboratories. The main drawback of the GC/MS methods discussed in literature for CO measurement is the absence of a specific CO internal standard necessary for performing quantification. Even if stable isotope of CO is commercially available in the gaseous state, it is essential to develop a safer method to limit the manipulation of gaseous CO and to precisely control the injected amount of CO for spiking and calibration. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in a vial in situ, an internal labeled standard gas ((13)CO) formed by the reaction of labeled formic acid formic acid (H(13)COOH) with sulfuric acid. As sulfuric acid can also be employed to liberate the CO reagent from whole blood, the procedure allows for the liberation of CO simultaneously with the generation of (13)CO. This method allows for precise measurement of blood CO concentrations from a small amount of blood (10 μL). Finally, this method was applied to measure the CO concentration of intoxicated human blood samples from autopsies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a method to achieve the most relevant contours of an image. The presented method proposes to integrate the information of the local contours from chromatic components such as H, S and I, taking into account the criteria of coherence of the local contour orientation values obtained from each of these components. The process is based on parametrizing pixel by pixel the local contours (magnitude and orientation values) from the H, S and I images. This process is carried out individually for each chromatic component. If the criterion of dispersion of the obtained orientation values is high, this chromatic component will lose relevance. A final processing integrates the extracted contours of the three chromatic components, generating the so-called integrated contours image

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation