977 resultados para mapping method
Resumo:
Aim, Location Although the alpine mouse Apodemus alpicola has been given species status since 1989, no distribution map has ever been constructed for this endemic alpine rodent in Switzerland. Based on redetermined museum material and using the Ecological-Niche Factor Analysis (ENFA), habitat-suitability maps were computed for A. alpicola, and also for the co-occurring A. flavicollis and A. sylvaticus. Methods In the particular case of habitat suitability models, classical approaches (GLMs, GAMs, discriminant analysis, etc.) generally require presence and absence data. The presence records provided by museums can clearly give useful information about species distribution and ecology and have already been used for knowledge-based mapping. In this paper, we apply the ENFA which requires only presence data, to build a habitat-suitability map of three species of Apodemus on the basis of museum skull collections. Results Interspecific niche comparisons showed that A. alpicola is very specialized concerning habitat selection, meaning that its habitat differs unequivocally from the average conditions in Switzerland, while both A. flavicollis and A. sylvaticus could be considered as 'generalists' in the study area. Main conclusions Although an adequate sampling design is the best way to collect ecological data for predictive modelling, this is a time and money consuming process and there are cases where time is simply not available, as for instance with endangered species conservation. On the other hand, museums, herbariums and other similar institutions are treasuring huge presence data sets. By applying the ENFA to such data it is possible to rapidly construct a habitat suitability model. The ENFA method not only provides two key measurements regarding the niche of a species (i.e. marginality and specialization), but also has ecological meaning, and allows the scientist to compare directly the niches of different species.
Resumo:
A solution of (18)F was standardised with a 4pibeta-4pigamma coincidence counting system in which the beta detector is a one-inch diameter cylindrical UPS89 plastic scintillator, positioned at the bottom of a well-type 5''x5'' NaI(Tl) gamma-ray detector. Almost full detection efficiency-which was varied downwards electronically-was achieved in the beta-channel. Aliquots of this (18)F solution were also measured using 4pigamma NaI(Tl) integral counting and Monte Carlo calculated efficiencies as well as the CIEMAT-NIST method. Secondary measurements of the same solution were also performed with an IG11 ionisation chamber whose equivalent activity is traceable to the Système International de Référence through the contribution IRA-METAS made to it in 2001; IRA's degree of equivalence was found to be close to the key comparison reference value (KCRV). The (18)F activity predicted by this coincidence system agrees closely with the ionisation chamber measurement and is compatible within one standard deviation of the other primary measurements. This work demonstrates that our new coincidence system can standardise short-lived radionuclides used in nuclear medicine.
Resumo:
INTRODUCTION: Persistent atrial fibrillation (AF) ablation may lead to partial disconnection of the coronary sinus (CS). As a result, disparate activation sequences of the local CS versus contiguous left atrium (LA) may be observed during atrial tachycardia (AT). We aimed to evaluate the prevalence of this phenomenon and its impact on activation mapping. METHODS: AT occurring after persistent AF ablation were investigated in 74 consecutive patients. Partial CS disconnection during AT was suspected when double potentials with disparate activation sequences were observed on the CS catheter. Endocardial mapping facing CS bipoles was performed to differentiate LA far-field from local CS potentials. RESULTS: A total of 149 ATs were observed. Disparate LA-CS activations were apparent in 20 ATs after magnifying the recording scale (13%). The most common pattern (90%) was distal to proximal endocardial LA activation against proximal to distal CS activation, the latter involving the whole CS or its distal part. Perimitral macroreentry was more common when disparate LA-CS activations were observed (67% vs 29%; P = 0.002). Partial CS disconnection also resulted in "pseudo" mitral isthmus (MI) block during LA appendage pacing in 20% of patients as local CS activation was proximal to distal despite distal to proximal activation of the contiguous LA. CONCLUSION: Careful analysis of CS recordings during AT following persistent AF ablation often reveals disparate patterns of activation. Recognizing when endocardial LA activation occurs in the opposite direction to the more obvious local CS signals is critical to avoid misleading interpretations during mapping of AT and evaluation of MI block.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods
Resumo:
We studied the response to F+0 renography and the relative and absolute individual kidney function in neonates and < 6-mo-old infants before and after surgery for unilateral ureteropelvic junction obstruction (UJO). METHODS: The results obtained at diagnosis and after pyeloplasty for 9 children (8 boys, 1 girl; age range, 0.8-5.9 mo; mean age +/- SD, 2.4 +/- 1.5 mo) with proven unilateral UJO (i.e., affected kidney [AK]) and an unremarkable contralateral kidney (i.e., normal kidney [NK]) were evaluated and compared with a control group of 10 children (6 boys, 4 girls; age range, 0.8-2.8 mo; mean age, 1.5 +/- 0.7 mo) selected because of symmetric renal function, absence of vesicoureteral reflux or infection, and an initially dilated but not obstructed renal pelvis as proven by follow-up. Renography was performed for 20 min after injection of (123)I-hippuran (OIH) (0.5-1.0 MBq/kg) immediately followed by furosemide (1 mg/kg). The relative and absolute renal functions and the response to furosemide were measured on background-subtracted and depth-corrected renograms. The response to furosemide was quantified by an elimination index (EI), defined as the ratio of the 3- to 20-min activities: An EI > or = 3 was considered definitively normal and an EI < or = 1 definitively abnormal. If EI was equivocal (1 < EI < 3), the response to gravity-assisted drainage was used to differentiate AKs from NKs. Absolute separate renal function was measured by an accumulation index (AI), defined as the percentage of (123)I-OIH (%ID) extracted by the kidney 30-90 s after maximal cardiac activity. RESULTS: All AKs had definitively abnormal EIs at diagnosis (mean, 0.56 +/- 0.12) and were significantly lower than the EIs of the NKs (mean, 3.24 +/- 1.88) and of the 20 control kidneys (mean, 3.81 +/- 1.97; P < 0.001). The EIs of the AKs significantly improved (mean, 2.81 +/- 0.64; P < 0.05) after pyeloplasty. At diagnosis, the AIs of the AKs were significantly lower (mean, 6.31 +/- 2.33 %ID) than the AIs of the NKs (mean, 9.43 +/- 1.12 %ID) and of the control kidneys (mean, 9.05 +/- 1.17 %ID; P < 0.05). The AIs of the AKs increased at follow-up (mean, 7.81 +/- 2.23 %ID) but remained lower than those of the NKs (mean, 10.75 +/- 1.35 %ID; P < 0.05). CONCLUSION: In neonates and infants younger than 6 mo, (123)I-OIH renography with early furosemide injection (F+0) allowed us to reliably diagnose AKs and to determine if parenchymal function was normal or impaired and if it improved after surgery.
Resumo:
The integration of geophysical data into the subsurface characterization problem has been shown in many cases to significantly improve hydrological knowledge by providing information at spatial scales and locations that is unattainable using conventional hydrological measurement techniques. The investigation of exactly how much benefit can be brought by geophysical data in terms of its effect on hydrological predictions, however, has received considerably less attention in the literature. Here, we examine the potential hydrological benefits brought by a recently introduced simulated annealing (SA) conditional stochastic simulation method designed for the assimilation of diverse hydrogeophysical data sets. We consider the specific case of integrating crosshole ground-penetrating radar (GPR) and borehole porosity log data to characterize the porosity distribution in saturated heterogeneous aquifers. In many cases, porosity is linked to hydraulic conductivity and thus to flow and transport behavior. To perform our evaluation, we first generate a number of synthetic porosity fields exhibiting varying degrees of spatial continuity and structural complexity. Next, we simulate the collection of crosshole GPR data between several boreholes in these fields, and the collection of porosity log data at the borehole locations. The inverted GPR data, together with the porosity logs, are then used to reconstruct the porosity field using the SA-based method, along with a number of other more elementary approaches. Assuming that the grid-cell-scale relationship between porosity and hydraulic conductivity is unique and known, the porosity realizations are then used in groundwater flow and contaminant transport simulations to assess the benefits and limitations of the different approaches.
Resumo:
Objective: The purpose of this study was to find loci for major depression via linkage analysis of a large sibling pair sample. Method: The authors conducted a genome-wide linkage analysis of 839 families consisting of 971 affected sibling pairs with severe recurrent major depression, comprising waves I and II of the Depression Network Study cohort. In addition to examining affected status, linkage analyses in the full data set were performed using diagnoses restricted by impairment severity, and association mapping of hits in a large case-control data set was attempted. Results: The authors identified genome-wide significant linkage to chromosome 3p25-26 when the diagnoses were restricted by severity, which was a maximum LOD score of 4.0 centered at the linkage marker D3S1515. The linkage signal identified was genome-wide significant after correction for the multiple phenotypes tested, although subsequent association mapping of the region in a genome-wide association study of a U.K. depression sample did not provide significant results. Conclusions: The authors report a genome-wide significant locus for depression that implicates genes that are highly plausible for involvement in the etiology of recurrent depression. Despite the fact that association mapping in the region was negative, the linkage finding was replicated by another group who found genome-wide-significant linkage for depression in the same region. This suggests that 3p25-26 is a new locus for severe recurrent depression. This represents the first report of a genome-wide significant locus for depression that also has an independent genome-wide significant replication.
Resumo:
The aim of our study was to provide an innovative headspace-gas chromatography-mass spectrometry (HS-GC-MS) method applicable for the routine determination of blood CO concentration in forensic toxicology laboratories. The main drawback of the GC/MS methods discussed in literature for CO measurement is the absence of a specific CO internal standard necessary for performing quantification. Even if stable isotope of CO is commercially available in the gaseous state, it is essential to develop a safer method to limit the manipulation of gaseous CO and to precisely control the injected amount of CO for spiking and calibration. To avoid the manipulation of a stable isotope-labeled gas, we have chosen to generate in a vial in situ, an internal labeled standard gas ((13)CO) formed by the reaction of labeled formic acid formic acid (H(13)COOH) with sulfuric acid. As sulfuric acid can also be employed to liberate the CO reagent from whole blood, the procedure allows for the liberation of CO simultaneously with the generation of (13)CO. This method allows for precise measurement of blood CO concentrations from a small amount of blood (10 μL). Finally, this method was applied to measure the CO concentration of intoxicated human blood samples from autopsies.
Resumo:
This paper describes a method to achieve the most relevant contours of an image. The presented method proposes to integrate the information of the local contours from chromatic components such as H, S and I, taking into account the criteria of coherence of the local contour orientation values obtained from each of these components. The process is based on parametrizing pixel by pixel the local contours (magnitude and orientation values) from the H, S and I images. This process is carried out individually for each chromatic component. If the criterion of dispersion of the obtained orientation values is high, this chromatic component will lose relevance. A final processing integrates the extracted contours of the three chromatic components, generating the so-called integrated contours image
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
This paper presents a complete solution for creating accurate 3D textured models from monocular video sequences. The methods are developed within the framework of sequential structure from motion, where a 3D model of the environment is maintained and updated as new visual information becomes available. The camera position is recovered by directly associating the 3D scene model with local image observations. Compared to standard structure from motion techniques, this approach decreases the error accumulation while increasing the robustness to scene occlusions and feature association failures. The obtained 3D information is used to generate high quality, composite visual maps of the scene (mosaics). The visual maps are used to create texture-mapped, realistic views of the scene
Resumo:
Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio
Resumo:
In networks with small buffers, such as optical packet switching based networks, the convolution approach is presented as one of the most accurate method used for the connection admission control. Admission control and resource management have been addressed in other works oriented to bursty traffic and ATM. This paper focuses on heterogeneous traffic in OPS based networks. Using heterogeneous traffic and bufferless networks the enhanced convolution approach is a good solution. However, both methods (CA and ECA) present a high computational cost for high number of connections. Two new mechanisms (UMCA and ISCA) based on Monte Carlo method are proposed to overcome this drawback. Simulation results show that our proposals achieve lower computational cost compared to enhanced convolution approach with an small stochastic error in the probability estimation
Resumo:
Schistosomiasis mansoni is not just a physical disease, but is related to social and behavioural factors as well. Snails of the Biomphalaria genus are an intermediate host for Schistosoma mansoni and infect humans through water. The objective of this study is to classify the risk of schistosomiasis in the state of Minas Gerais (MG). We focus on socioeconomic and demographic features, basic sanitation features, the presence of accumulated water bodies, dense vegetation in the summer and winter seasons and related terrain characteristics. We draw on the decision tree approach to infection risk modelling and mapping. The model robustness was properly verified. The main variables that were selected by the procedure included the terrain's water accumulation capacity, temperature extremes and the Human Development Index. In addition, the model was used to generate two maps, one that included risk classification for the entire of MG and another that included classification errors. The resulting map was 62.9% accurate.