979 resultados para clustering techniques
Resumo:
OBJECTIVE: This review describes and evaluates the results of laparoscopic aortic surgery. METHODS: We describe the different laparoscopic techniques used to treat aortic disease, including (1) total laparoscopic aortic surgery (TLS), (2) laparoscopy-assisted procedures including hand-assisted laparoscopic surgery (HALS), and (3) robot-assisted laparoscopic surgery, with their current indications. Results of these techniques are analyzed in a systematic review of the clinical series published between 1998 and 2008, each containing >10 patients with complete information concerning operative time, clamping time, conversion rate, length of hospital stay, morbidity, and mortality. RESULTS: We selected and reviewed 29 studies that included 1073 patients. Heterogeneity of the studies and selection of the patients made comparison with current open or endovascular surgery difficult. Median operative time varied widely in TLS, from 240 to 391 minutes. HALS had the shortest operating time. Median clamping time varied from 60 to 146 minutes in TLS and was shorter in HALS. Median hospital stay varied from 4 to 10 days regardless of the laparoscopic technique. The postoperative mortality rate was 2.1% (95% confidence interval, 1.4-3.0), with no significant difference between patients treated for occlusive disease or for aneurysmal disease. Conversion to open surgery was necessary in 8.1% of patients and was slightly higher with TLS than with laparoscopy-assisted techniques (P = .07). CONCLUSIONS: Analysis of these series shows that laparoscopic aortic surgery can be performed safely provided that patient selection is adjusted to the surgeon's experience and conversion is liberally performed. The future of this technique in comparison with endovascular surgery is still unknown, and it is now time for multicenter randomized trials to demonstrate the potential benefit of this type of surgery.
Resumo:
The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.
Resumo:
The current state of regional and urban science has been much discussed and a number of studies have speculated on possible future trends in the development of the discipline. However, there has been little empirical analysis of current publication patterns in regional and urban journals. This paper studies the kinds of topics, techniques and data used in articles published in nine top international journals during the 1990s with the aim of identifying current trends in this research field
Resumo:
In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.
Resumo:
A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.
Resumo:
As our nation’s highway system continues to age, asphalt maintenance and rehabilitation techniques have become increasingly important. The deterioration of pavement over time is inevitable. Preventive maintenance is a strategy to extend the serviceable life of a pavement by applying cost-effective treatments that slow the deterioration of pavement and extend its usable life. Thin maintenance surfaces (TMSs) are preventive maintenance techniques that can effectively prolong the life of pavement when applied at an opportune time. Common TMSs include bituminous fog seal, bituminous seal coat, slurry seal, cold in-place recycling (CIR), and micro-surfacing. This research project investigated ways to improve Iowa Statewide Urban Design and Specifications (SUDAS) and Iowa Department of Transportation (DOT) documents regarding asphalt roadway maintenance and rehabilitation. Researchers led an effort to review and help ensure that the documents supporting proper selection, design, and construction for asphalt maintenance and rehabilitation techniques reflect the latest research findings on these processes: seal coating, slurry sealing, micro-surfacing, and fog sealing. Full results of this investigation are included in this report and its appendices. This report also presents a summary of the recommendations based on the study results.
Resumo:
The laparoscopic approach has emerged as a valid option for surgical management of kidney cancer, as well as a few benign pathologies. The immediate benefits of laparoscopy are well established and include less estimated blood loss, decreased pain, shorter perioperative convalescence, and improved cosmesis. Long-term oncologic outcomes of patients treated laparoscopically for kidney tumors are similar to those of open surgery.
Resumo:
Since different pedologists will draw different soil maps of a same area, it is important to compare the differences between mapping by specialists and mapping techniques, as for example currently intensively discussed Digital Soil Mapping. Four detailed soil maps (scale 1:10.000) of a 182-ha sugarcane farm in the county of Rafard, São Paulo State, Brazil, were compared. The area has a large variation of soil formation factors. The maps were drawn independently by four soil scientists and compared with a fifth map obtained by a digital soil mapping technique. All pedologists were given the same set of information. As many field expeditions and soil pits as required by each surveyor were provided to define the mapping units (MUs). For the Digital Soil Map (DSM), spectral data were extracted from Landsat 5 Thematic Mapper (TM) imagery as well as six terrain attributes from the topographic map of the area. These data were summarized by principal component analysis to generate the map designs of groups through Fuzzy K-means clustering. Field observations were made to identify the soils in the MUs and classify them according to the Brazilian Soil Classification System (BSCS). To compare the conventional and digital (DSM) soil maps, they were crossed pairwise to generate confusion matrices that were mapped. The categorical analysis at each classification level of the BSCS showed that the agreement between the maps decreased towards the lower levels of classification and the great influence of the surveyor on both the mapping and definition of MUs in the soil map. The average correspondence between the conventional and DSM maps was similar. Therefore, the method used to obtain the DSM yielded similar results to those obtained by the conventional technique, while providing additional information about the landscape of each soil, useful for applications in future surveys of similar areas.
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.