924 resultados para global optimization algorithms
Resumo:
In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.
Resumo:
Background: Statistical shape models are widely used in biomedical research. They are routinely implemented for automatic image segmentation or object identification in medical images. In these fields, however, the acquisition of the large training datasets, required to develop these models, is usually a time-consuming process. Even after this effort, the collections of datasets are often lost or mishandled resulting in replication of work. Objective: To solve these problems, the Virtual Skeleton Database (VSD) is proposed as a centralized storage system where the data necessary to build statistical shape models can be stored and shared. Methods: The VSD provides an online repository system tailored to the needs of the medical research community. The processing of the most common image file types, a statistical shape model framework, and an ontology-based search provide the generic tools to store, exchange, and retrieve digital medical datasets. The hosted data are accessible to the community, and collaborative research catalyzes their productivity. Results: To illustrate the need for an online repository for medical research, three exemplary projects of the VSD are presented: (1) an international collaboration to achieve improvement in cochlear surgery and implant optimization, (2) a population-based analysis of femoral fracture risk between genders, and (3) an online application developed for the evaluation and comparison of the segmentation of brain tumors. Conclusions: The VSD is a novel system for scientific collaboration for the medical image community with a data-centric concept and semantically driven search option for anatomical structures. The repository has been proven to be a useful tool for collaborative model building, as a resource for biomechanical population studies, or to enhance segmentation algorithms.
Resumo:
Dynamic systems, especially in real-life applications, are often determined by inter-/intra-variability, uncertainties and time-varying components. Physiological systems are probably the most representative example in which population variability, vital signal measurement noise and uncertain dynamics render their explicit representation and optimization a rather difficult task. Systems characterized by such challenges often require the use of adaptive algorithmic solutions able to perform an iterative structural and/or parametrical update process towards optimized behavior. Adaptive optimization presents the advantages of (i) individualization through learning of basic system characteristics, (ii) ability to follow time-varying dynamics and (iii) low computational cost. In this chapter, the use of online adaptive algorithms is investigated in two basic research areas related to diabetes management: (i) real-time glucose regulation and (ii) real-time prediction of hypo-/hyperglycemia. The applicability of these methods is illustrated through the design and development of an adaptive glucose control algorithm based on reinforcement learning and optimal control and an adaptive, personalized early-warning system for the recognition and alarm generation against hypo- and hyperglycemic events.
Resumo:
Accurate assessments of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the climate policy process, and project future climate change. Present-day analysis requires the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. Here we describe datasets and a methodology developed by the global carbon cycle science community to quantify all major components of the global carbon budget, including their uncertainties. We discuss changes compared to previous estimates, consistency within and among components, and methodology and data limitations. CO2 emissions from fossil fuel combustion and cement production (EFF) are based on energy statistics, while emissions from Land-Use Change (ELUC), including deforestation, are based on combined evidence from land cover change data, fire activity in regions undergoing deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. Finally, the global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms. For the last decade available (2002–2011), EFF was 8.3 ± 0.4 PgC yr−1, ELUC 1.0 ± 0.5 PgC yr−1, GATM 4.3 ± 0.1 PgC yr−1, SOCEAN 2.5 ± 0.5 PgC yr−1, and SLAND 2.6 ± 0.8 PgC yr−1. For year 2011 alone, EFF was 9.5 ± 0.5 PgC yr−1, 3.0 percent above 2010, reflecting a continued trend in these emissions; ELUC was 0.9 ± 0.5 PgC yr−1, approximately constant throughout the decade; GATM was 3.6 ± 0.2 PgC yr−1, SOCEAN was 2.7 ± 0.5 PgC yr−1, and SLAND was 4.1 ± 0.9 PgC yr−1. GATM was low in 2011 compared to the 2002–2011 average because of a high uptake by the land probably in response to natural climate variability associated to La Niña conditions in the Pacific Ocean. The global atmospheric CO2 concentration reached 391.31 ± 0.13 ppm at the end of year 2011. We estimate that EFF will have increased by 2.6% (1.9–3.5%) in 2012 based on projections of gross world product and recent changes in the carbon intensity of the economy. All uncertainties are reported as ±1 sigma (68% confidence assuming Gaussian error distributions that the real value lies within the given interval), reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. This paper is intended to provide a baseline to keep track of annual carbon budgets in the future.
Resumo:
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the climate policy process, and project future climate change. Present-day analysis requires the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. Here we describe datasets and a methodology developed by the global carbon cycle science community to quantify all major components of the global carbon budget, including their uncertainties. We discuss changes compared to previous estimates, consistency within and among components, and methodology and data limitations. Based on energy statistics, we estimate that the global emissions of CO2 from fossil fuel combustion and cement production were 9.5 ± 0.5 PgC yr−1 in 2011, 3.0 percent above 2010 levels. We project these emissions will increase by 2.6% (1.9–3.5%) in 2012 based on projections of Gross World Product and recent changes in the carbon intensity of the economy. Global net CO2 emissions from Land-Use Change, including deforestation, are more difficult to update annually because of data availability, but combined evidence from land cover change data, fire activity in regions undergoing deforestation and models suggests those net emissions were 0.9 ± 0.5 PgC yr−1 in 2011. The global atmospheric CO2 concentration is measured directly and reached 391.38 ± 0.13 ppm at the end of year 2011, increasing 1.70 ± 0.09 ppm yr−1 or 3.6 ± 0.2 PgC yr−1 in 2011. Estimates from four ocean models suggest that the ocean CO2 sink was 2.6 ± 0.5 PgC yr−1 in 2011, implying a global residual terrestrial CO2 sink of 4.1 ± 0.9 PgC yr−1. All uncertainties are reported as ±1 sigma (68% confidence assuming Gaussian error distributions that the real value lies within the given interval), reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. This paper is intended to provide a baseline to keep track of annual carbon budgets in the future.
Resumo:
Accurate assessment of anthropogenic carbon dioxide (CO2) emissions and their redistribution among the atmosphere, ocean, and terrestrial biosphere is important to better understand the global carbon cycle, support the development of climate policies, and project future climate change. Here we describe data sets and a methodology to quantify all major components of the global carbon budget, including their uncertainties, based on the combination of a range of data, algorithms, statistics and model estimates and their interpretation by a broad scientific community. We discuss changes compared to previous estimates, consistency within and among components, alongside methodology and data limitations. CO2 emissions from fossil-fuel combustion and cement production (EFF) are based on energy statistics, while emissions from land-use change (ELUC), mainly deforestation, are based on combined evidence from land-cover change data, fire activity associated with deforestation, and models. The global atmospheric CO2 concentration is measured directly and its rate of growth (GATM) is computed from the annual changes in concentration. The mean ocean CO2 sink (SOCEAN) is based on observations from the 1990s, while the annual anomalies and trends are estimated with ocean models. The variability in SOCEAN is evaluated for the first time in this budget with data products based on surveys of ocean CO2 measurements. The global residual terrestrial CO2 sink (SLAND) is estimated by the difference of the other terms of the global carbon budget and compared to results of independent dynamic global vegetation models forced by observed climate, CO2 and land cover change (some including nitrogen–carbon interactions). All uncertainties are reported as ± 1 σ, reflecting the current capacity to characterise the annual estimates of each component of the global carbon budget. For the last decade available (2003–2012), EFF was 8.6 ± 0.4 GtC yr − 1, ELUC 0.9 ± 0.5 GtC yr − 1, GATM 4.3 ± 0.1 GtC yr − 1, S OCEAN 2.5 ± 0.5 GtC yr − 1, and S LAND 2.8 ± 0.8 GtC yr − 1. For year 2012 alone, EFF grew to 9.7 ± 0.5 GtC yr − 1, 2.2 % above 2011, reflecting a continued growing trend in these emissions, GATM was 5.1 ± 0.2 GtC yr − 1, SOCEANwas 2.9 ± 0.5 GtC yr −1, and assuming an ELU Cof 1.0 ± 0.5 GtC yr − 1 (based on the 2001–2010 average), SLAND was 2.7 ± 0.9 GtC yr − 1. GATM was high in 2012 compared to the 2003–2012 average, almost entirely reflecting the high EFF. The global atmospheric CO2 con- centration reached 392.52 ± 0.10 ppm averaged over 2012. We estimate that EFF will increase by 2.1 % (1.1–3.1 %) to 9.9 ± 0.5 GtC in 2013, 61 % above emissions in 1990, based on projections of world gross domestic product and recent changes in the carbon intensity of the economy. With this projection, cumulative emissions of CO2 will reach about 535 ± 55 GtC for 1870–2013, about 70 % from EFF (390 ± 20 GtC) and 30 % from ELUC (145 ± 50 GtC). This paper also documents any changes in the methods and data sets used in this new carbon budget from previous budgets (Le Quéré et al., 2013). All observations presented here can be downloaded from the Carbon Dioxide Information Analysis Center.
Resumo:
The influence of respiratory motion on patient anatomy poses a challenge to accurate radiation therapy, especially in lung cancer treatment. Modern radiation therapy planning uses models of tumor respiratory motion to account for target motion in targeting. The tumor motion model can be verified on a per-treatment session basis with four-dimensional cone-beam computed tomography (4D-CBCT), which acquires an image set of the dynamic target throughout the respiratory cycle during the therapy session. 4D-CBCT is undersampled if the scan time is too short. However, short scan time is desirable in clinical practice to reduce patient setup time. This dissertation presents the design and optimization of 4D-CBCT to reduce the impact of undersampling artifacts with short scan times. This work measures the impact of undersampling artifacts on the accuracy of target motion measurement under different sampling conditions and for various object sizes and motions. The results provide a minimum scan time such that the target tracking error is less than a specified tolerance. This work also presents new image reconstruction algorithms for reducing undersampling artifacts in undersampled datasets by taking advantage of the assumption that the relevant motion of interest is contained within a volume-of-interest (VOI). It is shown that the VOI-based reconstruction provides more accurate image intensity than standard reconstruction. The VOI-based reconstruction produced 43% fewer least-squares error inside the VOI and 84% fewer error throughout the image in a study designed to simulate target motion. The VOI-based reconstruction approach can reduce acquisition time and improve image quality in 4D-CBCT.
Resumo:
Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
This paper describes the basic tools to work with wireless sensors. TinyOShas a componentbased architecture which enables rapid innovation and implementation while minimizing code size as required by the severe memory constraints inherent in sensor networks. TinyOS's component library includes network protocols, distributed services, sensor drivers, and data acquisition tools ? all of which can be used asia or be further refined for a custom application. TinyOS was originally developed as a research project at the University of California Berkeley, but has since grown to have an international community of developers and users. Some algorithms concerning packet routing are shown. Incar entertainment systems can be based on wireless sensors in order to obtain information from Internet, but routing protocols must be implemented in order to avoid bottleneck problems. Ant Colony algorithms are really useful in such cases, therefore they can be embedded into the sensors to perform such routing task.
Resumo:
This article focuses on the evaluation of a biometric technique based on the performance of an identifying gesture by holding a telephone with an embedded accelerometer in his/her hand. The acceleration signals obtained when users perform gestures are analyzed following a mathematical method based on global sequence alignment. In this article, eight different scores are proposed and evaluated in order to quantify the differences between gestures, obtaining an optimal EER result of 3.42% when analyzing a random set of 40 users of a database made up of 80 users with real attempts of falsification. Moreover, a temporal study of the technique is presented leeding to the need to update the template to adapt the manner in which users modify how they perform their identifying gesture over time. Six updating schemes have been assessed within a database of 22 users repeating their identifying gesture in 20 sessions over 4 months, concluding that the more often the template is updated the better and more stable performance the technique presents.
Resumo:
Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain.
Resumo:
Global analysis of logic programs can be performed effectively by the use of one of several existing efficient algorithms. However, the traditional global analysis scheme in which all the program code is known in advance and no previous analysis information is available is unsatisfactory in many situations. Incrementa! analysis of logic programs has been shown to be feasible and much more efficient in certain contexts than traditional (non-incremental) global analysis. However, incremental analysis poses additional requirements on the fixpoint algorithm used. In this work we identify these requirements, present an important class of strategies meeting the requirements, present sufficient a priori conditions for such strategies, and propose, implement, and evalúate experimentally a novel algorithm for incremental analysis based on these ideas. The experimental results show that the proposed algorithm performs very efficiently in the incremental case while being comparable to (and, in some cases, considerably better than) other state-of-the-art analysis algorithms even for the non-incremental case. We argüe that our discussions, results, and experiments also shed light on some of the many tradeoffs involved in the design of algorithms for logic program analysis.
Resumo:
This paper presents and illustrates with an example a practical approach to the dataflow analysis of programs written in constraint logic programming (CLP) languages using abstract interpretation. It is first argued that, from the framework point of view, it sufnces to propose relatively simple extensions of traditional analysis methods which have already been proved useful and practical and for which efncient fixpoint algorithms have been developed. This is shown by proposing a simple but quite general extensión of Bruynooghe's traditional framework to the analysis of CLP programs. In this extensión constraints are viewed not as "suspended goals" but rather as new information in the store, following the traditional view of CLP. Using this approach, and as an example of its use, a complete, constraint system independent, abstract analysis is presented for approximating definiteness information. The analysis is in fact of quite general applicability. It has been implemented and used in the analysis of CLP(R) and Prolog-III applications. Results from the implementation of this analysis are also presented.