910 resultados para Multi-Criteria Optimization
Resumo:
The problem of optimal design of a multi-gravity-assist space trajectories, with free number of deep space maneuvers (MGADSM) poses multi-modal cost functions. In the general form of the problem, the number of design variables is solution dependent. To handle global optimization problems where the number of design variables varies from one solution to another, two novel genetic-based techniques are introduced: hidden genes genetic algorithm (HGGA) and dynamic-size multiple population genetic algorithm (DSMPGA). In HGGA, a fixed length for the design variables is assigned for all solutions. Independent variables of each solution are divided into effective and ineffective (hidden) genes. Hidden genes are excluded in cost function evaluations. Full-length solutions undergo standard genetic operations. In DSMPGA, sub-populations of fixed size design spaces are randomly initialized. Standard genetic operations are carried out for a stage of generations. A new population is then created by reproduction from all members based on their relative fitness. The resulting sub-populations have different sizes from their initial sizes. The process repeats, leading to increasing the size of sub-populations of more fit solutions. Both techniques are applied to several MGADSM problems. They have the capability to determine the number of swing-bys, the planets to swing by, launch and arrival dates, and the number of deep space maneuvers as well as their locations, magnitudes, and directions in an optimal sense. The results show that solutions obtained using the developed tools match known solutions for complex case studies. The HGGA is also used to obtain the asteroids sequence and the mission structure in the global trajectory optimization competition (GTOC) problem. As an application of GA optimization to Earth orbits, the problem of visiting a set of ground sites within a constrained time frame is solved. The J2 perturbation and zonal coverage are considered to design repeated Sun-synchronous orbits. Finally, a new set of orbits, the repeated shadow track orbits (RSTO), is introduced. The orbit parameters are optimized such that the shadow of a spacecraft on the Earth visits the same locations periodically every desired number of days.
Resumo:
We present in this paper several contributions on the collision detection optimization centered on hardware performance. We focus on the broad phase which is the first step of the collision detection process and propose three new ways of parallelization of the well-known Sweep and Prune algorithm. We first developed a multi-core model takes into account the number of available cores. Multi-core architecture enables us to distribute geometric computations with use of multi-threading. Critical writing section and threads idling have been minimized by introducing new data structures for each thread. Programming with directives, like OpenMP, appears to be a good compromise for code portability. We then proposed a new GPU-based algorithm also based on the "Sweep and Prune" that has been adapted to multi-GPU architectures. Our technique is based on a spatial subdivision method used to distribute computations among GPUs. Results show that significant speed-up can be obtained by passing from 1 to 4 GPUs in a large-scale environment.
Resumo:
Previous studies have shown that collective property rights offer higher flexibility than individual property and improve sustainable community-based forest management. Our case study, carried out in the Beni department of Bolivia, does not contradict this assertion, but shows that collective rights have been granted in areas where ecological contexts and market facilities were less favourable to intensive land use. Previous experiences suggest investigating political processes in order to understand the criteria according to which access rights were distributed. Based on remote sensing and on a multi-level land governance framework, our research confirms that land placed under collective rights, compared to individual property, is less affected by deforestation among Andean settlements. However, analysis of the historical process of land distribution in the area shows that the distribution of property rights is the result of a political process based on economic, spatial, and environmental strategies that are defined by multiple stakeholders. Collective titles were established in the more remote areas and distributed to communities with lower productive potentialities. Land rights are thus a secondary factor of forest cover change which results from diverse political compromises based on population distribution, accessibility, environmental perceptions, and expected production or extraction incomes.
Resumo:
Offset printing is a common method to produce large amounts of printed matter. We consider a real-world offset printing process that is used to imprint customer-specific designs on napkin pouches. The print- ing technology used yields a number of specific constraints. The planning problem consists of allocating designs to printing-plate slots such that the given customer demand for each design is fulfilled, all technologi- cal and organizational constraints are met and the total overproduction and setup costs are minimized. We formulate this planning problem as a mixed-binary linear program, and we develop a multi-pass matching-based savings heuristic. We report computational results for a set of problem instances devised from real-world data.
Resumo:
BACKGROUND Retinal optical coherence tomography (OCT) permits quantification of retinal layer atrophy relevant to assessment of neurodegeneration in multiple sclerosis (MS). Measurement artefacts may limit the use of OCT to MS research. OBJECTIVE An expert task force convened with the aim to provide guidance on the use of validated quality control (QC) criteria for the use of OCT in MS research and clinical trials. METHODS A prospective multi-centre (n = 13) study. Peripapillary ring scan QC rating of an OCT training set (n = 50) was followed by a test set (n = 50). Inter-rater agreement was calculated using kappa statistics. Results were discussed at a round table after the assessment had taken place. RESULTS The inter-rater QC agreement was substantial (kappa = 0.7). Disagreement was found highest for judging signal strength (kappa = 0.40). Future steps to resolve these issues were discussed. CONCLUSION Substantial agreement for QC assessment was achieved with aid of the OSCAR-IB criteria. The task force has developed a website for free online training and QC certification. The criteria may prove useful for future research and trials in MS using OCT as a secondary outcome measure in a multi-centre setting.
Resumo:
Introduction: In professional soccer, talent selection relies on the subjective judgment of scouts and coaches. To date, little is known about coaches´ “eye for talent” (Christensen, 2009, p. 379) and the nature of the subjective criteria they use to identify those players with the greatest potential to achieve peak performance in adulthood (Williams & Reilly, 2000). Drawing on a constructivist approach (Kelly, 1991), this study explores coaches´ subjective talent criteria. It is assumed that coaches are able to verbalise and specify their talent criteria, and that these are related to their talent selection decisions based on instinct. Methods: Participants and generation of data. Five national youth soccer coaches (Mage = 55.6; SD = 5.03) were investigated at three appointments: (1) talent selection decision based on instinct, (2) semi-structured inductive interview to elicit each coaches´ talent criteria in detail, (3) communicative validation and evaluation of the players by each coach using the repertory grid technique (Fromm, 2004). Data Analysis: Interviews were transcribed and summarized with regard to each specified talent criterion. Each talent criterion was categorized using a bottom-up-approach (meaning categorization, Kvale, 1996). The repertory grid data was analysed using descriptive statistics and correlation analysis. Results and Discussion: For each coach, six to nine talent criteria were elicited and specified. The subjective talent criteria include aspects of personality, cognitive perceptual skills, motor abilities, development, technique, social environment and physical constitution, which shows that the coaches use a multi-dimensional concept of talent. However, more than half of all criteria describe personality characteristics, in particular achievement motivation, volition and self-confidence. In contrast to Morris (2000), this result shows that coaches have a differentiated view of the personality characteristics required to achieve peak performance. As an indication of criterion validity, moderate to high correlations (.57 ≤ r ≤ .81) are found between the evaluations of the players according to the coaches´ talent criteria and their talent selection decision. The study shows that coaches are able to specify their subject talent criteria and that those criteria are strongly related to their instinctive selection decisions. References: Christensen, M. K. (2009). "An Eye for Talent": Talent Identification and the "Practical Sense" of Top-Level Soccer Coaches. Sociology of Sport Journal, 26, 365–382. Fromm, M. (2004). Introduction to the Repertory Grid Interview. Münster: Waxmann. Kelly, G. A. (1991). The Psychology of Personal Constructs: Volume One: Theory and personality. London: Routledge. Kvale, S. (1996). InterViews: An introduction to Qualitative Research Interviewing. Thousand Oaks: Sage. Morris, T. (2000). Psychological characteristics and talent identification in soccer. Journal of Sports Sciences, 18, 715–726. Williams, A. M., & Reilly, T. (2000). Talent identification and development in soccer. Journal of Sports Sciences, 18, 657–667.
Resumo:
Recent studies of Schwinger pair production have demonstrated that the asymptotic particle spectrum is extremely sensitive to the applied field profile. We extend the idea of the dynamically assisted Schwinger effect from single pulse profiles to more realistic field configurations to be generated in an all-optical experiment searching for pair creation. We use the quantum kinetic approach to study the particle production and employ a multi-start method, combined with optimal control theory, to determine a set of parameters for which the particle yield in the forward direction in momentum space is maximized. We argue that this strategy can be used to enhance the signal of pair production on a given detector in an experimental setup.
Resumo:
Advancements in cloud computing have enabled the proliferation of distributed applications, which require management and control of multiple services. However, without an efficient mechanism for scaling services in response to changing workload conditions, such as number of connected users, application performance might suffer, leading to violations of Service Level Agreements (SLA) and possible inefficient use of hardware resources. Combining dynamic application requirements with the increased use of virtualised computing resources creates a challenging resource Management context for application and cloud-infrastructure owners. In such complex environments, business entities use SLAs as a means for specifying quantitative and qualitative requirements of services. There are several challenges in running distributed enterprise applications in cloud environments, ranging from the instantiation of service VMs in the correct order using an adequate quantity of computing resources, to adapting the number of running services in response to varying external loads, such as number of users. The application owner is interested in finding the optimum amount of computing and network resources to use for ensuring that the performance requirements of all her/his applications are met. She/he is also interested in appropriately scaling the distributed services so that application performance guarantees are maintained even under dynamic workload conditions. Similarly, the infrastructure Providers are interested in optimally provisioning the virtual resources onto the available physical infrastructure so that her/his operational costs are minimized, while maximizing the performance of tenants’ applications. Motivated by the complexities associated with the management and scaling of distributed applications, while satisfying multiple objectives (related to both consumers and providers of cloud resources), this thesis proposes a cloud resource management platform able to dynamically provision and coordinate the various lifecycle actions on both virtual and physical cloud resources using semantically enriched SLAs. The system focuses on dynamic sizing (scaling) of virtual infrastructures composed of virtual machines (VM) bounded application services. We describe several algorithms for adapting the number of VMs allocated to the distributed application in response to changing workload conditions, based on SLA-defined performance guarantees. We also present a framework for dynamic composition of scaling rules for distributed service, which used benchmark-generated application Monitoring traces. We show how these scaling rules can be combined and included into semantic SLAs for controlling allocation of services. We also provide a detailed description of the multi-objective infrastructure resource allocation problem and various approaches to satisfying this problem. We present a resource management system based on a genetic algorithm, which performs allocation of virtual resources, while considering the optimization of multiple criteria. We prove that our approach significantly outperforms reactive VM-scaling algorithms as well as heuristic-based VM-allocation approaches.
Resumo:
SOMS is a general surrogate-based multistart algorithm, which is used in combination with any local optimizer to find global optima for computationally expensive functions with multiple local minima. SOMS differs from previous multistart methods in that a surrogate approximation is used by the multistart algorithm to help reduce the number of function evaluations necessary to identify the most promising points from which to start each nonlinear programming local search. SOMS’s numerical results are compared with four well-known methods, namely, Multi-Level Single Linkage (MLSL), MATLAB’s MultiStart, MATLAB’s GlobalSearch, and GLOBAL. In addition, we propose a class of wavy test functions that mimic the wavy nature of objective functions arising in many black-box simulations. Extensive comparisons of algorithms on the wavy testfunctions and on earlier standard global-optimization test functions are done for a total of 19 different test problems. The numerical results indicate that SOMS performs favorably in comparison to alternative methods and does especially well on wavy functions when the number of function evaluations allowed is limited.
Resumo:
We present a novel surrogate model-based global optimization framework allowing a large number of function evaluations. The method, called SpLEGO, is based on a multi-scale expected improvement (EI) framework relying on both sparse and local Gaussian process (GP) models. First, a bi-objective approach relying on a global sparse GP model is used to determine potential next sampling regions. Local GP models are then constructed within each selected region. The method subsequently employs the standard expected improvement criterion to deal with the exploration-exploitation trade-off within selected local models, leading to a decision on where to perform the next function evaluation(s). The potential of our approach is demonstrated using the so-called Sparse Pseudo-input GP as a global model. The algorithm is tested on four benchmark problems, whose number of starting points ranges from 102 to 104. Our results show that SpLEGO is effective and capable of solving problems with large number of starting points, and it even provides significant advantages when compared with state-of-the-art EI algorithms.
Resumo:
The purpose of this study is to examine the stages of program realization of the interventions that the Bronx Health REACH program initiated at various levels to improve nutrition as a means for reducing racial and ethnic disparities in diabetes. This study was based on secondary analyses of qualitative data collected through the Bronx Health REACH Nutrition Project, a project conducted under the auspices of the Institute on Urban Family Health, with support from the Centers for Disease Control and Prevention (CDC). Local human subjects' review and approval through the Institute on Urban Family Health was required and obtained in order to conduct the Bronx Health REACH Nutrition Project. ^ The study drew from two theoretical models—Glanz and colleagues' nutrition environments model and Shediac-Rizkallah and Bone's sustainability model. The specific study objectives were two-fold: (1) to categorize each nutrition activity to a specific dimension (i.e. consumer, organizational or community nutrition environment); and (2) to evaluate the stage at which the program has been realized (i.e. development, implementation or sustainability). ^ A case study approach was applied and a constant comparative method was used to analyze the data. Triangulation of data based was also conducted. Qualitative data from this study revealed the following principal findings: (1) communities of color are disproportionately experiencing numerous individual and environmental factors contributing to the disparities in diabetes; (2) multi-level strategies that targeted the individual, organizational and community nutrition environments can appropriately address these contributing factors; (3) the nutrition strategies greatly varied in their ability to appropriately meet criteria for the three program stages; and (4) those nutrition strategies most likely to succeed (a) conveyed consistent and culturally relevant messages, (b) had continued involvement from program staff and partners, (c) were able to adapt over time or setting, (d) had a program champion and a training component, (e) were integrated into partnering organizations, and (f) were perceived to be successful by program staff and partners in their efforts to create individual, organizational and community/policy change. As a result of the criteria-based assessment and qualitative findings, an ecological framework elaborating on Glanz and colleagues model was developed. The qualitative findings and the resulting ecological framework developed from this study will help public health professionals and community leaders to develop and implement sustainable multi-level nutrition strategies for addressing racial and ethnic disparities in diabetes. ^
Resumo:
Detailed analyses of the Lake Van pollen, Ca/K ratio and stable oxygen isotope record allow the identification of millennial-scale vegetation and environmental changes in eastern Anatolia throughout the last glacial (~75-15 ka BP). The climate within the last glacial was cold and dry, with low arboreal pollen (AP) levels. The driest and coldest period corresponds to Marine Isotope Stage (MIS) 2 (~28-14.5 ka BP) dominated by the highest values of xerophytic steppe vegetation. Our high-resolution multi proxy record shows rapid expansions and contractions of tree populations that reflects variability in temperature and moisture availability. This rapid vegetation and environmental changes can be linked to the stadial-interstadial pattern of the Dansgaard-Oeschger (DO) events as recorded in the Greenland ice cores. Periods of reduced moisture availability were characterized by enhanced xerophytic species and high terrigenous input from the Lake Van catchment area. Furthermore, comparison with the marine realm reveals that the complex atmosphere-ocean interaction can be explained by the strength and position of the westerlies, which is responsible for the supply of humidity in eastern Anatolia. Influenced by diverse topography of the Lake Van catchment, larger DO interstadials (e.g. DO 19, 17-16, 14, 12 and 8) show the highest expansion of temperate species within the last glacial. However, Heinrich events (HE), characterized by highest concentrations of ice-rafted debris (IRD) in marine sediments, are identified in eastern Anatolia by AP values not lower and high steppe components not more abundant than during DO stadials. In addition, this work is a first attempt to establish a continuous microscopic charcoal record over the last glacial in the Near East, which documents an initial immediate response to millennial-scale climate and environmental variability and enables us to shed light on the history of fire activity during the last glacial.
Resumo:
A high-resolution multi-proxy record from Lake Van, eastern Anatolia, derived from a lacustrine sequence cored at the 357 m deep Ahlat Ridge (AR), allows a comprehensive view of paleoclimate and environmental history in the continental Near East during the last interglacial (LI). We combined paleovegetation (pollen), stable oxygen isotope (d18Obulk) and XRF data from the same sedimentary sequence, showing distinct variations during the period from 135 to 110 ka ago leading into and out of full interglacial conditions. The last interglacial plateau, as defined by the presence of thermophilous steppe-forest communities, lasted ca. 13.5 ka, from ~129.1-115.6 ka BP. The detailed palynological sequence at Lake Van documents a vegetation succession with several climatic phases: (I) the Pistacia zone (ca. 131.2-129.1 ka BP) indicates summer dryness and mild winter conditions during the initial warming, (II) the Quercus-Ulmus zone (ca. 129.1-127.2 ka BP) occurred during warm and humid climate conditions with enhanced evaporation, (III) the Carpinus zone (ca. 127.2-124.1 ka BP) suggest increasingly cooler and wetter conditions, and (IV) the expansion of Pinus at ~124.1 ka BP marks the onset of a colder/drier environment that extended into the interval of global ice growth. Pollen data suggest migration of thermophilous trees from refugial areas at the beginning of the last interglacial. Analogous to the current interglacial, the migration documents a time lag between the onset of climatic amelioration and the establishment of an oak steppe-forest, spanning 2.1 ka. Hence, the major difference between the last interglacial compared to the current interglacial (Holocene) is the abundance of Pinus as well as the decrease of deciduous broad-leaved trees, indicating higher continentality during the last interglacial. Finally, our results demonstrate intra-interglacial variability in the low mid-latitudes and suggest a close connection with the high-frequency climate variability recorded in Greenland ice cores.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
The evolution of water content on a sandy soil during the sprinkler irrigation campaign, in the summer of 2010, of a field of sugar beet crop located at Valladolid (Spain) is assessed by a capacitive FDR (Frequency Domain Reflectometry) EnviroScan. This field is one of the experimental sites of the Spanish research center for the sugar beet development (AIMCRA). The objective of the work focus on monitoring the soil water content evolution of consecutive irrigations during the second two weeks of July (from the 12th to the 28th). These measurements will be used to simulate water movement by means of Hydrus-2D. The water probe logged water content readings (m3/m3) at 10, 20, 40 and 60 cm depth every 30 minutes. The probe was placed between two rows in one of the typical 12 x 15 m sprinkler irrigation framework. Furthermore, a texture analysis at the soil profile was also conducted. The irrigation frequency in this farm was set by the own personal farmer 0 s criteria that aiming to minimizing electricity pumping costs, used to irrigate at night and during the weekend i.e. longer irrigation frequency than expected. However, the high evapotranspiration rates and the weekly sugar beet water consumption—up to 50mm/week—clearly determined the need for lower this frequency. Moreover, farmer used to irrigate for six or five hours whilst results from the EnviroScan probe showed the soil profile reaching saturation point after the first three hours. It must be noted that AIMCRA provides to his members with a SMS service regarding weekly sugar beet water requirement; from the use of different meteorological stations and evapotranspiration pans, farmers have an idea of the weekly irrigation needs. Nevertheless, it is the farmer 0 s decision to decide how to irrigate. Thus, in order to minimize water stress and pumping costs, a suitable irrigation time and irrigation frequency was modeled with Hydrus-2D. Results for the period above mentioned showed values of water content ranging from 35 and 30 (m3/m3) for the first 10 and 20cm profile depth (two hours after irrigation) to the minimum 14 and 13 (m3/m3) ( two hours before irrigation). For the 40 and 60 cm profile depth, water content moves steadily across the dates: The greater the root activity the greater the water content variation. According to the results in the EnviroScan probe and the modeling in Hydrus-2D, shorter frequencies and irrigation times are suggested.