18 resultados para Complexity of Distribution

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pinus uncinata forms forests in the centre and southwest of the Alps and in the subalpine Pyrenees (at around 1700 – 2600 m) (Costa Tenorio et al., 1997). The species reaches the southwestern limit of its distribution at the top of Mount Castillo de Vinuesa (Soria, Spain). The small population on this mountain occupies just 66 ha, but is very important from a geobotanical viewpoint since it is just one of two populations (the other being in the Sierra de Gúdar range in Teruel, Spain) isolated from the main area where the species is found in the Iberian Peninsula (The Pyrenees)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent of new signal processing methods, such as non-linear analysis techniques, represents a new perspective which adds further value to brain signals' analysis. Particularly, Lempel–Ziv's Complexity (LZC) has proven to be useful in exploring the complexity of the brain electromagnetic activity. However, an important problem is the lack of knowledge about the physiological determinants of these measures. Although acorrelation between complexity and connectivity has been proposed, this hypothesis was never tested in vivo. Thus, the correlation between the microstructure of the anatomic connectivity and the functional complexity of the brain needs to be inspected. In this study we analyzed the correlation between LZC and fractional anisotropy (FA), a scalar quantity derived from diffusion tensors that is particularly useful as an estimate of the functional integrity of myelinated axonal fibers, in a group of sixteen healthy adults (all female, mean age 65.56 ± 6.06 years, intervals 58–82). Our results showed a positive correlation between FA and LZC scores in regions including clusters in the splenium of the corpus callosum, cingulum, parahipocampal regions and the sagittal stratum. This study supports the notion of a positive correlation between the functional complexity of the brain and the microstructure of its anatomical connectivity. Our investigation proved that a combination of neuroanatomical and neurophysiological techniques may shed some light on the underlying physiological determinants of brain's oscillations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new multi-objective estimation of distribution algorithm (EDA) based on joint modeling of objectives and variables. This EDA uses the multi-dimensional Bayesian network as its probabilistic model. In this way it can capture the dependencies between objectives, variables and objectives, as well as the dependencies learnt between variables in other Bayesian network-based EDAs. This model leads to a problem decomposition that helps the proposed algorithm to find better trade-off solutions to the multi-objective problem. In addition to Pareto set approximation, the algorithm is also able to estimate the structure of the multi-objective problem. To apply the algorithm to many-objective problems, the algorithm includes four different ranking methods proposed in the literature for this purpose. The algorithm is applied to the set of walking fish group (WFG) problems, and its optimization performance is compared with an evolutionary algorithm and another multi-objective EDA. The experimental results show that the proposed algorithm performs significantly better on many of the problems and for different objective space dimensions, and achieves comparable results on some compared with the other algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The elemental composition, patterns of distribution and possible sources of street dust are not common to all urban environments, but vary according to the peculiarities of each city. The common features and dissimilarities in the origin and nature of street dust were investigated through a series of studies in two widely different cities, Madrid (Spain) and Oslo (Norway), between 1990 and 1994. The most comprehensive sampling campaign was carried out in the Norwegian capital during the summer of 1994. An area of 14 km2, covering most of downtown Oslo and some residential districts to the north of the city, was divided into 1 km2 mapping units, and 16 sampling increments of approximately 150 g were collected from streets and roads in each of them. The fraction below 100 μm was acid-digested and analysed by ICP-MS. Statistical analyses of the results suggest that chemical elements in street dust can be classified into three groups: “urban” elements (Ba, Cd, Co, Cu, Mg, Pb, Sb, Ti, Zn), “natural” elements (Al, Ga, La, Mn, Na, Sr, Th, Y) and elements of a mixed origin or which have undergone geochemical changes from their original sources (Ca, Cs, Fe, Mo, Ni, Rb, Sr, U). Soil resuspension and/or mobilisation appears to be the most important source of “natural” elements, while “urban” elements originate primarily from traffic and from the weathering and corrosion of building materials. The data for Pb seem to prove that the gradual shift from leaded to unleaded petrol as fuel for automobiles has resulted in an almost proportional reduction in the concentration of Pb in dust particles under 100 μm. This fact and the spatial distribution of Pb in the city strongly suggest that lead sources other than traffic (i.e. lead accumulated in urban soil over the years) may contribute as much lead, if not more, to urban street dust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several authors have analysed the changes of the probability density function of the solar radiation with different time resolutions. Some others have approached to study the significance of these changes when produced energy calculations are attempted. We have undertaken different transformations to four Spanish databases in order to clarify the interrelationship between radiation models and produced energy estimations. Our contribution is straightforward: the complexity of a solar radiation model needed for yearly energy calculations, is very low. Twelve values of monthly mean of solar radiation are enough to estimate energy with errors below 3%. Time resolutions better than hourly samples do not improve significantly the result of energy estimations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantics Difficulty Model (SDM) is a model that measures the difficult of introducing semantics technology into a company. SDM manages three descriptions of stages, which we will refer to as ?snapshots?: a company semantic snapshot, data snapshot and semantic application snapshot. Understanding a priory the complexity of introducing semantics into a company is important because it allows the organization to take early decisions, thus saving time and money, mitigating risks and improving innovation, time to market and productivity. SDM works by measuring the distance between each initial snapshot and its reference models (the company semantic snapshots reference model, data snapshots reference model, and the semantic application snapshots reference model) with Euclidian distances. The difficulty level will be "not at all difficult" when the distance is small, and becomes "extremely difficult" when the the distance is large. SDM has been tested experimentally with 2000 simulated companies with arrangements and several initial stages. The output is measured by five linguistic values: "not at all difficult, slightly difficult, averagely difficult, very difficult and extremely difficult". As the preliminary results of our SDM simulation model indicate, transforming a search application into integrated data from different sources with semantics is a "slightly difficult", in contrast with data and opinion extraction applications for which it is "very difficult".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in Estimation of Distribution Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve Estimation of Distribution Algorithms from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called distributed or island-based models. This approach defines several islands (algorithms instances) running independently and exchanging information with a given frequency. The information sent by the islands can be either a set of individuals or a probabilistic model. This paper presents a comparative study for a distributed univariate Estimation of Distribution Algorithm and a multivariate version, paying special attention to the comparison of two alternative methods for exchanging information, over a wide set of parameters and problems ? the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. Several analyses from different points of view have been conducted to analyze both the influence of the parameters and the relationships between them including a characterization of the configurations according to their behavior on the proposed benchmark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With increasing complexity of today's consumer requirements, food industry decision makers should be able to respond to consumer needs much faster than ever before. The preliminary studies showed that for improving the performance and selecting suitable distribution models, decision makers in food industries should classify different types of consumers and based on the classification prepare different distributions flows. By studying the HORECA distribution channel, this paper suggest that, logistics decision makers should investigate the relationship between consumers' characteristics and urban freight distribution strategy in order to respond to the exact needs and in the follow to reduce the logistics cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Entre os vários fatores que contribuem para a produção de uma cultura de milho, a distribuição vertical dos semeadores avaliada através da localização da semente em profundidade é um fator-chave, especialmente na técnica de sementeira direta. Simultaneamente, dada a complexidade dos ecossistemas naturais e agrícolas em sistemas de agricultura de conservação, a gestão diferenciada e localizada das parcelas assume um importante papel na análise e gestão da variabilidade das propriedades do solo e estabelecimento das culturas, nomeadamente utilizando informação geo referenciada e tecnologia expedita. Assim, o principal objetivo desta Tese foi a avaliação em culturas de milho da variabilidade espacial da localização de semente em profundidade e estabelecimento da cultura em sementeira direta usando sistemas convencionais de controlo de profundidade, tendo-se comparado com diferentes sistemas de mobilização e recorrendo a tecnologias de agricultura de precisão. Os ensaios decorreram na região Mediterrânea do Alentejo, em propriedades agrícolas no decorrer das campanhas de 2010, 2011, 2012 e 2015 em 6 diferentes campos experimentais. O trabalho experimental consistiu em ensaios com avaliações in loco do solo e cultura, consumo de combustível das operações e deteção remota. Os resultados obtidos indicam que não só o sistema de mobilização afetou a localização da semente em profundidade, como em sementeira direta a profundidade de sementeira foi afetada pelo teor de humidade do solo, resistência do solo à profundidade e velocidade da operação de sementeira. Adicionalmente observaram-se condições heterogéneas de emergência e estabelecimento da cultura afetadas por condições físicas de compactação do solo. Comparando os diferentes sistemas de mobilização, obteve-se uma significativa redução de combustível para a técnica de sementeira direta, apesar de se terem observado diferenças estatísticas significativas considerando diferentes calibrações de profundidade de sementeira Do trabalho realizado nesta Tese ressalva-se a importância que as tecnologias de agricultura de precisão podem ter no acompanhamento e avaliação de culturas em sementeira direta, bem como a necessidade de melhores procedimentos no controlo de profundidade dos semeadores pelo respetivos operadores ou ao invés, a adoção de semeadores com mecanismos ativos de controlo de profundidade. ABSTRACT Among the various factors that contribute towards producing a successful maize crop, seeders vertical distribution evaluated through seed depth placement is a key determinant, especially under a no-tillage technique. At the same time in conservation agriculture systems due to the complexity of natural and agricultural ecosystems site specific management became an important approach to understand and manage the variability of soil properties and crop establishment, especially when using geo spatial information and affording readily technology Thus, the main objective of this Thesis was to evaluate the spatial variability of seed depth placement and crop establishment in maize crops under no-tillage conditions compared to different tillage systems, using conventional seed depth control no till seeders and precision farming technologies. Trials were carried out in the Mediterranean region of Alentejo, in private farms along the sowing operations season over the years 2010, 2011, 2012 and 2015 in 6 different experimental fields. Experimental work covered field tests with in loco soil and crop evaluations, fuel operation evaluations and aerial sensing. The results obtained indicate that not only tillage system affected seed depth placement but under no till conditions seed depth was affected by soil moisture content, soil resistance to penetration and seeders forward speed. In addition uneven crop seedling and establishment depended on seed depth placement and could be affected by physical problems of compaction layers. Significant reduction in fuel consumption was observed for no till operations although significant differences observed according to different setting calibrations of seed depth control. According to the results, precision agriculture is an important tool to evaluate crops under no till conditions and seed depth mechanisms should be more accurate by the operators or is determinant the adoption of new active depth control technology to improve seeders performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spatial complexity of the distribution of organic matter, chemicals, nutrients, pollutants has been demonstrated to have multifractal nature (Kravchenco et al. [1]). This fact supports the possibility of existence of some emergent heterogeneity structure built under the evolution of the system. The aim of this note is providing a consistent explanation to the mentioned results via an extremely simple model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magnetoencephalography (MEG) allows the real-time recording of neural activity and oscillatory activity in distributed neural networks. We applied a non-linear complexity analysis to resting-state neural activity as measured using whole-head MEG. Recordings were obtained from 20 unmedicated patients with major depressive disorder and 19 matched healthy controls. Subsequently, after 6 months of pharmacological treatment with the antidepressant mirtazapine 30 mg/day, patients received a second MEG scan. A measure of the complexity of neural signals, the Lempel–Ziv Complexity (LZC), was derived from the MEG time series. We found that depressed patients showed higher pre-treatment complexity values compared with controls, and that complexity values decreased after 6 months of effective pharmacological treatment, although this effect was statistically significant only in younger patients. The main treatment effect was to recover the tendency observed in controls of a positive correlation between age and complexity values. Importantly, the reduction of complexity with treatment correlated with the degree of clinical symptom remission. We suggest that LZC, a formal measure of neural activity complexity, is sensitive to the dynamic physiological changes observed in depression and may potentially offer an objective marker of depression and its remission after treatment.