756 resultados para Grid-based clustering approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimal currency area theory suggests that business cycle comovement is a sufficient condition for monetary union, particularly if there are low levels of labour mobility between potential members of the monetary union. Previous studies of co-movement of business cycle variables (mainly authored by Artis and Zhang in the late 1990s) found that there was a core of member states in the EU that could be grouped together as having similar business cycle comovements, but these studies always used Germany as the country against which to compare. In this study, the analysis of Artis and Zhang is extended and updated but correlating against both German and euro area macroeconomic aggregates and using more recent techniques in cluster analysis, namely model-based clustering techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Questions of handling unbalanced data considered in this article. As models for classification, PNN and MLP are used. Problem of estimation of model performance in case of unbalanced training set is solved. Several methods (clustering approach and boosting approach) considered as useful to deal with the problem of input data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic diversity and population structure were investigated across the core range of Tasmanian devils (Sarcophilus laniarius; Dasyuridae), a wide-ranging marsupial carnivore restricted to the island of Tasmania. Heterozygosity (0.386-0.467) and allelic diversity (2.7-3.3) were low in all subpopulations and allelic size ranges were small and almost continuous, consistent with a founder effect. Island effects and repeated periods of low population density may also have contributed to the low variation. Within continuous habitat, gene flow appears extensive up to 50 km (high assignment rates to source or close neighbour populations; nonsignificant values of pairwise F-ST), in agreement with movement data. At larger scales (150-250 km), gene flow is reduced (significant pairwise F-ST) but there is no evidence for isolation by distance. The most substantial genetic structuring was observed for comparisons spanning unsuitable habitat, implying limited dispersal of devils between the well-connected, eastern populations and a smaller northwestern population. The genetic distinctiveness of the northwestern population was reflected in all analyses: unique alleles; multivariate analyses of gene frequency (multidimensional scaling, minimum spanning tree, nearest neighbour); high self-assignment (95%); two distinct populations for Tasmania were detected in isolation by distance and in Bayesian model-based clustering analyses. Marsupial carnivores appear to have stronger population subdivisions than their placental counterparts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simulation-based modelling approach is used to examine the effects of stratified seed dispersal (representing the distribution of the majority of dispersal around the maternal parent and also rare long-distance dispersal) on the genetic structure of maternally inherited genomes and the colonization rate of expanding plant populations. The model is parameterized to approximate postglacial oak colonization in the UK, but is relevant to plant populations that exhibit stratified seed dispersal. The modelling approach considers the colonization of individual plants over a large area (three 500 km x 10 km rolled transects are used to approximate a 500 km x 300 km area). Our approach shows how the interaction of plant population dynamics with stratified dispersal can result in a spatially patchy haplotype structure. We show that while both colonization speeds and the resulting genetic structure are influenced by the characteristics of the dispersal kernel, they are robust to changes in the periodicity of long-distance events, provided the average number of long-distance dispersal events remains constant. We also consider the effects of additional physical and environmental mechanisms on plant colonization. Results show significant changes in genetic structure when the initial colonization of different haplotypes is staggered over time and when a barrier to colonization is introduced. Environmental influences on survivorship and fecundity affect both the genetic structure and the speed of colonization. The importance of these mechanisms in relation to the postglacial spread and genetic structure of oak in the UK is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three important goals in describing software design patterns are: generality, precision, and understandability. To address these goals, this paper presents an integrated approach to specifying patterns using Object-Z and UML. To achieve the generality goal, we adopt a role-based metamodeling approach to define patterns. With this approach, each pattern is defined as a pattern role model. To achieve precision, we formalize role concepts using Object-Z (a role metamodel) and use these concepts to define patterns (pattern role models). To achieve understandability, we represent the role metamodel and pattern role models visually using UML. Our pattern role models provide a precise basis for pattern-based model transformations or refactoring approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a formal but practical approach for defining and using design patterns. Initially we formalize the concepts commonly used in defining design patterns using Object-Z. We also formalize consistency constraints that must be satisfied when a pattern is deployed in a design model. Then we implement the pattern modeling language and its consistency constraints using an existing modeling framework, EMF, and incorporate the implementation as plug-ins to the Eclipse modeling environment. While the language is defined formally in terms of Object-Z definitions, the language is implemented in a practical environment. Using the plug-ins, users can develop precise pattern descriptions without knowing the underlying formalism, and can use the tool to check the validity of the pattern descriptions and pattern usage in design models. In this work, formalism brings precision to the pattern language definition and its implementation brings practicability to our pattern-based modeling approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-national manufacturing companies are often faced with very difficult decisions regarding where and how to cost effectively manufacture products in a global setting. Clearly, they must utilize efficient and responsive manufacturing strategies to reach low cost solutions, but they must also consider the impact of manufacturing and transportation solutions upon their ability to support sales. One important sales consideration is determining how much work in process, in-transit stock, and finished goods to have on hand to support sales at a desired service level. This paper addresses this important consideration through a comprehensive scenario-based simulation approach, including sensitivity analysis on key study parameters. Results indicate that the inventory needs vary considerably for different manufacturing and delivery methods in ways that may not be obvious when using common evaluative tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Respiration is a complex activity. If the relationship between all neurological and skeletomuscular interactions was perfectly understood, an accurate dynamic model of the respiratory system could be developed and the interaction between different inputs and outputs could be investigated in a straightforward fashion. Unfortunately, this is not the case and does not appear to be viable at this time. In addition, the provision of appropriate sensor signals for such a model would be a considerable invasive task. Useful quantitative information with respect to respiratory performance can be gained from non-invasive monitoring of chest and abdomen motion. Currently available devices are not well suited in application for spirometric measurement for ambulatory monitoring. A sensor matrix measurement technique is investigated to identify suitable sensing elements with which to base an upper body surface measurement device that monitors respiration. This thesis is divided into two main areas of investigation; model based and geometrical based surface plethysmography. In the first instance, chapter 2 deals with an array of tactile sensors that are used as progression of existing and previously investigated volumetric measurement schemes based on models of respiration. Chapter 3 details a non-model based geometrical approach to surface (and hence volumetric) profile measurement. Later sections of the thesis concentrate upon the development of a functioning prototype sensor array. To broaden the application area the study has been conducted as it would be fore a generically configured sensor array. In experimental form the system performance on group estimation compares favourably with existing system on volumetric performance. In addition provides continuous transient measurement of respiratory motion within an acceptable accuracy using approximately 20 sensing elements. Because of the potential size and complexity of the system it is possible to deploy it as a fully mobile ambulatory monitoring device, which may be used outside of the laboratory. It provides a means by which to isolate coupled physiological functions and thus allows individual contributions to be analysed separately. Thus facilitating greater understanding of respiratory physiology and diagnostic capabilities. The outcome of the study is the basis for a three-dimensional surface contour sensing system that is suitable for respiratory function monitoring and has the prospect with future development to be incorporated into a garment based clinical tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This comparative study considers the main causative factors for change in recent years in the teaching of modern languages in England and France and seeks to contribute, in a general sense, to the understanding of change in comparable institutions. In England by 1975 the teaching of modern languages in the comprehensive schools was seen to be inappropriate to the needs of children of the whole ability-range. A combination of the external factor of the Council of Europe initiative in devising a needs-based learning approach for adult learners, and the internal factor of teacher-based initiatives in developing a graded-objectives learning approach for the less-able, has reversed this situation to some extent. The study examines and evaluates this reversal, and, in addition, assesses teachers' attitudes towards, and understanding of, the changes involved. In France the imposition of `la reforme Haby' in 1977 and the creation of `le college unique' were the main external factors for change. The subsequent failure of the reform and the socialist government's support of decentralisation policies returning the initiative for renewal to schools are examined and evaluated, as are the internal factors for changes in language-teaching - `groupes de niveau' and the creation of `equipes pedagogiques'. In both countries changes in the function of examinations at 15/16 plus are examined. The final chapter compared the changes in both education systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern business trends such as agile manufacturing and virtual corporations require high levels of flexibility and responsiveness to consumer demand, and require the ability to quickly and efficiently select trading partners. Automated computational techniques for supply chain formation have the potential to provide significant advantages in terms of speed and efficiency over the traditional manual approach to partner selection. Automated supply chain formation is the process of determining the participants within a supply chain and the terms of the exchanges made between these participants. In this thesis we present an automated technique for supply chain formation based upon the min-sum loopy belief propagation algorithm (LBP). LBP is a decentralised and distributed message-passing algorithm which allows participants to share their beliefs about the optimal structure of the supply chain based upon their costs, capabilities and requirements. We propose a novel framework for the application of LBP to the existing state-of-the-art case of the decentralised supply chain formation problem, and extend this framework to allow for application to further novel and established problem cases. Specifically, the contributions made by this thesis are: • A novel framework to allow for the application of LBP to the decentralised supply chain formation scenario investigated using the current state-of-the-art approach. Our experimental analysis indicates that LBP is able to match or outperform this approach for the vast majority of problem instances tested. • A new solution goal for supply chain formation in which economically motivated producers aim to maximise their profits by intelligently altering their profit margins. We propose a rational pricing strategy that allows producers to earn significantly greater profits than a comparable LBP-based profitmaking approach. • An LBP-based framework which allows the algorithm to be used to solve supply chain formation problems in which goods are exchanged in multiple units, a first for a fully decentralised technique. As well as multiple-unit exchanges, we also model in this scenario realistic constraints such as factory capacities and input-to-output ratios. LBP continues to be able to match or outperform an extended version of the existing state-of-the-art approach in this scenario. • Introduction of a dynamic supply chain formation scenario in which participants are able to alter their properties or to enter or leave the process at any time. Our results suggest that LBP is able to deal easily with individual occurences of these alterations and that performance degrades gracefully when they occur in larger numbers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we propose a two phases control method for DSRC vehicle networks at road intersection, where multiple road safety applications may coexist. We consider two safety applications, emergency safety application with high priority and routine safety applications with low priority. The control method is designed to provide high availability and low latency for emergency safety applications while leave as much as possible bandwidth for routine applications. It is expected to be capable of adapting to changing network conditions. In the first phase of the method we use a simulation based offline approach to find out the best configurations for message rate and MAC layer parameters for given numbers of vehicles. In the second phase we use the configurations identified by simulations at roadside access point (AP) for system operation. A utilization function is proposed to balance the QoS performances provided to multiple safety applications. It is demonstrated that the proposed method can largely improve the system performance when compared to fixed control method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mood stabilising drugs such as lithium (LiCl) and valproic acid (VPA) are the first line agents for treating conditions such as Bipolar disorder and Epilepsy. However, these drugs have potential developmental effects that are not fully understood. This study explores the use of a simple human neurosphere-based in vitro model to characterise the pharmacological and toxicological effects of LiCl and VPA using gene expression changes linked to phenotypic alterations in cells. Treatment with VPA and LiCl resulted in the differential expression of 331 and 164 genes respectively. In the subset of VPA targeted genes, 114 were downregulated whilst 217 genes were upregulated. In the subset of LiCl targeted genes, 73 were downregulated and 91 were upregulated. Gene ontology (GO) term enrichment analysis was used to highlight the most relevant GO terms associated with a given gene list following toxin exposure. In addition, in order to phenotypically anchor the gene expression data, changes in the heterogeneity of cell subtype populations and cell cycle phase were monitored using flow cytometry. Whilst LiCl exposure did not significantly alter the proportion of cells expressing markers for stem cells/undifferentiated cells (Oct4, SSEA4), neurons (Neurofilament M), astrocytes (GFAP) or cell cycle phase, the drug caused a 1.4-fold increase in total cell number. In contrast, exposure to VPA resulted in significant upregulation of Oct4, SSEA, Neurofilament M and GFAP with significant decreases in both G2/M phase cells and cell number. This neurosphere model might provide the basis of a human-based cellular approach for the regulatory exploration of developmental impact of potential toxic chemicals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the light of the financial crisis and the radically changed conditions in the market place, international leadership development is facing new demands. The Danish-based International Leadership Institute Mannaz has researched the new conditions in collaboration with the Institute of Executive Development in the United States. The research, conducted in 2008 and 2009, combines, in an innovative way, quantitative and qualitative inputs, from both current and future perspectives, from some 111 senior Corporate Executives, Heads of Human Resources and of Learning and Organisational Development in large international corporations headquartered in Europe and the United States; together with the thoughts of some 50 experienced practitioners involved in executive coaching as well as in designing, developing and facilitating leadership development programmes. Also we include a section summarising the key findings from recently published research from other leadership development surveys. Conclusions reveal that the crisis has propelled a long-awaited decline of the traditional classroom-based educational approach to leadership development. Instead, effective leadership development is suggested to build on experiential learning approaches rooted in real life, real time and allowing for more immediate impact and providing for considerably higher relevance and motivation. Coaching, leaders teaching leaders, stretch assignments, action learning, peer networking, customer insights and selective use of technology are seen as important contributors to the leadership development process going forward.