920 resultados para Ecosystem-level models
Resumo:
Multiple-antenna systems offer significant performance enhancement and will be applied to the next generation broadband wireless communications. This thesis presents the investigations of multiple-antenna systems – multiple-input multiple-output (MIMO) and cooperative communication (CC) – and their performances in more realistic propagation environments than those reported previously. For MIMO systems, the investigations are conducted via theoretical modelling and simulations in a double-scattering environment. The results show that the variations of system performances depend on how scatterer density varies in flat fading channels, and that in frequency-selective fading channels system performances are affected by the length of the coding block as well as scatterer density. In realistic propagation environments, the fading correlation also has an impact on CC systems where the antennas can be further apart than those in MIMO systems. A general stochastic model is applied to studying the effects of fading correlation on the performances of CC systems. This model reflects the asymmetry fact of the wireless channels in a CC system. The results demonstrate the varied effects of fading correlation under different protocols and channel conditions. Performances of CC systems are further studied at the packet level, using both simulations and an experimental testbed. The results obtained have verified various performance trade-offs of the cooperative relaying network (CRN) investigated in different propagation environments. The results suggest that a proper selection of the relaying algorithms and other techniques can meet the requirements of quality of service for different applications.
Resumo:
This paper presents a comparative study of three closely related Bayesian models for unsupervised document level sentiment classification, namely, the latent sentiment model (LSM), the joint sentiment-topic (JST) model, and the Reverse-JST model. Extensive experiments have been conducted on two corpora, the movie review dataset and the multi-domain sentiment dataset. It has been found that while all the three models achieve either better or comparable performance on these two corpora when compared to the existing unsupervised sentiment classification approaches, both JST and Reverse-JST are able to extract sentiment-oriented topics. In addition, Reverse-JST always performs worse than JST suggesting that the JST model is more appropriate for joint sentiment topic detection.
Resumo:
The scenario planning literature is focused on corporate level interventions. There is a general consensus on the method, but there is little debate about the stages involved in building and using the scenarios. This article presents a case study of a scenario planning intervention, which was conducted at a business unit of the British division of one of the largest beauty and cosmetic products multinationals. The method adopted in this case study has some fundamental differences to the existing models used at corporate level. This research is based on the principles of autoethnography, since its purpose is to present self-critical reflections, enhanced by reflective and reflexive conversations on a scenario planning method used at business unit level. The critical reflections concern a series of critical incidents which distinguish this method from existing intuitive logic scenario planning models which are used at corporate level planning. Ultimately this article contributes to the scenario planning method literature by providing insights into its practice at business unit level. © 2012 Elsevier Ltd.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
Recently, we have developed the hierarchical Generative Topographic Mapping (HGTM), an interactive method for visualization of large high-dimensional real-valued data sets. In this paper, we propose a more general visualization system by extending HGTM in three ways, which allows the user to visualize a wider range of data sets and better support the model development process. 1) We integrate HGTM with noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM). This enables us to visualize data of inherently discrete nature, e.g., collections of documents, in a hierarchical manner. 2) We give the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode, the user selects "regions of interest," whereas in the automatic mode, an unsupervised minimum message length (MML)-inspired construction of a mixture of LTMs is employed. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. 3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualization plots, since they can highlight the boundaries between data clusters. We illustrate our approach on a toy example and evaluate it on three more complex real data sets. © 2005 IEEE.
Resumo:
Models for the conditional joint distribution of the U.S. Dollar/Japanese Yen and Euro/Japanese Yen exchange rates, from November 2001 until June 2007, are evaluated and compared. The conditional dependency is allowed to vary across time, as a function of either historical returns or a combination of past return data and option-implied dependence estimates. Using prices of currency options that are available in the public domain, risk-neutral dependency expectations are extracted through a copula repre- sentation of the bivariate risk-neutral density. For this purpose, we employ either the one-parameter \Normal" or a two-parameter \Gumbel Mixture" specification. The latter provides forward-looking information regarding the overall degree of covariation, as well as, the level and direction of asymmetric dependence. Specifications that include option-based measures in their information set are found to outperform, in-sample and out-of-sample, models that rely solely on historical returns.
Resumo:
In recent years, learning word vector representations has attracted much interest in Natural Language Processing. Word representations or embeddings learned using unsupervised methods help addressing the problem of traditional bag-of-word approaches which fail to capture contextual semantics. In this paper we go beyond the vector representations at the word level and propose a novel framework that learns higher-level feature representations of n-grams, phrases and sentences using a deep neural network built from stacked Convolutional Restricted Boltzmann Machines (CRBMs). These representations have been shown to map syntactically and semantically related n-grams to closeby locations in the hidden feature space. We have experimented to additionally incorporate these higher-level features into supervised classifier training for two sentiment analysis tasks: subjectivity classification and sentiment classification. Our results have demonstrated the success of our proposed framework with 4% improvement in accuracy observed for subjectivity classification and improved the results achieved for sentiment classification over models trained without our higher level features.
Resumo:
ACM Computing Classification System (1998): K.3.1, K.3.2.
Resumo:
2000 Mathematics Subject Classification: 60K15, 60K20, 60G20,60J75, 60J80, 60J85, 60-08, 90B15.
Resumo:
Systems-of-systems (SoS) are systems resulted from the interaction among other independent constituent systems that collaborate to offer new functionalities towards accomplishing global missions. Each of these constituent systems accomplishes its individual missions and is able to contribute to the achievement of the global missions of the SoS, both being viewed as a set of associated goals. In the perspective of self-aware systems, SoS need to exhibit goal-awareness, i.e., They need to be aware of their own goals and of how their constituent systems contribute to their accomplishment. In this paper, we revisit goal-oriented concepts aiming at identifying and modeling goals at both SoS level and the constituent systems level. Moreover, we take advantage of such goal-oriented models to express the relationship among goals at these levels as well as to define how each constituent system can contribute to the accomplishment of global goals of an SoS. In addition, we shed light on important issues related to goal modeling in self-aware SoS to be addressed in future research.
Resumo:
Software architecture plays an essential role in the high level description of a system design, where the structure and communication are emphasized. Despite its importance in the software engineering process, the lack of formal description and automated verification hinders the development of good software architecture models. In this paper, we present an approach to support the rigorous design and verification of software architecture models using the semantic web technology. We view software architecture models as ontology representations, where their structures and communication constraints are captured by the Web Ontology Language (OWL) and the Semantic Web Rule Language (SWRL). Specific configurations on the design are represented as concrete instances of the ontology, to which their structures and dynamic behaviors must conform. Furthermore, ontology reasoning tools can be applied to perform various automated verification on the design to ensure correctness, such as consistency checking, style recognition, and behavioral inference.
Resumo:
This study examines the effect of blood absorption on the endogenous fluorescence signal intensity of biological tissues. Experimental studies were conducted to identify these effects. To register the fluorescence intensity, the fluorescence spectroscopy method was employed. The intensity of the blood flow was measured by laser Doppler flowmetry. We proposed one possible implementation of the Monte Carlo method for the theoretical analysis of the effect of blood on the fluorescence signals. The simulation is constructed as a four-layer skin optical model based on the known optical parameters of the skin with different levels of blood supply. With the help of the simulation, we demonstrate how the level of blood supply can affect the appearance of the fluorescence spectra. In addition, to describe the properties of biological tissue, which may affect the fluorescence spectra, we turned to the method of diffuse reflectance spectroscopy (DRS). Using the spectral data provided by the DRS, the tissue attenuation effect can be extracted and used to correct the fluorescence spectra.
Resumo:
A szerzők célja, hogy megvizsgálják, milyen kölcsönhatásban áll az ellátási láncban elfoglalt pozíció, valamint a szolgálatosodás szintje az európai termelővállalatoknál. Vizsgálatuk azt mutatja, hogy a globalizáció és a termelés nemzetközivé válása mindkét tényezőt jelentős mértékben befolyásolja. A termelés globalizációs trendjeinek megfelelően így a kelet-európai (fejlődő), illetve a nyugat-európai (fejlett) országokban eltérő üzleti modellek válnak dominánssá, amelyek különböző ellátásilánc-pozícióval és más-más szintű szolgáltatásnyújtással jellemezhetőek. A domináns üzleti modellek mellett természetesen más üzleti modellek is működőképesnek bizonyulhatnak a két vizsgált régióban. A létesítmények elhelyezésére, valamint az üzleti eredményességre vonatkozó mutatók elemzésbe történő bevonásával cikkük az Európában működő üzleti modellek kialakításának okára, valamint jövőbeli fenntarthatóságára is megpróbál választ adni. __________ The objective of this paper is to examine the relationship between supply chain position and level of servitization in European manufacturing companies. The analysis shows that globalization and internationalization of production has dramatic impact on both phenomena. Due to the globalization trends different business models became dominant in the less developed Eastern-European and the more developed Western European countries, which can be characterized by different supply chain position and servitization level. Certainly other business models can also be successful in the two regions. Involving facility location motivations and business performance indicators the article shed light on the reasons of why these business models came alive and how sustainable they can be.
Resumo:
Climate change has serious effects on the setting up and the operation of natural ecosystems. Small increase in temperature could cause rise in the amount of some species or potential disappearance of others. During our researches, the dispersion of the species and biomass production of a theoretical ecosystem were examined on the effect of the temperature–climate change. The answers of the ecosystems which are given to the climate change could be described by means of global climate modelling and dynamic vegetation models. The examination of the operation of the ecosystems is only possible in huge centres on supercomputers because of the number and the complexity of the calculation. The number of the calculation could be decreased to the level of a PC by considering the temperature and the reproduction during modelling a theoretical ecosystem, and several important theoretical questions could be answered.
Resumo:
Az Európai Unión belül az elmúlt időszakban megerősödött a vita arról, vajon a Közösség versenyképességének javításához milyen módon és mértékben járulhat hozzá az ipari és lakossági fogyasztók számára kedvező áron elérhető villamos energia. Az uniós testületek elsődlegesen a verseny feltételeinek további javításában látják a versenyképesség javításának fő eszközét, ám egyesek az aktívabb központi szabályozás mellett érvelnek. A jelenleg alkalmazott európai szabályozási gyakorlat áttekintése, a szabályozási modellek és a piaci árak alakulásának vizsgálata hozzásegíthet, hogy következtetéseket vonjunk le a tagállami gyakorlatok tekintetében, vajon sikeresebb-e a központi ármegállapításon alapuló szabályozói mechanizmus, mint a liberalizált piacmodell. ______ There is a strengthening debate within the European Union in recent years about the impact of the affordable industrial and household electricity prices on the general competitiveness of European economies. While the European Institutions argues for the further liberalization of the energy retail sector, there are others who believe in centralization and price control to achieve lower energy prices. Current paper reviews the regulatory models of the European countries and examines the connection between the regulatory regime and consumer price trends. The analysis can help to answer, whether the bureaucratic central regulation or the liberalized market model seems more successful in supporting the competitiveness goals. Although the current regulatory practice is heterogeneous within the EU member states, there is a clear trend to decrease the role of regulated tariffs in the end-user prices. Our study did not find a general causal relationship between the regulatory regime and the level of consumer electricity prices in a country concerned. However, the quantitative analysis of the industrial and household energy prices by various segments detected significant differences between the regulated and free-market countries. The first group of member states tends to decrease the prices in the low-consuming household segments through cross-financing technics, including increased network tariffs and/or taxes for the high-consuming segments and for industrial consumers. One of the major challenges of the regulatory authorities is to find the proper way of sharing these burdens proportionally with minimizing the market-distorting effects of the cross-subsidization between the different stakeholder groups.