980 resultados para dynamic set
Resumo:
In recent works large area hydrogenated amorphous silicon p-i-n structures with low conductivity doped layers were proposed as single element image sensors. The working principle of this type of sensor is based on the modulation, by the local illumination conditions, of the photocurrent generated by a light beam scanning the active area of the device. In order to evaluate the sensor capabilities is necessary to perform a response time characterization. This work focuses on the transient response of such sensor and on the influence of the carbon contents of the doped layers. In order to evaluate the response time a set of devices with different percentage of carbon incorporation in the doped layers is analyzed by measuring the scanner-induced photocurrent under different bias conditions.
Resumo:
This paper addresses the impact of the CO2 opportunity cost on the wholesale electricity price in the context of the Iberian electricity market (MIBEL), namely on the Portuguese system, for the period corresponding to the Phase II of the European Union Emission Trading Scheme (EU ETS). In the econometric analysis a vector error correction model (VECM) is specified to estimate both long–run equilibrium relations and short–run interactions between the electricity price and the fuel (natural gas and coal) and carbon prices. The model is estimated using daily spot market prices and the four commodities prices are jointly modelled as endogenous variables. Moreover, a set of exogenous variables is incorporated in order to account for the electricity demand conditions (temperature) and the electricity generation mix (quantity of electricity traded according the technology used). The outcomes for the Portuguese electricity system suggest that the dynamic pass–through of carbon prices into electricity prices is strongly significant and a long–run elasticity was estimated (equilibrium relation) that is aligned with studies that have been conducted for other markets.
Resumo:
Due to the growing complexity and dynamism of many embedded application domains (including consumer electronics, robotics, automotive and telecommunications), it is increasingly difficult to react to load variations and adapt the system's performance in a controlled fashion within an useful and bounded time. This is particularly noticeable when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may exhibit unrestricted QoS inter-dependencies. This paper proposes a novel anytime adaptive QoS control policy in which the online search for the best set of QoS levels is combined with each user's personal preferences on their services' adaptation behaviour. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e telecomunicações
Resumo:
Sandwich structures with soft cores are widely used in applications where a high bending stiffness is required without compromising the global weight of the structure, as well as in situations where good thermal and damping properties are important parameters to observe. As equivalent single layer approaches are not the more adequate to describe realistically the kinematics and the stresses distributions as well as the dynamic behaviour of this type of sandwiches, where shear deformations and the extensibility of the core can be very significant, layerwise models may provide better solutions. Additionally and in connection with this multilayer approach, the selection of different shear deformation theories according to the nature of the material that constitutes the core and the outer skins can predict more accurately the sandwich behaviour. In the present work the authors consider the use of different shear deformation theories to formulate different layerwise models, implemented through kriging-based finite elements. The viscoelastic material behaviour, associated to the sandwich core, is modelled using the complex approach and the dynamic problem is solved in the frequency domain. The outer elastic layers considered in this work may also be made from different nanocomposites. The performance of the models developed is illustrated through a set of test cases. (C) 2015 Elsevier Ltd. All rights reserved.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
A função de escalonamento desempenha um papel importante nos sistemas de produção. Os sistemas de escalonamento têm como objetivo gerar um plano de escalonamento que permite gerir de uma forma eficiente um conjunto de tarefas que necessitam de ser executadas no mesmo período de tempo pelos mesmos recursos. Contudo, adaptação dinâmica e otimização é uma necessidade crítica em sistemas de escalonamento, uma vez que as organizações de produção têm uma natureza dinâmica. Nestas organizações ocorrem distúrbios nas condições requisitos de trabalho regularmente e de forma inesperada. Alguns exemplos destes distúrbios são: surgimento de uma nova tarefa, cancelamento de uma tarefa, alteração na data de entrega, entre outros. Estes eventos dinâmicos devem ser tidos em conta, uma vez que podem influenciar o plano criado, tornando-o ineficiente. Portanto, ambientes de produção necessitam de resposta imediata para estes eventos, usando um método de reescalonamento em tempo real, para minimizar o efeito destes eventos dinâmicos no sistema de produção. Deste modo, os sistemas de escalonamento devem de uma forma automática e inteligente, ser capazes de adaptar o plano de escalonamento que a organização está a seguir aos eventos inesperados em tempo real. Esta dissertação aborda o problema de incorporar novas tarefas num plano de escalonamento já existente. Deste modo, é proposta uma abordagem de otimização – Hiper-heurística baseada em Seleção Construtiva para Escalonamento Dinâmico- para lidar com eventos dinâmicos que podem ocorrer num ambiente de produção, a fim de manter o plano de escalonamento, o mais robusto possível. Esta abordagem é inspirada em computação evolutiva e hiper-heurísticas. Do estudo computacional realizado foi possível concluir que o uso da hiper-heurística de seleção construtiva pode ser vantajoso na resolução de problemas de otimização de adaptação dinâmica.
Resumo:
Most of today’s systems, especially when related to the Web or to multi-agent systems, are not standalone or independent, but are part of a greater ecosystem, where they need to interact with other entities, react to complex changes in the environment, and act both over its own knowledge base and on the external environment itself. Moreover, these systems are clearly not static, but are constantly evolving due to the execution of self updates or external actions. Whenever actions and updates are possible, the need to ensure properties regarding the outcome of performing such actions emerges. Originally purposed in the context of databases, transactions solve this problem by guaranteeing atomicity, consistency, isolation and durability of a special set of actions. However, current transaction solutions fail to guarantee such properties in dynamic environments, since they cannot combine transaction execution with reactive features, or with the execution of actions over domains that the system does not completely control (thus making rolling back a non-viable proposition). In this thesis, we investigate what and how transaction properties can be ensured over these dynamic environments. To achieve this goal, we provide logic-based solutions, based on Transaction Logic, to precisely model and execute transactions in such environments, and where knowledge bases can be defined by arbitrary logic theories.
Resumo:
This study examines the impact of globalization on cross-country inequality and poverty using a panel data set for 65 developing counties, over the period 1970-2008. With separate modelling for poverty and inequality, explicit control for financial intermediation, and comparative analysis for developing countries, the study attempts to provide a deeper understanding of cross country variations in income inequality and poverty. The major findings of the study are five fold. First, a non-monotonic relationship between income distribution and the level of economic development holds in all samples of countries. Second, both openness to trade and FDI do not have a favourable effect on income distribution in developing countries. Third, high financial liberalization exerts a negative and significant influence on income distribution in developing countries. Fourth, inflation seems to distort income distribution in all sets of countries. Finally, the government emerges as a major player in impacting income distribution in developing countries.
Resumo:
The international Functional Annotation Of the Mammalian Genomes 4 (FANTOM4) research collaboration set out to better understand the transcriptional network that regulates macrophage differentiation and to uncover novel components of the transcriptome employing a series of high-throughput experiments. The primary and unique technique is cap analysis of gene expression (CAGE), sequencing mRNA 5'-ends with a second-generation sequencer to quantify promoter activities even in the absence of gene annotation. Additional genome-wide experiments complement the setup including short RNA sequencing, microarray gene expression profiling on large-scale perturbation experiments and ChIP-chip for epigenetic marks and transcription factors. All the experiments are performed in a differentiation time course of the THP-1 human leukemic cell line. Furthermore, we performed a large-scale mammalian two-hybrid (M2H) assay between transcription factors and monitored their expression profile across human and mouse tissues with qRT-PCR to address combinatorial effects of regulation by transcription factors. These interdependent data have been analyzed individually and in combination with each other and are published in related but distinct papers. We provide all data together with systematic annotation in an integrated view as resource for the scientific community (http://fantom.gsc.riken.jp/4/). Additionally, we assembled a rich set of derived analysis results including published predicted and validated regulatory interactions. Here we introduce the resource and its update after the initial release.
Resumo:
The level of ab initio theory which is necessary to compute reliable values for the static and dynamic (hyper)polarizabilities of three medium size π-conjugated organic nonlinear optical (NLO) molecules is investigated. With the employment of field-induced coordinates in combination with a finite field procedure, the calculations were made possible. It is stated that to obtain reasonable values for the various individual contributions to the (hyper)polarizability, it is necessary to include electron correlation. Based on the results, the convergence of the usual perturbation treatment for vibrational anharmonicity was examined
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.
Resumo:
The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.
Resumo:
Perceptual maps have been used for decades by market researchers to illuminatethem about the similarity between brands in terms of a set of attributes, to position consumersrelative to brands in terms of their preferences, or to study how demographic and psychometricvariables relate to consumer choice. Invariably these maps are two-dimensional and static. Aswe enter the era of electronic publishing, the possibilities for dynamic graphics are opening up.We demonstrate the usefulness of introducing motion into perceptual maps through fourexamples. The first example shows how a perceptual map can be viewed in three dimensions,and the second one moves between two analyses of the data that were collected according todifferent protocols. In a third example we move from the best view of the data at the individuallevel to one which focuses on between-group differences in aggregated data. A final exampleconsiders the case when several demographic variables or market segments are available foreach respondent, showing an animation with increasingly detailed demographic comparisons.These examples of dynamic maps use several data sets from marketing and social scienceresearch.
Resumo:
Plants maintain stem cells in their meristems as a source for new undifferentiated cells throughout their life. Meristems are small groups of cells that provide the microenvironment that allows stem cells to prosper. Homeostasis of a stem cell domain within a growing meristem is achieved by signalling between stem cells and surrounding cells. We have here simulated the origin and maintenance of a defined stem cell domain at the tip of Arabidopsis shoot meristems, based on the assumption that meristems are self-organizing systems. The model comprises two coupled feedback regulated genetic systems that control stem cell behaviour. Using a minimal set of spatial parameters, the mathematical model allows to predict the generation, shape and size of the stem cell domain, and the underlying organizing centre. We use the model to explore the parameter space that allows stem cell maintenance, and to simulate the consequences of mutations, gene misexpression and cell ablations.