852 resultados para Multi-Equation Income Model
Resumo:
This paper presents a new architecture for the MASCEM, a multi-agent electricity market simulator. This is implemented in a Prolog which is integrated in the JAVA program by using the LPA Win-Prolog Intelligence Server (IS) provides a DLL interface between Win-Prolog and other applications. This paper mainly focus on the MASCEM ability to provide the means to model and simulate Virtual Power Producers (VPP). VPPs are represented as a coalition of agents, with specific characteristics and goals. VPPs can reinforce the importance of these generation technologies making them valuable in electricity markets.
Resumo:
The aim of the present study was to test a hypothetical model to examine if dispositional optimism exerts a moderating or a mediating effect between personality traits and quality of life, in Portuguese patients with chronic diseases. A sample of 540 patients was recruited from central hospitals in various districts of Portugal. All patients completed self-reported questionnaires assessing socio-demographic and clinical variables, personality, dispositional optimism, and quality of life. Structural equation modeling (SEM) was used to analyze the moderating and mediating effects. Results suggest that dispositional optimism exerts a mediator rather than a moderator role between personality traits and quality of life, suggesting that “the expectation that good things will happen” contributes to a better general well-being and better mental functioning.
Resumo:
In a heterogeneous cellular networks environment, users behaviour and network deployment configuration parameters have an impact on the overall Quality of Service. This paper proposes a new and simple model that, on the one hand, explores the users behaviour impact on the network by having mobility, multi-service usage and traffic generation profiles as inputs, and on the other, enables the network setup configuration evaluation impact on the Joint Radio Resource Management (JRRM), assessing some basic JRRM performance indicators, like Vertical Handover (VHO) probabilities, average bit rates, and number of active users, among others. VHO plays an important role in fulfilling seamless users sessions transfer when mobile terminals cross different Radio Access Technologies (RATs) boundaries. Results show that high bit rate RATs suffer and generate more influence from/on other RATs, by producing additional signalling traffic to a JRRM entity. Results also show that the VHOs probability can range from 5 up to 65%, depending on RATs cluster radius and users mobility profile.
Resumo:
OBJECTIVE: To analyze whether the relationship between income inequality and human health is mediated through social capital, and whether political regime determines differences in income inequality and social capital among countries. METHODS: Path analysis of cross sectional ecological data from 110 countries. Life expectancy at birth was the outcome variable, and income inequality (measured by the Gini coefficient), social capital (measured by the Corruption Perceptions Index or generalized trust), and political regime (measured by the Index of Freedom) were the predictor variables. Corruption Perceptions Index (an indirect indicator of social capital) was used to include more developing countries in the analysis. The correlation between Gini coefficient and predictor variables was calculated using Spearman's coefficients. The path analysis was designed to assess the effect of income inequality, social capital proxies and political regime on life expectancy. RESULTS: The path coefficients suggest that income inequality has a greater direct effect on life expectancy at birth than through social capital. Political regime acts on life expectancy at birth through income inequality. CONCLUSIONS: Income inequality and social capital have direct effects on life expectancy at birth. The "class/welfare regime model" can be useful for understanding social and health inequalities between countries, whereas the "income inequality hypothesis" which is only a partial approach is especially useful for analyzing differences within countries.
Resumo:
OBJECTIVE: To assess the prevalence of preterm birth among low birthweight babies in low and middle-income countries. METHODS: Major databases (PubMed, LILACS, Google Scholar) were searched for studies on the prevalence of term and preterm LBW babies with field work carried out after 1990 in low- and middle-income countries. Regression methods were used to model this proportion according to LBW prevalence levels. RESULTS: According to 47 studies from 27 low- and middle-income countries, approximately half of all LBW babies are preterm rather than one in three as assumed in studies previous to the 1990s. CONCLUSIONS: The estimate of a substantially higher number of LBW preterm babies has important policy implications in view of special health care needs of these infants. As for earlier projections, our findings are limited by the relative lack of population-based studies.
Resumo:
Due to usage conditions, hazardous environments or intentional causes, physical and virtual systems are subject to faults in their components, which may affect their overall behaviour. In a ‘black-box’ agent modelled by a set of propositional logic rules, in which just a subset of components is externally visible, such faults may only be recognised by examining some output function of the agent. A (fault-free) model of the agent’s system provides the expected output given some input. If the real output differs from that predicted output, then the system is faulty. However, some faults may only become apparent in the system output when appropriate inputs are given. A number of problems regarding both testing and diagnosis thus arise, such as testing a fault, testing the whole system, finding possible faults and differentiating them to locate the correct one. The corresponding optimisation problems of finding solutions that require minimum resources are also very relevant in industry, as is minimal diagnosis. In this dissertation we use a well established set of benchmark circuits to address such diagnostic related problems and propose and develop models with different logics that we formalise and generalise as much as possible. We also prove that all techniques generalise to agents and to multiple faults. The developed multi-valued logics extend the usual Boolean logic (suitable for faultfree models) by encoding values with some dependency (usually on faults). Such logics thus allow modelling an arbitrary number of diagnostic theories. Each problem is subsequently solved with CLP solvers that we implement and discuss, together with a new efficient search technique that we present. We compare our results with other approaches such as SAT (that require substantial duplication of circuits), showing the effectiveness of constraints over multi-valued logics, and also the adequacy of a general set constraint solver (with special inferences over set functions such as cardinality) on other problems. In addition, for an optimisation problem, we integrate local search with a constructive approach (branch-and-bound) using a variety of logics to improve an existing efficient tool based on SAT and ILP.
Resumo:
Theory building is one of the most crucial challenges faced by basic, clinical and population research, which form the scientific foundations of health practices in contemporary societies. The objective of the study is to propose a Unified Theory of Health-Disease as a conceptual tool for modeling health-disease-care in the light of complexity approaches. With this aim, the epistemological basis of theoretical work in the health field and concepts related to complexity theory as concerned to health problems are discussed. Secondly, the concepts of model-object, multi-planes of occurrence, modes of health and disease-illness-sickness complex are introduced and integrated into a unified theoretical framework. Finally, in the light of recent epistemological developments, the concept of Health-Disease-Care Integrals is updated as a complex reference object fit for modeling health-related processes and phenomena.
Resumo:
Modern multicore processors for the embedded market are often heterogeneous in nature. One feature often available are multiple sleep states with varying transition cost for entering and leaving said sleep states. This research effort explores the energy efficient task-mapping on such a heterogeneous multicore platform to reduce overall energy consumption of the system. This is performed in the context of a partitioned scheduling approach and a very realistic power model, which improves over some of the simplifying assumptions often made in the state-of-the-art. The developed heuristic consists of two phases, in the first phase, tasks are allocated to minimise their active energy consumption, while the second phase trades off a higher active energy consumption for an increased ability to exploit savings through more efficient sleep states. Extensive simulations demonstrate the effectiveness of the approach.
Resumo:
We prove existence, uniqueness, and stability of solutions of the prescribed curvature problem (u'/root 1 + u'(2))' = au - b/root 1 + u'(2) in [0, 1], u'(0) = u(1) = 0, for any given a > 0 and b > 0. We also develop a linear monotone iterative scheme for approximating the solution. This equation has been proposed as a model of the corneal shape in the recent paper (Okrasinski and Plociniczak in Nonlinear Anal., Real World Appl. 13:1498-1505, 2012), where a simplified version obtained by partial linearization has been investigated.
Resumo:
This paper studies a discrete dynamical system of interacting particles that evolve by interacting among them. The computational model is an abstraction of the natural world, and real systems can range from the huge cosmological scale down to the scale of biological cell, or even molecules. Different conditions for the system evolution are tested. The emerging patterns are analysed by means of fractal dimension and entropy measures. It is observed that the population of particles evolves towards geometrical objects with a fractal nature. Moreover, the time signature of the entropy can be interpreted at the light of complex dynamical systems.
Resumo:
Over the last three decades, computer architects have been able to achieve an increase in performance for single processors by, e.g., increasing clock speed, introducing cache memories and using instruction level parallelism. However, because of power consumption and heat dissipation constraints, this trend is going to cease. In recent times, hardware engineers have instead moved to new chip architectures with multiple processor cores on a single chip. With multi-core processors, applications can complete more total work than with one core alone. To take advantage of multi-core processors, parallel programming models are proposed as promising solutions for more effectively using multi-core processors. This paper discusses some of the existent models and frameworks for parallel programming, leading to outline a draft parallel programming model for Ada.
Resumo:
This paper presents a novel approach to WLAN propagation models for use in indoor localization. The major goal of this work is to eliminate the need for in situ data collection to generate the Fingerprinting map, instead, it is generated by using analytical propagation models such as: COST Multi-Wall, COST 231 average wall and Motley- Keenan. As Location Estimation Algorithms kNN (K-Nearest Neighbour) and WkNN (Weighted K-Nearest Neighbour) were used to determine the accuracy of the proposed technique. This work is based on analytical and measurement tools to determine which path loss propagation models are better for location estimation applications, based on Receive Signal Strength Indicator (RSSI).This study presents different proposals for choosing the most appropriate values for the models parameters, like obstacles attenuation and coefficients. Some adjustments to these models, particularly to Motley-Keenan, considering the thickness of walls, are proposed. The best found solution is based on the adjusted Motley-Keenan and COST models that allows to obtain the propagation loss estimation for several environments.Results obtained from two testing scenarios showed the reliability of the adjustments, providing smaller errors in the measured values values in comparison with the predicted values.
Resumo:
A construction project is a group of discernible tasks or activities that are conduct-ed in a coordinated effort to accomplish one or more objectives. Construction projects re-quire varying levels of cost, time and other resources. To plan and schedule a construction project, activities must be defined sufficiently. The level of detail determines the number of activities contained within the project plan and schedule. So, finding feasible schedules which efficiently use scarce resources is a challenging task within project management. In this context, the well-known Resource Constrained Project Scheduling Problem (RCPSP) has been studied during the last decades. In the RCPSP the activities of a project have to be scheduled such that the makespan of the project is minimized. So, the technological precedence constraints have to be observed as well as limitations of the renewable resources required to accomplish the activities. Once started, an activity may not be interrupted. This problem has been extended to a more realistic model, the multi-mode resource con-strained project scheduling problem (MRCPSP), where each activity can be performed in one out of several modes. Each mode of an activity represents an alternative way of combining different levels of resource requirements with a related duration. Each renewable resource has a limited availability for the entire project such as manpower and machines. This paper presents a hybrid genetic algorithm for the multi-mode resource-constrained pro-ject scheduling problem, in which multiple execution modes are available for each of the ac-tivities of the project. The objective function is the minimization of the construction project completion time. To solve the problem, is applied a two-level genetic algorithm, which makes use of two separate levels and extend the parameterized schedule generation scheme. It is evaluated the quality of the schedules and presents detailed comparative computational re-sults for the MRCPSP, which reveal that this approach is a competitive algorithm.
Resumo:
To study a flavour model with a non-minimal Higgs sector one must first define the symmetries of the fields; then identify what types of vacua exist and how they may break the symmetries; and finally determine whether the remnant symmetries are compatible with the experimental data. Here we address all these issues in the context of flavour models with any number of Higgs doublets. We stress the importance of analysing the Higgs vacuum expectation values that are pseudo-invariant under the generators of all subgroups. It is shown that the only way of obtaining a physical CKM mixing matrix and, simultaneously, non-degenerate and non-zero quark masses is requiring the vacuum expectation values of the Higgs fields to break completely the full flavour group, except possibly for some symmetry belonging to baryon number. The application of this technique to some illustrative examples, such as the flavour groups Delta (27), A(4) and S-3, is also presented.
Resumo:
We prove existence, uniqueness, and stability of solutions of the prescribed curvature problem (u'/root 1 + u'(2))' = au - b/root 1 + u'(2) in [0, 1], u'(0) = u(1) = 0, for any given a > 0 and b > 0. We also develop a linear monotone iterative scheme for approximating the solution. This equation has been proposed as a model of the corneal shape in the recent paper (Okrasinski and Plociniczak in Nonlinear Anal., Real World Appl. 13:1498-1505, 2012), where a simplified version obtained by partial linearization has been investigated.