825 resultados para multi-mediational path model
Resumo:
As a result of globalization and free trade agreements, international trade is enormously growing and inevitably putting more pressure on the environment over the last few decades. This has drawn the attention of both environmentalist and economist in response to the ever growing concerns of climate change and urgent need of international action for its mitigation. In this work we aim at analyzing the implication of international trade in terms of CO2 between Spain and its important partners using a multi-regional input-output (MRIO) model. A fully integrated 13 regions MRIO model is constructed to examine the pollution responsibility of Spain both from production and consumption perspectives. The empirical results show that Spain is a net importer of CO2 emissions which is equivalent to 29% of its emission due to production. Even though the leading partner with regard to import values are countries such as Germany, France, Italy and Great Britain, the CO2 embodied due to trade with China takes the largest share. This is mainly due to the importation of energy intensive products from China coupled with Chinese poor energy mix which is dominated by coal-power plant. The largest portion (67%) of the global imported CO2 emissions is due to intermediate demand requirements by production sectors. Products such as Motor vehicles, chemicals, a variety of machineries and equipments, textile and leather products, construction materials are the key imports that drive the emissions due to their production in the respective exporting countries. Being at its peak in 2005, the Construction sector is the most responsible activity behind both domestic and imported emissions.
Resumo:
In this paper we present the theoretical and methodologicalfoundations for the development of a multi-agentSelective Dissemination of Information (SDI) servicemodel that applies Semantic Web technologies for specializeddigital libraries. These technologies make possibleachieving more efficient information management,improving agent–user communication processes, andfacilitating accurate access to relevant resources. Othertools used are fuzzy linguistic modelling techniques(which make possible easing the interaction betweenusers and system) and natural language processing(NLP) techniques for semiautomatic thesaurus generation.Also, RSS feeds are used as “current awareness bulletins”to generate personalized bibliographic alerts.
Resumo:
The recent availability of the chicken genome sequence poses the question of whether there are human protein-coding genes conserved in chicken that are currently not included in the human gene catalog. Here, we show, using comparative gene finding followed by experimental verification of exon pairs by RT–PCR, that the addition to the multi-exonic subset of this catalog could be as little as 0.2%, suggesting that we may be closing in on the human gene set. Our protocol, however, has two shortcomings: (i) the bioinformatic screening of the predicted genes, applied to filter out false positives, cannot handle intronless genes; and (ii) the experimental verification could fail to identify expression at a specific developmental time. This highlights the importance of developing methods that could provide a reliable estimate of the number of these two types of genes.
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.
Resumo:
Evaluating the possible benefits of the introduction of genetically modified (GM) crops must address the issue of consumer resistance as well as the complex regulation that has ensued. In the European Union (EU) this regulation envisions the “co-existence” of GM food with conventional and quality-enhanced products, mandates the labelling and traceability of GM products, and allows only a stringent adventitious presence of GM content in other products. All these elements are brought together within a partial equilibrium model of the EU agricultural food sector. The model comprises conventional, GM and organic food. Demand is modelled in a novel fashion, whereby organic and conventional products are treated as horizontally differentiated but GM products are vertically differentiated (weakly inferior) relative to conventional ones. Supply accounts explicitly for the land constraint at the sector level and for the need for additional resources to produce organic food. Model calibration and simulation allow insights into the qualitative and quantitative effects of the large-scale introduction of GM products in the EU market. We find that the introduction of GM food reduces overall EU welfare, mostly because of the associated need for costly segregation of non-GM products, but the producers of quality-enhanced products actually benefit.
Resumo:
This article presents a formal model of policy decision-making in an institutional framework of separation of powers in which the main actors are pivotal political parties with voting discipline. The basic model previously developed from pivotal politics theory for the analysis of the United States lawmaking is here modified to account for policy outcomes and institutional performances in other presidential regimes, especially in Latin America. Legislators' party indiscipline at voting and multi-partism appear as favorable conditions to reduce the size of the equilibrium set containing collectively inefficient outcomes, while a two-party system with strong party discipline is most prone to produce 'gridlock', that is, stability of socially inefficient policies. The article provides a framework for analysis which can induce significant revisions of empirical data, especially regarding the effects of situations of (newly defined) unified and divided government, different decision rules, the number of parties and their discipline. These implications should be testable and may inspire future analytical and empirical work.
Resumo:
Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.
Resumo:
This article presents a formal model of policy decision-making in an institutional framework of separation of powers in which the main actors are pivotal political parties with voting discipline. The basic model previously developed from pivotal politics theory for the analysis of the United States lawmaking is here modified to account for policy outcomes and institutional performances in other presidential regimes, especially in Latin America. Legislators' party indiscipline at voting and multi-partism appear as favorable conditions to reduce the size of the equilibrium set containing collectively inefficient outcomes, while a two-party system with strong party discipline is most prone to produce 'gridlock', that is, stability of socially inefficient policies. The article provides a framework for analysis which can induce significant revisions of empirical data, especially regarding the effects of situations of (newly defined) unified and divided government, different decision rules, the number of parties and their discipline. These implications should be testable and may inspire future analytical and empirical work.
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.
Resumo:
The old, understudied electoral system composed of multi-member districts, open ballot and plurality rule is presented as the most remote scene of the origin of both political parties and new electoral systems. A survey of the uses of this set of electoral rules in different parts of the world during remote and recent periods shows its wide spread. A model of voting by this electoral system demonstrates that, while it can produce varied and pluralistic representation, it also provides incentives to form factional or partisan candidacies. Famous negative reactions to the emergence of factions and political parties during the 18th and 19th centuries are reinterpreted in this context. Many electoral rules and procedures invented since the second half of the 19th century, including the Australian ballot, single-member districts, limited and cumulative ballots, and proportional representation rules, derived from the search to reduce the effects of the originating multi-member district system in favor of a single party sweep. The general relations between political parties and electoral systems are restated to account for the foundational stage here discussed.
Resumo:
In this paper we develop two models for an inventory system in which the distributormanages the inventory at the retailers location. These type of systems correspondto the Vendor Managed Inventory (VMI) systems described ib the literature. Thesesystems are very common in many different types of industries, such as retailingand manufacturing, although assuming different characteristics.The objective of our model is to minimize total inventory cost for the distributorin a multi-period multi-retailer setting. The inventory system includes holdingand stock-out costs and we study the case whre an additional fixed setup cost ischarged per delivery.We construct a numerical experiment to analyze the model bahavior and observe theimpact of the characteristics of the model on the solutions.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.
Resumo:
In a previous paper a novel Generalized Multiobjective Multitree model (GMM-model) was proposed. This model considers for the first time multitree-multicast load balancing with splitting in a multiobjective context, whose mathematical solution is a whole Pareto optimal set that can include several results than it has been possible to find in the publications surveyed. To solve the GMM-model, in this paper a multi-objective evolutionary algorithm (MOEA) inspired by the Strength Pareto Evolutionary Algorithm (SPEA) is proposed. Experimental results considering up to 11 different objectives are presented for the well-known NSF network, with two simultaneous data flows
Resumo:
Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.
Resumo:
An accurate assessment of the rising ambient temperature by plant cells is crucial for the timely activation of various molecular defences before the appearance of heat damage. Recent findings have allowed a better understanding of the early cellular events that take place at the beginning of mild temperature rise, to timely express heat-shock proteins (HSPs), which will, in turn, confer thermotolerance to the plant. Here, we discuss the key components of the heat signalling pathway and suggest a model in which a primary sensory role is carried out by the plasma membrane and various secondary messengers, such as Ca(2+) ions, nitric oxide (NO) and hydrogen peroxide (H(2) O(2) ). We also describe the role of downstream components, such as calmodulins, mitogen-activated protein kinases and Hsp90, in the activation of heat-shock transcription factors (HSFs). The data gathered for land plants suggest that, following temperature elevation, the heat signal is probably transduced by several pathways that will, however, coalesce into the final activation of HSFs, the expression of HSPs and the onset of cellular thermotolerance.