907 resultados para Mixed model under selection


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The physical processes controlling the mixed layer salinity (MLS) seasonal budget in the tropical Atlantic Ocean are investigated using a regional configuration of an ocean general circulation model. The analysis reveals that the MLS cycle is generally weak in comparison of individual physical processes entering in the budget because of strong compensation. In evaporative regions, around the surface salinity maxima, the ocean acts to freshen the mixed layer against the action of evaporation. Poleward of the southern SSS maxima, the freshening is ensured by geostrophic advection, the vertical salinity diffusion and, during winter, a dominant contribution of the convective entrainment. On the equatorward flanks of the SSS maxima, Ekman transport mainly contributes to supply freshwater from ITCZ regions while vertical salinity diffusion adds on the effect of evaporation. All these terms are phase locked through the effect of the wind. Under the seasonal march of the ITCZ and in coastal areas affected by river (7°S:15°N), the upper ocean freshening by precipitations and/or runoff is attenuated by vertical salinity diffusion. In the eastern equatorial regions, seasonal cycle of wind forced surface currents advect freshwaters, which are mixed with subsurface saline water because of the strong vertical turbulent diffusion. In all these regions, the vertical diffusion presents an important contribution to the MLS budget by providing, in general, an upwelling flux of salinity. It is generally due to vertical salinity gradient and mixing due to winds. Furthermore, in the equator where the vertical shear, associated to surface horizontal currents, is developed, the diffusion depends also on the sheared flow stability.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite many researches on development in education and psychology, not often is the methodology tested with real data. A major barrier to test the growth model is that the design of study includes repeated observations and the nature of the growth is nonlinear. The repeat measurements on a nonlinear model require sophisticated statistical methods. In this study, we present mixed effects model in a negative exponential curve to describe the development of children's reading skills. This model can describe the nature of the growth on children's reading skills and account for intra-individual and inter-individual variation. We also apply simple techniques including cross-validation, regression, and graphical methods to determine the most appropriate curve for data, to find efficient initial values of parameters, and to select potential covariates. We illustrate with an example that motivated this research: a longitudinal study of academic skills from grade 1 to grade 12 in Connecticut public schools. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A small, but growing, body of literature searches for evidence of non-Keynesian effects of fiscal contractions. That is, some evidence exists that large fiscal contractions stimulate short-run economic activity. Our paper continues this research effort by systematically examining the effects, if any, of unusual fiscal events - either non-Keynesian results within a Keynesian model or Keynesian results within a neoclassical model -- on short-run economic activity. We examine this issue within three separate models -- a St. Louis equation, a Hall-type consumption equation, and a growth accounting equation. Our empirical findings are mixed, and do not provide strong systematic support for the view that unusually large fiscal contractions/expansions reverse the effects of normal fiscal events. Moreover, we find only limited evidence that trigger points are empirically important.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Standardization is a common method for adjusting confounding factors when comparing two or more exposure category to assess excess risk. Arbitrary choice of standard population in standardization introduces selection bias due to healthy worker effect. Small sample in specific groups also poses problems in estimating relative risk and the statistical significance is problematic. As an alternative, statistical models were proposed to overcome such limitations and find adjusted rates. In this dissertation, a multiplicative model is considered to address the issues related to standardized index namely: Standardized Mortality Ratio (SMR) and Comparative Mortality Factor (CMF). The model provides an alternative to conventional standardized technique. Maximum likelihood estimates of parameters of the model are used to construct an index similar to the SMR for estimating relative risk of exposure groups under comparison. Parametric Bootstrap resampling method is used to evaluate the goodness of fit of the model, behavior of estimated parameters and variability in relative risk on generated sample. The model provides an alternative to both direct and indirect standardization method. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The dataset contains raw data (quantification cycle) for a study which determined the most suitable hepatic reference genes for normalisation of qPCR data orginating from juvenile Atlantic salmon (14 days) exposed to 14 and 22 degrees C. These results will be useful for anyone wanting to study the effects of climate change/elevated temperature on reproductive physiology of fish (and perhaphs other vertebrates).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The dataset contains raw data (quantification cycle) for a study which determined the most suitable hepatic reference genes for normalisation of qPCR data orginating from adult (entire reproductive season) Atlantic salmon (14 days) exposed to 14 and 22 degrees C. These results will be useful for anyone wanting to study the effects of climate change/elevated temperature on reproductive physiology of fish (and perhaphs other vertebrates). In addition, a target gene (vitellogenin) has normalised using an inappropriate and an 'ideal' reference gene to demonstrate the consequences of using an unstable reference gene for normalisation. For the adult experiment, maiden and repeat adult females were held at the Salmon Enterprises of Tasmania (SALTAS) Wayatinah Hatchery (Tasmania, Australia) at ambient temperature and photoperiod in either 200 (maidens) or 50 (repeats) m3 circular tanks at stocking densities of 12-18, and 24-36 kg m-3 for maidens and repeats, respectively, until transfered to the experimental tanks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Southern Hemisphere Westerly Winds (SWW) have been suggested to exert a critical influence on global climate through wind-driven upwelling of deep water in the Southern Ocean and the potentially resulting atmospheric CO2 variations. The investigation of the temporal and spatial evolution of the SWW along with forcings and feedbacks remains a significant challenge in climate research. In this study, the evolution of the SWW under orbital forcing from the early Holocene (9 kyr BP) to pre-industrial modern times is examined with transient experiments using the comprehensive coupled global climate model CCSM3. Analyses of the model results suggest that the annual and seasonal mean SWW were subject to an overall strengthening and poleward shifting trend during the course of the early-to-late Holocene under the influence of orbital forcing, except for the austral spring season, where the SWW exhibited an opposite trend of shifting towards the equator.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Five frequently-used models were chosen and evaluated to calculate the viscosity of the mixed oil. Totally twenty mixed oil samples were prepared with different ratios of light to crude oil from different oil wells but the same oil field. The viscosities of the mixtures under the same shear rates of 10 s**-1 were measured using a rotation viscometer at the temperatures ranging from 30°C to 120°C. After comparing all of the experimental data with the corresponding model values, the best one of the five models for this oil field was determined. Using the experimental data, one model with a better accuracy than the existing models was developed to calculate the viscosity of mixed oils. Another model was derived to predict the viscosity of mixed oils at different temperatures and different values of mixing ratio of light to heavy oil.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A two-dimensional finite element model of current flow in the front surface of a PV cell is presented. In order to validate this model we perform an experimental test. Later, particular attention is paid to the effects of non-uniform illumination in the finger direction which is typical in a linear concentrator system. Fill factor, open circuit voltage and efficiency are shown to decrease with increasing degree of non-uniform illumination. It is shown that these detrimental effects can be mitigated significantly by reoptimization of the number of front surface metallization fingers to suit the degree of non-uniformity. The behavior of current flow in the front surface of a cell operating at open circuit voltage under non-uniform illumination is discussed in detail.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Species selection for forest restoration is often supported by expert knowledge on local distribution patterns of native tree species. This approach is not applicable to largely deforested regions unless enough data on pre-human tree species distribution is available. In such regions, ecological niche models may provide essential information to support species selection in the framework of forest restoration planning. In this study we used ecological niche models to predict habitat suitability for native tree species in "Tierra de Campos" region, an almost totally deforested area of the Duero Basin (Spain). Previously available models provide habitat suitability predictions for dominant native tree species, but including non-dominant tree species in the forest restoration planning may be desirable to promote biodiversity, specially in largely deforested areas were near seed sources are not expected. We used the Forest Map of Spain as species occurrence data source to maximize the number of modeled tree species. Penalized logistic regression was used to train models using climate and lithological predictors. Using model predictions a set of tools were developed to support species selection in forest restoration planning. Model predictions were used to build ordered lists of suitable species for each cell of the study area. The suitable species lists were summarized drawing maps that showed the two most suitable species for each cell. Additionally, potential distribution maps of the suitable species for the study area were drawn. For a scenario with two dominant species, the models predicted a mixed forest (Quercus ilex and a coniferous tree species) for almost one half of the study area. According to the models, 22 non-dominant native tree species are suitable for the study area, with up to six suitable species per cell. The model predictions pointed to Crataegus monogyna, Juniperus communis, J.oxycedrus and J.phoenicea as the most suitable non-dominant native tree species in the study area. Our results encourage further use of ecological niche models for forest restoration planning in largely deforested regions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.