938 resultados para Quasi-analytical algorithms
Resumo:
This paper presents a classical Cournot oligopoly model with some peculiar features: it is non--quasi--competitive as price under N-poly is greater than monopoly price; Cournot equilibrium exists and is unique with each new entry; the successive equilibria after new entries are stable under the adjustment mechanism that assumes that actual output of each seller is adjusted proportionally to the difference between actual output and profit maximizing output. Moreover, the model tends to perfect competition as N goes to infinity, reaching the monopoly price again.
Resumo:
PRECON S.A is a manufacturing company dedicated to produce prefabricatedconcrete parts to several industries as rail transportation andagricultural industries.Recently, PRECON signed a contract with RENFE,the Spanish Nnational Rail Transportation Company to manufacturepre-stressed concrete sleepers for siding of the new railways of the highspeed train AVE. The scheduling problem associated with the manufacturingprocess of the sleepers is very complex since it involves severalconstraints and objectives. The constraints are related with productioncapacity, the quantity of available moulds, satisfying demand and otheroperational constraints. The two main objectives are related withmaximizing the usage of the manufacturing resources and minimizing themoulds movements. We developed a deterministic crowding genetic algorithmfor this multiobjective problem. The algorithm has proved to be a powerfuland flexible tool to solve the large-scale instance of this complex realscheduling problem.
Resumo:
The paper explores an efficiency hypothesis regarding the contractual process between large retailers, such as Wal-Mart and Carrefour, and their suppliers. The empirical evidence presented supports the idea that large retailers play a quasi-judicial role, acting as "courts of first instance" in their relationships with suppliers. In this role, large retailers adjust the terms of trade to on-going changes and sanction performance failures, sometimes delaying payments. A potential abuse of their position is limited by the need for re-contracting and preserving their reputations. Suppliers renew their confidence in their retailers on a yearly basis, through writing new contracts. This renovation contradicts the alternative hypothesis that suppliers are expropriated by large retailers as a consequence of specific investments.
Resumo:
Many authors have discussed a decline in internal labor markets and an apparent shift to a new employment contract, characterized by less commitment between employer and employee and more portable skills. These discussions occur without much evidence on what employment contract employees currently feel is fair. We perfomed quasi-experimental surveys to study when employees in the U.S. andCanada feel that layoffs are fair.Layoffs were perceived as more fair if they were due to lower product demand than if the result of employee suggestions. This result appears to be solely due to norms of reciprocity (companiesshould not punish employees for their efforts), rather than norms of sharing rents, as new technology was also considered a justification for layoffs.Consistent with theories of distributive and procedural equity, layoffs were perceived as more fair if the CEO voluntarily shared the pain. CEO bonuses due to layoffs lowered their reported fairness only slightly.Respondents in Silicon Valley were not more accepting of layoffsthan were those in Canada on average, although the justificationsconsidered valid differed slightly.
Resumo:
Microparticles are phospholipid vesicles shed mostly in biological fluids, such as blood or urine, by various types of cells, such as red blood cells (RBCs), platelets, lymphocytes, endothelial cells. These microparticles contain a subset of the proteome of their parent cell, and their ready availability in biological fluid has raised strong interest in their study, as they might be markers of cell damage. However, their small size as well as their particular physico-chemical properties makes them hard to detect, size, count and study by proteome analysis. In this review, we report the pre-analytical and methodological caveats that we have faced in our own research about red blood cell microparticles in the context of transfusion science, as well as examples from the literature on the proteomics of various kinds of microparticles.
Resumo:
Recently, several anonymization algorithms have appeared for privacy preservation on graphs. Some of them are based on random-ization techniques and on k-anonymity concepts. We can use both of them to obtain an anonymized graph with a given k-anonymity value. In this paper we compare algorithms based on both techniques in orderto obtain an anonymized graph with a desired k-anonymity value. We want to analyze the complexity of these methods to generate anonymized graphs and the quality of the resulting graphs.
Resumo:
The educational sphere has an internal function relatively agreed by social scientists. Nonetheless, the contribution that educational systems provide to the society (i.e., their social function) does not have the same degree of consensus. Taking into consideration such theoretical precedent, the current article raises an analytical schema to grasp the social function of education considering a sociological perspective. Starting from the assumption that there is an intrinsic relationship between the internal and social functions of social systems, we suggest there are particular stratification determinants modifying the internal pedagogical function of education, which impact on its social function by creating simultaneous conditions of equity and differentiation. Throughout the paper this social function is considered a paradoxical mechanism. We highlight how this paradoxical dynamic is deployed in different structural levels of the educational sphere. Additionally, we discuss eventual consequences of this paradoxical social function for the inclusion possibilities that educational systems offer to individuals.
Resumo:
Since the first anti-doping tests in the 1960s, the analytical aspects of the testing remain challenging. The evolution of the analytical process in doping control is discussed in this paper with a particular emphasis on separation techniques, such as gas chromatography and liquid chromatography. These approaches are improving in parallel with the requirements of increasing sensitivity and selectivity for detecting prohibited substances in biological samples from athletes. Moreover, fast analyses are mandatory to deal with the growing number of doping control samples and the short response time required during particular sport events. Recent developments in mass spectrometry and the expansion of accurate mass determination has improved anti-doping strategies with the possibility of using elemental composition and isotope patterns for structural identification. These techniques must be able to distinguish equivocally between negative and suspicious samples with no false-negative or false-positive results. Therefore, high degree of reliability must be reached for the identification of major metabolites corresponding to suspected analytes. Along with current trends in pharmaceutical industry the analysis of proteins and peptides remains an important issue in doping control. Sophisticated analytical tools are still mandatory to improve their distinction from endogenous analogs. Finally, indirect approaches will be discussed in the context of anti-doping, in which recent advances are aimed to examine the biological response of a doping agent in a holistic way.
Resumo:
The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.