800 resultados para programming environments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The system described herein represents the first example of a recommender system in digital ecosystems where agents negotiate services on behalf of small companies. The small companies compete not only with price or quality, but with a wider service-by-service composition by subcontracting with other companies. The final result of these offerings depends on negotiations at the scale of millions of small companies. This scale requires new platforms for supporting digital business ecosystems, as well as related services like open-id, trust management, monitors and recommenders. This is done in the Open Negotiation Environment (ONE), which is an open-source platform that allows agents, on behalf of small companies, to negotiate and use the ecosystem services, and enables the development of new agent technologies. The methods and tools of cyber engineering are necessary to build up Open Negotiation Environments that are stable, a basic condition for predictable business and reliable business environments. Aiming to build stable digital business ecosystems by means of improved collective intelligence, we introduce a model of negotiation style dynamics from the point of view of computational ecology. This model inspires an ecosystem monitor as well as a novel negotiation style recommender. The ecosystem monitor provides hints to the negotiation style recommender to achieve greater stability of an open negotiation environment in a digital business ecosystem. The greater stability provides the small companies with higher predictability, and therefore better business results. The negotiation style recommender is implemented with a simulated annealing algorithm at a constant temperature, and its impact is shown by applying it to a real case of an open negotiation environment populated by Italian companies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our work is concerned with user modelling in open environments. Our proposal then is the line of contributions to the advances on user modelling in open environments thanks so the Agent Technology, in what has been called Smart User Model. Our research contains a holistic study of User Modelling in several research areas related to users. We have developed a conceptualization of User Modelling by means of examples from a broad range of research areas with the aim of improving our understanding of user modelling and its role in the next generation of open and distributed service environments. This report is organized as follow: In chapter 1 we introduce our motivation and objectives. Then in chapters 2, 3, 4 and 5 we provide the state-of-the-art on user modelling. In chapter 2, we give the main definitions of elements described in the report. In chapter 3, we present an historical perspective on user models. In chapter 4 we provide a review of user models from the perspective of different research areas, with special emphasis on the give-and-take relationship between Agent Technology and user modelling. In chapter 5, we describe the main challenges that, from our point of view, need to be tackled by researchers wanting to contribute to advances in user modelling. From the study of the state-of-the-art follows an exploratory work in chapter 6. We define a SUM and a methodology to deal with it. We also present some cases study in order to illustrate the methodology. Finally, we present the thesis proposal to continue the work, together with its corresponding work scheduling and temporalisation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Petroleum hydrocarbons are common contaminants in marine and freshwater aquatic habitats, often occurring as a result of oil spillage. Rapid and reliable on-site tools for measuring the bioavailable hydrocarbon fractions, i.e., those that are most likely to cause toxic effects or are available for biodegradation, would assist in assessing potential ecological damage and following the progress of cleanup operations. Here we examined the suitability of a set of different rapid bioassays (2-3 h) using bacteria expressing the LuxAB luciferase to measure the presence of short-chain linear alkanes, monoaromatic and polyaromatic compounds, biphenyls, and DNA-damaging agents in seawater after a laboratory-scale oil spill. Five independent spills of 20 mL of NSO-1 crude oil with 2 L of seawater (North Sea or Mediterranean Sea) were carried out in 5 L glass flasks for periods of up to 10 days. Bioassays readily detected ephemeral concentrations of short-chain alkanes and BTEX (i.e., benzene, toluene, ethylbenzene, and xylenes) in the seawater within minutes to hours after the spill, increasing to a maximum of up to 80 muM within 6-24 h, after which they decreased to low or undetectable levels. The strong decrease in short-chain alkanes and BTEX may have been due to their volatilization or biodegradation, which was supported by changes in the microbial community composition. Two- and three-ring PAHs appeared in the seawater phase after 24 h with a concentration up to 1 muM naphthalene equivalents and remained above 0.5 muM for the duration of the experiment. DNA-damage-sensitive bioreporters did not produce any signal with the oil-spilled aqueous-phase samples, whereas bioassays for (hydroxy)biphenyls showed occasional responses. Chemical analysis for alkanes and PAHs in contaminated seawater samples supported the bioassay data, but did not show the typical ephemeral peaks observed with the bioassays. We conclude that bacterium-based bioassays can be a suitable alternative for rapid on-site quantitative measurement of hydrocarbons in seawater.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Condemned es un juego de lucha en dos dimensiones desarrollado en Flash CS4 y ActionScript 3. El juego consta de cuatro pantallas, en cada una de ellas el jugador se enfrenta a un enemigo controlado por el ordenador a través de una inteligencia artificial. En la creación de este videojuego se ha pasado por todas las fases de desarrollo: diseño gráfico de personajes y escenarios, programación y control de errores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The presence of the antimicrobial peptide (AMP) biosynthetic genes srfAA (surfactin), bacA (bacylisin), fenD (fengycin), bmyB (bacyllomicin), spaS (subtilin), and ituC (iturin) was examined in 184 isolates of Bacillus spp. obtained from plant environments (aerial, rhizosphere, soil) in the Mediterranean land area of Spain. Most strains had between two and four AMP genes whereas strains with five genes were seldom detected and none of the strains had six genes. The most frequent AMP gene markers were srfAA, bacA, bmyB, and fenD, and the most frequent genotypes srfAA-bacA-bmyB and srfAAbacA-bmyB-fenD. The dominance of these particular genes in Bacillus strains associated with plants reinforces the competitive role of surfactin, bacyllomicin, fengycin, and bacilysin in the fitness of strains in natural environments. The use of these AMP gene markers may assist in the selection of putative biological control agents of plant pathogens

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business processes designers take into account the resources that the processes would need, but, due to the variable cost of certain parameters (like energy) or other circumstances, this scheduling must be done when business process enactment. In this report we formalize the energy aware resource cost, including time and usage dependent rates. We also present a constraint programming approach and an auction-based approach to solve the mentioned problem including a comparison of them and a comparison of the proposed algorithms for solving them

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estudi realitzat a partir d’una estada al Centro de Estudos Geograficos de la Universidade de Lisboa, Portugal, entre 2011 i 2012. En aquest grup he desenvolupat la meva recerca focalitzada en ambients polars en presència de permafrost, concretament centrada en l’extrem nord-occidental de la Península Antàrtica (Shetland del Sud) i a l’Alt Àrtic (Svalvard). Ambdós àrees han registrat un augment de temperatura molt significatiu les darreres dècades. La meva recerca ha contemplat l’anàlisi de registres sedimentaris (lacustres, eòlics, vessant) i la monitorització de processos geomorfològics actuals a fi efecte d’entendre la dinàmica ambiental present i passada (i.e. clima). Amb aquesta finalitat he realitzat tres campanyes de treball de camp a l’Antàrtida i dues a l’Àrtic. El posterior treball de laboratori i d’oficina està propiciant nombroses publicacions que donen fe dels èxits assolits. A més, cal enfatitzar altres activitats desenvolupades durant la BP-A: coneixement de com organitzar i gestionar una campanya antàrtica, docència universitària, participació en comitès, associacions i tribunals de tesis doctorals, organització i participació en nombroses conferències, treball de camp en noves àrees d’estudi, referee per revistes internacionals, etc. Tanmateix, la concessió del projecte de recerca HOLOANTAR, del qual en sóc l’Investigador Responsable, ha estat l’èxit més important d’aquesta estada. Aquest projecte m’està conferint la capacitat de gestionar i integrar la recerca de 16 investigadors de diferents nacionalitats des d’una perspectiva multidisciplinar. Tothora, cal remarcar que no s’ha assolit un dels èxits que pretenia el meu projecte de BP-A: la transferència del bagatge i coneixement adquirit al sistema de recerca català. Malgrat haver presentat la meva candidatura per un contracte BP-B per tal que aquest background après a l’estranger revertís a Catalunya, el procés de selecció emprat en la convocatòria ho ha impedit i m’obliga a continuar la meva recerca a l’estranger.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Harsh environmental conditions experienced during development can reduce the performance of the same individuals in adulthood. However, the 'predictive adaptive response' hypothesis postulates that if individuals adapt their phenotype during development to the environments where they are likely to live in the future, individuals exposed to harsh conditions in early life perform better when encountering the same harsh conditions in adulthood compared to those never exposed to these conditions before. 2. Using the common vole (Microtus arvalis) as study organism, we tested how exposure to flea parasitism during the juvenile stage affects the physiology (haematocrit, resistance to oxidative stress, resting metabolism, spleen mass, and testosterone), morphology (body mass, testis mass) and motor performance (open field activity and swimming speed) of the same individuals when infested with fleas in adulthood. According to the 'predictive adaptive response' hypothesis, we predicted that voles parasitized at the adult stage would perform better if they had already been parasitized with fleas at the juvenile stage. 3. We found that voles exposed to fleas in adulthood had a higher metabolic rate if already exposed to fleas when juvenile, compared to voles free of fleas when juvenile and voles free of fleas in adulthood. Independently of juvenile parasitism, adult parasitism impaired adult haematocrit and motor performances. Independently of adult parasitism, juvenile parasitism slowed down crawling speed in adult female voles. 4. Our results suggest that juvenile parasitism has long-term effects that do not protect from the detrimental effects of adult parasitism. On the contrary, experiencing parasitism in early-life incurs additional costs upon adult parasitism measured in terms of higher energy expenditure, rather than inducing an adaptive shift in the developmental trajectory. 5. Hence, our study provides experimental evidence for long term costs of parasitism. We found no support for a predictive adaptive response in this host-parasite system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The choice network revenue management model incorporates customer purchase behavioras a function of the offered products, and is the appropriate model for airline and hotel networkrevenue management, dynamic sales of bundles, and dynamic assortment optimization.The optimization problem is a stochastic dynamic program and is intractable. A certainty-equivalencerelaxation of the dynamic program, called the choice deterministic linear program(CDLP) is usually used to generate dyamic controls. Recently, a compact linear programmingformulation of this linear program was given for the multi-segment multinomial-logit (MNL)model of customer choice with non-overlapping consideration sets. Our objective is to obtaina tighter bound than this formulation while retaining the appealing properties of a compactlinear programming representation. To this end, it is natural to consider the affine relaxationof the dynamic program. We first show that the affine relaxation is NP-complete even for asingle-segment MNL model. Nevertheless, by analyzing the affine relaxation we derive a newcompact linear program that approximates the dynamic programming value function betterthan CDLP, provably between the CDLP value and the affine relaxation, and often comingclose to the latter in our numerical experiments. When the segment consideration sets overlap,we show that some strong equalities called product cuts developed for the CDLP remain validfor our new formulation. Finally we perform extensive numerical comparisons on the variousbounds to evaluate their performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whirligig beetles (Gyrinidae) inhabit water surfaces and possess unique eyes which are split into the overwater and underwater parts. In this study we analyze the micro- and nanostructure of the split eyes of two Gyrinidae beetles genera, Gyrinus and Orectochilus. We find that corneae of the overwater ommatidia are covered with maze-like nanostructures, while the corneal surface of the underwater eyes is smooth. We further show that the overwater nanostructures possess no anti-wetting, but the anti-reflective properties with the spectral preference in the range of 450-600 nm. These findings illustrate the adaptation of the corneal nanocoating of the two halves of an insect's eye to two different environments. The novel natural anti-reflective nanocoating we describe may find future technological applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.