35 resultados para Holding solution
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Estudi elaborat a partir d’una estada al Royal Veterinary and Agricultural University of Denmark entre els mesos de Març a Juny del 2006. S’ha investigat l’efecte dels envasats amb atmosferes modificades (MAP), així com la marinació amb vi tint, sobre l’evolució de la contaminació bacteriològica de carns fosques, dures i seques (DFD). Les carns DFD es troben a les canals d’animals que, abans del sacrifici, han estat exposades a activitats musculars prolongades o estrès. Les carns DFD impliquen importants pèrdues econòmiques degut a la contaminació bacteriològica i als problemes tecnològics relacionats amb la alta capacitat de retenció d’aigua. A més a més, és crític per la indústria investigar la diversitat de la contaminació bacteriana, identificar les espècies bacterianes i controlar-les. Però és difícil degut a la inhabilitat per detectar algunes bactèries en medis coneguts, les interaccions entre elles, la complexitat dels tipus de contaminació com són aigua, terra, femtes i l’ambient. La Polymerasa chain reaction- Denaturating Electrophoresis Gel (PCR-DGEE ) pot sobrepassar aquests problemes reflectint la diversitat microbial i les espècies bacterianes. Els resultants han indicat que la varietat bacteriana de la carn incrementava amb els dies d’envasat independentment del mètode d’envasat, però decreixia significativament amb el tractament de marinació amb vi tint. La DGEE ha mostrat diferències en les espècies trobades, indicant canvis en la contaminació bacteriana i les seves característiques en la carn DFD sota els diferents tractaments. Tot i que la marinació és una bona alternativa i solució a la comercialització de carn DFD , estudis de seqüenciació són necessaris per identificar les diferents tipus de bactèries.
Resumo:
Scholars and local planners are increasingly interested in tourism contribution to economic and social development. To this regard, several European cities lead the world rankings on tourist arrivals, and their governments have promoted tourism activity. Mobility is an essential service for tourists visiting large cities, since it is a crucial factor for their comfort. In addition, it facilitates the spread of benefits across the city. The aim of this study is to determine whether city planners respond to this additional urban transport demand pressure by extending supply services. We use an international database of European cities. Our results confirm that tourism intensity is a demand enhancing factor on urban transport. Contrarily, cities do not seem to address this pressure by increasing service supply. This suggests that tourism exerts a positive externality on public transport since it provides additional funding for these services, but it imposes as well external costs on resident users because of congestion given supply constraints.
Resumo:
The main result is a proof of the existence of a unique viscosity solution for Hamilton-Jacobi equation, where the hamiltonian is discontinuous with respect to variable, usually interpreted as the spatial one. Obtained generalized solution is continuous, but not necessarily differentiable.
Resumo:
The work in this paper deals with the development of momentum and thermal boundary layers when a power law fluid flows over a flat plate. At the plate we impose either constant temperature, constant flux or a Newton cooling condition. The problem is analysed using similarity solutions, integral momentum and energy equations and an approximation technique which is a form of the Heat Balance Integral Method. The fluid properties are assumed to be independent of temperature, hence the momentum equation uncouples from the thermal problem. We first derive the similarity equations for the velocity and present exact solutions for the case where the power law index n = 2. The similarity solutions are used to validate the new approximation method. This new technique is then applied to the thermal boundary layer, where a similarity solution can only be obtained for the case n = 1.
Resumo:
In this paper we axiomatize the strong constrained egalitarian solution (Dutta and Ray, 1991) over the class of weak superadditive games using constrained egalitarianism, order-consistency, and converse order-consistency. JEL classification: C71, C78. Keywords: Cooperative TU-game, strong constrained egalitarian solution, axiomatization.
Resumo:
One of the tantalising remaining problems in compositional data analysis lies in how to deal with data sets in which there are components which are essential zeros. By anessential zero we mean a component which is truly zero, not something recorded as zero simply because the experimental design or the measuring instrument has not been sufficiently sensitive to detect a trace of the part. Such essential zeros occur inmany compositional situations, such as household budget patterns, time budgets,palaeontological zonation studies, ecological abundance studies. Devices such as nonzero replacement and amalgamation are almost invariably ad hoc and unsuccessful insuch situations. From consideration of such examples it seems sensible to build up amodel in two stages, the first determining where the zeros will occur and the secondhow the unit available is distributed among the non-zero parts. In this paper we suggest two such models, an independent binomial conditional logistic normal model and a hierarchical dependent binomial conditional logistic normal model. The compositional data in such modelling consist of an incidence matrix and a conditional compositional matrix. Interesting statistical problems arise, such as the question of estimability of parameters, the nature of the computational process for the estimation of both the incidence and compositional parameters caused by the complexity of the subcompositional structure, the formation of meaningful hypotheses, and the devising of suitable testing methodology within a lattice of such essential zero-compositional hypotheses. The methodology is illustrated by application to both simulated and real compositional data
Resumo:
Cobre Las Cruces is a renowned copper mining company located in Sevilla, with unexpected problems in wireless communications that have a direct affectation in production. Therefore, the main goals are to improve the WiFi infrastructure, to secure it and to detect and prevent from attacks and from the installation of rogue (and non-authorized) APs. All of that integrated with the current ICT infrastructure.This project has been divided into four phases, although only two of them have been included into the TFC; they are the analysis of the current situation and the design of a WLAN solution.Once the analysis part was finished, some weaknesses were detected. Subjects such as lack of connectivity and control, ignorance about installed WiFi devices and their localization and state and, by and large, the use of weak security mechanisms were some of the problems found. Additionally, due to the fact that the working area became larger and new WiFi infrastructures were added, the first phase took more time than expected.As a result of the detailed analysis, some goals were defined to solve and it was designed a centralized approach able to cope with them. A solution based on 802.11i and 802.1x protocols, digital certificates, a probe system running as IDS/IPS and ligthweight APs in conjunction with a Wireless LAN Controller are the main features.
Resumo:
El TFC analitza la implantació en un holding empresarial d'un nou CRM i la seva connexió i integració amb la resta d'aplicacions ja existents.
Resumo:
Recently, the surprising result that ab initio calculations on benzene and other planar arenes at correlated MP2, MP3, configuration interaction with singles and doubles (CISD), and coupled cluster with singles and doubles levels of theory using standard Pople’s basis sets yield nonplanar minima has been reported. The planar optimized structures turn out to be transition states presenting one or more large imaginary frequencies, whereas single-determinant-based methods lead to the expected planar minima and no imaginary frequencies. It has been suggested that such anomalous behavior can be originated by two-electron basis set incompleteness error. In this work, we show that the reported pitfalls can be interpreted in terms of intramolecular basis set superposition error (BSSE) effects, mostly between the C–H moieties constituting the arenes. We have carried out counterpoise-corrected optimizations and frequency calculations at the Hartree–Fock, B3LYP, MP2, and CISD levels of theory with several basis sets for a number of arenes. In all cases, correcting for intramolecular BSSE fixes the anomalous behavior of the correlated methods, whereas no significant differences are observed in the single-determinant case. Consequently, all systems studied are planar at all levels of theory. The effect of different intramolecular fragment definitions and the particular case of charged species, namely, cyclopentadienyl and indenyl anions, respectively, are also discussed
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
This paper studies the rate of convergence of an appropriatediscretization scheme of the solution of the Mc Kean-Vlasovequation introduced by Bossy and Talay. More specifically,we consider approximations of the distribution and of thedensity of the solution of the stochastic differentialequation associated to the Mc Kean - Vlasov equation. Thescheme adopted here is a mixed one: Euler/weakly interactingparticle system. If $n$ is the number of weakly interactingparticles and $h$ is the uniform step in the timediscretization, we prove that the rate of convergence of thedistribution functions of the approximating sequence in the $L^1(\Omega\times \Bbb R)$ norm and in the sup norm is of theorder of $\frac 1{\sqrt n} + h $, while for the densities is ofthe order $ h +\frac 1 {\sqrt {nh}}$. This result is obtainedby carefully employing techniques of Malliavin Calculus.
Resumo:
Creative accounting is a growing issue of interest in Spain. In this article we argue that the concept true and fair view can limit or promote the use of creative accounting depending upon its interpretation. We review the range of meanings that true and fair view can take at an international level and compare the experience of the United Kingdom with the Australian one by analysing the use of true and fair view to limit creative accounting. Finally, we suggest lines of action to be considered by the Spanish accounting standards-setting institutions.
Resumo:
We address the problem of scheduling a multi-station multiclassqueueing network (MQNET) with server changeover times to minimizesteady-state mean job holding costs. We present new lower boundson the best achievable cost that emerge as the values ofmathematical programming problems (linear, semidefinite, andconvex) over relaxed formulations of the system's achievableperformance region. The constraints on achievable performancedefining these formulations are obtained by formulatingsystem's equilibrium relations. Our contributions include: (1) aflow conservation interpretation and closed formulae for theconstraints previously derived by the potential function method;(2) new work decomposition laws for MQNETs; (3) new constraints(linear, convex, and semidefinite) on the performance region offirst and second moments of queue lengths for MQNETs; (4) a fastbound for a MQNET with N customer classes computed in N steps; (5)two heuristic scheduling policies: a priority-index policy, anda policy extracted from the solution of a linear programmingrelaxation.