43 resultados para Chain rule
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
In this work, we use the rule of mixtures to develop an equivalent material model in which the total strain energy density is split into the isotropic part related to the matrix component and the anisotropic energy contribution related to the fiber effects. For the isotropic energy part, we select the amended non-Gaussian strain energy density model, while the energy fiber effects are added by considering the equivalent anisotropic volumetric fraction contribution, as well as the isotropized representation form of the eight-chain energy model that accounts for the material anisotropic effects. Furthermore, our proposed material model uses a phenomenological non-monotonous softening function that predicts stress softening effects and has an energy term, derived from the pseudo-elasticity theory, that accounts for residual strain deformations. The model’s theoretical predictions are compared with experimental data collected from human vaginal tissues, mice skin, poly(glycolide-co-caprolactone) (PGC25 3-0) and polypropylene suture materials and tracheal and brain human tissues. In all cases examined here, our equivalent material model closely follows stress-softening and residual strain effects exhibited by experimental data
Resumo:
L'Anàlisi de la supervivència s'utilitza en diferents camps per analitzar el temps transcorregut entre dos esdeveniments. El que distingeix l'anàlisi de la supervivència d'altres àrees de l'estadística és que les dades normalment estan censurades. La censura en un interval apareix quan l'esdeveniment final d'interès no és directament observable i només se sap que el temps de fallada està en un interval concret. Un esquema de censura més complex encara apareix quan tant el temps inicial com el temps final estan censurats en un interval. Aquesta situació s'anomena doble censura. En aquest article donem una descripció formal d'un mètode bayesà paramètric per a l'anàlisi de dades censurades en un interval i dades doblement censurades així com unes indicacions clares de la seva utilització o pràctica. La metodologia proposada s'ilustra amb dades d'una cohort de pacients hemofílics que es varen infectar amb el virus VIH a principis dels anys 1980's.
Resumo:
We present a new domain of preferences under which the majority relation is always quasi-transitive and thus Condorcet winners always exist. We model situations where a set of individuals must choose one individual in the group. Agents are connected through some relationship that can be interpreted as expressing neighborhood, and which is formalized by a graph. Our restriction on preferences is as follows: each agent can freely rank his immediate neighbors, but then he is indifferent between each neighbor and all other agents that this neighbor "leads to". Hence, agents can be highly perceptive regarding their neighbors, while being insensitive to the differences between these and other agents which are further removed from them. We show quasi-transitivity of the majority relation when the graph expressing the neighborhood relation is a tree. We also discuss a further restriction allowing to extend the result for more general graphs. Finally, we compare the proposed restriction with others in the literature, to conclude that it is independent of any previously discussed domain restriction.
Resumo:
In this paper, results known about the artinian and noetherian conditions for the Leavitt path algebras of graphs with finitely many vertices are extended to all row-finite graphs. In our first main result, necessary and sufficient conditions on a row-finite graph E are given so that the corresponding (not necessarily unital) Leavitt path K-algebra L(E) is semisimple. These are precisely the algebras L(E)for which every corner is left (equivalently, right)artinian. They are also precisely the algebras L(E) for which every finitely generated left (equivalently, right) L(E)-module is artinian. In our second main result, we give necessary and sufficient conditions for every corner of L(E) to be left (equivalently, right) noetherian. They also turn out to be precisely those algebras L(E) for which every finitely generated left(equivalently, right) L(E)-module is noetherian. In both situations, isomorphisms between these algebras and appropriate direct sums of matrix rings over K or K[x, x−1] are provided. Likewise, in both situations, equivalent graph theoretic conditions on E are presented.
Resumo:
We use store-specific data for a major UK supermarket chain to estimate the impact of planning on store output. Using the quasi-natural experiment of the variation in policies between England and other UK countries, we isolate the impact of Town Centre First policies. We find that space contributes directly to store productivity; and planning policies in England directly reduce output both by reducing store sizes and forcing stores onto less productive sites. We estimate that since the late 1980s planning policies have imposed a loss of output of at least 18.3 to 24.9% - more than a “lost decade’s” growth. JEL codes: D2, L51, L81, R32.
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced
Resumo:
In this paper, I consider a general and informationally effcient approach to determine the optimal access rule and show that there exists a simple rule that achieves the Ramsey outcome as the unique equilibrium when networks compete in linear prices without network-based price discrimination. My approach is informationally effcient in the sense that the regulator is required to know only the marginal cost structure, i.e. the marginal cost of making and terminating a call. The approach is general in that access prices can depend not only on the marginal costs but also on the retail prices, which can be observed by consumers and therefore by the regulator as well. In particular, I consider the set of linear access pricing rules which includes any fixed access price, the Efficient Component Pricing Rule (ECPR) and the Modified ECPR as special cases. I show that in this set, there is a unique access rule that achieves the Ramsey outcome as the unique equilibrium as long as there exists at least a mild degree of substitutability among networks' services.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
This paper explores the integration process that firms follow to implementSupply Chain Management (SCM) and the main barriers and benefits relatedto this strategy. This study has been inspired in the SCM literature,especially in the logistics integration model by Stevens [1]. Due to theexploratory nature of this paper and the need to obtain an in depthknowledge of the SCM development in the Spanish grocery sector, we used thecase study methodology. A multiple case study analysis based on interviewswith leading manufacturers and retailers was conducted.The results of this analysis suggest that firms seem to follow the integration process proposed by Stevens, integrating internally first, andthen, extending this integration to other supply chain members. The casesalso show that Spanish manufacturers, in general, seem to have a higherlevel of SCM development than Spanish retailers. Regarding the benefitsthat SCM can bring, most of the companies identify the general objectivesof cost and stock reductions and service improvements. However, withrespect to the barriers found in its implementation, retailers andmanufacturers are not coincident: manufacturers seem to see more barrierswith respect to aspects related to the other party, such as distrust and alack of culture of sharing information, while retailers find as mainbarriers the need of a know-how , the company culture and the historyand habits.
Resumo:
In today s highly competitive and global marketplace the pressure onorganizations to find new ways to create and deliver value to customersgrows ever stronger. In the last two decades, logistics and supply chainhas moved to the center stage. There has been a growing recognition thatit is through an effective management of the logistics function and thesupply chain that the goal of cost reduction and service enhancement canbe achieved. The key to success in Supply Chain Management (SCM) requireheavy emphasis on integration of activities, cooperation, coordination andinformation sharing throughout the entire supply chain, from suppliers tocustomers. To be able to respond to the challenge of integration there isthe need of sophisticated decision support systems based on powerfulmathematical models and solution techniques, together with the advancesin information and communication technologies. The industry and the academiahave become increasingly interested in SCM to be able to respond to theproblems and issues posed by the changes in the logistics and supply chain.We present a brief discussion on the important issues in SCM. We then arguethat metaheuristics can play an important role in solving complex supplychain related problems derived by the importance of designing and managingthe entire supply chain as a single entity. We will focus specially on theIterated Local Search, Tabu Search and Scatter Search as the ones, but notlimited to, with great potential to be used on solving the SCM relatedproblems. We will present briefly some successful applications.
Resumo:
The paper deals with a bilateral accident situation in which victims haveheterogeneous costs of care. With perfect information,efficient care bythe injurer raises with the victim's cost. When the injurer cannot observeat all the victim's type, and this fact can be verified by Courts, first-bestcannot be implemented with the use of a negligence rule based on thefirst-best levels of care. Second-best leads the injurer to intermediate care,and the two types of victims to choose the best response to it. This second-bestsolution can be easily implemented by a negligence rule with second-best as duecare. We explore imperfect observation of the victim's type, characterizing theoptimal solution and examining the different legal alternatives when Courts cannotverify the injurers' statements. Counterintuitively, we show that there is nodifference at all between the use by Courts of a rule of complete trust and arule of complete distrust towards the injurers' statements. We then relate thefindings of the model to existing rules and doctrines in Common Law and Civil Lawlegal systems.
Resumo:
This paper analyses the interaction of two topics: Supply Chain Management (SCM) andInternet. Merging these two fields is a key area of concern for contemporary managers andresearchers. They have realized that Internet can enhance SCM by making real timeinformation available and enabling collaboration between trading partners. The aim of thispaper is to define e-SCM, analyze how research in this area has evolved during the period1995-2003 and identify some lines of further research. To do that a literature review inprestigious academic journals in Operations Management and Logistics has beenconducted.
Resumo:
We study the standard economic model of unilateral accidents, in its simplest form, assumingthat the injurers have limited assets.We identify a second-best optimal rule that selects as duecare the minimum of first-best care, and a level of care that takes into account the wealth ofthe injurer. We show that such a rule in fact maximizes the precautionary effort by a potentialinjurer. The idea is counterintuitive: Being softer on an injurer, in terms of the required level ofcare, actually improves the incentives to take care when he is potentially insolvent. We extendthe basic result to an entire population of potentially insolvent injurers, and find that the optimalgeneral standards of care do depend on wealth, and distribution of income. We also show theconditions for the result that higher income levels in a given society call for higher levels of carefor accidents.
Resumo:
We study how to promote compliance with rules in everyday situations. Having access to unique data on the universe of users of all public libraries inBarcelona, we test the effect of sending email messages with dierent contents.We find that users return their items earlier if asked to do so in a simple email.Emails reminding users of the penalties associated with late returns are more effective than emails with only a generic reminder. We find differential treatmenteffects by user types. The characteristics we analyze are previous compliance,gender, age, and nationality.