69 resultados para noncooperative foundations
Resumo:
Representatives of several Internet service providers (ISPs) have expressed their wish to see a substantial change in the pricing policies of the Internet. In particular, they would like to see content providers (CPs) pay for use of the network, given the large amount of resources they use. This would be in clear violation of the ``network neutrality'' principle that had characterized the development of the wireline Internet. Our first goal in this article is to propose and study possible ways of implementing such payments and of regulating their amount. We introduce a model that includes the users' behavior, the utilities of the ISP and of the CPs, and, the monetary flow that involves the content users, the ISP and CP, and, in pUrticular, the CP's revenues from advertisements. We consider various game models and study the resulting equilibria; they are all combinations of a noncooperative game (in which the ISPs and CPs determine how much they will charge the users) with a ``cooperative'' one on how the CP and the ISP share the payments. We include in our model a possible asymmetric weighting parameter (that varies between zero to one). We also study equilibria that arise when one of the CPs colludes with the TSP. We also study two dynamic game models as well as the convergence of prices to the equilibrium values.
Resumo:
A method is presented for determining the ultimate bearing capacity of a circular footing reinforced with a horizontal circular sheet of reinforcement placed over granular and cohesive-frictional soils. It was assumed that the reinforcement sheet could bear axial tension but not the bending moment. The analysis was performed based on the lower-bound theorem of the limit analysis in combination with finite elements and linear optimization. The present research is an extension of recent work with strip foundations reinforced with different layers of reinforcement. To incorporate the effect of the reinforcement, the efficiency factors eta(gamma) and eta(c), which need to be multiplied by the bearing capacity factors N-gamma and N-c, were established. Results were obtained for different values of the soil internal friction angle (phi). The optimal positions of the reinforcements, which would lead to a maximum improvement in the bearing capacity, were also determined. The variations of the axial tensile force in the reinforcement sheet at different radial distances from the center were also studied. The results of the analysis were compared with those available from literature. (C) 2014 American Society of Civil Engineers.
Resumo:
Mathematics is beautiful and precise and often necessary to understand complex biological phenomena. And yet biologists cannot always hope to fully understand the mathematical foundations of the theory they are using or testing. How then should biologists behave when mathematicians themselves are in dispute? Using the on-going controversy over Hamilton's rule as an example, I argue that biologists should be free to treat mathematical theory with a healthy dose of agnosticism. In doing so biologists should equip themselves with a disclaimer that publicly admits that they cannot entirely attest to the veracity of the mathematics underlying the theory they are using or testing. The disclaimer will only help if it is accompanied by three responsibilities - stay bipartisan in a dispute among mathematicians, stay vigilant and help expose dissent among mathematicians, and make the biology larger than the mathematics. I must emphasize that my goal here is not to take sides in the on-going dispute over the mathematical validity of Hamilton's rule, indeed my goal is to argue that we should refrain from taking sides.
Resumo:
Health monitoring is an integral part of laboratory animal quality standards. However, current or past prevalence data as well as regulatory requirements dictate the frequency, type and the expanse of health monitoring. In an effort to understand the prevalence of rodent pathogens in India, a preliminary study was carried out by sero-epidemiology. Sera samples obtained from 26 public and private animal facilities were analyzed for the presence of antibodies against minute virus of mice (MVM), ectromelia virus (ECTV), lymphocytic choriomeningitis virus (LCMV), mouse hepatitis virus (MHV), Sendai virus (SeV), and Mycoplasma pulmonis in mice, and SeV, rat parvo virus (RPV), Kilham's rat virus (KRV) and sialodacryoadenitis virus (SDAV) in rats, by sandwich ELISA. It was observed that MHV was the most prevalent agent followed by Mycoplasma pulmonis and MVM in mice, and SDAV followed by RPV were prevalent in rats. On the other hand, none of the samples were positive for ECTV in mice, or SeV or KRV in rats. Multiple infections were common in both mice and rats. The incidence of MHV and Mycoplasma pulmonis was higher in facilities maintained by public organizations than in vivaria of private organizations, although the difference was not statistically different. On the other hand the prevalence of rodent pathogens was significantly higher in the northern part of India than in the South. These studies form the groundwork for detailed sero-prevalence studies which should further lay the foundations for country-specific guidelines for health monitoring of laboratory animals.
Resumo:
The problem of scaling up data integration, such that new sources can be quickly utilized as they are discovered, remains elusive: Global schemas for integrated data are difficult to develop and expand, and schema and record matching techniques are limited by the fact that data and metadata are often under-specified and must be disambiguated by data experts. One promising approach is to avoid using a global schema, and instead to develop keyword search-based data integration-where the system lazily discovers associations enabling it to join together matches to keywords, and return ranked results. The user is expected to understand the data domain and provide feedback about answers' quality. The system generalizes such feedback to learn how to correctly integrate data. A major open challenge is that under this model, the user only sees and offers feedback on a few ``top-'' results: This result set must be carefully selected to include answers of high relevance and answers that are highly informative when feedback is given on them. Existing systems merely focus on predicting relevance, by composing the scores of various schema and record matching algorithms. In this paper, we show how to predict the uncertainty associated with a query result's score, as well as how informative feedback is on a given result. We build upon these foundations to develop an active learning approach to keyword search-based data integration, and we validate the effectiveness of our solution over real data from several very different domains.
Resumo:
This paper presents a lower bound limit analysis approach for solving an axisymmetric stability problem by using the Drucker-Prager (D-P) yield cone in conjunction with finite elements and nonlinear optimization. In principal stress space, the tip of the yield cone has been smoothened by applying the hyperbolic approximation. The nonlinear optimization has been performed by employing an interior point method based on the logarithmic barrier function. A new proposal has also been given to simulate the D-P yield cone with the Mohr-Coulomb hexagonal yield pyramid. For the sake of illustration, bearing capacity factors N-c, N-q and N-gamma have been computed, as a function of phi, both for smooth and rough circular foundations. The results obtained from the analysis compare quite well with the solutions reported from literature.
Resumo:
I consider theories of gravity built not just from the metric and affine connection, but also other (possibly higher rank) symmetric tensor(s). The Lagrangian densities are scalars built from them, and the volume forms are related to Cayley's hyperdeterminants. The resulting diff-invariant actions give rise to geometric theories that go beyond the metric paradigm (even metric-less theories are possible), and contain Einstein gravity as a special case. Examples contain theories with generalizeations of Riemannian geometry. The 0-tensor case is related to dilaton gravity. These theories can give rise to new types of spontaneous Lorentz breaking and might be relevant for ``dark'' sector cosmology.
Resumo:
The bearing capacity of a circular footing lying over fully cohesive strata, with an overlaying sand layer, is computed using the axisymmetric lower bound limit analysis with finite elements and linear optimization. The effects of the thickness and the internal friction angle of the sand are examined for different combinations of c(u)/(gamma b) and q, where c(u)=the undrained shear strength of the cohesive strata, gamma=the unit weight of either layer, b=the footing radius, and q=the surcharge pressure. The results are given in the form of a ratio (eta) of the bearing capacity with an overlaying sand layer to that for a footing lying directly over clayey strata. An overlaying medium dense to dense sand layer considerably improves the bearing capacity. The improvement continuously increases with decreases in c(u)/(gamma b) and increases in phi and q/(gamma b). A certain optimum thickness of the sand layer exists beyond which no further improvement occurs. This optimum thickness increases with an increase in 0 and q and with a decrease in c(u)/(gamma b). Failure patterns are also drawn to examine the inclusion of the sand layer. (C) 2015 The Japanese Geotechnical Society. Production and hosting by Elsevier B.V. All rights reserved.
Resumo:
We show here a 2(Omega(root d.log N)) size lower bound for homogeneous depth four arithmetic formulas. That is, we give an explicit family of polynomials of degree d on N variables (with N = d(3) in our case) with 0, 1-coefficients such that for any representation of a polynomial f in this family of the form f = Sigma(i) Pi(j) Q(ij), where the Q(ij)'s are homogeneous polynomials (recall that a polynomial is said to be homogeneous if all its monomials have the same degree), it must hold that Sigma(i,j) (Number of monomials of Q(ij)) >= 2(Omega(root d.log N)). The above mentioned family, which we refer to as the Nisan-Wigderson design-based family of polynomials, is in the complexity class VNP. Our work builds on the recent lower bound results 1], 2], 3], 4], 5] and yields an improved quantitative bound as compared to the quasi-polynomial lower bound of 6] and the N-Omega(log log (N)) lower bound in the independent work of 7].