923 resultados para noncooperative foundations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A path in an edge colored graph is said to be a rainbow path if no two edges on the path have the same color. An edge colored graph is (strongly) rainbow connected if there exists a (geodesic) rainbow path between every pair of vertices. The (strong) rainbow connectivity of a graph G, denoted by (src(G), respectively) rc(G) is the smallest number of colors required to edge color the graph such that G is (strongly) rainbow connected. In this paper we study the rainbow connectivity problem and the strong rainbow connectivity problem from a computational point of view. Our main results can be summarised as below: 1) For every fixed k >= 3, it is NP-Complete to decide whether src(G) <= k even when the graph G is bipartite. 2) For every fixed odd k >= 3, it is NP-Complete to decide whether rc(G) <= k. This resolves one of the open problems posed by Chakraborty et al. (J. Comb. Opt., 2011) where they prove the hardness for the even case. 3) The following problem is fixed parameter tractable: Given a graph G, determine the maximum number of pairs of vertices that can be rainbow connected using two colors. 4) For a directed graph G, it is NP-Complete to decide whether rc(G) <= 2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently it has been discovered---contrary to expectations of physicists as well as biologists---that the energy transport during photosynthesis, from the chlorophyll pigment that captures the photon to the reaction centre where glucose is synthesised from carbon dioxide and water, is highly coherent even at ambient temperature and in the cellular environment. This process and the key molecular ingredients that it depends on are described. By looking at the process from the computer science view-point, we can study what has been optimised and how. A spatial search algorithmic model based on robust features of wave dynamics is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large software systems are developed by composing multiple programs. If the programs manip-ulate and exchange complex data, such as network packets or files, it is essential to establish that they follow compatible data formats. Most of the complexity of data formats is associated with the headers. In this paper, we address compatibility of programs operating over headers of network packets, files, images, etc. As format specifications are rarely available, we infer the format associated with headers by a program as a set of guarded layouts. In terms of these formats, we define and check compatibility of (a) producer-consumer programs and (b) different versions of producer (or consumer) programs. A compatible producer-consumer pair is free of type mismatches and logical incompatibilities such as the consumer rejecting valid outputs gen-erated by the producer. A backward compatible producer (resp. consumer) is guaranteed to be compatible with consumers (resp. producers) that were compatible with its older version. With our prototype tool, we identified 5 known bugs and 1 potential bug in (a) sender-receiver modules of Linux network drivers of 3 vendors and (b) different versions of a TIFF image library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the case history of the construction of a 3 m high embankment on the geocell foundation over the soft settled red mud. Red mud is a waste product from the Bayer process of Aluminum industry. Geotechnical problems of the site, the design of the geocell foundation based on experimental investigation and the construction sequences of the geocell foundations in the field are discussed in the paper. Based on the experimental studies, an analytical model was also developed to estimate the load carrying capacity of the soft clay bed reinforced with geocell and combination of geocell and geogrid. The results of the experimental and analytical studies revealed that the use of combination of geocell and the geogrid is always beneficial than using the geocell alone. Hence, the combination of geocell and geogrid was recommended to stabilize the embankment base. The reported embankment is located in Lanjigharh (Orissa) in India. Construction of the embankment on the geocell foundation has already been completed. The constructed embankmenthas already sustained two monsoon rains without any cracks and seepage. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sacred groves are patches of forests preserved for their spiritual and religious significance. The practice gained relevance with the spread of agriculture that caused large-scale deforestation affecting biodiversity and watersheds. Sacred groves may lose their prominence nowadays, but are still relevant in Indian rural landscapes inhabited by traditional communities. The recent rise of interest in this tradition encouraged scientific study that despite its pan-Indian distribution, focused on India's northeast, Western Ghats and east coast either for their global/regional importance or unique ecosystems. Most studies focused on flora, mainly angiosperms, and the faunal studies concentrated on vertebrates while lower life forms were grossly neglected. Studies on ecosystem functioning are few although observations are available. Most studies attributed watershed protection values to sacred groves but hardly highlighted hydrological process or water yield in comparison with other land use types. The grove studies require diversification from a stereotyped path and must move towards creating credible scientific foundations for conservation. Documentation should continue in unexplored areas but more work is needed on basic ecological functions and ecosystem dynamics to strengthen planning for scientifically sound sacred grove management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the parameterized complexity of the following edge coloring problem motivated by the problem of channel assignment in wireless networks. For an integer q >= 2 and a graph G, the goal is to find a coloring of the edges of G with the maximum number of colors such that every vertex of the graph sees at most q colors. This problem is NP-hard for q >= 2, and has been well-studied from the point of view of approximation. Our main focus is the case when q = 2, which is already theoretically intricate and practically relevant. We show fixed-parameter tractable algorithms for both the standard and the dual parameter, and for the latter problem, the result is based on a linear vertex kernel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The correlation clustering problem is a fundamental problem in both theory and practice, and it involves identifying clusters of objects in a data set based on their similarity. A traditional modeling of this question as a graph theoretic problem involves associating vertices with data points and indicating similarity by adjacency. Clusters then correspond to cliques in the graph. The resulting optimization problem, Cluster Editing (and several variants) are very well-studied algorithmically. In many situations, however, translating clusters to cliques can be somewhat restrictive. A more flexible notion would be that of a structure where the vertices are mutually ``not too far apart'', without necessarily being adjacent. One such generalization is realized by structures called s-clubs, which are graphs of diameter at most s. In this work, we study the question of finding a set of at most k edges whose removal leaves us with a graph whose components are s-clubs. Recently, it has been shown that unless Exponential Time Hypothesis fail (ETH) fails Cluster Editing (whose components are 1-clubs) does not admit sub-exponential time algorithm STACS, 2013]. That is, there is no algorithm solving the problem in time 2 degrees((k))n(O(1)). However, surprisingly they show that when the number of cliques in the output graph is restricted to d, then the problem can be solved in time O(2(O(root dk)) + m + n). We show that this sub-exponential time algorithm for the fixed number of cliques is rather an exception than a rule. Our first result shows that assuming the ETH, there is no algorithm solving the s-Club Cluster Edge Deletion problem in time 2 degrees((k))n(O(1)). We show, further, that even the problem of deleting edges to obtain a graph with d s-clubs cannot be solved in time 2 degrees((k))n(O)(1) for any fixed s, d >= 2. This is a radical contrast from the situation established for cliques, where sub-exponential algorithms are known.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Representatives of several Internet service providers (ISPs) have expressed their wish to see a substantial change in the pricing policies of the Internet. In particular, they would like to see content providers (CPs) pay for use of the network, given the large amount of resources they use. This would be in clear violation of the ``network neutrality'' principle that had characterized the development of the wireline Internet. Our first goal in this article is to propose and study possible ways of implementing such payments and of regulating their amount. We introduce a model that includes the users' behavior, the utilities of the ISP and of the CPs, and, the monetary flow that involves the content users, the ISP and CP, and, in pUrticular, the CP's revenues from advertisements. We consider various game models and study the resulting equilibria; they are all combinations of a noncooperative game (in which the ISPs and CPs determine how much they will charge the users) with a ``cooperative'' one on how the CP and the ISP share the payments. We include in our model a possible asymmetric weighting parameter (that varies between zero to one). We also study equilibria that arise when one of the CPs colludes with the TSP. We also study two dynamic game models as well as the convergence of prices to the equilibrium values.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method is presented for determining the ultimate bearing capacity of a circular footing reinforced with a horizontal circular sheet of reinforcement placed over granular and cohesive-frictional soils. It was assumed that the reinforcement sheet could bear axial tension but not the bending moment. The analysis was performed based on the lower-bound theorem of the limit analysis in combination with finite elements and linear optimization. The present research is an extension of recent work with strip foundations reinforced with different layers of reinforcement. To incorporate the effect of the reinforcement, the efficiency factors eta(gamma) and eta(c), which need to be multiplied by the bearing capacity factors N-gamma and N-c, were established. Results were obtained for different values of the soil internal friction angle (phi). The optimal positions of the reinforcements, which would lead to a maximum improvement in the bearing capacity, were also determined. The variations of the axial tensile force in the reinforcement sheet at different radial distances from the center were also studied. The results of the analysis were compared with those available from literature. (C) 2014 American Society of Civil Engineers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematics is beautiful and precise and often necessary to understand complex biological phenomena. And yet biologists cannot always hope to fully understand the mathematical foundations of the theory they are using or testing. How then should biologists behave when mathematicians themselves are in dispute? Using the on-going controversy over Hamilton's rule as an example, I argue that biologists should be free to treat mathematical theory with a healthy dose of agnosticism. In doing so biologists should equip themselves with a disclaimer that publicly admits that they cannot entirely attest to the veracity of the mathematics underlying the theory they are using or testing. The disclaimer will only help if it is accompanied by three responsibilities - stay bipartisan in a dispute among mathematicians, stay vigilant and help expose dissent among mathematicians, and make the biology larger than the mathematics. I must emphasize that my goal here is not to take sides in the on-going dispute over the mathematical validity of Hamilton's rule, indeed my goal is to argue that we should refrain from taking sides.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Health monitoring is an integral part of laboratory animal quality standards. However, current or past prevalence data as well as regulatory requirements dictate the frequency, type and the expanse of health monitoring. In an effort to understand the prevalence of rodent pathogens in India, a preliminary study was carried out by sero-epidemiology. Sera samples obtained from 26 public and private animal facilities were analyzed for the presence of antibodies against minute virus of mice (MVM), ectromelia virus (ECTV), lymphocytic choriomeningitis virus (LCMV), mouse hepatitis virus (MHV), Sendai virus (SeV), and Mycoplasma pulmonis in mice, and SeV, rat parvo virus (RPV), Kilham's rat virus (KRV) and sialodacryoadenitis virus (SDAV) in rats, by sandwich ELISA. It was observed that MHV was the most prevalent agent followed by Mycoplasma pulmonis and MVM in mice, and SDAV followed by RPV were prevalent in rats. On the other hand, none of the samples were positive for ECTV in mice, or SeV or KRV in rats. Multiple infections were common in both mice and rats. The incidence of MHV and Mycoplasma pulmonis was higher in facilities maintained by public organizations than in vivaria of private organizations, although the difference was not statistically different. On the other hand the prevalence of rodent pathogens was significantly higher in the northern part of India than in the South. These studies form the groundwork for detailed sero-prevalence studies which should further lay the foundations for country-specific guidelines for health monitoring of laboratory animals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of scaling up data integration, such that new sources can be quickly utilized as they are discovered, remains elusive: Global schemas for integrated data are difficult to develop and expand, and schema and record matching techniques are limited by the fact that data and metadata are often under-specified and must be disambiguated by data experts. One promising approach is to avoid using a global schema, and instead to develop keyword search-based data integration-where the system lazily discovers associations enabling it to join together matches to keywords, and return ranked results. The user is expected to understand the data domain and provide feedback about answers' quality. The system generalizes such feedback to learn how to correctly integrate data. A major open challenge is that under this model, the user only sees and offers feedback on a few ``top-'' results: This result set must be carefully selected to include answers of high relevance and answers that are highly informative when feedback is given on them. Existing systems merely focus on predicting relevance, by composing the scores of various schema and record matching algorithms. In this paper, we show how to predict the uncertainty associated with a query result's score, as well as how informative feedback is on a given result. We build upon these foundations to develop an active learning approach to keyword search-based data integration, and we validate the effectiveness of our solution over real data from several very different domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a lower bound limit analysis approach for solving an axisymmetric stability problem by using the Drucker-Prager (D-P) yield cone in conjunction with finite elements and nonlinear optimization. In principal stress space, the tip of the yield cone has been smoothened by applying the hyperbolic approximation. The nonlinear optimization has been performed by employing an interior point method based on the logarithmic barrier function. A new proposal has also been given to simulate the D-P yield cone with the Mohr-Coulomb hexagonal yield pyramid. For the sake of illustration, bearing capacity factors N-c, N-q and N-gamma have been computed, as a function of phi, both for smooth and rough circular foundations. The results obtained from the analysis compare quite well with the solutions reported from literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I consider theories of gravity built not just from the metric and affine connection, but also other (possibly higher rank) symmetric tensor(s). The Lagrangian densities are scalars built from them, and the volume forms are related to Cayley's hyperdeterminants. The resulting diff-invariant actions give rise to geometric theories that go beyond the metric paradigm (even metric-less theories are possible), and contain Einstein gravity as a special case. Examples contain theories with generalizeations of Riemannian geometry. The 0-tensor case is related to dilaton gravity. These theories can give rise to new types of spontaneous Lorentz breaking and might be relevant for ``dark'' sector cosmology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The bearing capacity of a circular footing lying over fully cohesive strata, with an overlaying sand layer, is computed using the axisymmetric lower bound limit analysis with finite elements and linear optimization. The effects of the thickness and the internal friction angle of the sand are examined for different combinations of c(u)/(gamma b) and q, where c(u)=the undrained shear strength of the cohesive strata, gamma=the unit weight of either layer, b=the footing radius, and q=the surcharge pressure. The results are given in the form of a ratio (eta) of the bearing capacity with an overlaying sand layer to that for a footing lying directly over clayey strata. An overlaying medium dense to dense sand layer considerably improves the bearing capacity. The improvement continuously increases with decreases in c(u)/(gamma b) and increases in phi and q/(gamma b). A certain optimum thickness of the sand layer exists beyond which no further improvement occurs. This optimum thickness increases with an increase in 0 and q and with a decrease in c(u)/(gamma b). Failure patterns are also drawn to examine the inclusion of the sand layer. (C) 2015 The Japanese Geotechnical Society. Production and hosting by Elsevier B.V. All rights reserved.