69 resultados para noncooperative foundations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research shows a new approach and development of a design methodology, based on the perspective of meanings. In this study the design process is explored as a development of the structure of meanings. The processes of search and evaluation of meanings form the foundations of developing this structure. In order to facilitate the use and operation of the meanings, the WordNet lexical database and an existing visualization of WordNet — Visuwords — is used for the process of meaning search. The basic tool used for evaluation process is the WordNet::Similarity software, measuring the relatedness of meanings in the database. In this way it is measuring the degree of interconnections between different meanings. This kind of search and evaluation techniques are later on incorporated into our methodology of the structure of meanings to support the design process. The measures of relatedness of meanings are developed as convergence criteria for application in the processes of evaluation. Further on, the methodology for the structure of meanings developed here is used to construct meanings in a verification of product design. The steps of the design methodology, including the search and evaluation processes involved in developing the structure of the meanings, are elucidated. The choices, made by the designer in terms of meanings are supported by consequent searches and evaluations of meanings to be implemented in the designed product. In conclusion, the paper presents directions for developing and further extensions of the proposed design methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The method of stress characteristics has been employed to compute the end-bearing capacity of driven piles. The dependency of the soil internal friction angle on the stress level has been incorporated to achieve more realistic predictions for the end-bearing capacity of piles. The validity of the assumption of the superposition principle while using the bearing capacity equation based on soil plasticity concepts, when applied to deep foundations, has been examined. Fourteen pile case histories were compiled with cone penetration tests (CPT) performed in the vicinity of different pile locations. The end-bearing capacity of the piles was computed using different methods, namely, static analysis, effective stress approach, direct CPT, and the proposed approach. The comparison between predictions made by different methods and measured records shows that the stress-level-based method of stress characteristics compares better with experimental data. Finally, the end-bearing capacity of driven piles in sand was expressed in terms of a general expression with the addition of a new factor that accounts for different factors contributing to the bearing capacity. The influence of the soil nonassociative flow rule has also been included to achieve more realistic results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various logical formalisms with the freeze quantifier have been recently considered to model computer systems even though this is a powerful mechanism that often leads to undecidability. In this paper, we study a linear-time temporal logic with past-time operators such that the freeze operator is only used to express that some value from an infinite set is repeated in the future or in the past. Such a restriction has been inspired by a recent work on spatio-temporal logics. We show decidability of finitary and infinitary satisfiability by reduction into the verification of temporal properties in Petri nets. This is a surprising result since the logic is closed under negation, contains future-time and past-time temporal operators and can express the nonce property and its negation. These ingredients are known to lead to undecidability with a more liberal use of the freeze quantifier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract. Let G = (V,E) be a weighted undirected graph, with non-negative edge weights. We consider the problem of efficiently computing approximate distances between all pairs of vertices in G. While many efficient algorithms are known for this problem in unweighted graphs, not many results are known for this problem in weighted graphs. Zwick [14] showed that for any fixed ε> 0, stretch 1 1 + ε distances between all pairs of vertices in a weighted directed graph on n vertices can be computed in Õ(n ω) time, where ω < 2.376 is the exponent of matrix multiplication and n is the number of vertices. It is known that finding distances of stretch less than 2 between all pairs of vertices in G is at least as hard as Boolean matrix multiplication of two n×n matrices. It is also known that all-pairs stretch 3 distances can be computed in Õ(n 2) time and all-pairs stretch 7/3 distances can be computed in Õ(n 7/3) time. Here we consider efficient algorithms for the problem of computing all-pairs stretch (2+ε) distances in G, for any 0 < ε < 1. We show that all pairs stretch (2 + ε) distances for any fixed ε> 0 in G can be computed in expected time O(n 9/4 logn). This algorithm uses a fast rectangular matrix multiplication subroutine. We also present a combinatorial algorithm (that is, it does not use fast matrix multiplication) with expected running time O(n 9/4) for computing all-pairs stretch 5/2 distances in G. 1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the many different objectives of large scale structural genomics projects are expanding the protein fold space, enhancing understanding of a model or disease-related organism, and providing foundations for structure-based drug discovery. Systematic analysis of protein structures of Mycobacterium tuberculosis has been ongoing towards meeting some of these objectives. Indian participation in these efforts has been enthusiastic and substantial. The proteins of M. tuberculosis chosen for structural analysis by the Indian groups span almost all the functional categories. The structures determined by the Indian groups have led to significant improvement in the biochemical knowledge on these proteins and consequently have started providing useful insights into the biology of M. tuberculosis. Moreover, these structures form starting points for inhibitor design studies, early results of which are encouraging. The progress made by Indian structural biologists in determining structures of M. tuberculosis proteins is highlighted in this review. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of reaction mechanisms involves systematic investigations of the correlation between structure, reactivity, and time. The challenge is to be able to observe the chemical changes undergone by reactants as they change into products via one or several intermediates such as electronic excited states (singlet and triplet), radicals, radical ions, carbocations, carbanions, carbenes, nitrenes, nitrinium ions, etc. The vast array of intermediates and timescales means there is no single ``do-it-all'' technique. The simultaneous advances in contemporary time-resolved Raman spectroscopic techniques and computational methods have done much towards visualizing molecular fingerprint snapshots of the reactive intermediates in the microsecond to femtosecond time domain. Raman spectroscopy and its sensitive counterpart resonance Raman spectroscopy have been well proven as means for determining molecular structure, chemical bonding, reactivity, and dynamics of short-lived intermediates in solution phase and are advantageous in comparison to commonly used time-resolved absorption and emission spectroscopy. Today time-resolved Raman spectroscopy is a mature technique; its development owes much to the advent of pulsed tunable lasers, highly efficient spectrometers, and high speed, highly sensitive multichannel detectors able to collect a complete spectrum. This review article will provide a brief chronological development of the experimental setup and demonstrate how experimentalists have conquered numerous challenges to obtain background-free (removing fluorescence), intense, and highly spectrally resolved Raman spectra in the nanosecond to microsecond (ns-mu s) and picosecond (ps) time domains and, perhaps surprisingly, laid the foundations for new techniques such as spatially offset Raman spectroscopy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a general class of timed automata parameterized by a set of “input-determined” operators, in a continuous time setting. We show that for any such set of operators, we have a monadic second order logic characterization of the class of timed languages accepted by the corresponding class of automata. Further, we consider natural timed temporal logics based on these operators, and show that they are expressively equivalent to the first-order fragment of the corresponding MSO logics. As a corollary of these general results we obtain an expressive completeness result for the continuous version of MTL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A $k$-box $B=(R_1,...,R_k)$, where each $R_i$ is a closed interval on the real line, is defined to be the Cartesian product $R_1\times R_2\times ...\times R_k$. If each $R_i$ is a unit length interval, we call $B$ a $k$-cube. Boxicity of a graph $G$, denoted as $\boxi(G)$, is the minimum integer $k$ such that $G$ is an intersection graph of $k$-boxes. Similarly, the cubicity of $G$, denoted as $\cubi(G)$, is the minimum integer $k$ such that $G$ is an intersection graph of $k$-cubes. It was shown in [L. Sunil Chandran, Mathew C. Francis, and Naveen Sivadasan: Representing graphs as the intersection of axis-parallel cubes. MCDES-2008, IISc Centenary Conference, available at CoRR, abs/cs/ 0607092, 2006.] that, for a graph $G$ with maximum degree $\Delta$, $\cubi(G)\leq \lceil 4(\Delta +1)\log n\rceil$. In this paper, we show that, for a $k$-degenerate graph $G$, $\cubi(G) \leq (k+2) \lceil 2e \log n \rceil$. Since $k$ is at most $\Delta$ and can be much lower, this clearly is a stronger result. This bound is tight. We also give an efficient deterministic algorithm that runs in $O(n^2k)$ time to output a $8k(\lceil 2.42 \log n\rceil + 1)$ dimensional cube representation for $G$. An important consequence of the above result is that if the crossing number of a graph $G$ is $t$, then $\boxi(G)$ is $O(t^{1/4}{\lceil\log t\rceil}^{3/4})$ . This bound is tight up to a factor of $O((\log t)^{1/4})$. We also show that, if $G$ has $n$ vertices, then $\cubi(G)$ is $O(\log n + t^{1/4}\log t)$. Using our bound for the cubicity of $k$-degenerate graphs we show that cubicity of almost all graphs in $\mathcal{G}(n,m)$ model is $O(d_{av}\log n)$, where $d_{av}$ denotes the average degree of the graph under consideration. model is O(davlogn).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A path in an edge colored graph is said to be a rainbow path if no two edges on the path have the same color. An edge colored graph is (strongly) rainbow connected if there exists a (geodesic) rainbow path between every pair of vertices. The (strong) rainbow connectivity of a graph G, denoted by (src(G), respectively) rc(G) is the smallest number of colors required to edge color the graph such that G is (strongly) rainbow connected. In this paper we study the rainbow connectivity problem and the strong rainbow connectivity problem from a computational point of view. Our main results can be summarised as below: 1) For every fixed k >= 3, it is NP-Complete to decide whether src(G) <= k even when the graph G is bipartite. 2) For every fixed odd k >= 3, it is NP-Complete to decide whether rc(G) <= k. This resolves one of the open problems posed by Chakraborty et al. (J. Comb. Opt., 2011) where they prove the hardness for the even case. 3) The following problem is fixed parameter tractable: Given a graph G, determine the maximum number of pairs of vertices that can be rainbow connected using two colors. 4) For a directed graph G, it is NP-Complete to decide whether rc(G) <= 2.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently it has been discovered---contrary to expectations of physicists as well as biologists---that the energy transport during photosynthesis, from the chlorophyll pigment that captures the photon to the reaction centre where glucose is synthesised from carbon dioxide and water, is highly coherent even at ambient temperature and in the cellular environment. This process and the key molecular ingredients that it depends on are described. By looking at the process from the computer science view-point, we can study what has been optimised and how. A spatial search algorithmic model based on robust features of wave dynamics is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large software systems are developed by composing multiple programs. If the programs manip-ulate and exchange complex data, such as network packets or files, it is essential to establish that they follow compatible data formats. Most of the complexity of data formats is associated with the headers. In this paper, we address compatibility of programs operating over headers of network packets, files, images, etc. As format specifications are rarely available, we infer the format associated with headers by a program as a set of guarded layouts. In terms of these formats, we define and check compatibility of (a) producer-consumer programs and (b) different versions of producer (or consumer) programs. A compatible producer-consumer pair is free of type mismatches and logical incompatibilities such as the consumer rejecting valid outputs gen-erated by the producer. A backward compatible producer (resp. consumer) is guaranteed to be compatible with consumers (resp. producers) that were compatible with its older version. With our prototype tool, we identified 5 known bugs and 1 potential bug in (a) sender-receiver modules of Linux network drivers of 3 vendors and (b) different versions of a TIFF image library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the case history of the construction of a 3 m high embankment on the geocell foundation over the soft settled red mud. Red mud is a waste product from the Bayer process of Aluminum industry. Geotechnical problems of the site, the design of the geocell foundation based on experimental investigation and the construction sequences of the geocell foundations in the field are discussed in the paper. Based on the experimental studies, an analytical model was also developed to estimate the load carrying capacity of the soft clay bed reinforced with geocell and combination of geocell and geogrid. The results of the experimental and analytical studies revealed that the use of combination of geocell and the geogrid is always beneficial than using the geocell alone. Hence, the combination of geocell and geogrid was recommended to stabilize the embankment base. The reported embankment is located in Lanjigharh (Orissa) in India. Construction of the embankment on the geocell foundation has already been completed. The constructed embankmenthas already sustained two monsoon rains without any cracks and seepage. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sacred groves are patches of forests preserved for their spiritual and religious significance. The practice gained relevance with the spread of agriculture that caused large-scale deforestation affecting biodiversity and watersheds. Sacred groves may lose their prominence nowadays, but are still relevant in Indian rural landscapes inhabited by traditional communities. The recent rise of interest in this tradition encouraged scientific study that despite its pan-Indian distribution, focused on India's northeast, Western Ghats and east coast either for their global/regional importance or unique ecosystems. Most studies focused on flora, mainly angiosperms, and the faunal studies concentrated on vertebrates while lower life forms were grossly neglected. Studies on ecosystem functioning are few although observations are available. Most studies attributed watershed protection values to sacred groves but hardly highlighted hydrological process or water yield in comparison with other land use types. The grove studies require diversification from a stereotyped path and must move towards creating credible scientific foundations for conservation. Documentation should continue in unexplored areas but more work is needed on basic ecological functions and ecosystem dynamics to strengthen planning for scientifically sound sacred grove management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the parameterized complexity of the following edge coloring problem motivated by the problem of channel assignment in wireless networks. For an integer q >= 2 and a graph G, the goal is to find a coloring of the edges of G with the maximum number of colors such that every vertex of the graph sees at most q colors. This problem is NP-hard for q >= 2, and has been well-studied from the point of view of approximation. Our main focus is the case when q = 2, which is already theoretically intricate and practically relevant. We show fixed-parameter tractable algorithms for both the standard and the dual parameter, and for the latter problem, the result is based on a linear vertex kernel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The correlation clustering problem is a fundamental problem in both theory and practice, and it involves identifying clusters of objects in a data set based on their similarity. A traditional modeling of this question as a graph theoretic problem involves associating vertices with data points and indicating similarity by adjacency. Clusters then correspond to cliques in the graph. The resulting optimization problem, Cluster Editing (and several variants) are very well-studied algorithmically. In many situations, however, translating clusters to cliques can be somewhat restrictive. A more flexible notion would be that of a structure where the vertices are mutually ``not too far apart'', without necessarily being adjacent. One such generalization is realized by structures called s-clubs, which are graphs of diameter at most s. In this work, we study the question of finding a set of at most k edges whose removal leaves us with a graph whose components are s-clubs. Recently, it has been shown that unless Exponential Time Hypothesis fail (ETH) fails Cluster Editing (whose components are 1-clubs) does not admit sub-exponential time algorithm STACS, 2013]. That is, there is no algorithm solving the problem in time 2 degrees((k))n(O(1)). However, surprisingly they show that when the number of cliques in the output graph is restricted to d, then the problem can be solved in time O(2(O(root dk)) + m + n). We show that this sub-exponential time algorithm for the fixed number of cliques is rather an exception than a rule. Our first result shows that assuming the ETH, there is no algorithm solving the s-Club Cluster Edge Deletion problem in time 2 degrees((k))n(O(1)). We show, further, that even the problem of deleting edges to obtain a graph with d s-clubs cannot be solved in time 2 degrees((k))n(O)(1) for any fixed s, d >= 2. This is a radical contrast from the situation established for cliques, where sub-exponential algorithms are known.