966 resultados para Explanatory combinatorial lexicology
Resumo:
Richard Lewontin proposed that the ability of a scientific field to create a narrative for public understanding garners it social relevance. This article applies Lewontin's conceptual framework of the functions of science (manipulatory and explanatory) to compare and explain the current differences in perceived societal relevance of genetics/genomics and proteomics. We provide three examples to illustrate the social relevance and strong cultural narrative of genetics/genomics for which no counterpart exists for proteomics. We argue that the major difference between genetics/genomics and proteomics is that genomics has a strong explanatory function, due to the strong cultural narrative of heredity. Based on qualitative interviews and observations of proteomics conferences, we suggest that the nature of proteins, lack of public understanding, and theoretical complexity exacerbates this difference for proteomics. Lewontin's framework suggests that social scientists may find that omics sciences affect social relations in different ways than past analyses of genetics.
Resumo:
This thesis describes current and past n-in-one methods and presents three early experimental studies using mass spectrometry and the triple quadrupole instrument on the application of n-in-one in drug discovery. N-in-one strategy pools and mix samples in drug discovery prior to measurement or analysis. This allows the most promising compounds to be rapidly identified and then analysed. Nowadays properties of drugs are characterised earlier and in parallel with pharmacological efficacy. Studies presented here use in vitro methods as caco-2 cells and immobilized artificial membrane chromatography for drug absorption and lipophilicity measurements. The high sensitivity and selectivity of liquid chromatography mass spectrometry are especially important for new analytical methods using n-in-one. In the first study, the fragmentation patterns of ten nitrophenoxy benzoate compounds, serial homology, were characterised and the presence of the compounds was determined in a combinatorial library. The influence of one or two nitro substituents and the alkyl chain length of methyl to pentyl on collision-induced fragmentation was studied, and interesting structurefragmentation relationships were detected. Two nitro group compounds increased fragmentation compared to one nitro group, whereas less fragmentation was noted in molecules with a longer alkyl chain. The most abundant product ions were nitrophenoxy ions, which were also tested in the precursor ion screening of the combinatorial library. In the second study, the immobilized artificial membrane chromatographic method was transferred from ultraviolet detection to mass spectrometric analysis and a new method was developed. Mass spectra were scanned and the chromatographic retention of compounds was analysed using extract ion chromatograms. When changing detectors and buffers and including n-in-one in the method, the results showed good correlation. Finally, the results demonstrated that mass spectrometric detection with gradient elution can provide a rapid and convenient n-in-one method for ranking the lipophilic properties of several structurally diverse compounds simultaneously. In the final study, a new method was developed for caco-2 samples. Compounds were separated by liquid chromatography and quantified by selected reaction monitoring using mass spectrometry. This method was used for caco-2 samples, where absorption of ten chemically and physiologically different compounds was screened using both single and nin- one approaches. These three studies used mass spectrometry for compound identification, method transfer and quantitation in the area of mixture analysis. Different mass spectrometric scanning modes for the triple quadrupole instrument were used in each method. Early drug discovery with n-in-one is area where mass spectrometric analysis, its possibilities and proper use, is especially important.
Resumo:
According to certain arguments, computation is observer-relative either in the sense that many physical systems implement many computations (Hilary Putnam), or in the sense that almost all physical systems implement all computations (John Searle). If sound, these arguments have a potentially devastating consequence for the computational theory of mind: if arbitrary physical systems can be seen to implement arbitrary computations, the notion of computation seems to lose all explanatory power as far as brains and minds are concerned. David Chalmers and B. Jack Copeland have attempted to counter these relativist arguments by placing certain constraints on the definition of implementation. In this thesis, I examine their proposals and find both wanting in some respects. During the course of this examination, I give a formal definition of the class of combinatorial-state automata , upon which Chalmers s account of implementation is based. I show that this definition implies two theorems (one an observation due to Curtis Brown) concerning the computational power of combinatorial-state automata, theorems which speak against founding the theory of implementation upon this formalism. Toward the end of the thesis, I sketch a definition of the implementation of Turing machines in dynamical systems, and offer this as an alternative to Chalmers s and Copeland s accounts of implementation. I demonstrate that the definition does not imply Searle s claim for the universal implementation of computations. However, the definition may support claims that are weaker than Searle s, yet still troubling to the computationalist. There remains a kernel of relativity in implementation at any rate, since the interpretation of physical systems seems itself to be an observer-relative matter, to some degree at least. This observation helps clarify the role the notion of computation can play in cognitive science. Specifically, I will argue that the notion should be conceived as an instrumental rather than as a fundamental or foundational one.
Resumo:
A simplicial complex is said to satisfy complementarity if exactly one of each complementary pair of nonempty vertex-sets constitutes a face of the complex.
Resumo:
The aim of logic synthesis is to produce circuits which satisfy the given boolean function while meeting timing constraints and requiring the minimum silicon area. Logic synthesis involves two steps namely logic decomposition and technology mapping. Existing methods treat the two as separate operation. The traditional approach is to minimize the number of literals without considering the target technology during the decomposition phase. The decomposed expressions are then mapped on to the target technology to optimize the area, Timing optimization is carried out subsequently, A new approach which treats logic decomposition and technology maping as a single operation is presented. The logic decomposition is based on the parameters of the target technology. The area and timing optimization is carried out during logic decomposition phase itself. Results using MCNC circuits are presented to show that this method produces circuits which are 38% faster while requiring 14% increase in area.
Resumo:
Combinatorial exchanges are double sided marketplaces with multiple sellers and multiple buyers trading with the help of combinatorial bids. The allocation and other associated problems in such exchanges are known to be among the hardest to solve among all economic mechanisms. In this paper, we develop computationally efficient iterative auction mechanisms for solving combinatorial exchanges. Our mechanisms satisfy Individual-rationality (IR) and budget-nonnegativity (BN) properties. We also show that our method is bounded and convergent. Our numerical experiments show that our algorithm produces good quality solutions and is computationally efficient.
Resumo:
Combinatorial exchanges are double sided marketplaces with multiple sellers and multiple buyers trading with the help of combinatorial bids. The allocation and other associated problems in such exchanges are known to be among the hardest to solve among all economic mechanisms. It has been shown that the problems of surplus maximization or volume maximization in combinatorial exchanges are inapproximable even with free disposal. In this paper, the surplus maximization problem is formulated as an integer linear programming problem and we propose a Lagrangian relaxation based heuristic to find a near optimal solution. We develop computationally efficient tâtonnement mechanisms for clearing combinatorial exchanges where the Lagrangian multipliers can be interpreted as the prices of the items set by the exchange in each iteration. Our mechanisms satisfy Individual-rationality and Budget-nonnegativity properties. The computational experiments performed on representative data sets show that the proposed heuristic produces a feasible solution with negligible optimality gap.
Resumo:
We prove a lower bound of Omega(1/epsilon (m + log(d - a)) where a = [log(m) (1/4epsilon)] for the hitting set size for combinatorial rectangles of volume at least epsilon in [m](d) space, for epsilon is an element of [m(-(d-2)), 2/9] and d > 2. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Web services are now a key ingredient of software services offered by software enterprises. Many standardized web services are now available as commodity offerings from web service providers. An important problem for a web service requester is the web service composition problem which involves selecting the right mix of web service offerings to execute an end-to-end business process. Web service offerings are now available in bundled form as composite web services and more recently, volume discounts are also on offer, based on the number of executions of web services requested. In this paper, we develop efficient algorithms for the web service composition problem in the presence of composite web service offerings and volume discounts. We model this problem as a combinatorial auction with volume discounts. We first develop efficient polynomial time algorithms when the end-to-end service involves a linear workflow of web services. Next we develop efficient polynomial time algorithms when the end-to-end service involves a tree workflow of web services.
Resumo:
An elementary combinatorial Tanner graph construction for a family of near-regular low density parity check (LDPC) codes achieving high girth is presented. These codes are near regular in the sense that the degree of a left/right vertex is allowed to differ by at most one from the average. The construction yields in quadratic time complexity an asymptotic code family with provable lower bounds on the rate and the girth for a given choice of block length and average degree. The construction gives flexibility in the choice of design parameters of the code like rate, girth and average degree. Performance simulations of iterative decoding algorithm for the AWGN channel on codes designed using the method demonstrate that these codes perform better than regular PEG codes and MacKay codes of similar length for all values of Signal to noise ratio.
Resumo:
We look at graphical descriptions of block codes known as trellises, which illustrate connections between algebra and graph theory, and can be used to develop powerful decoding algorithms. Trellis sizes for linear block codes are known to grow exponentially with the code parameters. Of considerable interest to coding theorists therefore, are more compact descriptions called tail-biting trellises which in some cases can be much smaller than any conventional trellis for the same code . We derive some interesting properties of tail-biting trellises and present a new decoding algorithm.
Resumo:
We demonstrate the possibility of accelerated identification of potential compositions for high-temperature shape memory alloys (SMAs) through a combinatorial material synthesis and analysis approach, wherein we employ the combination of diffusion couple and indentation techniques. The former was utilized to generate smooth and compositionally graded inter-diffusion zones (IDZs) in the Ni-Ti-Pd ternary alloy system of varying IDZ thickness, depending on the annealing time at high temperature. The IDZs thus produced were then impressed with an indenter with a spherical tip so as to inscribe a predetermined indentation strain. Subsequent annealing of the indented samples at various elevated temperatures, T-a, ranging between 150 and 550 degrees C allows for partial to full relaxation of the strain imposed due to the shape memory effect. If T-a is above the austenite finish temperature, A(f), the relaxation will be complete. By measuring the depth recovery, which serves as a proxy for the shape recovery characteristic of the SMA, a three-dimensional map in the recovery temperature composition space is constructed. A comparison of the published Af data for different compositions with the Ta data shows good agreement when the depth recovery is between 70% and 80%, indicating that the methodology proposed in this paper can be utilized for the identification of promising compositions. Advantages and further possibilities of this methodology are discussed.
Resumo:
We introduce k-stellated spheres and consider the class W-k(d) of triangulated d-manifolds, all of whose vertex links are k-stellated, and its subclass W-k*; (d), consisting of the (k + 1)-neighbourly members of W-k(d). We introduce the mu-vector of any simplicial complex and show that, in the case of 2-neighbourly simplicial complexes, the mu-vector dominates the vector of Betti numbers componentwise; the two vectors are equal precisely for tight simplicial complexes. We are able to estimate/compute certain alternating sums of the components of the mu-vector of any 2-neighbourly member of W-k(d) for d >= 2k. As a consequence of this theory, we prove a lower bound theorem for such triangulated manifolds, and we determine the integral homology type of members of W-k*(d) for d >= 2k + 2. As another application, we prove that, when d not equal 2k + 1, all members of W-k*(d) are tight. We also characterize the tight members of W-k*(2k + 1) in terms of their kth Betti numbers. These results more or less answer a recent question of Effenberger, and also provide a uniform and conceptual tightness proof for all except two of the known tight triangulated manifolds. We also prove a lower bound theorem for homology manifolds in which the members of W-1(d) provide the equality case. This generalizes a result (the d = 4 case) due to Walkup and Kuhnel. As a consequence, it is shown that every tight member of W-1 (d) is strongly minimal, thus providing substantial evidence in favour of a conjecture of Kuhnel and Lutz asserting that tight homology manifolds should be strongly minimal. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
The structural landscape of acid-pyridine cocrystals is explored by adopting a combinatorial matrix method with 4-substituted benzoic acids and 4-substituted pyridines. The choice of the system restricts the primary synthon to the robust acid-pyridine entity. This methodology accordingly provides hints toward the formation of secondary synthons. The pK(a) rule is validated in the landscape by taking all components of the matrix together and exploring it as a whole. Along with the global features, the exploration of landscapes reveals some local features. Apart from the identification of secondary synthons, it also sheds light on the propensity of hydration in cocrystals, synthon competition, and certain topological similarities. The method described here combines two approaches, namely, database analysis and high throughput crystallography, to extract more information with minimal extra experimental effort.