631 resultados para algorithmic skeletons
Resumo:
Grattan, J.P., Al-Saad, Z., Gilbertson, D.D., Karaki, L.O., Pyatt, F.B 2005 Analyses of patterns of copper and lead mineralisation in human skeletons excavated from an ancient mining and smelting centre in the Jordanian desert Mineralogical Magazine. 69(5) 653-666.
Resumo:
Pyatt, F.B., Pyatt, A.J., Walker, C., Sheen, T., Grattan, J.P, The heavy metal content of skeletons from an ancient metalliferous polluted area in southern Jordan with particular reerence to bioaccumulation and human health, Ecotoxicology & Environmental Safety 60, 13th August 2003, 295-300
Resumo:
Hill, Joe M., Lloyd, Noel G., Pearson, Jane M., 'Algorithmic derivation of isochronicity conditions', Nonlinear Analysis (2007) 67, 52-69.
Resumo:
Estimation of the skeleton of a directed acyclic graph (DAG) is of great importance for understanding the underlying DAG and causal effects can be assessed from the skeleton when the DAG is not identifiable. We propose a novel method named PenPC to estimate the skeleton of a high-dimensional DAG by a two-step approach. We first estimate the nonzero entries of a concentration matrix using penalized regression, and then fix the difference between the concentration matrix and the skeleton by evaluating a set of conditional independence hypotheses. For high-dimensional problems where the number of vertices p is in polynomial or exponential scale of sample size n, we study the asymptotic property of PenPC on two types of graphs: traditional random graphs where all the vertices have the same expected number of neighbors, and scale-free graphs where a few vertices may have a large number of neighbors. As illustrated by extensive simulations and applications on gene expression data of cancer patients, PenPC has higher sensitivity and specificity than the state-of-the-art method, the PC-stable algorithm.
Resumo:
'To Tremble the Zero: Art in the Age of Algorithmic Reproduction' is a philosophic, political and sensuous journey playing with (and against) Benjamin's 'Art in the Age of Mechanical Reproduction'. In an age inundated by the 'post-': postmodernity, posthuman, post art, postsexual, post-feminist, post-society, post-nation, etc, 'To Tremble the Zero' sets out to re/present the nature of what it means to do or make 'art', as well as what it means to be or have 'human/ity' when the ground is nothing other than the fractal, and algorithmically infinite, combinations of zero and one. The work will address also the unfortunate way in which modern forms of metaphysics continue to creep 'unsuspectingly' into our understanding of contemporary media/electronic arts, despite (or perhaps even because of) the attempts by Latour, Badiou, or Agamben especially when addressing the zero/one as if a contradictory 'binary' rather than as a kind of 'slice' or (to use Deleuze and Guattari) an immanent plane of immanence. This work argues that by retrieving Benjamin, Einstein, Gödel, and Haraway, a rather different story of art can be told.
Resumo:
Traditionally, the Internet provides only a “best-effort” service, treating all packets going to the same destination equally. However, providing differentiated services for different users based on their quality requirements is increasingly becoming a demanding issue. For this, routers need to have the capability to distinguish and isolate traffic belonging to different flows. This ability to determine the flow each packet belongs to is called packet classification. Technology vendors are reluctant to support algorithmic solutions for classification due to their non-deterministic performance. Although CAMs are favoured by technology vendors due to their deterministic high lookup rates, they suffer from the problems of high power dissipation and high silicon cost. This paper provides a new algorithmic-architectural solution for packet classification that mixes CAMs with algorithms based on multi-level cutting the classification space into smaller spaces. The provided solution utilizes the geometrical distribution of rules in the classification space. It provides the deterministic performance of CAMs, support for dynamic updates, and added flexibility for system designers.
Resumo:
The classification of protein structures is an important and still outstanding problem. The purpose of this paper is threefold. First, we utilize a relation between the Tutte and homfly polynomial to show that the Alexander-Conway polynomial can be algorithmically computed for a given planar graph. Second, as special cases of planar graphs, we use polymer graphs of protein structures. More precisely, we use three building blocks of the three-dimensional protein structure-alpha-helix, antiparallel beta-sheet, and parallel beta-sheet-and calculate, for their corresponding polymer graphs, the Tutte polynomials analytically by providing recurrence equations for all three secondary structure elements. Third, we present numerical results comparing the results from our analytical calculations with the numerical results of our algorithm-not only to test consistency, but also to demonstrate that all assigned polynomials are unique labels of the secondary structure elements. This paves the way for an automatic classification of protein structures.