53 resultados para Hidden logic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Duck hepatitis B viruses (DHBV), unlike mammalian hepadnaviruses, are thought to lack X genes, which encode transcription-regulatory proteins believed to contribute to the development of hepatocellular carcinoma. A lack of association of chronic DHBV infection with hepatocellular carcinoma development supports this belief. Here, we demonstrate that DHBV genomes have a hidden open reading frame from which a transcription-regulatory protein, designated DHBx, is expressed both in vitro and in vivo. We show that DHBx enhances neither viral protein expression, intracellular DNA synthesis, nor virion production when assayed in the full-length genome context in LMH cells. However, similar to mammalian hepadnavirus X proteins, DHBx activates cellular and viral promoters via the Raf-mitogen-activated protein kinase signaling pathway and localizes primarily in the cytoplasm. The functional similarities as,well as the weak sequence homologies of DHBx and the X proteins of mammalian hepadnaviruses strongly suggest a common ancestry of ortho- and avihepadnavirus X genes. In addition, our data disclose similar intracellular localization and transcription regulatory functions of the corresponding proteins, raise new questions as to their presumed role in hepatocarcinogenesis, and imply unique opportunities for deciphering of their still-enigmatic in vivo functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing refinement calculi provide frameworks for the stepwise development of imperative programs from specifications. This paper presents a refinement calculus for deriving logic programs. The calculus contains a wide-spectrum logic programming language, including executable constructs such as sequential conjunction, disjunction, and existential quantification, as well as specification constructs such as general predicates, assumptions and universal quantification. A declarative semantics is defined for this wide-spectrum language based on executions. Executions are partial functions from states to states, where a state is represented as a set of bindings. The semantics is used to define the meaning of programs and specifications, including parameters and recursion. To complete the calculus, a notion of correctness-preserving refinement over programs in the wide-spectrum language is defined and refinement laws for developing programs are introduced. The refinement calculus is illustrated using example derivations and prototype tool support is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss techniques for producing, manipulating, and measuring qubits encoded optically as vacuum- and single-photon states. We show that a universal set of nondeterministic gates can be constructed using linear optics and photon counting. We investigate the efficacy of a test gate given realistic detector efficiencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central dogma of biology holds that genetic information normally flows from DNA to RNA to protein. As a consequence it has been generally assumed that genes generally code for proteins, and that proteins fulfil not only most structural and catalytic but also most regulatory functions, in all cells, from microbes to mammals. However, the latter may not be the case in complex organisms. A number of startling observations about the extent of non-protein-coding RNA (ncRNA) transcription in the higher eukaryotes and the range of genetic and epigenetic phenomena that are RNA-directed suggests that the traditional view of the structure of genetic regulatory systems in animals and plants may be incorrect. ncRNA dominates the genomic output of the higher organisms and has been shown to control chromosome architecture, mRNA turnover and the developmental timing of protein expression, and may also regulate transcription and alternative splicing. This paper re-examines the available evidence and suggests a new framework for considering and understanding the genomic programming of biological complexity, autopoletic development and phenotypic variation. BioEssays 25:930-939,2003. (C) 2003 Wiley Periodicals, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New high-precision niobium (Nb) and tantalum (Ta) concentration data are presented for early Archaean metabasalts, metabasaltic komatiites and their erosion products (mafic metapelites) from SW Greenland and the Acasta gneiss complex, Canada. Individual datasets consistently show sub-chondritic Nb/Ta ratios averaging 15.1+/-11.6. This finding is discussed with regard to two competing models for the solution of the Nb-deficit that characterises the accessible Earth. Firstly, we test whether Nb could have sequestered into the core due to its slightly siderophile (or chalcophile) character under very reducing conditions, as recently proposed from experimental evidence. We demonstrate that troilite inclusions of the Canyon Diablo iron meteorite have Nb and V concentrations in excess of typical chondrites but that the metal phase of the Grant, Toluca and Canyon Diablo iron meteorites do not have significant concentrations of these lithophile elements. We find that if the entire accessible Earth Nb-deficit were explained by Nb in the core, only ca. 17% of the mantle could be depleted and that by 3.7 Ga, continental crust would have already achieved ca. 50% of its present mass. Nb/Ta systematics of late Archaean metabasalts compiled from the literature would further require that by 2.5 Ga, 90% of the present mass of continental crust was already in existence. As an alternative to this explanation, we propose that the average Nb/Ta ratio (15.1+/-11.6) of Earth's oldest mafic rocks is a valid approximation for bulk silicate Earth. This would require that ca. 13% of the terrestrial Nb resided in the Ta-free core. Since the partitioning of Nb between silicate and metal melts depends largely on oxygen fugacity and pressure, this finding could mean that metal/silicate segregation did not occur at the base of a deep magma ocean or that the early mantle was slightly less reducing than generally assumed. A bulk silicate Earth Nb/Ta ratio of 15.1 allows for depletion of up to 40% of the total mantle. This could indicate that in addition to the upper mantle, a portion of the lower mantle is depleted also, or if only the upper mantle were depleted, an additional hidden high Nb/Ta reservoir must exist. Comparison of Nb/Ta systematics between early and late Archaean metabasalts supports the latter idea and indicates deeply subducted high Nb/Ta eclogite slabs could reside in the mantle transition zone or the lower mantle. Accumulation of such slabs appears to have commenced between 2.5 and 2.0 Ga. Regardless of these complexities of terrestrial Nb/Ta systematics, it is shown that the depleted mantle Nb/Th ratio is a very robust proxy for the amount of extracted continental crust, because the temporal evolution of this ratio is dominated by Th-loss to the continents and not Nb-retention in the mantle. We present a new parameterisation of the continental crust volume versus age curve that specifically explores the possibility of lithophile element loss to the core and storage of eclogite slabs in the transition zone. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a novel application of fuzzy logic to web data mining for two basic problems of a website: popularity and satisfaction. Popularity means that people will visit the website while satisfaction refers to the usefulness of the site. We will illustrate that the popularity of a website is a fuzzy logic problem. It is an important characteristic of a website in order to survive in Internet commerce. The satisfaction of a website is also a fuzzy logic problem that represents the degree of success in the application of information technology to the business. We propose a framework of fuzzy logic for the representation of these two problems based on web data mining techniques to fuzzify the attributes of a website.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessments for assigning the conservation status of threatened species that are based purely on subjective judgements become problematic because assessments can be influenced by hidden assumptions, personal biases and perceptions of risks, making the assessment process difficult to repeat. This can result in inconsistent assessments and misclassifications, which can lead to a lack of confidence in species assessments. It is almost impossible to Understand an expert's logic or visualise the underlying reasoning behind the many hidden assumptions used throughout the assessment process. In this paper, we formalise the decision making process of experts, by capturing their logical ordering of information, their assumptions and reasoning, and transferring them into a set of decisions rules. We illustrate this through the process used to evaluate the conservation status of species under the NatureServe system (Master, 1991). NatureServe status assessments have been used for over two decades to set conservation priorities for threatened species throughout North America. We develop a conditional point-scoring method, to reflect the current subjective process. In two test comparisons, 77% of species' assessments using the explicit NatureServe method matched the qualitative assessments done subjectively by NatureServe staff. Of those that differed, no rank varied by more than one rank level under the two methods. In general, the explicit NatureServe method tended to be more precautionary than the subjective assessments. The rank differences that emerged from the comparisons may be due, at least in part, to the flexibility of the qualitative system, which allows different factors to be weighted on a species-by-species basis according to expert judgement. The method outlined in this study is the first documented attempt to explicitly define a transparent process for weighting and combining factors under the NatureServe system. The process of eliciting expert knowledge identifies how information is combined and highlights any inconsistent logic that may not be obvious in Subjective decisions. The method provides a repeatable, transparent, and explicit benchmark for feedback, further development, and improvement. (C) 2004 Elsevier SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Race is fundamental in shaping the development of Australian law just as it has played its part in other former colonies, such as the United States, where a body of critical race theory has been established on the basis of this premise. Drawing on this theory I argue that the possessive logic of patriarchal white sovereignty works ideologically to naturalise the nation as a white possession by informing and circulating a coherent set of meanings about white possession as part of common sense knowledge and socially produced conventions in the High Court's Yorta Yorta decision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. Australia, like other countries, is experiencing an epidemic of heart failure (HF). However, given the lack of national and population-based datasets collating detailed cardiovascular-specific morbidity and mortality outcomes, quantifying the specific burden imposed by HF has been difficult. Methods. Australian Bureau of Statistics (ABS data) for the year 2000 were used in combination with contemporary, well-validated population-based epidemiologic data to estimate the number of individuals with symptomatic and asymptomatic HF related to both preserved (diastolic dysfunction) and impaired left ventricular systolic (dys)function (LVSD) and rates of HF-related hospitalisation. Results. In 2000, we estimate that around 325,000 Australians (58% male) had symptomatic HF associated with both LVSD and diastolic dysfunction and an additional 214,000 with asymptomatic LVSD. 140,000 (26%) live in rural and remote regions, distal to specialist health care services. There was an estimated 22,000 incidents of admissions for congestive heart failure and approximately 100,000 admissions associated with this syndrome overall. Conclusion. Australia is in the midst of a HF epidemic that continues to grow. Overall, it probably contributes to over 1.4 million days of hospitalization at a cost of more than $1 billion. A national response to further quantify and address this enormous health problem is required.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A disappointing feature of conventional methods for detecting association between DNA variation and a phenotype of interest is that they tell us little about the hidden pattern of linkage disequilibrium (LD) with the functional variant that is actually responsible for the association. This limitation applies to case-control studies and also to the transmission/disequilibrium test (TDT) and other family-based association methods. Here we present a fresh perspective on genetic association based on two novel concepts called 'LD squares' and 'equi-risk alleles'. These describe and characterize the different patterns of gametic LD which underlie genetic association. These concepts lead to a general principle - the Equi-Risk Allele Segregation Principle - which captures the way in which underlying LD patterns affect the transmission patterns of genetic variants associated with a phenotype. This provides a basis for distinguishing the hidden LD patterns and might help to locate the functional variants responsible for the association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.