931 resultados para Abbott, Andrew: Methods of discovery


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Expedition 302 of the Integrated Ocean Drilling Program (IODP), also known as the Arctic Coring Expedition (ACEX), successfully penetrated a sequence of Cenozoic sediments draping the crest of the Lomonosov Ridge in the central Arctic Ocean. The cumulative sedimentary record spans the last 57 m.y. and was recovered from three sites located within 15 km of each other. Merging the recovered cores onto a common depth scale that accurately reflects their stratigraphic placement below the seafloor is a fundamental step toward interpreting this unique sedimentary record. However, the lack of overlapping recovery in adjacent holes and intervals of high core disturbance complicated traditional methods of stratigraphic correlation. Here we present a revised composite depth scale for the ACEX sediments, generated in part by performing a regional stratigraphic correlation with sediments recovered from previous expeditions to the Lomonosov Ridge. The revised depth scale also reassesses the offsets for cores in the upper 55 meters below seafloor, where no overlapping recovery was acquired, and proposes modifications to these depths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pt. II has title: River gardens; being an account of the best methods of cultivating fresh water plants in aquaria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chapters 5-12 (p. 60-334) include a portion of Vancouver's journal, reprinted from v. 2 of his Voyage of discovery to the North Pacific Ocean, and round the world: 2d edition. London, 1801. The reprint "is designed to follow that explorer from the time he strikes the shore of the present state of Washington ... on into Puget Sound, and around Vancouver Island, and, finally, through the negotiations at Nootka".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cyclotides are a family of small disulfide rich proteins that have a cyclic peptide backbone and a cystine knot formed by three conserved disulfide bonds. The combination of these two structural motifs contributes to the exceptional chemical, thermal and enzymatic stability of the cyclotides, which retain bioactivity after boiling. They were initially discovered based on native medicine or screening studies associated with some of their various activities, which include uterotonic action, anti-HIV activity, neurotensin antagonism, and cytotoxicity. They are present in plants from the Rubiaceae, Violaceae and Cucurbitaccae families and their natural function in plants appears to be in host defense: they have potent activity against certain insect pests and they also have antimicrobial activity. There are currently around 50 published sequences of cyclotides and their rate of discovery has been increasing over recent years. Ultimately the family may comprise thousands of members. This article describes the background to the discovery of the cyclotides, their structural characterization, chemical synthesis, genetic origin, biological activities and potential applications in the pharmaceutical and agricultural industries. Their unique topological features make them interesting from a protein folding perspective. Because of their highly stable peptide framework they might make useful templates in drug design programs, and their insecticidal activity opens the possibility of applications in crop protection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Q fever is a common zoonosis worldwide. Awareness of the disease and newer diagnostic modalities have resulted in increasing recognition of unusual manifestations. We report 3 cases of Q fever osteomyelitis in children and review the literature on 11 other reported cases. The cases demonstrate that Coxiella burnetii can cause granulomatous osteomyelitis that presents without systemic symptoms and frequently results in a chronic, relapsing, multifocal clinical course. Optimal selection and duration of antimicrobial therapy and methods of monitoring therapy are currently uncertain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Body mass index ( BMI) is used to diagnose obesity. However, its ability to predict the percentage fat mass (% FM) reliably is doubtful. Therefore validity of BMI as a diagnostic tool of obesity is questioned. Aim: This study is focused on determining the ability of BMI- based cut- off values in diagnosing obesity among Australian children of white Caucasian and Sri Lankan origin. Subjects and methods: Height and weight was measured and BMI ( W/H-2) calculated. Total body water was determined by deuterium dilution technique and fat free mass and hence fat mass derived using age- and gender- specific constants. A % FM of 30% for girls and 20% for boys was considered as the criterion cut- off level for obesity. BMI- based obesity cut- offs described by the International Obesity Task Force ( IOTF), CDC/ NCHS centile charts and BMI- Z were validated against the criterion method. Results: There were 96 white Caucasian and 42 Sri Lankan children. Of the white Caucasians, 19 ( 36%) girls and 29 ( 66%) boys, and of the Sri Lankans 7 ( 46%) girls and 16 ( 63%) boys, were obese based on % FM. The FM and BMI were closely associated in both Caucasians ( r = 0.81, P < 0.001) and Sri Lankans ( r = 0.92, P< 0.001). Percentage FM and BMI also had a lower but significant association. Obesity cut- off values recommended by IOTF failed to detect a single case of obesity in either group. However, NCHS and BMI- Z cut- offs detected cases of obesity with low sensitivity. Conclusions: BMI is a poor indicator of percentage fat and the commonly used cut- off values were not sensitive enough to detect cases of childhood obesity in this study. In order to improve the diagnosis of obesity, either BMI cut- off values should be revised to increase the sensitivity or the possibility of using other indirect methods of estimating the % FM should be explored.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vitro measurements of skin absorption are an increasingly important aspect of regulatory studies, product support claims, and formulation screening. However, such measurements are significantly affected by skin variability. The purpose of this study was to determine inter- and intralaboratory variation in diffusion cell measurements caused by factors other than skin. This was attained through the use of an artificial (silicone rubber) rate-limiting membrane and the provision of materials including a standard penetrant, methyl paraben (MP), and a minimally prescriptive protocol to each of the 18 participating laboratories. Standardized calculations of MP flux were determined from the data submitted by each laboratory by applying a predefined mathematical model. This was deemed necessary to eliminate any interlaboratory variation caused by different methods of flux calculations. Average fluxes of MP calculated and reported by each laboratory (60 +/- 27 mug cm(-2) h(-1), n = 25, range 27-101) were in agreement with the standardized calculations of MP flux (60 +/- 21 mug cm(-2) h(-1), range 19-120). The coefficient of variation between laboratories was approximately 35% and was manifest as a fourfold difference between the lowest and highest average flux values and a sixfold difference between the lowest and highest individual flux values. Intra-laboratory variation was lower, averaging 10% for five individuals using the same equipment within a single laboratory. Further studies should be performed to clarify the exact components responsible for nonskin-related variability in diffusion cell measurements. It is clear that further developments of in vitro methodologies for measuring skin absorption are required. (C) 2005 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivation: An important problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. We provide a straightforward and easily implemented method for estimating the posterior probability that an individual gene is null. The problem can be expressed in a two-component mixture framework, using an empirical Bayes approach. Current methods of implementing this approach either have some limitations due to the minimal assumptions made or with more specific assumptions are computationally intensive. Results: By converting to a z-score the value of the test statistic used to test the significance of each gene, we propose a simple two-component normal mixture that models adequately the distribution of this score. The usefulness of our approach is demonstrated on three real datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article applies methods of latent class analysis (LCA) to data on lifetime illicit drug use in order to determine whether qualitatively distinct classes of illicit drug users can be identified. Self-report data on lifetime illicit drug use (cannabis, stimulants, hallucinogens, sedatives, inhalants, cocaine, opioids and solvents) collected from a sample of 6265 Australian twins (average age 30 years) were analyzed using LCA. Rates of childhood sexual and physical abuse, lifetime alcohol and tobacco dependence, symptoms of illicit drug abuse/dependence and psychiatric comorbidity were compared across classes using multinomial logistic regression. LCA identified a 5-class model: Class 1 (68.5%) had low risks of the use of all drugs except cannabis; Class 2 (17.8%) had moderate risks of the use of all drugs; Class 3 (6.6%) had high rates of cocaine, other stimulant and hallucinogen use but lower risks for the use of sedatives or opioids. Conversely, Class 4 (3.0%) had relatively low risks of cocaine, other stimulant or hallucinogen use but high rates of sedative and opioid use. Finally, Class 5 (4.2%) had uniformly high probabilities for the use of all drugs. Rates of psychiatric comorbidity were highest in the polydrug class although the sedative/opioid class had elevated rates of depression/suicidal behaviors and exposure to childhood abuse. Aggregation of population-level data may obscure important subgroup differences in patterns of illicit drug use and psychiatric comorbidity. Further exploration of a 'self-medicating' subgroup is needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two contrasting multivariate statistical methods, viz., principal components analysis (PCA) and cluster analysis were applied to the study of neuropathological variations between cases of Alzheimer's disease (AD). To compare the two methods, 78 cases of AD were analyzed, each characterised by measurements of 47 neuropathological variables. Both methods of analysis revealed significant variations between AD cases. These variations were related primarily to differences in the distribution and abundance of senile plaques (SP) and neurofibrillary tangles (NFT) in the brain. Cluster analysis classified the majority of AD cases into five groups which could represent subtypes of AD. However, PCA suggested that variation between cases was more continuous with no distinct subtypes. Hence, PCA may be a more appropriate method than cluster analysis in the study of neuropathological variations between AD cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gas absorption, the removal of one or more constitutents from a gas mixture, is widely used in chemical processes. In many gas absorption processes, the gas mixture is already at high pressure and in recent years organic solvents have been developed for the process of physical absorption at high pressure followed by low pressure regeneration of the solvent and recovery of the absorbed gases. Until now the discovery of new solvents has usually been by expensive and time consuming trial and error laboratory tests. This work describes a new approach, whereby a solvent is selected from considerations of its molecular structure by applying recently published methods of predicting gas solubility from the molecular groups which make up the solvent molecule. The removal of the acid gases of carbon dioxide and hydrogen sulfide from methane or hydrogen was used as a commercially important example. After a preliminary assessment to identify promising moecular groups, more than eighty new solvent molecules were designed and evaluated by predicting gas solubility. The other important physical properties were also predicted by appropriate theoretical procedures, and a commercially promising new solvent was chosen to have a high solubility for acid gases, a low solubility for methane and hydrogen, a low vapour pressure, and a low viscosity. The solvent chosen, of molecular structure Ch3-COCH2-CH2-CO-CH3, was tested in the laboratory and shown to have physical properties, except for vapour pressures, close to those predicted. That is gas solubilities were within 10% but lower than predicted. Viscosity within 10% but higher than predicted and a vapour pressure significantly lower than predicted. A computer program was written to predict gas solubility in the new solvent at the high pressures (25 bar) used in practice. This is based on the group contribution method of Skold Jorgensen (1984). Before using this with the new solvent, Acetonyl acetone, the method was show to be sufficiently accurate by comparing predicted values of gas solubility with experimental solubilities from the literature for 14 systems up to 50 bar. A test of the commercial potential of the new solvent was made by means of two design studies which compared the size of plant and approximate relative costs of absorbing acid gases by means of the new solvent with other commonly used solvents. These were refrigerated methanol(Rectisol process) and Dimethyl Ether or Polyethylene Glycol(Selexol process). Both studies showed in terms of capital and operating cost some significant advantage for plant designed for the new solvent process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of this thesis is to investigate the application of methods of differential geometry to the constraint analysis of relativistic high spin field theories. As a starting point the coordinate dependent descriptions of the Lagrangian and Dirac-Bergmann constraint algorithms are reviewed for general second order systems. These two algorithms are then respectively employed to analyse the constraint structure of the massive spin-1 Proca field from the Lagrangian and Hamiltonian viewpoints. As an example of a coupled field theoretic system the constraint analysis of the massive Rarita-Schwinger spin-3/2 field coupled to an external electromagnetic field is then reviewed in terms of the coordinate dependent Dirac-Bergmann algorithm for first order systems. The standard Velo-Zwanziger and Johnson-Sudarshan inconsistencies that this coupled system seemingly suffers from are then discussed in light of this full constraint analysis and it is found that both these pathologies degenerate to a field-induced loss of degrees of freedom. A description of the geometrical version of the Dirac-Bergmann algorithm developed by Gotay, Nester and Hinds begins the geometrical examination of high spin field theories. This geometric constraint algorithm is then applied to the free Proca field and to two Proca field couplings; the first of which is the minimal coupling to an external electromagnetic field whilst the second is the coupling to an external symmetric tensor field. The onset of acausality in this latter coupled case is then considered in relation to the geometric constraint algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was concerned with investigating methods of improving the IOP pulse’s potential as a measure of clinical utility. There were three principal sections to the work. 1. Optimisation of measurement and analysis of the IOP pulse. A literature review, covering the years 1960 – 2002 and other relevant scientific publications, provided a knowledge base on the IOP pulse. Initial studies investigated suitable instrumentation and measurement techniques. Fourier transformation was identified as a promising method of analysing the IOP pulse and this technique was developed. 2. Investigation of ocular and systemic variables that affect IOP pulse measurements In order to recognise clinically important changes in IOP pulse measurement, studies were performed to identify influencing factors. Fourier analysis was tested against traditional parameters in order to assess its ability to detect differences in IOP pulse. In addition, it had been speculated that the waveform components of the IOP pulse contained vascular characteristic analogous to those components found in arterial pulse waves. Validation studies to test this hypothesis were attempted. 3. The nature of the intraocular pressure pulse in health and disease and its relation to systemic cardiovascular variables. Fourier analysis and traditional parameters were applied to the IOP pulse measurements taken on diseased and healthy eyes. Only the derived parameter, pulsatile ocular blood flow (POBF) detected differences in diseased groups. The use of an ocular pressure-volume relationship may have improved the POBF measure’s variance in comparison to the measurement of the pulse’s amplitude or Fourier components. Finally, the importance of the driving force of pulsatile blood flow, the arterial pressure pulse, is highlighted. A method of combining the measurements of pulsatile blood flow and pulsatile blood pressure to create a measure of ocular vascular impedance is described along with its advantages for future studies.