922 resultados para Application specific algorithm
Resumo:
The removal of chemicals in solution by overland how from agricultural land has the potential to be a significant source of chemical loss where chemicals are applied to the soil surface, as in zero tillage and surface-mulched farming systems. Currently, we lack detailed understanding of the transfer mechanism between the soil solution and overland flow, particularly under field conditions. A model of solute transfer from soil solution to overland flow was developed. The model is based on the hypothesis that a solute is initially distributed uniformly throughout the soil pore space in a thin layer at the soil surface. A fundamental assumption of the model is that at the time runoff commences, any solute at the soil surface that could be transported into the soil with the infiltrating water will already have been convected away from the area of potential exchange. Solute remaining at the soil surface is therefore not subject to further infiltration and may be approximated as a layer of tracer on a plane impermeable surface. The model fitted experimental data very well in all but one trial. The model in its present form focuses on the exchange of solute between the soil solution and surface water after the commencement of runoff. Future model development requires the relationship between the mass transfer parameters of the model and the time to runoff: to be defined. This would enable the model to be used for extrapolation beyond the specific experimental results of this study. The close agreement between experimental results and model simulations shows that the simple transfer equation proposed in this study has promise for estimating solute loss to surface runoff. Copyright (C) 2000 John Wiley & Sons, Ltd.
Resumo:
A major challenge associated with using large chemical libraries synthesized on microscopic solid support beads is the rapid discrimination of individual compounds in these libraries. This challenge can be overcome by encoding the beads with 1 mum silica colloidal particles (reporters) that contain specific and identifiable combinations of fluorescent byes. The colored bar code generated on support beads during combinatorial library synthesis can be easily, rapidly, and inexpensively decoded through the use of fluorescence microscopy. All reporters are precoated with polyelectrolytes [poly(acrylic acid), PAA, poly(sodium 4-styrenesulfonate PSSS, polyethylenimine, PEI, and/or poly(diallyldimethylammonium chloride), PDADMAC] with the aim of enhancing surface charge, promoting electrostatic attraction to the bead, and facilitating polymer bridging between the bead and reporter for permanent adhesion. As shown in this article, reporters coated with polyelectrolytes clearly outperform uncoated reporters with regard to quantity of attached reporters per bead (54 +/- 23 in 2500 mum(2) area for PEI/PAA coated and 11 +/- 6 for uncoated reporters) and minimization of cross-contamination (1 red reporter in 2500 mum(2) area of green-labeled bead for PEI/PAA coated and 26 +/- 15 red reporters on green-labeled beads for uncoated reporters after 10 days). Examination of various polyelectrolyte systems shows that the magnitude of the xi -potential of polyelectrolyte-coated reporters (-64 mV for PDADMAC/PSSS and -42 mV for PEI/PAA-coated reporters) has no correlation with the number of reporters that adhere to the solid support beads (21 +/- 16 in 2500 mum(2) area for PDADMAC/PSSS and 54 +/- 23 for PEI/PAA-coated reporters). The contribution of polymer bridging to the adhesion has a far greater influence than electrostatic attraction and is demonstrated by modification of the polyelectrolyte multilayers using gamma irradiation of precoated reporters either in aqueous solution or in polyelectrolyte solution.
Resumo:
Neurons in the central amygdala express two distinct types of ionotropic GABA receptor. One is the classical GABA(A) receptor that is blocked by low concentrations of bicuculline and positively modulated by benzodiazepines. The other is a novel type of ionotropic GABA receptor that is less sensitive to bicuculline but blocked by the GABA(C) receptor antagonist (1,2,5,6-tetrohydropyridine-4-yl) methylphosphinic acid (TPMPA) and by benzodiazepines. In this study, we examine the distribution of these two receptor types. Recordings of GABAergic miniature inhibitory postsynaptic currents (mIPSCs) showed a wide variation in amplitude. Most events had amplitudes of 100 pA. Large-amplitude events also had rise times faster than small-amplitude events. Large-amplitude events were fully blocked by 10 muM bicuculline but unaffected by TPMPA. Small amplitude events were partially blocked by both bicuculline and TPMPA. Focal application of hypertonic sucrose to the soma evoked large-amplitude mIPSCs, whereas focal dendritic application of sucrose evoked small-amplitude mIPSCs. Thus inhibitory synapses on the dendrites of neurons in the central amygdala express both types of GABA receptor, but somatic synapses expressed purely GABA(A) receptors. Minimal stimulation revealed that inhibitory inputs arising from the laterally located intercalated cells innervate dendritic synapses, whereas inhibitory inputs of medial origin innervated somatic inhibitory synapses. These results show that different types of ionotropic GABA receptors are targeted to spatially and functionally distinct synapses. Thus benzodiazepines will have different modulatory effects on different inhibitory pathways in the central amygdala.
Resumo:
A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
Feature selection is one of important and frequently used techniques in data preprocessing. It can improve the efficiency and the effectiveness of data mining by reducing the dimensions of feature space and removing the irrelevant and redundant information. Feature selection can be viewed as a global optimization problem of finding a minimum set of M relevant features that describes the dataset as well as the original N attributes. In this paper, we apply the adaptive partitioned random search strategy into our feature selection algorithm. Under this search strategy, the partition structure and evaluation function is proposed for feature selection problem. This algorithm ensures the global optimal solution in theory and avoids complete randomness in search direction. The good property of our algorithm is shown through the theoretical analysis.
Resumo:
Prediction of carbohydrate fractions using equations from the Cornell Net Carbohydrate and Protein System (CNCPS) is a valuable tool to assess the nutritional value of forages. In this paper these carbohydrate fractions were predicted using data from three sunflower (Helianthus annuus L.) cultivars, fresh or as silage. The CNCPS equations for fractions B(2) and C include measurement of ash and protein-free neutral detergent fibre (NDF) as one of their components. However, NDF lacks pectin and other non-starch polysaccharides that are found in the cell wall (CW) matrix, so this work compared the use of a crude CW preparation instead of NDF in the CNCPS equations. There were no differences in the estimates of fractions B, and C when CW replaced NDF; however there were differences in fractions A and B2. Some of the CNCPS equations could be simplified when using CW instead of NDF Notably, lignin could be expressed as a proportion of DM, rather than on the basis of ash and protein-free NDF, when predicting CNCPS fraction C. The CNCPS fraction B(1) (starch + pectin) values were lower than pectin determined through wet chemistty. This finding, along with the results obtained by the substitution of CW for NDF in the CNCPS equations, suggests that pectin was not part of fraction B(1) but present in fraction A. We suggest that pectin and other non-starch polysaccharides that are dissolved by the neutral detergent solution be allocated to a specific fraction (B2) and that another fraction (B(3)) be adopted for the digestible cell wall carbohydrates.
The structure of middle management remuneration packages: An application to Australian mine managers
Resumo:
This paper investigates the composition of remuneration packages for middle managers and relates the structure of remuneration contracts to firm-specific attributes. A statutorily defined position in a single industry is studied as an example of middle management. This allows us to control for differences in task complexity across managers and industry-induced factors that could determine differences in remuneration contracts. Higher-risk firms are expected to pay their mine managers a greater proportion of variable salaries and market and/or accounting-based compensation than low-risk firms. Results indicate that high-risk firms pay a higher proportion of variable salaries and more compensation based on market and/or accounting performance.
Resumo:
The treatment of lateral epicondylalgia, a widely-used model of musculoskeletal pain in the evaluation of many physical therapy treatments, remains somewhat of an enigma. The protagonists of a new treatment technique for lateral epicondylalgia report that it produces substantial and rapid pain relief, despite a lack of experimental evidence. A randomized, double blind, placebo-controlled repeated-measures study evaluated the initial effect of this new treatment in 24 patients with unilateral, chronic lateral epicondylalgia. Pain-free grip strength was assessed as an outcome measure before, during and after the application of the treatment, placebo and control conditions. Pressure-pain thresholds were also measured before and after the application of treatment, placebo and control conditions. The results demonstrated a significant and substantial increase in pain-free grip strength of 58% (of the order of 60 N) during treatment but not during placebo and control. In contrast, the 10% change in pressure-pain threshold after treatment, although significantly greater than placebo and control, was substantially smaller than the change demonstrated for pain-free grip strength. This effect was only present in the affected limb. The selective and specific effect of this treatment technique provides a valuable insight into the physical modulation of musculoskeletal pain and requires further investigation. (C) 2001 Harcourt Publishers Ltd.
Resumo:
An emerging idea is that long-term alcohol abuse results in changes in gene expression in the brain and that these changes are responsible at least partly for alcohol tolerance, dependence and neurotoxicity, The overall goal of our research is to identify genes which are differentia[ly expressed in the brains of well-characterized human alcoholics as compared with non-alcoholics. This should identify as-yet-unknown alcohol-responsive genes, and may well confirm changes in the expression of genes which have been delineated in animal models of alcohol abuse. Cases were carefully selected and samples pooled on the basis of relevant criteria; differential expression was monitored by microarray hybridization. The inherent diversity of human alcoholics can be exploited to identify genes associated with specific pathological processes, as well as to assess the effects of concomitant disease, severity of brain damage, drinking behavior, and factors such as gender and smoking history. initial results show selective changes in gene expression in alcoholics; of particular importance is a coordinated reduction in genes coding for myelin components, Copyright (C) 2001 National Science Council, ROC and S. Karger AG, Basel.
Resumo:
Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.
Resumo:
The magnitude of genotype-by-management (G x M) interactions for grain yield and grain protein concentration was examined in a multi-environment trial (MET) involving a diverse set of 272 advanced breeding lines from the Queensland wheat breeding program. The MET was structured as a series of management-regimes imposed at 3 sites for 2 years. The management-regimes were generated at each site-year as separate trials in which planting time, N fertiliser application rate, cropping history, and irrigation were manipulated. irrigation was used to simulate different rainfall regimes. From the combined analysis of variance, the G x M interaction variance components were found to be the largest source of G x E interaction variation for both grain yield (0.117 +/- 0.005 t(2) ha(-2); 49% of total G x E 0.238 +/- 0.028 t(2) ha(-2)) and grain protein concentration (0.445 +/- 0.020%(2); 82% of total G x E 0.546 +/- 0.057%(2)), and in both cases this source of variation was larger than the genotypic variance component (grain yield 0.068 +/- 0.014 t(2) ha(-2) and grain protein 0.203 +/- 0.026%(2)). The genotypic correlation between the traits varied considerably with management-regime, ranging from -0.98 to -0.31, with an estimate of 0.0 for one trial. Pattern analysis identified advanced breeding lines with improved grain yield and grain protein concentration relative to the cultivars Hartog, Sunco and Meteor. It is likely that a large component of the previously documented G x E interactions for grain yield of wheat in the northern grains region are in part a result of G x M interactions. The implications of the strong influence of G x M interactions for the conduct of wheat breeding METs in the northern region are discussed. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
A methodology and framework for discipline-specific curriculum development in a local context is described. These activities, as part of the Thailand-Australia Science and Engineering Assistance Project, were in response to a needs analysis for curriculum assistance to a number of publicly-funded Thai universities in the engineering priority area of Materials Processing and Manufacturing. The paper outlines a strategy for the delivery of a centralised curriculum development workshop for academic staff follow-up visits and local curriculum activities with participating universities, and the presentation of technical short courses as guidance for such activity in other settings and/or discipline areas. This paper is part of a process of documentation so that others can apply the developed methodology and framework for curriculum development. While the paper is a report on curriculum activities in a particular setting, it is written in a manner that allows application of the methodology to other settings. The reader is advised that each curriculum activity needs to adopt a methodology and strategy to fit the particular circumstances being considered To assist in applying this approach elsewhere, a description of the various steps in the curriculum process, and typical responses to some of the more global issues, have been presented. Full details are available in the various TASEAP reports prepared by the authors. Specific detail has been omitted where this detail does not provide any information for generalized consumption.
Resumo:
We present global and regional rates of brain atrophy measured on serially acquired T1-weighted brain MR images for a group of Alzheimer's disease (AD) patients and age-matched normal control (NC) subjects using the analysis procedure described in Part I. Three rates of brain atrophy: the rate of atrophy in the cerebrum, the rate of lateral ventricular enlargement and the rate of atrophy in the region of temporal lobes, were evaluated for 14 AD patients and 14 age-matched NC subjects. All three rates showed significant differences between the two groups, However, the greatest separation of the two groups was obtained when the regional rates were combined. This application has demonstrated that rates of brain atrophy, especially in specific regions of the brain, based on MR images can provide sensitive measures for evaluating the progression of AD. These measures will be useful for the evaluation of therapeutic effects of novel therapies for AD. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
In microarray studies, the application of clustering techniques is often used to derive meaningful insights into the data. In the past, hierarchical methods have been the primary clustering tool employed to perform this task. The hierarchical algorithms have been mainly applied heuristically to these cluster analysis problems. Further, a major limitation of these methods is their inability to determine the number of clusters. Thus there is a need for a model-based approach to these. clustering problems. To this end, McLachlan et al. [7] developed a mixture model-based algorithm (EMMIX-GENE) for the clustering of tissue samples. To further investigate the EMMIX-GENE procedure as a model-based -approach, we present a case study involving the application of EMMIX-GENE to the breast cancer data as studied recently in van 't Veer et al. [10]. Our analysis considers the problem of clustering the tissue samples on the basis of the genes which is a non-standard problem because the number of genes greatly exceed the number of tissue samples. We demonstrate how EMMIX-GENE can be useful in reducing the initial set of genes down to a more computationally manageable size. The results from this analysis also emphasise the difficulty associated with the task of separating two tissue groups on the basis of a particular subset of genes. These results also shed light on why supervised methods have such a high misallocation error rate for the breast cancer data.
Resumo:
A proportion of melanoma,prone individuals in both familial and non,familial contexts has been shown to carry inactivating mutations in either CDKN2A or, rarely, CDK4. CDKN2A is a complex locus that encodes two unrelated proteins from alternately spliced transcripts that are read in different frames. The alpha transcript (exons 1a, 2, and 3) produces the p16INK4A cyclin-dependent kinase inhibitor, while the beta transcript (exons 1beta and 2) is translated as p14ARF, a stabilizing factor of p53 levels through binding to MDM2. Mutations in exon 2 can impair both polypeptides and insertions and deletions in exons 1alpha, 1beta, and 2, which can theoretically generate p16INK4A,p14ARF fusion proteins. No online database currently takes into account all the consequences of these genotypes, a situation compounded by some problematic previous annotations of CDKN2A related sequences and descriptions of their mutations. As an initiative of the international Melanoma Genetics Consortium, we have therefore established a database of germline variants observed in all loci implicated in familial melanoma susceptibility. Such a comprehensive, publicly accessible database is an essential foundation for research on melanoma susceptibility and its clinical application. Our database serves two types of data as defined by HUGO. The core dataset includes the nucleotide variants on the genomic and transcript levels, amino acid variants, and citation. The ancillary dataset includes keyword description of events at the transcription and translation levels and epidemiological data. The application that handles users' queries was designed in the model,view. controller architecture and was implemented in Java. The object-relational database schema was deduced using functional dependency analysis. We hereby present our first functional prototype of eMelanoBase. The service is accessible via the URL www.wmi.usyd.e, du.au:8080/melanoma.html.