948 resultados para COMBINING CLASSIFIERS
Resumo:
T cells recognize peptide epitopes bound to major histocompatibility complex molecules. Human T-cell epitopes have diagnostic and therapeutic applications in autoimmune diseases. However, their accurate definition within an autoantigen by T-cell bioassay, usually proliferation, involves many costly peptides and a large amount of blood, We have therefore developed a strategy to predict T-cell epitopes and applied it to tyrosine phosphatase IA-2, an autoantigen in IDDM, and HLA-DR4(*0401). First, the binding of synthetic overlapping peptides encompassing IA-2 was measured directly to purified DR4. Secondly, a large amount of HLA-DR4 binding data were analysed by alignment using a genetic algorithm and were used to train an artificial neural network to predict the affinity of binding. This bioinformatic prediction method was then validated experimentally and used to predict DR4 binding peptides in IA-2. The binding set encompassed 85% of experimentally determined T-cell epitopes. Both the experimental and bioinformatic methods had high negative predictive values, 92% and 95%, indicating that this strategy of combining experimental results with computer modelling should lead to a significant reduction in the amount of blood and the number of peptides required to define T-cell epitopes in humans.
Resumo:
The cost of spatial join processing can be very high because of the large sizes of spatial objects and the computation-intensive spatial operations. While parallel processing seems a natural solution to this problem, it is not clear how spatial data can be partitioned for this purpose. Various spatial data partitioning methods are examined in this paper. A framework combining the data-partitioning techniques used by most parallel join algorithms in relational databases and the filter-and-refine strategy for spatial operation processing is proposed for parallel spatial join processing. Object duplication caused by multi-assignment in spatial data partitioning can result in extra CPU cost as well as extra communication cost. We find that the key to overcome this problem is to preserve spatial locality in task decomposition. We show in this paper that a near-optimal speedup can be achieved for parallel spatial join processing using our new algorithms.
Resumo:
The Hubble Deep Field South (HDF-S) Hubble Space Telescope (HST) observations are expected to begin in 1998 October. We present a composite spectrum of the QSO in the HDF-S held covering UV/optical/near-IR wavelengths, obtained by combining data from the Australian National University 2.3 m telescope with STIS on the HST.(1) This intermediate-resolution spectrum covers the range 1600-10000 Angstrom and allows us to derive some basic information on the intervening absorption systems which will be important in planning future higher resolution studies of this QSO. The QSO J2233 - 606 coordinates are alpha = 22(h)33(m)37(s).6, delta = -60 degrees 33'29 (J2000), the magnitude is B = 17.5, and its redshift is z(em) = 2.238, derived by simultaneously fitting several emission lines. The spectral index is alpha = -0.7 +/- 0.1, measured between the Ly alpha and Mg II emission lines. Many absorption systems are present, including systems with metal lines redward of the Ly alpha emission line at z(abs) 2.204, 1.942, 1.870, 1.787 and a few very strong Ly alpha features at z(abs) = 2.077, 1.928, without similarly strong metal lines. There is a conspicuous Lyman limit (LL) absorption system that is most likely associated with the z(abs) = 1.942 system with a neutral hydrogen column density of N-HI = (3.1 +/- 1.0) x 10(17) cm(-2). There is some evidence for the presence of a second LL absorber just to the blue of the conspicuous system at z = 1.870. We have employed a new technique, based on an analysis of the shape of the observed spectrum in the region of the LL absorption, to explore the properties of the gas. We tentatively conclude that this system might have suitable characteristics for measuring the deuterium-to-hydrogen (D/H) ratio.
Resumo:
A t - J model for correlated electrons with impurities is proposed. The impurities are introduced in such a way that integrability of the model in one dimension is not violated. The algebraic Bethe ansatz solution of the model is also given and it is shown that the Bethe states are highest weight states with respect to the supersymmetry algebra gl(2/1).
Resumo:
Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.
Resumo:
First of all, we would like to clarify that the passive to active transition was determined not by using Solgasmix [1], but by combining thermodynamic equilibrium and mass balance for the oxidation of SiC under pure CO2 and pure CO. The model used in our paper [2]was an extension ofWagner’s model [3], in a similar way as Balat et al. [4] did for the oxidation of SiC in oxygen.
Resumo:
We examined the effect of recombinant human growth hormone (rhGH) and/or recombinant human insulin-like growth factor-I (rhIGF-I) on regional fat loss in postmenopausal women undergoing a weight loss regimen of diet plus exercise. Twenty-seven women aged 59-79 years, 20-40% above ideal body weight, completed a 12-week program consisting of resistance training 2 days/week and walking 3 days/week, while consuming a diet that was 500 kcal/day less than that required for weight maintenance, Participants were randomly assigned in a double-blind fashion to receive rhGH (0.025 mg/kg BW/day: n=7), rhIGF-I (0.015 mg/kg BW/day: n=7), rhGH + rhIGF-I (n = 6), or placebo (PL: n = 7). Regional and whole body fat mass were determined by dual X-ray absorptiometry. Body fat distribution was assessed by the ratios of trunk fat-to-limb fat (TrF/LimbF) and trunk fat-to-total fat (TrF/TotF), Limb and trunk fat decreased in all groups (p < 0.01). For both ratios of fat distribution, the rhGH treated group experienced an enhanced loss of truncal compared to peripheral fat (p less than or equal to 0.01), with no significant change for those administered rhIGF-I or FL. There was no association between change in fat distribution and indices of cardiovascular disease risk as determined by serum lipid/lipoprotein levels and maximal aerobic capacity. These results suggest that administration of rhGH facilitates a decrease in central compared to peripheral fat in older women undertaking a weight loss program that combines exercise and moderate caloric restriction, although no beneficial effects are conferred to lipid/lipoprotein profiles, Further, the effect of rhGH is not enhanced by combining rhCH with rhIGF-I administration. In addition, rhIGF-I does not augment the loss of trunk fat when administered alone.
Resumo:
This paper presents an overview of the MPEG-7 Description Definition Language (DDL). The DDL provides the syntactic rules for creating, combining, extending and refining MPEG-7 Descriptors (Ds) and Description Schemes (DSs), In the interests of interoperability, the W3C's XML Schema language, with the addition of certain MPEG-7-specific extensions, has been chosen as the DDL. This paper describes the background to this decision and using examples, provides an overview of the core XML, schema features used within MPEG-7 and the extensions made in order to satisfy the MPEG-7 DDL requirements.
Resumo:
Examples from the Murray-Darling basin in Australia are used to illustrate different methods of disaggregation of reconnaissance-scale maps. One approach for disaggregation revolves around the de-convolution of the soil-landscape paradigm elaborated during a soil survey. The descriptions of soil ma units and block diagrams in a soil survey report detail soil-landscape relationships or soil toposequences that can be used to disaggregate map units into component landscape elements. Toposequences can be visualised on a computer by combining soil maps with digital elevation data. Expert knowledge or statistics can be used to implement the disaggregation. Use of a restructuring element and k-means clustering are illustrated. Another approach to disaggregation uses training areas to develop rules to extrapolate detailed mapping into other, larger areas where detailed mapping is unavailable. A two-level decision tree example is presented. At one level, the decision tree method is used to capture mapping rules from the training area; at another level, it is used to define the domain over which those rules can be extrapolated. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
A multiparametric extension of the anisotropic U model is discussed which maintains integrability. The R-matrix solving the Yang-Baxter equation is obtained through a twisting construction applied to the underlying U-q(sl (2/1)) superalgebraic structure which introduces the additional free parameters that arise in the model. Three forms of Bethe ansatz solution for the transfer matrix eigenvalues are given which we show to be equivalent.
Resumo:
Background. A decline in muscle mass and muscle strength characterizes normal aging. As clinical and animal studies show it relationship between higher cytokine levels and low muscle mass, the aim of this study was to investigate whether markers, of inflammation are associated with muscle mass and strength in well-functioning elderly persons. Methods. We Used baseline data (1997-1998) of the Health, Aging, and Body Composition (Health ABC) Study on 3075 black and white men and women aged 70-79 years. Midthigh muscle cross-sectional area (computed tomography), appendicular muscle mass (dual-energy x-ray ab absorptiometry). isokinetic knee extensor strength (KinCom). and isometric inip strength were measured. plasma levels of interleukin-6 (IL-6) and tumor necrosis factor-alpha (TNF-alpha) were assessed by enzyme-linked immunosorbent assay (ELISA). Results. Higher cytokine levels were generally associated with lower muscle mass and lower muscle strength. The most consistent relationship across the gender and race groups was observed for IL-6 and grip strength: per SD increase in IL-6, grip strength was 1.1 to 2.4 kg lower (p < .05) after adjustment for age, clinic Site. health status, medications, physical activity. smoking. height. and body fat. Ail overall measure of elevated cytokine level was created by combining the levels of IL-6 and TNF-alpha. With the exception of white men, elderly persons having high levels of IL-6 (> 1.80 pg/ml) as well as high levels of TNF-alpha (> 3.20 pg/ml) had a smaller muscle area, less appendicular mass. a lower knee extensor strength. and a lower grip strength compared to those with low levels of both cytokines. Conclusions. Higher plasma concentrations of IL-6 and TNF-alpha are associated with lower muscle mass and lower muscle strength in well-functioning older men and women. Higher cytokine levels. as often observed in healthy older persons. may contribute to the loss Of muscle mass and strength that accompanies aging.
Resumo:
In contrast to curative therapies, preventive therapies are administered to largely healthy individuals over long periods. The risk-benefit and cost-benefit ratios are more likely to be unfavourable, making treatment decisions difficult. Drug trials provide insufficient information for treatment decisions, as they are conducted on highly selected populations over short durations, estimate only relative benefits of treatment and offer little information on risks and costs. Epidemiological modelling is a method of combining evidence from observational epidemiology and clinical trials to assist in clinical and health policy decision-making. It can estimate absolute benefits, risks and costs of long-term preventive strategies, and thus allow their precise targeting to individuals for whom they are safest and most cost-effective. Epidemiological modelling also allows explicit information about risks and benefits of therapy to be presented to patients, facilitating informed decision-making.
Resumo:
The aim of this study is to create a two-tiered assessment combining restoration and conservation, both needed for biodiversity management. The first tier of this approach assesses the condition of a site using a standard bioassessment method, AUSRIVAS, to determine whether significant loss of biodiversity has occurred because of human activity. The second tier assesses the conservation value of sites that were determined to be unimpacted in the first step against a reference database. This ensures maximum complementarity without having to set a priori target areas. Using the reference database, we assign site-specific and comparable coefficients for both restoration (Observed/Expected taxa with > 50% probability of occurrence) and conservation values (O/E taxa with < 50%, rare taxa). In a trial on 75 sites on rivers around Sydney, NSW, Australia we were able to identify three regions: (1) an area that may need restoration; (2) an area that had a high conservation value and; (3) a region that was identified as having significant biodiversity loss but with high potential to respond to rehabilitation and become a biodiversity hotspot. These examples highlight the use of the new framework as a comprehensive system for biodiversity assessment.
Resumo:
Canine parasitic zoonoses pose a continuing public health problem, especially in developing countries and communities that are socioeconomically disadvantaged. Our study combined the use of conventional and molecular epidemic, logical tools to determine the role of dogs in transmission of gastrointestinal (GI) parasites such as hookworms, Giardia and Ascaris in a parasite endemic teagrowing community in northeast India. A highly sensitive and specific molecular tool was developed to detect and differentiate the zoonotic species of canine hookworm eggs directly from faeces. This allowed epidemiological screening of canine hookworm species in this community to be conducted with ease and accuracy. The zoonotic potential of canine Giardia was also investigated by characterising Giardia duodenalis recovered from humans and dogs living in the same locality and households at three different loci. Phylogenetic and epidemiological analysis provided compelling evidence to support the zoonotic transmission of canine Giardia. Molecular tools were also used to identify the species of Ascaris egg present in over 30% of dog faecal samples. The results demonstrated the role of dogs as a significant disseminator and environmental contaminator of Ascaris lumbricoides in communities where promiscuous defecation practices exist. Our study demonstrated the usefulness of combining conventional and molecular parasitological and epidemiological tools to help solve unresolved relationships with regards to parasitic zoonoses.
Resumo:
Quantifying mass and energy exchanges within tropical forests is essential for understanding their role in the global carbon budget and how they will respond to perturbations in climate. This study reviews ecosystem process models designed to predict the growth and productivity of temperate and tropical forest ecosystems. Temperate forest models were included because of the minimal number of tropical forest models. The review provides a multiscale assessment enabling potential users to select a model suited to the scale and type of information they require in tropical forests. Process models are reviewed in relation to their input and output parameters, minimum spatial and temporal units of operation, maximum spatial extent and time period of application for each organization level of modelling. Organizational levels included leaf-tree, plot-stand, regional and ecosystem levels, with model complexity decreasing as the time-step and spatial extent of model operation increases. All ecosystem models are simplified versions of reality and are typically aspatial. Remotely sensed data sets and derived products may be used to initialize, drive and validate ecosystem process models. At the simplest level, remotely sensed data are used to delimit location, extent and changes over time of vegetation communities. At a more advanced level, remotely sensed data products have been used to estimate key structural and biophysical properties associated with ecosystem processes in tropical and temperate forests. Combining ecological models and image data enables the development of carbon accounting systems that will contribute to understanding greenhouse gas budgets at biome and global scales.