31 resultados para hands-on approach
Resumo:
The binary diffusivities of water in low molecular weight sugars; fructose, sucrose and a high molecular weight carbohydrate; maltodextrin (DE 11) and the effective diffusivities of water in mixtures of these sugars (sucrose, glucose, fructose) and maltodextrin (DE 11) were determined using a simplified procedure based on the Regular Regime Approach. The effective diffusivity of these mixtures exhibited both the concentration and molecular weight dependence. Surface stickiness was observed in all samples during desorption, with fructose exhibiting the highest and maltodextrin the lowest. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
A thermodynamic approach based on the Bender equation of state is suggested for the analysis of supercritical gas adsorption on activated carbons at high pressure. The approach accounts for the equality of the chemical potential in the adsorbed phase and that in the corresponding bulk phase and the distribution of elements of the adsorption volume (EAV) over the potential energy for gas-solid interaction. This scheme is extended to subcritical fluid adsorption and takes into account the phase transition in EAV The method is adapted to gravimetric measurements of mass excess adsorption and has been applied to the adsorption of argon, nitrogen, methane, ethane, carbon dioxide, and helium on activated carbon Norit R I in the temperature range from 25 to 70 C. The distribution function of adsorption volume elements over potentials exhibits overlapping peaks and is consistently reproduced for different gases. It was found that the distribution function changes weakly with temperature, which was confirmed by its comparison with the distribution function obtained by the same method using nitrogen adsorption isotherm at 77 K. It was shown that parameters such as pore volume and skeleton density can be determined directly from adsorption measurements, while the conventional approach of helium expansion at room temperature can lead to erroneous results due to the adsorption of helium in small pores of activated carbon. The approach is a convenient tool for analysis and correlation of excess adsorption isotherms over a wide range of pressure and temperature. This approach can be readily extended to the analysis of multicomponent adsorption systems. (C) 2002 Elsevier Science (USA).
Resumo:
Adsorption of nitrogen, argon, methane, and carbon dioxide on activated carbon Norit R1 over a wide range of pressure (up to 50 MPa) at temperatures from 298 to 343 K (supercritical conditions) is analyzed by means of the density functional theory modified by incorporating the Bender equation of state, which describes the bulk phase properties with very high accuracy. It has allowed us to precisely describe the experimental data of carbon dioxide adsorption slightly above and below its critical temperatures. The pore size distribution (PSD) obtained with supercritical gases at ambient temperatures compares reasonably well with the PSD obtained with subcritical nitrogen at 77 K. Our approach does not require the skeletal density of activated carbon from helium adsorption measurements to calculate excess adsorption. Instead, this density is treated as a fitting parameter, and in all cases its values are found to fall into a very narrow range close to 2000 kg/m(3). It was shown that in the case of high-pressure adsorption of supercritical gases the PSD could be reliably obtained for the range of pore width between 0.6 and 3 run. All wider pores can be reliably characterized only in terms of surface area as their corresponding excess local isotherms are the same over a practical range of pressure.
Resumo:
We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Commercial explosives behave non-ideally in rock blasting. A direct and convenient measure of non-ideality is the detonation velocity. In this study, an alternative model fitted to experimental unconfined detonation velocity data is proposed and the effect of confinement on the detonation velocity is modelled. Unconfined data of several explosives showing various levels of nonideality were successfully modelled. The effect of confinement on detonation velocity was modelled empirically based on field detonation velocity measurements. Confined detonation velocity is a function of the ideal detonation velocity, unconfined detonation velocity at a given blasthole diameter and rock stiffness. For a given explosive and charge diameter, as confinement increases detonation velocity increases. The confinement model is implemented in a simple engineering based non-ideal detonation model. A number of simulations are carried out and analysed to predict the explosive performance parameters for the adopted blasting conditions.
Resumo:
In vitro evolution imitates the natural evolution of genes and has been very successfully applied to the modification of coding sequences, but it has not yet been applied to promoter sequences. We propose an alternative method for functional promoter analysis by applying an in vitro evolution scheme consisting of rounds of error-prone PCR, followed by DNA shuffling and selection of mutant promoter activities. We modified the activity in embryogenic sugarcane cells of the promoter region of the Goldfinger isolate of banana streak virus and obtained mutant promoter sequences that showed an average mutation rate of 2.5% after applying one round of error-prone PCR and DNA shuffling. Selection and sequencing of promoter sequences with decreased or unaltered activity allowed us to rapidly map the position of one cis-acting element that influenced promoter activity in embryogenic sugarcane cells and to discover neutral mutations that did not affect promoter Junction. The selective-shotgun approach of this promoter analysis method immediately after the promoter boundaries have been defined by 5' deletion analysis dramatically reduces the labor associated with traditional linker-scanning deletion analysis to reveal the position of functional promoter domains. Furthermore, this method allows the entire promoter to be investigated at once, rather than selected domains or nucleotides, increasing the, prospect of identifying interacting promoter regions.
Resumo:
Many studies on birds focus on the collection of data through an experimental design, suitable for investigation in a classical analysis of variance (ANOVA) framework. Although many findings are confirmed by one or more experts, expert information is rarely used in conjunction with the survey data to enhance the explanatory and predictive power of the model. We explore this neglected aspect of ecological modelling through a study on Australian woodland birds, focusing on the potential impact of different intensities of commercial cattle grazing on bird density in woodland habitat. We examine a number of Bayesian hierarchical random effects models, which cater for overdispersion and a high frequency of zeros in the data using WinBUGS and explore the variation between and within different grazing regimes and species. The impact and value of expert information is investigated through the inclusion of priors that reflect the experience of 20 experts in the field of bird responses to disturbance. Results indicate that expert information moderates the survey data, especially in situations where there are little or no data. When experts agreed, credible intervals for predictions were tightened considerably. When experts failed to agree, results were similar to those evaluated in the absence of expert information. Overall, we found that without expert opinion our knowledge was quite weak. The fact that the survey data is quite consistent, in general, with expert opinion shows that we do know something about birds and grazing and we could learn a lot faster if we used this approach more in ecology, where data are scarce. Copyright (c) 2005 John Wiley & Sons, Ltd.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The role of mutualisms in contributing to species invasions is rarely considered, inhibiting effective risk analysis and management options. Potential ecological consequences of invasion of non-native pollinators include increased pollination and seed set of invasive plants, with subsequent impacts on population growth rates and rates of spread. We outline a quantitative approach for evaluating the impact of a proposed introduction of an invasive pollinator on existing weed population dynamics and demonstrate the use of this approach on a relatively data-rich case study: the impacts on Cytisus scoparius (Scotch broom) from proposed introduction of Bombus terrestris. Three models have been used to assess population growth (matrix model), spread speed (integrodifference equation), and equilibrium occupancy (lattice model) for C. scoparius. We use available demographic data for an Australian population to parameterize two of these models. Increased seed set due to more efficient pollination resulted in a higher population growth rate in the density-independent matrix model, whereas simulations of enhanced pollination scenarios had a negligible effect on equilibrium weed occupancy in the lattice model. This is attributed to strong microsite limitation of recruitment in invasive C. scoparius populations observed in Australia and incorporated in the lattice model. A lack of information regarding secondary ant dispersal of C. scoparius prevents us from parameterizing the integrodifference equation model for Australia, but studies of invasive populations in California suggest that spread speed will also increase with higher seed set. For microsite-limited C. scoparius populations, increased seed set has minimal effects on equilibrium site occupancy. However, for density-independent rapidly invading populations, increased seed set is likely to lead to higher growth rates and spread speeds. The impacts of introduced pollinators on native flora and fauna and the potential for promoting range expansion in pollinator-limited 'sleeper weeds' also remain substantial risks.
Resumo:
Objective: Inpatient length of stay (LOS) is an important measure of hospital activity, health care resource consumption, and patient acuity. This research work aims at developing an incremental expectation maximization (EM) based learning approach on mixture of experts (ME) system for on-line prediction of LOS. The use of a batchmode learning process in most existing artificial neural networks to predict LOS is unrealistic, as the data become available over time and their pattern change dynamically. In contrast, an on-line process is capable of providing an output whenever a new datum becomes available. This on-the-spot information is therefore more useful and practical for making decisions, especially when one deals with a tremendous amount of data. Methods and material: The proposed approach is illustrated using a real example of gastroenteritis LOS data. The data set was extracted from a retrospective cohort study on all infants born in 1995-1997 and their subsequent admissions for gastroenteritis. The total number of admissions in this data set was n = 692. Linked hospitalization records of the cohort were retrieved retrospectively to derive the outcome measure, patient demographics, and associated co-morbidities information. A comparative study of the incremental learning and the batch-mode learning algorithms is considered. The performances of the learning algorithms are compared based on the mean absolute difference (MAD) between the predictions and the actual LOS, and the proportion of predictions with MAD < 1 day (Prop(MAD < 1)). The significance of the comparison is assessed through a regression analysis. Results: The incremental learning algorithm provides better on-line prediction of LOS when the system has gained sufficient training from more examples (MAD = 1.77 days and Prop(MAD < 1) = 54.3%), compared to that using the batch-mode learning. The regression analysis indicates a significant decrease of MAD (p-value = 0.063) and a significant (p-value = 0.044) increase of Prop(MAD
Resumo:
The effects of dredging on the benthic communities in the Noosa River, a subtropical estuary in SE Queensland, Australia, were examined using a 'Beyond BACF experimental design. Changes in the numbers and types of animals and characteristics of the sediments in response to dredging in the coarse sandy sediments near the mouth of the estuary were compared with those occurring naturally in two control regions. Samples were collected twice before and twice after the dredging operations, at multiple spatial scales, ranging from metres to kilometres. Significant effects from the dredging were detected on the abundance of some polychaetes and bivalves and two measures of diversity (numbers of polychaete families and total taxonomic richness). In addition, the dredging caused a significant increase in the diversity of sediment particle sizes found in the dredged region compared with elsewhere. Community composition in the dredged region was more similar to that in the control regions after dredging than before. Changes in the characteristics of the sedimentary environment as a result of the dredging appeared to lead to the benthic communities of the dredged region becoming more similar to those elsewhere in the estuary, so dredging in this system may have led to the loss or reduction in area of a specific type of habitat in the estuary with implications for overall patterns of biodiversity and ecosystem function. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Foreign Exchange trading has emerged in recent times as a significant activity in many countries. As with most forms of trading, the activity is influenced by many random parameters so that the creation of a system that effectively emulates the trading process will be very helpful. In this paper we try to create such a system using Machine learning approach to emulate trader behaviour on the Foreign Exchange market and to find the most profitable trading strategy.