156 resultados para binary data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mechanism underlying segregation in liquid fluidized beds is investigated in this paper, A binary fluidized bed system not at a stable equilibrium condition. is modelled in the literature as forming a mixed part-corresponding to stable mixture-at the bottom of the bed and a pure layer of excess components always floating on the mixed part. On the basis of this model: (0 comprehensive criteria for binary particles of any type to mix/segregate, and (ii) mixing, segregation regime map in terms of size ratio and density ratio of the particles for a given fluidizing medium, are established in this work. Therefore, knowing the properties of given particles, a second type of particles can be chosen in order to avoid or to promote segregation according to the particular process requirements. The model is then advanced for multicomponent fluidized beds and validated against experimental results observed for ternary fluidized beds. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alcohol and tobacco consumption are closely correlated and published results on their association with breast cancer have not always allowed adequately for confounding between these exposures. Over 80% of the relevant information worldwide on alcohol and tobacco consumption and breast cancer were collated, checked and analysed centrally. Analyses included 58515 women with invasive breast cancer and 95067 controls from 53 studies. Relative risks of breast cancer were estimated, after stratifying by study, age, parity and, where appropriate, women's age when their first child was born and consumption of alcohol and tobacco. The average consumption of alcohol reported by controls from developed countries was 6.0 g per day, i.e. about half a unit/drink of alcohol per day, and was greater in ever-smokers than never-smokers, (8.4 g per day and 5.0 g per day, respectively). Compared with women who reported drinking no alcohol, the relative risk of breast cancer was 1.32 (1.19 - 1.45, P < 0.00001) for an intake of 35 - 44 g per day alcohol, and 1.46 (1.33 - 1.61, P < 0.00001) for greater than or equal to 45 g per day alcohol. The relative risk of breast cancer increased by 7.1% (95% CI 5.5-8.7%; P

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anew thermodynamic approach has been developed in this paper to analyze adsorption in slitlike pores. The equilibrium is described by two thermodynamic conditions: the Helmholtz free energy must be minimal, and the grand potential functional at that minimum must be negative. This approach has led to local isotherms that describe adsorption in the form of a single layer or two layers near the pore walls. In narrow pores local isotherms have one step that could be either very sharp but continuous or discontinuous benchlike for a definite range of pore width. The latter reflects a so-called 0 --> 1 monolayer transition. In relatively wide pores, local isotherms have two steps, of which the first step corresponds to the appearance of two layers near the pore walls, while the second step corresponds to the filling of the space between these layers. All features of local isotherms are in agreement with the results obtained from the density functional theory and Monte Carlo simulations. The approach is used for determining pore size distributions of carbon materials. We illustrate this with the benzene adsorption data on activated carbon at 20, 50, and 80 degreesC, argon adsorption on activated carbon Norit ROX at 87.3 K, and nitrogen adsorption on activated carbon Norit R1 at 77.3 K.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A theoretical analysis of adsorption of mixtures containing subcritical adsorbates into activated carbon is presented as an extension to the theory for pure component developed earlier by Do and coworkers. In this theory, adsorption of mixtures in a pore follows a two-stage process, similar to that for pure component systems. The first stage is the layering of molecules on the surface, with the behavior of the second and higher layers resembling to that of vapor-liquid equilibrium. The second stage is the pore-filling process when the remaining pore width is small enough and the pressure is high enough to promote the pore filling with liquid mixture having the same compositions as those of the outermost molecular layer just prior to pore filling. The Kelvin equation is applied for mixtures, with the vapor pressure term being replaced by the equilibrium pressure at the compositions of the outermost layer of the liquid film. Simulations are detailed to illustrate the effects of various parameters, and the theory is tested with a number of experimental data on mixture. The predictions were very satisfactory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we analyzed the adsorption of gases and vapors on graphitised thermal carbon black by using a modified DFT-lattice theory, in which we assume that the behavior of the first layer in the adsorption film is different from those of second and higher layers. The effects of various parameters on the topology of the adsorption isotherm were first investigated, and the model was then applied in the analysis of adsorption data of numerous substances on carbon black. We have found that the first layer in the adsorption film behaves differently from the second and higher layers in such a way that the adsorbate-adsorbate interaction energy in the first layer is less than that of second and higher layers, and the same is observed for the partition function. Furthermore, the adsorbate-adsorbate and adsorbate-adsorbent interaction energies obtained from the fitting are consistently lower than the corresponding values obtained from the viscosity data and calculated from the Lorentz-Berthelot rule, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Back ground. Based on the well-described excess of schizophrenia births in winter and spring, we hypothesised that individuals with schizophrenia (a) would be more likely to be born during periods of decreased perinatal sunshine, and (b) those born during periods of less sunshine would have an earlier age of first registration. Methods. We undertook an ecological analysis of long-term trends in perinatal sunshine duration and schizophrenia birth rates based on two mental health registers (Queensland. Australia n = 6630; The Netherlands n = 24, 474). For each of the 480 months between 1931 and 1970, the agreement between slopes of the trends in psychosis and long-term sunshine duration series were assessed. Age at first registration was assessed by quartiles of long-term trends in perinatal sunshine duration, Males and females were assessed separately. Results. Both the Dutch and Australian data showed a statistically significant association between falling long-term trends in sunshine duration around the time of birth and rising schizophrenia birth rates for males only. In both the Dutch and Australian data there were significant associations between earlier age of first registration and reduced long-term trends in sunshine duration around the time of birth for both males and females, Conclusions. A measure of long-term trends in perinatal sunshine duration was associated with two epidemiological features of schizophrenia in two separate data sets. Exposures related to sunshine duration warrant further consideration in schizophrenia research. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We focus on mixtures of factor analyzers from the perspective of a method for model-based density estimation from high-dimensional data, and hence for the clustering of such data. This approach enables a normal mixture model to be fitted to a sample of n data points of dimension p, where p is large relative to n. The number of free parameters is controlled through the dimension of the latent factor space. By working in this reduced space, it allows a model for each component-covariance matrix with complexity lying between that of the isotropic and full covariance structure models. We shall illustrate the use of mixtures of factor analyzers in a practical example that considers the clustering of cell lines on the basis of gene expressions from microarray experiments. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tests that are currently available for the measurement of overexpression of the human epidermal growth factor-2 (HER2) in breast cancer have shown considerable problems in accuracy and interlaboratory reproducibility. Although these problems are partly alleviated by the use of validated, standardised 'kits', there may be considerable cost involved in their use. Prior to testing it may therefore be an advantage to be able to predict from basic pathology data whether a cancer is likely to overexpress HER2. In this study, we have correlated pathology features of cancers with the frequency of HER2 overexpression assessed by immunohistochemistry (IHC) using HercepTest (Dako). In addition, fluorescence in situ hybridisation (FISH) has been used to re-test the equivocal cancers and interobserver variation in assessing HER2 overexpression has been examined by a slide circulation scheme. Of the 1536 cancers, 1144 (74.5%) did not overexpress HER2. Unequivocal overexpression (3+ by IHC) was seen in 186 cancers (12%) and an equivocal result (2+ by IHC) was seen in 206 cancers (13%). Of the 156 IHC 3+ cancers for which complete data was available, 149 (95.5%) were ductal NST and 152 (97%) were histological grade 2 or 3. Only 1 of 124 infiltrating lobular carcinomas (0.8%) showed HER2 overexpression. None of the 49 'special types' of carcinoma showed HER2 overexpression. Re-testing by FISH of a proportion of the IHC 2+ cancers showed that only 25 (23%) of those assessable exhibited HER2 gene amplification, but 46 of the 47 IHC 3+ cancers (98%) were confirmed as showing gene amplification. Circulating slides for the assessment of HER2 score showed a moderate level of agreement between pathologists (kappa 0.4). As a result of this study we would advocate consideration of a triage approach to HER-2 testing. Infiltrating lobular and special types of carcinoma may not need to be routinely tested at presentation nor may grade 1 NST carcinomas in which only 1.4% have been shown to overexpress HER2. Testing of these carcinomas may be performed when HER2 status is required to assist in therapeutic or other clinical/prognostic decision-making. The highest yield of HER2 overexpressing carcinomas is seen in the grade 3 NST subgroup in which 24% are positive by IHC. (C) 2003 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of exchange of substances between blood and tissue has been a long-lasting challenge to physiologists, and considerable theoretical and experimental accomplishments were achieved before the development of the positron emission tomography (PET). Today, when modeling data from modern PET scanners, little use is made of earlier microvascular research in the compartmental models, which have become the standard model by which the vast majority of dynamic PET data are analysed. However, modern PET scanners provide data with a sufficient temporal resolution and good counting statistics to allow estimation of parameters in models with more physiological realism. We explore the standard compartmental model and find that incorporation of blood flow leads to paradoxes, such as kinetic rate constants being time-dependent, and tracers being cleared from a capillary faster than they can be supplied by blood flow. The inability of the standard model to incorporate blood flow consequently raises a need for models that include more physiology, and we develop microvascular models which remove the inconsistencies. The microvascular models can be regarded as a revision of the input function. Whereas the standard model uses the organ inlet concentration as the concentration throughout the vascular compartment, we consider models that make use of spatial averaging of the concentrations in the capillary volume, which is what the PET scanner actually registers. The microvascular models are developed for both single- and multi-capillary systems and include effects of non-exchanging vessels. They are suitable for analysing dynamic PET data from any capillary bed using either intravascular or diffusible tracers, in terms of physiological parameters which include regional blood flow. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of number of samples and selection of data for analysis on the calculation of surface motor unit potential (SMUP) size in the statistical method of motor unit number estimates (MUNE) was determined in 10 normal subjects and 10 with amyotrophic lateral sclerosis (ALS). We recorded 500 sequential compound muscle action potentials (CMAPs) at three different stable stimulus intensities (10–50% of maximal CMAP). Estimated mean SMUP sizes were calculated using Poisson statistical assumptions from the variance of 500 sequential CMAP obtained at each stimulus intensity. The results with the 500 data points were compared with smaller subsets from the same data set. The results using a range of 50–80% of the 500 data points were compared with the full 500. The effect of restricting analysis to data between 5–20% of the CMAP and to standard deviation limits was also assessed. No differences in mean SMUP size were found with stimulus intensity or use of different ranges of data. Consistency was improved with a greater sample number. Data within 5% of CMAP size gave both increased consistency and reduced mean SMUP size in many subjects, but excluded valid responses present at that stimulus intensity. These changes were more prominent in ALS patients in whom the presence of isolated SMUP responses was a striking difference from normal subjects. Noise, spurious data, and large SMUP limited the Poisson assumptions. When these factors are considered, consistent statistical MUNE can be calculated from a continuous sequence of data points. A 2 to 2.5 SD or 10% window are reasonable methods of limiting data for analysis. Muscle Nerve 27: 320–331, 2003