27 resultados para Statistical method
Resumo:
The effects of the process variables, pH of aqueous phase, rate of addition of organic, polymeric, drug-containing phase to aqueous phase, organic:aqueous phase volume ratio and aqueous phase temperature on the entrapment of propranolol hydrochloride in ethylcellulose (N4) microspheres prepared by the solvent evaporation method were examined using a factorial design. The observed range of drug entrapment was 1.43 +/- 0.02%w/w (pH 6, 25 degrees C, phase volume ratio 1:10, fast rate of addition) to 16.63 +/- 0.92%w/w (pH 9, 33 degrees C, phase volume ratio 1:10, slow rate of addition) which corresponded to mean entrapment efficiencies of 2.86 and 33.26, respectively. Increased pH, increased temperature and decreased rate of addition significantly enhanced entrapment efficiency. However, organic:aqueous phase volume ratio did not significantly affect drug entrapment. Statistical interactions were observed between pH and rate of addition, pH and temperature, and temperature and rate of addition. The observed interactions involving pH are suggested to be due to the abilities of increased temperature and slow rate of addition to sufficiently enhance the solubility of dichloromethane in the aqueous phase, which at pH 9, but not pH 6, allows partial polymer precipitation prior to drug partitioning into the aqueous phase. The interaction between temperature and rate of addition is due to the relative lack of effect of increased temperature on drug entrapment following slow rate of addition of the organic phase. In comparison to the effects of pH on drug entrapment, the contributions of the other physical factors examined were limited.
Resumo:
The effects of four process factors: pH, emulsifier (gelatin) concentration, mixing and batch, on the % w/w entrapment of propranolol hydrochloride in ethylcellulose microcapsules prepared by the solvent evaporation process were examined using a factorial design. In this design the minimum % w/w entrapments of propranolol hydrochloride were observed whenever the external aqueous phase contained 1.5% w/v gelatin at pH 6.0 (0.71-0.91% w/w) whereas maximum entrapments occurred whenever the external aqueous phase was composed of 0.5% w/v gelatin at pH 9.0,(8.9-9.1% w/w). The theoretical maximum loading was 50% w/w. Statistical evaluation of the results by analysis of variance showed that emulsifer (gelatin) concentration and pH, but not mixing and batch significantly affected entrapment. An interaction between pH and gelatin concentration was observed in the factorial design which was accredited to the greater effect of gelatin concentration on % w/w entrapment at pH 9.0 than at pH 6.0. Maximum theoretical entrapment was achieved by increasing the pH of the external phase to 12.0. Marked increases in drug entrapment were observed whenever the pH of the external phase exceeded the pK(2) of propranolol hydrochloride. It was concluded that pH, and hence ionisation, was the greatest determinant of entrapment of propranolol hydrochloride into microcapsules prepared by the solvent evaporation process.
Resumo:
Modern biology and medicine aim at hunting molecular and cellular causes of biological functions and diseases. Gene regulatory networks (GRN) inferred from gene expression data are considered an important aid for this research by providing a map of molecular interactions. Hence, GRNs have the potential enabling and enhancing basic as well as applied research in the life sciences. In this paper, we introduce a new method called BC3NET for inferring causal gene regulatory networks from large-scale gene expression data. BC3NET is an ensemble method that is based on bagging the C3NET algorithm, which means it corresponds to a Bayesian approach with noninformative priors. In this study we demonstrate for a variety of simulated and biological gene expression data from S. cerevisiae that BC3NET is an important enhancement over other inference methods that is capable of capturing biochemical interactions from transcription regulation and protein-protein interaction sensibly. An implementation of BC3NET is freely available as an R package from the CRAN repository. © 2012 de Matos Simoes, Emmert-Streib.
Resumo:
Gene expression data can provide a very rich source of information for elucidating the biological function on the pathway level if the experimental design considers the needs of the statistical analysis methods. The purpose of this paper is to provide a comparative analysis of statistical methods for detecting the differentially expression of pathways (DEP). In contrast to many other studies conducted so far, we use three novel simulation types, producing a more realistic correlation structure than previous simulation methods. This includes also the generation of surrogate data from two large-scale microarray experiments from prostate cancer and ALL. As a result from our comprehensive analysis of 41,004 parameter configurations, we find that each method should only be applied if certain conditions of the data from a pathway are met. Further, we provide method-specific estimates for the optimal sample size for microarray experiments aiming to identify DEP in order to avoid an underpowered design. Our study highlights the sensitivity of the studied methods on the parameters of the system. © 2012 Tripahti and Emmert-Streib.
Resumo:
In this research, a preliminary study was done to find out the initial parameter window to obtain the full-penetrated NiTi weldment. A L27 Taguchi experiment was then carried out to statistically study the effects of the welding parameters and their possible interactions on the weld bead aspect ratio (or penetration over fuse-zone width ratio), and to determine the optimized parameter settings to produce the full-penetrated weldment with desirable aspect ratio. From the statistical results in the Taguchi experiment, the laser mode was found to be the most important factor that substantially affects the aspect ratio. Strong interaction between the power and focus position was found in the Taguchi experiment. The optimized weldment was mainly of columnar dendritic structure in the weld zone (WZ), while the HAZ exhibited equiaxed grain structure. The XRD and DSC results showed that the WZ remained the B2 austenite structure without any precipitates, but with a significant decrease of phase transformation temperatures. The results in the micro-hardness and tensile tests indicated that the mechanical properties of NiTi were decreased to a certain extent after fibre laser welding.
Resumo:
Background: Barrett's oesophagus (BO) is a well recognized precursor of the majority of cases of oesophageal adenocarcinoma (OAC). Endoscopic surveillance of BO patients is frequently undertaken in an attempt to detect early OAC, high grade dysplasia (HGD) or low grade dysplasia (LGD). However histological interpretation and grading of dysplasia is subjective and poorly reproducible. The alternative flow cytometry and cytology-preparation image cytometry techniques require large amounts of tissue and specialist expertise which are not widely available for frontline health care.
Methods: This study has combined whole slide imaging with DNA image cytometry, to provide a novel method for the detection and quantification of abnormal DNA contents. 20 cases were evaluated, including 8 Barrett's specialised intestinal metaplasia (SIM), 6 LGD and 6 HGD. Feulgen stained oesophageal sections (1µm thickness) were digitally scanned in their entirety and evaluated to select regions of interests and abnormalities. Barrett’s mucosa was then interactively chosen for automatic nuclei segmentation where irrelevant cell types are ignored. The combined DNA content histogram for all selected image regions was then obtained. In addition, histogram measurements, including 5c exceeding ratio (xER-5C), 2c deviation index (2cDI) and DNA grade of malignancy (DNA-MG), were computed.
Results: The histogram measurements, xER-5C, 2cDI and DNA-MG, were shown to be effective in differentiating SIM from HGD, SIM from LGD, and LGD from HGD. All three measurements discriminated SIM from HGD cases successfully with statistical significance (pxER-5C=0.0041, p2cDI=0.0151 and pDNA-MG=0.0057). Statistical significance is also achieved differentiating SIM from LGD samples with pxER-5C=0.0019, p2cDI=0.0023 and pDNA-MG=0.0030. Furthermore the differences between LGD and HGD cases are statistical significant (pxER-5C=0.0289, p2cDI=0.0486 and pDNA-MG=0.0384).
Conclusion: Whole slide image cytometry is a novel and effective method for the detection and quantification of abnormal DNA content in BO. Compared to manual histological review, this proposed method is more objective and reproducible. Compared to flow cytometry and cytology-preparation image cytometry, the current method is low cost, simple to use and only requires a single 1µm tissue section. Whole slide image cytometry could assist the routine clinical diagnosis of dysplasia in BO, which is relevant for future progression risk to OAC.
Resumo:
Anti-islanding protection is becoming increasingly important due to the rapid installation of distributed generation from renewable resources like wind, tidal and wave, solar PV, bio-fuels, as well as from other resources like diesel. Unintentional islanding presents a potential risk for damaging utility plants and equipment connected from the demand side, as well as to public and personnel in utility plants. This paper investigates automatic islanding detection. This is achieved by deploying a statistical process control approach for fault detection with the real-time data acquired through a wide area measurement system, which is based on Phasor Measurement Unit (PMU) technology. In particular, the principal component analysis (PCA) is used to project the data into principal component subspace and residual space, and two statistics are used to detect the occurrence of fault. Then a fault reconstruction method is used to identify the fault and its development over time. The proposed scheme has been used in a real system and the results have confirmed that the proposed method can correctly identify the fault and islanding site.
Resumo:
In recent years, there has been a move towards the development of indirect structural health monitoring (SHM)techniques for bridges; the low-cost vibration-based method presented in this paper is such an approach. It consists of the use of a moving vehicle fitted with accelerometers on its axles and incorporates wavelet analysis and statistical pattern recognition. The aim of the approach is to both detect and locate damage in bridges while reducing the need for direct instrumentation of the bridge. In theoretical simulations, a simplified vehicle-bridge interaction model is used to investigate the effectiveness of the approach in detecting damage in a bridge from vehicle accelerations. For this purpose, the accelerations are processed using a continuous wavelet transform as when the axle passes over a damaged section, any discontinuity in the signal would affect the wavelet coefficients. Based on these coefficients, a damage indicator is formulated which can distinguish between different damage levels. However, it is found to be difficult to quantify damage of varying levels when the vehicle’s transverse position is varied between bridge crossings. In a real bridge field experiment, damage was applied artificially to a steel truss bridge to test the effectiveness of the indirect approach in practice; for this purpose a two-axle van was driven across the bridge at constant speed. Both bridge and vehicle acceleration measurements were recorded. The dynamic properties of the test vehicle were identified initially via free vibration tests. It was found that the resulting damage indicators for the bridge and vehicle showed similar patterns, however, it was difficult to distinguish between different artificial damage scenarios.
Resumo:
We present a Bayesian-odds-ratio-based algorithm for detecting stellar flares in light-curve data. We assume flares are described by a model in which there is a rapid rise with a half-Gaussian profile, followed by an exponential decay. Our signal model also contains a polynomial background model required to fit underlying light-curve variations in the data, which could otherwise partially mimic a flare. We characterize the false alarm probability and efficiency of this method under the assumption that any unmodelled noise in the data is Gaussian, and compare it with a simpler thresholding method based on that used in Walkowicz et al. We find our method has a significant increase in detection efficiency for low signal-to-noise ratio (S/N) flares. For a conservative false alarm probability our method can detect 95 per cent of flares with S/N less than 20, as compared to S/N of 25 for the simpler method. We also test how well the assumption of Gaussian noise holds by applying the method to a selection of 'quiet' Kepler stars. As an example we have applied our method to a selection of stars in Kepler Quarter 1 data. The method finds 687 flaring stars with a total of 1873 flares after vetos have been applied. For these flares we have made preliminary characterizations of their durations and and S/N.
Resumo:
Physicians expect a treatment to be more effective when its clinical outcomes are described as relative rather than as absolute risk reductions. We examined whether effects of presentation method (relative vs. absolute risk reduction)
remain when physicians are provided the baseline risk information, a vital piece of statistical information omitted in previous studies. Using a between-subjects design, ninety five physicians were presented the risk reduction associated
with a fictitious treatment for hypertension either as an absolute risk reduction or as a relative risk reduction, with or without including baseline risk information. Physicians reported that the treatment would be more effective and that they would be more willing to prescribe it when its risk reduction was presented to them in relative rather than in absolute terms. The relative risk reduction was perceived as more effective than absolute risk reduction even when the baseline risk information was explicitly reported. We recommend that information about absolute risk reduction be made available to physicians in the reporting of clinical outcomes. Moreover, health professionals should be cognizant of the potential biasing effects of risk information presented in relative risk terms
Resumo:
Recently there has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and architectural complexity). Once one has learned a model based on their devised method, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Unfortunately, the standard tests used for this purpose are not able to jointly consider performance measures. The aim of this paper is to resolve this issue by developing statistical procedures that are able to account for multiple competing measures at the same time. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameter of such models, as usually the number of studied cases is very reduced in such comparisons. Real data from a comparison among general purpose classifiers is used to show a practical application of our tests.
Resumo:
The strong mixing of many-electron basis states in excited atoms and ions with open f shells results in very large numbers of complex, chaotic eigenstates that cannot be computed to any degree of accuracy. Describing the processes which involve such states requires the use of a statistical theory. Electron capture into these “compound resonances” leads to electron-ion recombination rates that are orders of magnitude greater than those of direct, radiative recombination and cannot be described by standard theories of dielectronic recombination. Previous statistical theories considered this as a two-electron capture process which populates a pair of single-particle orbitals, followed by “spreading” of the two-electron states into chaotically mixed eigenstates. This method is similar to a configuration-average approach because it neglects potentially important effects of spectator electrons and conservation of total angular momentum. In this work we develop a statistical theory which considers electron capture into “doorway” states with definite angular momentum obtained by the configuration interaction method. We apply this approach to electron recombination with W20+, considering 2×106 doorway states. Despite strong effects from the spectator electrons, we find that the results of the earlier theories largely hold. Finally, we extract the fluorescence yield (the probability of photoemission and hence recombination) by comparison with experiment.