944 resultados para mixed verification methods
Resumo:
The use of kilometre-scale ensembles in operational forecasting provides new challenges for forecast interpretation and evaluation to account for uncertainty on the convective scale. A new neighbourhood based method is presented for evaluating and characterising the local predictability variations from convective scale ensembles. Spatial scales over which ensemble forecasts agree (agreement scales, S^A) are calculated at each grid point ij, providing a map of the spatial agreement between forecasts. By comparing the average agreement scale obtained from ensemble member pairs (S^A(mm)_ij), with that between members and radar observations (S^A(mo)_ij), this approach allows the location-dependent spatial spread-skill relationship of the ensemble to be assessed. The properties of the agreement scales are demonstrated using an idealised experiment. To demonstrate the methods in an operational context the S^A(mm)_ij and S^A(mo)_ij are calculated for six convective cases run with the Met Office UK Ensemble Prediction System. The S^A(mm)_ij highlight predictability differences between cases, which can be linked to physical processes. Maps of S^A(mm)_ij are found to summarise the spatial predictability in a compact and physically meaningful manner that is useful for forecasting and for model interpretation. Comparison of S^A(mm)_ij and S^A(mo)_ij demonstrates the case-by-case and temporal variability of the spatial spread-skill, which can again be linked to physical processes.
Resumo:
The tiger nut tuber of the Cyperus esculentus L. plant is an unusual storage system with similar amounts of starch and lipid. The extraction of its oil employing both mechanical pressing and aqueous enzymatic extraction (AEE) methods was investigated and an examination of the resulting products was carried out. The effects of particle size and moisture content of the tuber on the yield of tiger nut oil with pressing were initially studied. Smaller particles were found to enhance oil yields while a range of moisture content was observed to favour higher oil yields. When samples were first subjected to high pressures up to 700 MPa before pressing at 38 MPa there was no increase in the oil yields. Ground samples incubated with a mixture of α- Amylase, Alcalase, and Viscozyme (a mixture of cell wall degrading enzyme) as a pre-treatment, increased oil yield by pressing and 90% of oil was recovered as a result. When aqueous enzymatic extraction was carried out on ground samples, the use of α- Amylase, Alcalase, and Celluclast independently improved extraction oil yields compared to oil extraction without enzymes by 34.5, 23.4 and 14.7% respectively. A mixture of the three enzymes further augmented the oil yield and different operational factors were individually studied for their effects on the process. These include time, total mixed enzyme concentration, linear agitation speed, and solid-liquid ratio. The largest oil yields were obtained with a solid-liquid ratio of 1:6, mixed enzyme concentration of 1% (w/w) and 6 h incubation time although the longer time allowed for the formation of an emulsion. Using stationary samples during incubation surprisingly gave the highest oil yields, and this was observed to be as a result of gravity separation occurring during agitation. Furthermore, the use of high pressure processing up to 300 MPa as a pre-treatment enhanced oil yields but additional pressure increments had a detrimental effect. The quality of oils recovered from both mechanical and aqueous enzymatic extraction based on the percentage free fatty acid (% FFA) and peroxide values (PV) all reflected the good stabilities of the oils with the highest % FFA of 1.8 and PV of 1.7. The fatty acid profiles of all oils also remained unchanged. The level of tocopherols in oils were enhanced with both enzyme aided pressing (EAP) and high pressure processing before AEE. Analysis on the residual meals revealed DP 3 and DP 4 oligosaccharides present in EAP samples but these would require further assessment on their identity and quality.
Resumo:
Sensitivity and specificity are measures that allow us to evaluate the performance of a diagnostic test. In practice, it is common to have situations where a proportion of selected individuals cannot have the real state of the disease verified, since the verification could be an invasive procedure, as occurs with biopsy. This happens, as a special case, in the diagnosis of prostate cancer, or in any other situation related to risks, that is, not practicable, nor ethical, or in situations with high cost. For this case, it is common to use diagnostic tests based only on the information of verified individuals. This procedure can lead to biased results or workup bias. In this paper, we introduce a Bayesian approach to estimate the sensitivity and the specificity for two diagnostic tests considering verified and unverified individuals, a result that generalizes the usual situation based on only one diagnostic test.
Resumo:
Linear mixed models were developed to handle clustered data and have been a topic of increasing interest in statistics for the past 50 years. Generally. the normality (or symmetry) of the random effects is a common assumption in linear mixed models but it may, sometimes, be unrealistic, obscuring important features of among-subjects variation. In this article, we utilize skew-normal/independent distributions as a tool for robust modeling of linear mixed models under a Bayesian paradigm. The skew-normal/independent distributions is an attractive class of asymmetric heavy-tailed distributions that includes the skew-normal distribution, skew-t, skew-slash and the skew-contaminated normal distributions as special cases, providing an appealing robust alternative to the routine use of symmetric distributions in this type of models. The methods developed are illustrated using a real data set from Framingham cholesterol study. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
In this article, we consider local influence analysis for the skew-normal linear mixed model (SN-LMM). As the observed data log-likelihood associated with the SN-LMM is intractable, Cook`s well-known approach cannot be applied to obtain measures of local influence. Instead, we develop local influence measures following the approach of Zhu and Lee (2001). This approach is based on the use of an EM-type algorithm and is measurement invariant under reparametrizations. Four specific perturbation schemes are discussed. Results obtained for a simulated data set and a real data set are reported, illustrating the usefulness of the proposed methodology.
Resumo:
The present paper describes the immobilization of nanoparticles onto conducting substrates by using both electrostatic layer-by-layer and electrophoretic deposition (EPD) methods. These two techniques were compared in high-performance electrochromic electrodes based on mixed nickel hydroxide nanoparticles. In addition to easy handling, EPD seems to be the most suitable method for the immobilization of nanoparticles, leading to higher electrochromic efficiencies, lower response times and higher stability upon coloration and bleaching cycling. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Two porous mixed valent diruthenium(II,III)-dicarboxylate compounds have been prepared and characterized by spectroscopic methods, X-ray diffraction and thermogravimetry. Crystalline solids of [Ru(2)(tere)(2)Cl] center dot 3.5H(2)O (tere=terephthalate) and [Ru(2)(adip)(2)Cl] center dot 1.5H(2)O (adip=adipate) consist of extended chains in which polymeric layers of multiply metal-metal bonded [Ru(2)](5+) cores are bridged by dicarboxylate ligands in paddlewheel type geometries. Units of [Ru(2)(dicarboxylate)(2)](n)(+) are linked by axial bridging chloride ions generating three-dimensional networks. The polymers loose non-bonded water molecules at low temperatures but do not undergo thermal decomposition below 280-300 degrees C. Both of compounds exhibit high BET surface areas, [Ru(2)(tere)(2)Cl]: 235 m(2) g(-1) and [Ru(2)(adip)(2)Cl]: 281 m(2) g(-1), and occlude similar numbers of mol of N(2) per mol of metal. The terephthalate ligand generated an organized structure with supermicropores (total pore size of 0.24 cm(3) g(-1)) while the adipate ligand led to a mesoporous structure (total pore sizes of 0.47 cm(3) g(-1)) for the corresponding diruthenium(II,III)-dicarboxylate polymers. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Mixed-ligand complexes of technetium(V) or rhenium(V) containing tridentate N-[(dialkylamino)(thiocarbonyl)]benzamidine (H(2)L(1)) and bidentate N,N-dialkyl-N`-benzoylthiourea (HL(2)) ligands were formed in high yields when (NBu(4))[MOCl(4)] (M = Tc or Re) or [ReOCl(3)(PPh(3))(2)] was treated with mixtures of the proligands. Other approaches for the synthesis of the products are reactions of [MOCl(L(1))] complexes with HL(2) or compounds of the-composition [ReOCl(2)(PPh(3))(L(2))] with H(2)L(1). The resulting air-stable [MO(L(1))(L(2))] complexes possess potential for the development of metal-based radiopharmaceuticals. [TcO(L(1))(L(2))] complexes are readily reduced by PPh3 with formation of [Tc(L(1))(L(2))(PPh(3))]. The resulting Tc(III) complexes undergo two almost-reversible oxidation steps corresponding to one-electron transfer processes. ((C) Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2009)
Resumo:
This thesis develops and evaluates statistical methods for different types of genetic analyses, including quantitative trait loci (QTL) analysis, genome-wide association study (GWAS), and genomic evaluation. The main contribution of the thesis is to provide novel insights in modeling genetic variance, especially via random effects models. In variance component QTL analysis, a full likelihood model accounting for uncertainty in the identity-by-descent (IBD) matrix was developed. It was found to be able to correctly adjust the bias in genetic variance component estimation and gain power in QTL mapping in terms of precision. Double hierarchical generalized linear models, and a non-iterative simplified version, were implemented and applied to fit data of an entire genome. These whole genome models were shown to have good performance in both QTL mapping and genomic prediction. A re-analysis of a publicly available GWAS data set identified significant loci in Arabidopsis that control phenotypic variance instead of mean, which validated the idea of variance-controlling genes. The works in the thesis are accompanied by R packages available online, including a general statistical tool for fitting random effects models (hglm), an efficient generalized ridge regression for high-dimensional data (bigRR), a double-layer mixed model for genomic data analysis (iQTL), a stochastic IBD matrix calculator (MCIBD), a computational interface for QTL mapping (qtl.outbred), and a GWAS analysis tool for mapping variance-controlling loci (vGWAS).
Resumo:
Generalized linear mixed models are flexible tools for modeling non-normal data and are useful for accommodating overdispersion in Poisson regression models with random effects. Their main difficulty resides in the parameter estimation because there is no analytic solution for the maximization of the marginal likelihood. Many methods have been proposed for this purpose and many of them are implemented in software packages. The purpose of this study is to compare the performance of three different statistical principles - marginal likelihood, extended likelihood, Bayesian analysis-via simulation studies. Real data on contact wrestling are used for illustration.
Resumo:
Recently, two international standard organizations, ISO and OGC, have done the work of standardization for GIS. Current standardization work for providing interoperability among GIS DB focuses on the design of open interfaces. But, this work has not considered procedures and methods for designing river geospatial data. Eventually, river geospatial data has its own model. When we share the data by open interface among heterogeneous GIS DB, differences between models result in the loss of information. In this study a plan was suggested both to respond to these changes in the information envirnment and to provide a future Smart River-based river information service by understanding the current state of river geospatial data model, improving, redesigning the database. Therefore, primary and foreign key, which can distinguish attribute information and entity linkages, were redefined to increase the usability. Database construction of attribute information and entity relationship diagram have been newly redefined to redesign linkages among tables from the perspective of a river standard database. In addition, this study was undertaken to expand the current supplier-oriented operating system to a demand-oriented operating system by establishing an efficient management of river-related information and a utilization system, capable of adapting to the changes of a river management paradigm.
Resumo:
Cateteres venosos centrais inseridos em pacientes internados em unidade de terapia intensiva foram avaliados por métodos microbiológicos (cultura semi-quantitativa) e microscopia eletrônica de varredura a fim de detectar adesão microbiana e correlacionar com a cultura de sangue. Durante o período de estudo, foram avaliados 59 pacientes com cateter venoso central. A idade dos pacientes, sexo, sítio de inserção e tempo de permanência do cateter foram anotados. O cateter era de poliuretano não tunelizado e de único lúmen. O sangue para cultura foi coletado no momento da remoção do cateter. de 63 pontas de cateteres, 30 (47,6%) foram colonizadas e a infecção encontrada em 5 (23,8%) cateteres. A infecção foi mais prevalente em 26 pacientes (41,3%) com cateteres inseridos em veia subclávia do que nos 3 (3,2%) inseridos em veia jugular. A infecção foi observada com mais freqüência em cateteres com tempo de permanência maior do que sete dias. Os microrganismos isolados incluíram 32 estafilococos coagulase-negativa (29,7%), 61 bactérias Gram-negativas (52,9%), 9 estafilcocos coagulase-positiva (8,3%) e 3 leveduras (2,7%). Como agentes causais de infecções em unidade de terapia intensiva foram isolados E. aerogenes, P. aeruginosa, A. baumannii. Os antimicrobianos com maior atividade in vitro contra as bactérias Gram-negativas foram o imipenem e contra as Gram-positivas vancomicina, cefepime, penicilina, rifampicina e tetraciclina. As análises por microscopia eletrônica de varredura revelaram biofilmes sobre a superfície de todos os cateteres examinados.
Resumo:
Among the researches on preparation and test of nanostructured materials, titanium dioxide and zinc oxide have been the most frequent studied oxides. In order to extend their properties, composites have been prepared using three different methods: Polyol Method, Sol-gel Process and a combination of the two processes (hybrid process). Recent research showed best properties in composite materials than in pure oxides. In this work is presented the preparation and the structural characterization of ZnO-TiO2 composite nanostructures to be tested for their performance in electrocatalysis and in further trial on photovoltaic cells.
Resumo:
PLCs (acronym for Programmable Logic Controllers) perform control operations, receiving information from the environment, processing it and modifying this same environment according to the results produced. They are commonly used in industry in several applications, from mass transport to petroleum industry. As the complexity of these applications increase, and as various are safety critical, a necessity for ensuring that they are reliable arouses. Testing and simulation are the de-facto methods used in the industry to do so, but they can leave flaws undiscovered. Formal methods can provide more confidence in an application s safety, once they permit their mathematical verification. We make use of the B Method, which has been successfully applied in the formal verification of industrial systems, is supported by several tools and can handle decomposition, refinement, and verification of correctness according to the specification. The method we developed and present in this work automatically generates B models from PLC programs and verify them in terms of safety constraints, manually derived from the system requirements. The scope of our method is the PLC programming languages presented in the IEC 61131-3 standard, although we are also able to verify programs not fully compliant with the standard. Our approach aims to ease the integration of formal methods in the industry through the abbreviation of the effort to perform formal verification in PLCs