910 resultados para Variable Aggregation
Resumo:
The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.
Resumo:
Consensus HIV-1 genes can decrease the genetic distances between candidate immunogens and field virus strains. To ensure the functionality and optimal presentation of immunologic epitopes, we generated two group-M consensus env genes that contain variable regions either from a wild-type B/C recombinant virus isolate (CON6) or minimal consensus elements (CON-S) in the V1, V2, V4, and V5 regions. C57BL/6 and BALB/c mice were primed twice with CON6, CON-S, and subtype control (92UG37_A and HXB2/Bal_B) DNA and boosted with recombinant vaccinia virus (rVV). Mean antibody titers against 92UG37_A, 89.6_B, 96ZM651_C, CON6, and CON-S Env protein were determined. Both CON6 and CON-S induced higher mean antibody titers against several of the proteins, as compared with the subtype controls. However, no significant differences were found in mean antibody titers in animals immunized with CON6 or CON-S. Cellular immune responses were measured by using five complete Env overlapping peptide sets: subtype A (92UG37_A), subtype B (MN_B, 89.6_B and SF162_B), and subtype C (Chn19_C). The intensity of the induced cellular responses was measured by using pooled Env peptides; T-cell epitopes were identified by using matrix peptide pools and individual peptides. No significant differences in T-cell immune-response intensities were noted between CON6 and CON-S immunized BALB/c and C57BL/6 mice. In BALB/c mice, 10 and eight nonoverlapping T-cell epitopes were identified in CON6 and CON-S, whereas eight epitopes were identified in 92UG37_A and HXB2/BAL_B. In C57BL/6 mice, nine and six nonoverlapping T-cell epitopes were identified after immunization with CON6 and CON-S, respectively, whereas only four and three were identified in 92UG37_A and HXB2/BAL_B, respectively. When combined together from both mouse strains, 18 epitopes were identified. The group M artificial consensus env genes, CON6 and CON-S, were equally immunogenic in breadth and intensity for inducing humoral and cellular immune responses.
Resumo:
The intersection of the amyloid cascade hypothesis and the implication of metal ions in Alzheimer's disease progression has sparked an interest in using metal-binding compounds as potential therapeutic agents. In the present work, we describe a prochelator SWH that is enzymatically activated by beta-secretase to produce a high affinity copper chelator CP. Because beta-secretase is responsible for the amyloidogenic processing of the amyloid precursor protein, this prochelator strategy imparts disease specificity toward copper chelation not possible with general metal chelators. Furthermore, once activated, CP efficiently sequesters copper from amyloid-beta, prevents and disassembles copper-induced amyloid-beta aggregation, and diminishes copper-promoted reactive oxygen species formation.
Resumo:
We consider the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates. This is an unconventional variable selection problem for two reasons: (1) The dimension of the covariate space is comparable, and often much larger, than the number of subjects in the study, and (2) the covariate space is highly structured, and in some cases it is desirable to incorporate this structural information in to the model building process. We approach this problem through the Bayesian variable selection framework, where we assume that the covariates lie on an undirected graph and formulate an Ising prior on the model space for incorporating structural information. Certain computational and statistical problems arise that are unique to such high-dimensional, structured settings, the most interesting being the phenomenon of phase transitions. We propose theoretical and computational schemes to mitigate these problems. We illustrate our methods on two different graph structures: the linear chain and the regular graph of degree k. Finally, we use our methods to study a specific application in genomics: the modeling of transcription factor binding sites in DNA sequences. © 2010 American Statistical Association.
Resumo:
This paper studies the multiplicity-correction effect of standard Bayesian variable-selection priors in linear regression. Our first goal is to clarify when, and how, multiplicity correction happens automatically in Bayesian analysis, and to distinguish this correction from the Bayesian Ockham's-razor effect. Our second goal is to contrast empirical-Bayes and fully Bayesian approaches to variable selection through examples, theoretical results and simulations. Considerable differences between the two approaches are found. In particular, we prove a theorem that characterizes a surprising aymptotic discrepancy between fully Bayes and empirical Bayes. This discrepancy arises from a different source than the failure to account for hyperparameter uncertainty in the empirical-Bayes estimate. Indeed, even at the extreme, when the empirical-Bayes estimate converges asymptotically to the true variable-inclusion probability, the potential for a serious difference remains. © Institute of Mathematical Statistics, 2010.
Resumo:
Antigenically variable RNA viruses are significant contributors to the burden of infectious disease worldwide. One reason for their ubiquity is their ability to escape herd immunity through rapid antigenic evolution and thereby to reinfect previously infected hosts. However, the ways in which these viruses evolve antigenically are highly diverse. Some have only limited diversity in the long-run, with every emergence of a new antigenic variant coupled with a replacement of the older variant. Other viruses rapidly accumulate antigenic diversity over time. Others still exhibit dynamics that can be considered evolutionary intermediates between these two extremes. Here, we present a theoretical framework that aims to understand these differences in evolutionary patterns by considering a virus's epidemiological dynamics in a given host population. Our framework, based on a dimensionless number, probabilistically anticipates patterns of viral antigenic diversification and thereby quantifies a virus's evolutionary potential. It is therefore similar in spirit to the basic reproduction number, the well-known dimensionless number which quantifies a pathogen's reproductive potential. We further outline how our theoretical framework can be applied to empirical viral systems, using influenza A/H3N2 as a case study. We end with predictions of our framework and work that remains to be done to further integrate viral evolutionary dynamics with disease ecology.
Resumo:
Detailed phenotypic characterization of B cell subpopulations is of utmost importance for the diagnosis and management of humoral immunodeficiencies, as they are used for classification of common variable immunodeficiencies. Since age-specific reference values remain scarce in the literature, we analysed by flow cytometry the proportions and absolute values of total, memory, switched memory and CD21(-/low) B cells in blood samples from 168 healthy children (1 day to 18 years) with special attention to the different subpopulations of CD21(low) B cells. The percentages of total memory B cells and their subsets significantly increased up to 5-10 years. In contrast, the percentages of immature CD21(-) B cells and of immature transitional CD21(low)CD38(hi) B cells decreased progressively with age, whereas the percentage of CD21(low) CD38(low) B cells remained stable during childhood. Our data stress the importance of age-specific reference values for the correct interpretation of B cell subsets in children as a diagnostic tool in immunodeficiencies.
Resumo:
En el análisis del discurso matemático manifiesto en un texto de álgebra escolar, hemos encontrado que el dominio de la variable es un concepto presente desde la aparición de las expresiones generalizadoras de operaciones, relaciones y propiedades de los números reales, que tan sólo se explicita en el estudio del álgebra de las expresiones algebraicas. Este concepto, junto con el de conjunto de referencia de una expresión y con el de conjunto solución, juega un papel protagónico en diferentes contextos del álgebra escolar, que le permiten configurarse como una variable didáctica imprescindible en la significación de muchos otros conceptos algebraicos.
Resumo:
Este documento centra su atención en la noción de variable como elemento básico de la construcción de conceptos relacionados a fenómenos de variación y cambio. Partimos de que la variable no es una idea construida como un objeto o proceso aislado, sino que surge necesariamente de la relación de al menos dos entidades cambiantes que en la mayoría de los casos una de ellas es la variable tiempo. Pretendemos realizar el estudio de la variable desde diferentes dimensiones: la epistemológica, la cognitiva, la didáctica y la sociocultural, para poder tener elementos que nos permitan determinar qué procesos favorecen la construcción de esta noción y asimismo realizar su caracterización.
Resumo:
En el artículo se exponen dos métodos de resolución de inecuaciones. Se comparan desde varios puntos de vista y se comentan algunos aspectos del trabajo realizado a partir de 1983 en la enseñanza de dicho tópico en la facultad de ciencias de la Universidad Central de Venezuela.
Resumo:
There has been a recent revival of interest in the register insertion (RI) protocol because of its high throughput and low delay characteristics. Several variants of the protocol have been investigated with a view to integrating voice and data applications on a single local area network (LAN). In this paper the performance of an RI ring with a variable size buffer is studied by modelling and simulation. The chief advantage of the proposed scheme is that an efficient but simple bandwidth allocation scheme is easily incorporated. Approximate formulas are derived for queue lengths, queueing times, and total end-to-end transfer delays. The results are compared with previous analyses and with simulation estimates. The effectiveness of the proposed protocol in ensuring fairness of access under conditions of heavy and unequal loading is investigated.
Resumo:
The use of variable frequency microwave technology in curing of polymer materials used in microelectronics applications is discussed. A revolutionary open-ended microwave curing system is outlined and assessed using experimental and numerical approaches. Experimental and numerical results are presented, demonstrating the feasibility of the system
Resumo:
Curing of encapsulant material in a simplified microelectronics package using an open oven Variable Frequency Microwave (VFM) system is numerically simulated using a coupled solver approach. A numerical framework capable of simulating electromagnetic field distribution within the oven system, plus heat transfer, cure rate, degree of cure and thermally induced stresses within the encapsulant material is presented. The discrete physical processes have been integrated into a fully coupled solution, enabling usefully accurate results to be generated. Numerical results showing the heating and curing of the encapsulant material have been obtained and are presented in this contribution. The requirement to capture inter-process coupling and the variation in dielectric and thermophysical material properties is discussed and illustrated with simulation results.
Resumo:
Dual-section variable frequency microwave systems enable rapid, controllable heating of materials within an individual surface mount component in a chip-on=board assembly. The ability to process devices individually allows components with disparate processing requirements to be mounted on the same assembly. The temperature profile induced by the microwave system can be specifically tailored to the needs of the component, allowing optimisation and degree of cure whilst minimising thermomechanical stresses. This paper presents a review of dual-section microwave technology and its application to curing of thermosetting polymer materials in microelectronics applications. Curing processes using both conventional and microwave technologies are assessed and compared. Results indicate that dual-section microwave systems are able to cure individual surface mount packages in a significantly shorter time, at the expense of an increase in thermomechanical stresses and a greater variation in degree of cure.
Resumo:
The MHD wave instability in commercial cells for electrolytic aluminium production is often described using ‘shallow water’ models. The model [1] is extended for a variable height cathode bottom and anode top to account for realistic cell features. The variable depth of the two fluid layers affects the horizontal current density, the wave development and the stability threshold. Instructive examples for the 500 kA cell are presented.