818 resultados para fuzzy rule base models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no 10º Congresso Nacional de Sismologia e Engenharia Sísmica, 20-22 abril de 2016, Ponta Delgada, Açores, Portugal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed a quantitative comparison of brittle thrust wedge experiments to evaluate the variability among analogue models and to appraise the reproducibility and limits of model interpretation. Fifteen analogue modeling laboratories participated in this benchmark initiative. Each laboratory received a shipment of the same type of quartz and corundum sand and all laboratories adhered to a stringent model building protocol and used the same type of foil to cover base and sidewalls of the sandbox. Sieve structure, sifting height, filling rate, and details on off-scraping of excess sand followed prescribed procedures. Our analogue benchmark shows that even for simple plane-strain experiments with prescribed stringent model construction techniques, quantitative model results show variability, most notably for surface slope, thrust spacing and number of forward and backthrusts. One of the sources of the variability in model results is related to slight variations in how sand is deposited in the sandbox. Small changes in sifting height, sifting rate, and scraping will result in slightly heterogeneous material bulk densities, which will affect the mechanical properties of the sand, and will result in lateral and vertical differences in peak and boundary friction angles, as well as cohesion values once the model is constructed. Initial variations in basal friction are inferred to play the most important role in causing model variability. Our comparison shows that the human factor plays a decisive role, and even when one modeler repeats the same experiment, quantitative model results still show variability. Our observations highlight the limits of up-scaling quantitative analogue model results to nature or for making comparisons with numerical models. The frictional behavior of sand is highly sensitive to small variations in material state or experimental set-up, and hence, it will remain difficult to scale quantitative results such as number of thrusts, thrust spacing, and pop-up width from model to nature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Materials Laboratory, Contract No. AF33(616)-5771, Project No. 7021."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Project no. 6114, task no. 60806. Prepared under Contract no. AF 33(616)-6858 by W. A. Livingston Jr. of the Cornell Aeronautical Laboratory, Inc."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local false discovery rate is provided for each gene, and it can be implemented so that the implied global false discovery rate is bounded as with the Benjamini-Hochberg methodology based on tail areas. The latter procedure is too conservative, unless it is modified according to the prior probability that a gene is not differentially expressed. An attractive feature of the mixture model approach is that it provides a framework for the estimation of this probability and its subsequent use in forming a decision rule. The rule can also be formed to take the false negative rate into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A comparison of a constant (continuous delivery of 4% FiO(2)) and a variable (initial 5% FiO(2) with adjustments to induce low amplitude EEG (LAEEG) and hypotension) hypoxic/ischemic insult was performed to determine which insult was more effective in producing a consistent degree of survivable neuropathological damage in a newborn piglet model of perinatal asphyxia. We also examined which physiological responses contributed to this outcome. Thirty-nine 1-day-old piglets were subjected to either a constant hypoxic/ischemic insult of 30- to 37-min duration or a variable hypoxic/ischemic insult of 30-min low peak amplitude EEG (LAEEG < 5 mu V) including 10 min of low mean arterial blood pressure (MABP < 70% of baseline). Control animals (n = 6) received 21% FiO(2) for the duration of the experiment. At 72 h, the piglets were euthanased, their brains removed and fixed in 4% paraformaldehyde and assessed for hypoxic/ischemic injury by histological analysis. Based on neuropathology scores, piglets were grouped as undamaged or damaged; piglets that did not survive to 72 h were grouped separately as dead. The variable insult resulted in a greater number of piglets with neuropathological damage (undamaged = 12.5%, damaged = 68.75%, dead = 18.75%) while the constant insult resulted in a large proportion of undamaged piglets (undamaged = 50%, damaged = 22.2%, dead = 27.8%). A hypoxic insult varied to maintain peak amplitude EEG < 5 mu V results in a greater number of survivors with a consistent degree of neuropathological damage than a constant hypoxic insult. Physiological variables MABP, LAEEG, pH and arterial base excess were found to be significantly associated with neuropathological outcome. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Niche apportionment models have only been applied once to parasite communities. Only the random assortment model (RA), which indicates that species abundances are independent from each other and that interspecific competition is unimportant, provided a good fit to 3 out of 6 parasite communities investigated. The generality of this result needs to be validated, however. In this study we apply 5 niche apportionment models to the parasite communities of 14 fish species from the Great Barrier Reef. We determined which model fitted the data when using either numerical abundance or biomass as an estimate of parasite abundance, and whether the fit of niche apportionment models depends on how the parasite community is defined (e.g. ecto, endoparasites or all parasites considered together). The RA model provided a good fit for the whole community of parasites in 7 fish species when using biovolume (as a surrogate of biomass) as a measure of species abundance. The RA model also fitted observed data when ecto- and endoparasites were considered separately, using abundance or biovolume, but less frequently. Variation in fish sizes among species was not associated with the probability of a model fitting the data. Total numerical abundance and biovolume of parasites were not related across host species, suggesting that they capture different aspects of abundance. Biovolume is not only a better measurement to use with niche-orientated models, it should also be the preferred descriptor to analyse parasite community structure in other contexts. Most of the biological assumptions behind the RA model, i.e. randomness in apportioning niche space, lack of interspecific competition, independence of abundance among different species, and species with variable niches in changeable environments, are in accordance with some previous findings on parasite communities. Thus, parasite communities may generally be unsaturated with species, with empty niches, and interspecific interactions may generally be unimportant in determining parasite community structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Through a prospective study of 70 youths staying at homeless-youth shelters, the authors tested the utility of I. Ajzen's (1991) theory of planned behavior (TPB), by comparing the constructs of self-efficacy with perceived behavioral control (PBC), in predicting people's rule-following behavior during shelter stays. They performed the 1st wave of data collection through a questionnaire assessing the standard TPB components of attitudes, subjective norms, PBC, and behavioral intentions in relation to following the set rules at youth shelters. Further, they distinguished between items assessing PBC (or perceived control) and those reflecting self-efficacy (or perceived difficulty). At the completion of each youth's stay at the shelter, shelter staff rated the rule adherence for that participant. Regression analyses revealed some support for the TPB in that subjective norm was a significant predictor of intentions. However, self-efficacy emerged as the strongest predictor of intentions and was the only significant predictor of rule-following behavior. Thus, the results of the present study indicate the possibility that self-efficacy is integral to predicting rule adherence within this context and reaffirm the importance of incorporating notions of people's perceived ease or difficulty in performing actions in models of attitude-behavior prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important and common problem in microarray experiments is the detection of genes that are differentially expressed in a given number of classes. As this problem concerns the selection of significant genes from a large pool of candidate genes, it needs to be carried out within the framework of multiple hypothesis testing. In this paper, we focus on the use of mixture models to handle the multiplicity issue. With this approach, a measure of the local FDR (false discovery rate) is provided for each gene. An attractive feature of the mixture model approach is that it provides a framework for the estimation of the prior probability that a gene is not differentially expressed, and this probability can subsequently be used in forming a decision rule. The rule can also be formed to take the false negative rate into account. We apply this approach to a well-known publicly available data set on breast cancer, and discuss our findings with reference to other approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional vegetation mapping methods use high cost, labour-intensive aerial photography interpretation. This approach can be subjective and is limited by factors such as the extent of remnant vegetation, and the differing scale and quality of aerial photography over time. An alternative approach is proposed which integrates a data model, a statistical model and an ecological model using sophisticated Geographic Information Systems (GIS) techniques and rule-based systems to support fine-scale vegetation community modelling. This approach is based on a more realistic representation of vegetation patterns with transitional gradients from one vegetation community to another. Arbitrary, though often unrealistic, sharp boundaries can be imposed on the model by the application of statistical methods. This GIS-integrated multivariate approach is applied to the problem of vegetation mapping in the complex vegetation communities of the Innisfail Lowlands in the Wet Tropics bioregion of Northeastern Australia. The paper presents the full cycle of this vegetation modelling approach including sampling sites, variable selection, model selection, model implementation, internal model assessment, model prediction assessments, models integration of discrete vegetation community models to generate a composite pre-clearing vegetation map, independent data set model validation and model prediction's scale assessments. An accurate pre-clearing vegetation map of the Innisfail Lowlands was generated (0.83r(2)) through GIS integration of 28 separate statistical models. This modelling approach has good potential for wider application, including provision of. vital information for conservation planning and management; a scientific basis for rehabilitation of disturbed and cleared areas; a viable method for the production of adequate vegetation maps for conservation and forestry planning of poorly-studied areas. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pac-Man is a well-known, real-time computer game that provides an interesting platform for research. We describe an initial approach to developing an artificial agent that replaces the human to play a simplified version of Pac-Man. The agent is specified as a simple finite state machine and ruleset. with parameters that control the probability of movement by the agent given the constraints of the maze at some instant of time. In contrast to previous approaches, the agent represents a dynamic strategy for playing Pac-Man, rather than a pre-programmed maze-solving method. The agent adaptively "learns" through the application of population-based incremental learning (PBIL) to adjust the agents' parameters. Experimental results are presented that give insight into some of the complexities of the game, as well as highlighting the limitations and difficulties of the representation of the agent.