845 resultados para Poisson Representation
Resumo:
Pós-graduação em Engenharia de Produção - FEB
Resumo:
Pós-graduação em Letras - IBILCE
Resumo:
With the “social turn” of language in the past decade within English studies, ethnographic and teacher research methods increasingly have acquired legitimacy as a means of studying student literacy. And with this legitimacy, graduate students specializing in literacy and composition studies increasingly are being encouraged to use ethnographic and teacher research methods to study student literacy within classrooms. Yet few of the narratives produced from these studies discuss the problems that frequently arise when participant observers enter the classroom. Recently, some researchers have begun to interrogate the extent to which ethnographic and teacher research methods are able to construct and disseminate knowledge in empowering ways (Anderson & Irvine, 1993; Bishop, 1993; Fine, 1994; Fleischer. 1994; McLaren, 1992). While ethnographic and teacher research methods have oftentimes been touted as being more democratic and nonhierarchical than quantitative methods—-which oftentimes erase individuals lived experiences with numbers and statistical formulas—-researchers are just beginning to probe the ways that ethnographic and teacher research models can also be silencing, unreflective, and oppressive. Those who have begun to question the ethics of conducting, writing about, and disseminating knowledge in education have coined the term “critical” research, a rather vague and loose term that proposes a position of reflexivity and self-critique for all research methods, not just ethnography or teacher research. Drawing upon theories of feminist consciousness-raising, liberatory praxis, and community-action research, theories of critical research aim to involve researchers and participants in a highly participatory framework for constructing knowledge, an inquiry that seeks to question, disrupt, or intervene in the conditions under study for some socially transformative end. While critical research methods are always contingent upon the context being studied, in general they are undergirded by principles of non-hierarchical relations, participatory collaboration, problem-posing, dialogic inquiry, and multiple and multi-voiced interpretations. In distinguishing between critical and traditional ethnographic processes, for instance, Peter McLaren says that critical ethnography asks questions such as “[u]nder what conditions and to what ends do we. as educational researchers, enter into relations of cooperation. mutuality, and reciprocity with those who we research?” (p. 78) and “what social effects do you want your evaluations and understandings to have?” (p. 83). In»the same vein, Michelle Fine suggests that critical researchers must move beyond notions of the etic/emic dichotomy of researcher positionality in order to “probe how we are in relation with the contexts we study and with our informants, understanding that we are all multiple in those relations” (p. 72). Researchers in composition and literacy stud¬ies who endorse critical research methods, then, aim to enact some sort of positive transformative change in keeping with the needs and interests of the participants with whom they work.
Resumo:
We examine Weddell Sea deep water mass distributions with respect to the results from three different model runs using the oceanic component of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM). One run is inter-annually forced by corrected NCAR/NCEP fluxes, while the other two are forced with the annual cycle obtained from the same climatology. One of the latter runs includes an interactive sea-ice model. Optimum Multiparameter analysis is applied to separate the deep water masses in the Greenwich Meridian section (into the Weddell Sea only) to measure the degree of realism obtained in the simulations. First, we describe the distribution of the simulated deep water masses using observed water type indices. Since the observed indices do not provide an acceptable representation of the Weddell Sea deep water masses as expected, they are specifically adjusted for each simulation. Differences among the water masses` representations in the three simulations are quantified through their root-mean-square differences. Results point out the need for better representation (and inclusion) of ice-related processes in order to improve the oceanic characteristics and variability of dense Southern Ocean water masses in the outputs of the NCAR-CCSM model, and probably in other ocean and climate models.
Resumo:
The Conway-Maxwell Poisson (COMP) distribution as an extension of the Poisson distribution is a popular model for analyzing counting data. For the first time, we introduce a new three parameter distribution, so-called the exponential-Conway-Maxwell Poisson (ECOMP) distribution, that contains as sub-models the exponential-geometric and exponential-Poisson distributions proposed by Adamidis and Loukas (Stat Probab Lett 39:35-42, 1998) and KuAY (Comput Stat Data Anal 51:4497-4509, 2007), respectively. The new density function can be expressed as a mixture of exponential density functions. Expansions for moments, moment generating function and some statistical measures are provided. The density function of the order statistics can also be expressed as a mixture of exponential densities. We derive two formulae for the moments of order statistics. The elements of the observed information matrix are provided. Two applications illustrate the usefulness of the new distribution to analyze positive data.
Resumo:
The purpose of this paper is to develop a Bayesian analysis for the right-censored survival data when immune or cured individuals may be present in the population from which the data is taken. In our approach the number of competing causes of the event of interest follows the Conway-Maxwell-Poisson distribution which generalizes the Poisson distribution. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the proposed model. Also, some discussions on the model selection and an illustration with a real data set are considered.
Resumo:
We use computer algebra to study polynomial identities for the trilinear operation [a, b, c] = abc - acb - bac + bca + cab - cba in the free associative algebra. It is known that [a, b, c] satisfies the alternating property in degree 3, no new identities in degree 5, a multilinear identity in degree 7 which alternates in 6 arguments, and no new identities in degree 9. We use the representation theory of the symmetric group to demonstrate the existence of new identities in degree 11. The only irreducible representations of dimension <400 with new identities correspond to partitions 2(5), 1 and 2(4), 1(3) and have dimensions 132 and 165. We construct an explicit new multilinear identity for partition 2(5), 1 and we demonstrate the existence of a new non-multilinear identity in which the underlying variables are permutations of a(2)b(2)c(2)d(2)e(2) f.
Resumo:
ACID-BASE REACTIONS: CONCEPT, REPRESENTATION AND GENERALIZATION FROM THE ENERGY INVOLVED IN TRANSFORMATIONS. Undergraduate students on the first year of Chemistry Courses are unfamiliar with the representation of acid-base reactions using the ionic equation H+ + OH- -> H2O. A chemistry class was proposed about acid-base reactions using theory and experimental evaluation of neutralization heat to discuss the energy involved when water is formed from H+ and OH- ions. The experiment is suggested using different strong acids and strong base pairs. The presentation of the theme within a chemistry class for high school teachers increased the number of individuals that saw the acid-base reaction from this perspective.
Resumo:
Background: In the analysis of effects by cell treatment such as drug dosing, identifying changes on gene network structures between normal and treated cells is a key task. A possible way for identifying the changes is to compare structures of networks estimated from data on normal and treated cells separately. However, this approach usually fails to estimate accurate gene networks due to the limited length of time series data and measurement noise. Thus, approaches that identify changes on regulations by using time series data on both conditions in an efficient manner are demanded. Methods: We propose a new statistical approach that is based on the state space representation of the vector autoregressive model and estimates gene networks on two different conditions in order to identify changes on regulations between the conditions. In the mathematical model of our approach, hidden binary variables are newly introduced to indicate the presence of regulations on each condition. The use of the hidden binary variables enables an efficient data usage; data on both conditions are used for commonly existing regulations, while for condition specific regulations corresponding data are only applied. Also, the similarity of networks on two conditions is automatically considered from the design of the potential function for the hidden binary variables. For the estimation of the hidden binary variables, we derive a new variational annealing method that searches the configuration of the binary variables maximizing the marginal likelihood. Results: For the performance evaluation, we use time series data from two topologically similar synthetic networks, and confirm that our proposed approach estimates commonly existing regulations as well as changes on regulations with higher coverage and precision than other existing approaches in almost all the experimental settings. For a real data application, our proposed approach is applied to time series data from normal Human lung cells and Human lung cells treated by stimulating EGF-receptors and dosing an anticancer drug termed Gefitinib. In the treated lung cells, a cancer cell condition is simulated by the stimulation of EGF-receptors, but the effect would be counteracted due to the selective inhibition of EGF-receptors by Gefitinib. However, gene expression profiles are actually different between the conditions, and the genes related to the identified changes are considered as possible off-targets of Gefitinib. Conclusions: From the synthetically generated time series data, our proposed approach can identify changes on regulations more accurately than existing methods. By applying the proposed approach to the time series data on normal and treated Human lung cells, candidates of off-target genes of Gefitinib are found. According to the published clinical information, one of the genes can be related to a factor of interstitial pneumonia, which is known as a side effect of Gefitinib.
Resumo:
In this paper, a new family of survival distributions is presented. It is derived by considering that the latent number of failure causes follows a Poisson distribution and the time for these causes to be activated follows an exponential distribution. Three different activation schemes are also considered. Moreover, we propose the inclusion of covariates in the model formulation in order to study their effect on the expected value of the number of causes and on the failure rate function. Inferential procedure based on the maximum likelihood method is discussed and evaluated via simulation. The developed methodology is illustrated on a real data set on ovarian cancer.
Resumo:
Different representations for a control surface freeplay nonlinearity in a three degree of freedom aeroelastic system are assessed. These are the discontinuous, polynomial and hyperbolic tangent representations. The Duhamel formulation is used to model the aerodynamic loads. Assessment of the validity of these representations is performed through comparison with previous experimental observations. The results show that the instability and nonlinear response characteristics are accurately predicted when using the discontinuous and hyperbolic tangent representations. On the other hand, the polynomial representation fails to predict chaotic motions observed in the experiments. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Let IaS,a"e (d) be a set of centers chosen according to a Poisson point process in a"e (d) . Let psi be an allocation of a"e (d) to I in the sense of the Gale-Shapley marriage problem, with the additional feature that every center xi aI has an appetite given by a nonnegative random variable alpha. Generalizing some previous results, we study large deviations for the distance of a typical point xaa"e (d) to its center psi(x)aI, subject to some restrictions on the moments of alpha.
Resumo:
Objective: To identify and compare perceptions of pain and how it is faced between men and women with central post-stroke pain. Methods: The participants were 25 men and 25 women of minimum age 30 years-old and minimum schooling level of four years, presenting central post-stroke pain for at least three months. The instruments used were: Mini-Mental State Examination; structured interview for the Brief Psychiatric Scale; Survey of Sociodemographic and Clinical Data; Visual Analogue Scale (VAS); Ways of Coping with Problems Scale (WCPS) in Scale; Revised Illness Perception Questionnaire (IPQ-R); and Beck Depression Inventory (BD). Results: A significantly greater number of women used the coping strategy "Turn to spiritual and religious activities" in WCPS. They associated their emotional state with the cause of pain in IPQ-R. "Distraction of attention" was the strategy most used by the subjects. Conclusion: Women used spiritual and religious activities more as a coping strategy and perceived their emotional state as the cause of pain.
Resumo:
In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We prove that any two Poisson dependent elements in a free Poisson algebra and a free Poisson field of characteristic zero are algebraically dependent, thus answering positively a question from Makar-Limanov and Umirbaev (2007) [8]. We apply this result to give a new proof of the tameness of automorphisms for free Poisson algebras of rank two (see Makar-Limanov and Umirbaev (2011) [9], Makar-Limanov et al. (2009) [10]). (C) 2011 Elsevier Inc. All rights reserved.