234 resultados para Hidden variable theory
Resumo:
The relation between the low-energy constants appearing in the effective field theory description of the Lambda N -> NN transition potential and the parameters of the one-meson-exchange model previously developed is obtained. We extract the relative importance of the different exchange mechanisms included in the meson picture by means of a comparison to the corresponding operational structures appearing in the effective approach. The ability of this procedure to obtain the weak baryon-baryon-meson couplings for a possible scalar exchange is also discussed.
Resumo:
Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.
Resumo:
This paper develops an approach to rank testing that nests all existing rank tests andsimplifies their asymptotics. The approach is based on the fact that implicit in every ranktest there are estimators of the null spaces of the matrix in question. The approach yieldsmany new insights about the behavior of rank testing statistics under the null as well as localand global alternatives in both the standard and the cointegration setting. The approach alsosuggests many new rank tests based on alternative estimates of the null spaces as well as thenew fixed-b theory. A brief Monte Carlo study illustrates the results.
Resumo:
We review the progress in the field of front propagation in recent years. We survey many physical, biophysical and cross-disciplinary applications, including reduced-variable models of combustion flames, Reid's paradox of rapid forest range expansions, the European colonization of North America during the 19th century, the Neolithic transition in Europe from 13 000 to 5000 years ago, the description of subsistence boundaries, the formation of cultural boundaries, the spread of genetic mutations, theory and experiments on virus infections, models of cancer tumors, etc. Recent theoretical advances are unified in a single framework, encompassing very diverse systems such as those with biased random walks, distributed delays, sequential reaction and dispersion, cohabitation models, age structure and systems with several interacting species. Directions for future progress are outlined
Resumo:
By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.
Resumo:
The material presented in the these notes covers the sessions Modelling of electromechanical systems, Passive control theory I and Passive control theory II of the II EURON/GEOPLEX Summer School on Modelling and Control of Complex Dynamical Systems.We start with a general description of what an electromechanical system is from a network modelling point of view. Next, a general formulation in terms of PHDS is introduced, and some of the previous electromechanical systems are rewritten in this formalism. Power converters, which are variable structure systems (VSS), can also be given a PHDS form.We conclude the modelling part of these lectures with a rather complex example, showing the interconnection of subsystems from several domains, namely an arrangement to temporally store the surplus energy in a section of a metropolitan transportation system based on dc motor vehicles, using either arrays of supercapacitors or an electric poweredflywheel. The second part of the lectures addresses control of PHD systems. We first present the idea of control as power connection of a plant and a controller. Next we discuss how to circumvent this obstacle and present the basic ideas of Interconnection and Damping Assignment (IDA) passivity-based control of PHD systems.
Resumo:
We present a new phenomenological approach to nucleation, based on the combination of the extended modified liquid drop model and dynamical nucleation theory. The new model proposes a new cluster definition, which properly includes the effect of fluctuations, and it is consistent both thermodynamically and kinetically. The model is able to predict successfully the free energy of formation of the critical nucleus, using only macroscopic thermodynamic properties. It also accounts for the spinodal and provides excellent agreement with the result of recent simulations.
Resumo:
We present a model in which particles (or individuals of a biological population) disperse with a rest time between consecutive motions (or migrations) which may take several possible values from a discrete set. Particles (or individuals) may also react (or reproduce). We derive a new equation for the effective rest time T˜ of the random walk. Application to the neolithic transition in Europe makes it possible to derive more realistic theoretical values for its wavefront speed than those following from the single-delayed framework presented previously [J. Fort and V. Méndez, Phys. Rev. Lett. 82, 867 (1999)]. The new results are consistent with the archaeological observations of this important historical process
Resumo:
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0
Resumo:
We review the progress in the field of front propagation in recent years. We survey many physical, biophysical and cross-disciplinary applications, including reduced-variable models of combustion flames, Reid's paradox of rapid forest range expansions, the European colonization of North America during the 19th century, the Neolithic transition in Europe from 13 000 to 5000 years ago, the description of subsistence boundaries, the formation of cultural boundaries, the spread of genetic mutations, theory and experiments on virus infections, models of cancer tumors, etc. Recent theoretical advances are unified in a single framework, encompassing very diverse systems such as those with biased random walks, distributed delays, sequential reaction and dispersion, cohabitation models, age structure and systems with several interacting species. Directions for future progress are outlined
Resumo:
We report on the study of nonequilibrium ordering in the reaction-diffusion lattice gas. It is a kinetic model that relaxes towards steady states under the simultaneous competition of a thermally activated creation-annihilation $(reaction$) process at temperature T, and a diffusion process driven by a heat bath at temperature T?T. The phase diagram as one varies T and T, the system dimension d, the relative priori probabilities for the two processes, and their dynamical rates is investigated. We compare mean-field theory, new Monte Carlo data, and known exact results for some limiting cases. In particular, no evidence of Landau critical behavior is found numerically when d=2 for Metropolis rates but Onsager critical points and a variety of first-order phase transitions.
Resumo:
Closing talk of the Open Access Week 2011 at the UOC, by Josep Jover. Why do altruistic strategies beat selfish ones in the spheres of both free software and the #15m movement? The #15m movement, like software but unlike tangible goods, cannot be owned. It can be used (by joining it) by an indeterminate number of people without depriving anyone else of the chance to do the same. And that turns everything on its head: how universities manage information and what their mission is in this new society. In the immediate future, universities will be valued not for the information they harbour, which will always be richer and more extensive beyond their walls, but rather for their capacity to create critical masses, whether of knowledge research, skill-building, or networks of peers... universities must implement the new model or risk becoming obsolete.
Resumo:
Web 2.0 services such as social bookmarking allow users to manage and share the links they find interesting, adding their own tags for describingthem. This is especially interesting in the field of open educational resources, asdelicious is a simple way to bridge the institutional point of view (i.e. learningobject repositories) with the individual one (i.e. personal collections), thuspromoting the discovering and sharing of such resources by other users. In this paper we propose a methodology for analyzing such tags in order to discover hidden semantics (i.e. taxonomies and vocabularies) that can be used toimprove descriptions of learning objects and make learning object repositories more visible and discoverable. We propose the use of a simple statistical analysis tool such as principal component analysis to discover which tags createclusters that can be semantically interpreted. We will compare the obtained results with a collection of resources related to open educational resources, in order to better understand the real needs of people searching for open educational resources.
Resumo:
Estas notas corresponden a las exposiciones presentadas en el \emph{Primer Seminario de Integrabilidad}, dentro de lo que se denomina \emph{Aula de Sistemas Din\'amicos}. Durante este evento se realizaron seis conferencias, todas presentadas por miembros del grupo de Sistemas Din\'amicos de la UPC. El programa desarrollado fue el siguiente:\\\begin{center}AULA DE SISTEMAS DIN\'AMICOS\end{center}\begin{center}\texttt{http://www.ma1.upc.es/recerca/seminaris/aulasd-cat.html}\end{center}\begin{center}SEMINARIO DE INTEGRABILIDAD\end{center}\begin{center}Martes 29 y Mi\'ercoles 30 de marzo de 2005\\Facultad de Matem\'aticas y Estad\'{\i}stica, UPC\\Aula: Seminario 1\end{center}\bigskip\begin{center}PROGRAMA Y RES\'UMENES\end{center}{\bf Martes 29 de marzo}\begin{itemize}\item15:30. Juan J. Morales-Ruiz. \emph{El problema de laintegrabilidad en Sistemas Din\'amicos}\medskip {\bf Resumen.} En esta presentaci\'on se pretende dar unaidea de conjunto, pero sin entrar en detalles, sobre las diversasnociones de integrabilidad, asociadas a nombres de matem\'aticostan ilustres como Liouville, Galois-Picard-Vessiot, Lie, Darboux,Kowalevskaya, Painlev\'e, Poincar\'e, Kolchin, Lax, etc. Adem\'astambi\'en mencionaremos la revoluci\'on que supuso en los a\~nossesenta del siglo pasado el descubrimiento de Gardner, Green,Kruskal y Miura sobre un nuevo m\'etodo para resolver en algunoscasos determinadas ecuaciones en derivadas parciales. \medskip\item16:00. David G\'omez-Ullate. \emph{Superintegrabilidad, pares deLax y modelos de $N-$cuerpos en el plano}\medskip{\bf Resumen.} Introduciremos algunas t\'ecnicas cl\'asicas paraconstruir modelos de N-cuerpos integrables, como los pares de Laxo la din\'amica de los ceros de un polinomio. Revisaremos lanoci\'on de integrabilidad Liouville y superintegrabilidad, ydiscutiremos un nuevo m\'etodo debido a F. Calogero para contruirmodelos de N-cuerpos en el plano con muchas \'orbitasperi\'odicas. La exposici\'on se acompa\~nar\'a de animaciones delmovimiento de los cuerpos, y se plantear\'an algunos problemasabiertos.\medskip\item17:00. Pausa\medskip\item17:30. Yuri Fedorov. \emph{An\'alisis de Kovalevskaya--Painlev\'ey Sistemas Algebraicamente Integrables}\medskip{\bf Resumen.} Muchos sistemas integrables poseen una propiedadremarcable: todas sus soluciones son funciones meromorfas deltiempo como una variable compleja. Tal comportamiento, que serefiere como propiedad de Kovalevskaya-Painleve (KP) y que se usafrecuentemente como una ensayo de integrabilidad, no es accidentaly tiene unas ra\'{\i}ces geom\'etricas profundas. En esta charladescribiremos una clase de tales sistemas (conocidos como lossistemas algebraicamente integrables) y subrayaremos suspropiedades geom\'etricas principales que permiten predecir laestructura de las soluciones complejas y adem\'as encontrarlasexpl\'{\i}citamente. Eso lo ilustraremos con algunos sistemas dela mec\'anica cl\'asica. Tambi\'en mencionaremos unasgeneralizaciones \'utiles de la noci\'on de integrabilidadalgebraica y de la propiedad KP.\end{itemize}\medskip{\bf Mi\'ercoles 30 de marzo}\begin{itemize}\item 15:30. Rafael Ram\'{\i}rez-Ros. \emph{El m\'etodo de Poincar\'e}\medskip{\bf Resumen.} Dado un sistema Hamiltoniano aut\'onomo cercano acompletamente integrable Poincar\'e prob\'o que, en general, noexiste ninguna integral primera adicional uniforme en elpar\'ametro de perturbaci\'on salvo el propio Hamiltoniano.Esbozaremos las ideas principales del m\'etodo de prueba ycomentaremos algunas extensiones y generalizaciones.\newpage\item16:30. Chara Pantazi. \emph{El M\'etodo de Darboux}\medskip{\bf Resumen.} Darboux, en 1878, present\'o su m\'etodo paraconstruir integrales primeras de campos vectoriales polinomialesutilizando sus curvas invariantes algebraicas. En estaexposici\'on presentaremos algunas extensiones del m\'etodocl\'asico de Darboux y tambi\'en algunas aplicaciones.\medskip\item17:30. Pausa\medskip\item18:00. Juan J. Morales-Ruiz. \emph{M\'etodos recientes paradetectar la no integrabilidad}\medskip{\bf Resumen.} En 1982 Ziglin utiliza la estructura de laecuaci\'on en variaciones de Poincar\'e (sobre una curva integralparticular) como una herramienta fundamental para detectar la nointegrabilidad de un sistema Hamiltoniano. En esta charla sepretende dar una idea de esta aproximaci\'on a la nointegrabilidad, junto con t\'ecnicas m\'as recientes queinvolucran la teor\'{\i}a de Galois de ecuaciones diferencialeslineales, haciendo \'enfasis en los ejemplos m\'as que en lateor\'{\i}a general. Ilustraremos estos m\'etodos con resultadossobre la no integrabilidad de algunos problemas de $N$ cuerpos enMec\'anica Celeste.\end{itemize}
Resumo:
The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.