940 resultados para real von Neumann measurement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to develop a new extruder control system for recycled materials which has ability to automatically maintain constant a polymer melt viscosity of mixed recycled polymers during extrusion, regardless of variations in the Melt Flow Index (MFI) of recycled mixed grade high density polyethylene (HDPE) feedstock. A closed-loop controller is developed to automatically regulate screw speed and barrel temperature profile to achieve constant viscosity and enable consistent processing of variable grade recycled HDPE materials. The experimental results of real time viscosity measurement and control using a 38mm single screw extruder with different recycled HDPEs with widely different MFIs are reported in this work

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We define the Schur multipliers of a separable von Neumann algebra M with Cartan masa A, generalising the classical Schur multipliers of B(` 2 ). We characterise these as the normal A-bimodule maps on M. If M contains a direct summand isomorphic to the hyper- finite II1 factor, then we show that the Schur multipliers arising from the extended Haagerup tensor product A ⊗eh A are strictly contained in the algebra of all Schur multipliers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a probabilistic approach to the problem of assigning k indivisible identical objects to a set of agents with single-peaked preferences. Using the ordinal extension of preferences, we characterize the class of uniform probabilistic rules by Pareto efficiency, strategy-proofness, and no-envy. We also show that in this characterization no-envy cannot be replaced by anonymity. When agents are strictly risk averse von-Neumann-Morgenstern utility maximizers, then we reduce the problem of assigning k identical objects to a problem of allocating the amount k of an infinitely divisible commodity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We reconsider the problem of aggregating individual preference orderings into a single social ordering when alternatives are lotteries and individual preferences are of the von Neumann-Morgenstern type. Relative egalitarianism ranks alternatives by applying the leximin ordering to the distributions of (0-1) normalized utilities they generate. We propose an axiomatic characterization of this aggregation rule and discuss related criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selon le voile d’ignorance proposé par John Harsanyi (1953, 1955), l’observateur rationnel derrière le voile d’ignorance cherche à maximiser la somme des utilités individuelles. Cependant, le modèle d’Harsanyi est fondé sur une hypothèse erronée que la fonction d’utilité à la von Neumann-Morgenstern de l’observateur permet la comparaison interpersonnelle de bien-être. Ce papier suggère une modification du modèle d’Harsanyi qui permet la comparaison interpersonnelle de bien-être, en utilisant les années de vie en parfaite utilité ou les années de vie heureuse comme mesure du bien-être.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le théorème ergodique de Birkhoff nous renseigne sur la convergence de suites de fonctions. Nous nous intéressons alors à étudier la convergence en moyenne et presque partout de ces suites, mais dans le cas où la suite est une suite strictement croissante de nombres entiers positifs. C’est alors que nous définirons les suites uniformes et étudierons la convergence presque partout pour ces suites. Nous regarderons également s’il existe certaines suites pour lesquelles la convergence n’a pas lieu. Nous présenterons alors un résultat dû en partie à Alexandra Bellow qui dit que de telles suites existent. Finalement, nous démontrerons une équivalence entre la notion de transformatiuon fortement mélangeante et la convergence d'une certaine suite qui utilise des “poids” qui satisfont certaines propriétés.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les implications philosophiques de la Théorie de la Perspective de 1979, notamment celles qui concernent l’introduction d’une fonction de valeur sur les résultats et d’un coefficient de pondération sur les probabilités, n’ont à ce jour jamais été explorées. Le but de ce travail est de construire une théorie philosophique de la volonté à partir des résultats de la Théorie de la Perspective. Afin de comprendre comment cette théorie a pu être élaborée il faut étudier la Théorie de l’Utilité Attendue dont elle est l’aboutissement critique majeur, c’est-à-dire les axiomatisations de la décision de Ramsey (1926), von Neumann et Morgenstern (1947), et enfin Savage (1954), qui constituent les fondements de la théorie classique de la décision. C’est entre autres la critique – par l’économie et la psychologie cognitive – du principe d’indépendance, des axiomes d’ordonnancement et de transitivité qui a permis de faire émerger les éléments représentationnels subjectifs à partir desquels la Théorie de la Perspective a pu être élaborée. Ces critiques ont été menées par Allais (1953), Edwards (1954), Ellsberg (1961), et enfin Slovic et Lichtenstein (1968), l’étude de ces articles permet de comprendre comment s’est opéré le passage de la Théorie de l’Utilité Attendue, à la Théorie de la Perspective. À l’issue de ces analyses et de celle de la Théorie de la Perspective est introduite la notion de Système de Référence Décisionnel, qui est la généralisation naturelle des concepts de fonction de valeur et de coefficient de pondération issus de la Théorie de la Perspective. Ce système, dont le fonctionnement est parfois heuristique, sert à modéliser la prise de décision dans l’élément de la représentation, il s’articule autour de trois phases : la visée, l’édition et l’évaluation. À partir de cette structure est proposée une nouvelle typologie des décisions et une explication inédite des phénomènes d’akrasie et de procrastination fondée sur les concepts d’aversion au risque et de surévaluation du présent, tous deux issus de la Théorie de la Perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis mainly focuses on material characterization in different environments: freely available samples taken in planar fonn, biological samples available in small quantities and buried objects.Free space method, finds many applications in the fields of industry, medicine and communication. As it is a non-contact method, it can be employed for monitoring the electrical properties of materials moving through a conveyor belt in real time. Also, measurement on such systems at high temperature is possible. NID theory can be applied to the characterization of thin films. Dielectric properties of thin films deposited on any dielectric substrate can be determined. ln chemical industry, the stages of a chemical reaction can be monitored online. Online monitoring will be more efficient as it saves time and avoids risk of sample collection.Dielectric contrast is one of the main factors, which decides the detectability of a system. lt could be noted that the two dielectric objects of same dielectric constant 3.2 (s, of plastic mine) placed in a medium of dielectric constant 2.56 (er of sand) could even be detected employing the time domain analysis of the reflected signal. This type of detection finds strategic importance as it provides solution to the problem of clearance of non-metallic mines. The demining of these mines using the conventional techniques had been proved futile. The studies on the detection of voids and leakage in pipes find many applications.The determined electrical properties of tissues can be used for numerical modeling of cells, microwave imaging, SAR test etc. All these techniques need the accurate determination of dielectric constant. ln the modem world, the use of cellular and other wireless communication systems is booming up. At the same time people are concemed about the hazardous effects of microwaves on living cells. The effect is usually studied on human phantom models. The construction of the models requires the knowledge of the dielectric parameters of the various body tissues. lt is in this context that the present study gains significance. The case study on biological samples shows that the properties of normal and infected body tissues are different. Even though the change in the dielectric properties of infected samples from that of normal one may not be a clear evidence of an ailment, it is an indication of some disorder.ln medical field, the free space method may be adapted for imaging the biological samples. This method can also be used in wireless technology. Evaluation of electrical properties and attenuation of obstacles in the path of RF waves can be done using free waves. An intelligent system for controlling the power output or frequency depending on the feed back values of the attenuation may be developed.The simulation employed in GPR can be extended for the exploration of the effects due to the factors such as the different proportion of water content in the soil, the level and roughness of the soil etc on the reflected signal. This may find applications in geological explorations. ln the detection of mines, a state-of-the art technique for scanning and imaging an active mine field can be developed using GPR. The probing antenna can be attached to a robotic arm capable of three degrees of rotation and the whole detecting system can be housed in a military vehicle. In industry, a system based on the GPR principle can be developed for monitoring liquid or gas through a pipe, as pipe with and without the sample gives different reflection responses. lt may also be implemented for the online monitoring of different stages of extraction and purification of crude petroleum in a plant.Since biological samples show fluctuation in the dielectric nature with time and other physiological conditions, more investigation in this direction should be done. The infected cells at various stages of advancement and the normal cells should be analysed. The results from these comparative studies can be utilized for the detection of the onset of such diseases. Studying the properties of infected tissues at different stages, the threshold of detectability of infected cells can be determined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In [4], Guillard and Viozat propose a finite volume method for the simulation of inviscid steady as well as unsteady flows at low Mach numbers, based on a preconditioning technique. The scheme satisfies the results of a single scale asymptotic analysis in a discrete sense and comprises the advantage that this can be derived by a slight modification of the dissipation term within the numerical flux function. Unfortunately, it can be observed by numerical experiments that the preconditioned approach combined with an explicit time integration scheme turns out to be unstable if the time step Dt does not satisfy the requirement to be O(M2) as the Mach number M tends to zero, whereas the corresponding standard method remains stable up to Dt=O(M), M to 0, which results from the well-known CFL-condition. We present a comprehensive mathematical substantiation of this numerical phenomenon by means of a von Neumann stability analysis, which reveals that in contrast to the standard approach, the dissipation matrix of the preconditioned numerical flux function possesses an eigenvalue growing like M-2 as M tends to zero, thus causing the diminishment of the stability region of the explicit scheme. Thereby, we present statements for both the standard preconditioner used by Guillard and Viozat [4] and the more general one due to Turkel [21]. The theoretical results are after wards confirmed by numerical experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El manejo del trauma abdominal supone el reto de realizar una anastomosis o sutura intestinal en pacientes comprometidos hemodinámicamente. La decisión de cirugía de control de daños ante la presencia de acidosis, hipotermia y coagulopatía es evidente, sin embargo la situación no siempre es tan clara. En individuos con trauma se desarrollan cambios moleculares e inflamatorios por inadecuado balance entre aporte y demanda de oxígeno, que afectan el proceso de reparación de los tejidos con el riesgo de aparición de fístulas. Una forma rápida y práctica de detectar esta hipoperfusión es midiendo la saturación venosa de oxígeno (SVO2) y el Lactato Sérico. OBJETIVOS: Establecer correlación entre los valores de SVO2 transoperatorio y la aparición de fístulas intestinales en pacientes intervenidos por trauma abdominal. MATERIALES Y METODOS: Estudio de cohorte prospectivo que analiza diferentes variables en relación con la aparición de fistulas en pacientes con trauma abdominal que requieren suturas en el tracto gastrointestinal, haciendo énfasis en los niveles de SVO2. RESULTADOS: Los pacientes con falla anastomótica, presentaron un promedio de SVO2 más baja (60.0% ± 2.94%), versus los no fistulizados (69.89% ± 7.21%) (p =0.010). Todos los pacientes de la cohorte expuesta (SVO2<65%), presentaron dehiscencia de la anastomosis (RR =39.8, IC95%: 2.35,659.91, p<0.001, Test exacto de Fisher). El valor predictivo positivo de la saturación (<65%) fue de 57.14% (IC 95%: 13.34%, 100%) y el valor predictivo negativo fue de 100% (IC 95%:81.75%, 100%). La sensibilidad fue de 100% (IC 95%:87.50%, 100%) y especificidad de 91.89% (IC 95%: 81.75%, 100%). En el análisis bivariante determinó que el índice de trauma abdominal, el nivel de hemoglobina y el requerimiento de transfusión de glóbulos rojos, son factores de riesgo directamente relacionados con la falla de la anastomosis en pacientes con trauma abdominal CONCLUSIONES: - Hay una fuerte relación entre la falla en la reparación intestinal y SVO2 < 65%. - El pronóstico de una anastomosis intestinal está directamente relacionada con el estado hemodinámico y la perfusión tisular al momento de la intervención quirúrgica. - El nivel de SVO2 puede apoyar al cirujano en la decisión de realizar o no una reparación en víscera hueca al momento de intervención quirúrgica en un paciente con trauma abdominal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente documento constituye un estudio de caso que se desarrolló de acuerdo a los lineamientos planteados en el Plan Nacional de Desarrollo 2010 – 2014 “Prosperidad para todos”, en la que el Gobierno define que se deben otorgar 1.000.000 de soluciones de vivienda a nivel nacional en este periodo presidencial, de las cuales 254.920 soluciones son responsabilidad del Fondo Nacional del Ahorro. Por lo tanto, se analizan las estrategias que ha venido desarrollando el FNA con el propósito de proponer alternativas que permitan a la alta dirección de la entidad tomar decisiones coherentes con los modelos de promoción de vivienda, los cuales han estado alineados con el cumplimiento de los objetivos definidos por el Gobierno Nacional en el eje central de vivienda.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The strategic equilibrium of an N-person cooperative game with transferable utility is a system composed of a cover collection of subsets of N and a set of extended imputations attainable through such equilibrium cover. The system describes a state of coalitional bargaining stability where every player has a bargaining alternative against any other player to support his corresponding equilibrium claim. Any coalition in the sable system may form and divide the characteristic value function of the coalition as prescribed by the equilibrium payoffs. If syndicates are allowed to form, a formed coalition may become a syndicate using the equilibrium payoffs as disagreement values in bargaining for a part of the complementary coalition incremental value to the grand coalition when formed. The emergent well known-constant sum derived game in partition function is described in terms of parameters that result from incumbent binding agreements. The strategic-equilibrium corresponding to the derived game gives an equal value claim to all players.  This surprising result is alternatively explained in terms of strategic-equilibrium based possible outcomes by a sequence of bargaining stages that when the binding agreements are in the right sequential order, von Neumann and Morgenstern (vN-M) non-discriminatory solutions emerge. In these solutions a preferred branch by a sufficient number of players is identified: the weaker players syndicate against the stronger player. This condition is referred to as the stronger player paradox.  A strategic alternative available to the stronger players to overcome the anticipated not desirable results is to voluntarily lower his bargaining equilibrium claim. In doing the original strategic equilibrium is modified and vN-M discriminatory solutions may occur, but also a different stronger player may emerge that has eventually will have to lower his equilibrium claim. A sequence of such measures converges to the equal opportunity for all vN-M solution anticipated by the strategic equilibrium of partition function derived game.    [298-words]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El desarrollo de modelos económicos lineales fue uno de los logros más signifi cativos en teoría económica en la Norteamérica de la posguerra. La programación lineal, desarrollada por George B. Dantzig (1947), los modelos de insumo producto de Wassily Leontief (1946) y la teoría de juegos de John. Von Neumann (1944) se constituyeron en tres ramas diferentes de la teoría económica lineal. Sus aplicaciones en variados campos del conocimiento, como la Economía y la Ciencia Política, y en actividades de gestión en la industria y en el gobierno son cada vez más signifi cativas. El objetivo principal de este trabajo es el de presentar un modelo práctico de los procesos de producción típicos de una fábrica o empresa que transforma insumos en productos. El modelo se desarrolla en el contexto y con los conceptos propios de la teoría de modelos económicos lineales, y el enfoque de la investigación de operaciones, también conocido como el de las ciencias de la administración.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the beginning, the world of game-playing by machine has been fortunate in attracting contributions from the leading names of computer science. Charles Babbage, Konrad Zuse, Claude Shannon, Alan Turing, John von Neumann, John McCarthy, Alan Newell, Herb Simon and Ken Thompson all come to mind, and each reader will wish to add to this list. Recently, the Journal has saluted both Claude Shannon and Herb Simon. Ken’s retirement from Lucent Technologies’ Bell Labs to the start-up Entrisphere is also a good moment for reflection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a substantial literature which suggests that appraisals are smoothed and lag the true level of prices. This study combines a qualitative interview survey of the leading fund manager/owners in the UK and their appraisers with a empirical study of the number of appraisals which change each month within the IPD Monthly Index. The paper concentrates on how the appraisal process operates for commercial real estate performance measurement purposes. The survey interviews suggest that periodic appraisal services are consolidating in fewer firms and, within these major firms, appraisers adopt different approaches to changing appraisals on a period by period basis, with some wanting hard transaction evidence while others act on ‘softer’ signals. The survey also indicates a seasonal effect with greater effort and information being applied to annual and quarterly appraisals than monthly. The analysis of the appraisals within the IPD Monthly Index confirms this effect with around 5% more appraisals being moved at each quarter day than the other months. More November appraisals change than expected and this suggests that the increased information flows for the December end year appraisals are flowing through into earlier appraisals, especially as client/appraiser draft appraisal meetings for the December appraisals, a regular occurrence in the UK, can occur in November. January illustrates significantly less activity than other months, a seasonal effect after the exertions of the December appraisals.