16 resultados para real life data

em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco


Relevância:

90.00% 90.00%

Publicador:

Resumo:

[ES]La fibrilación ventricular (VF) es el primer ritmo registrado en el 40\,\% de las muertes súbitas por paro cardiorrespiratorio extrahospitalario (PCRE). El único tratamiento eficaz para la FV es la desfibrilación mediante una descarga eléctrica. Fuera del hospital, la descarga se administra mediante un desfibrilador externo automático (DEA), que previamente analiza el electrocardiograma (ECG) del paciente y comprueba si presenta un ritmo desfibrilable. La supervivencia en un caso de PCRE depende fundamentalmente de dos factores: la desfibrilación temprana y la resucitación cardiopulmonar (RCP) temprana, que prolonga la FV y por lo tanto la oportunidad de desfibrilación. Para un correcto análisis del ritmo cardiaco es necesario interrumpir la RCP, ya que, debido a las compresiones torácicas, la RCP introduce artefactos en el ECG. Desafortunadamente, la interrupción de la RCP afecta negativamente al éxito en la desfibrilación. En 2003 se aprobó el uso del DEA en pacientes entre 1 y 8 años. Los DEA, que originalmente se diseñaron para pacientes adultos, deben discriminar de forma precisa las arritmias pediátricas para que su uso en niños sea seguro. Varios DEAs se han adaptado para uso pediátrico, bien demostrando la precisión de los algoritmos para adultos con arritmias pediátricas, o bien mediante algoritmos específicos para arritmias pediátricas. Esta tesis presenta un nuevo algoritmo DEA diseñado conjuntamente para pacientes adultos y pediátricos. El algoritmo se ha probado exhaustivamente en bases de datos acordes a los requisitos de la American Heart Association (AHA), y en registros de resucitación con y sin artefacto RCP. El trabajo comenzó con una larga fase experimental en la que se recopilaron y clasificaron retrospectivamente un total de 1090 ritmos pediátricos. Además, se revisó una base de arritmias de adultos y se añadieron 928 nuevos ritmos de adultos. La base de datos final contiene 2782 registros, 1270 se usaron para diseñar el algoritmo y 1512 para validarlo. A continuación, se diseñó un nuevo algoritmo DEA compuesto de cuatro subalgoritmos. Estos subalgoritmos están basados en un conjunto de nuevos parámetros para la detección de arritmias, calculados en diversos dominios de la señal, como el tiempo, la frecuencia, la pendiente o la función de autocorrelación. El algoritmo cumple las exigencias de la AHA para la detección de ritmos desfibrilables y no-desfibrilables tanto en pacientes adultos como en pediátricos. El trabajo concluyó con el análisis del comportamiento del algoritmo con episodios reales de resucitación. En los ritmos que no contenían artefacto RCP se cumplieron las exigencias de la AHA. Posteriormente, se estudió la precisión del algoritmo durante las compresiones torácicas, antes y después de filtrar el artefacto RCP. Para suprimir el artefacto se utilizó un nuevo método desarrollado a lo largo de la tesis. Los ritmos desfibrilables se detectaron de forma precisa tras el filtrado, los no-desfibrilables sin embargo no.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper proposes an extended version of the basic New Keynesian monetary (NKM) model which contemplates revision processes of output and inflation data in order to assess the importance of data revisions on the estimated monetary policy rule parameters and the transmission of policy shocks. Our empirical evidence based on a structural econometric approach suggests that although the initial announcements of output and inflation are not rational forecasts of revised output and inflation data, ignoring the presence of non well-behaved revision processes may not be a serious drawback in the analysis of monetary policy in this framework. However, the transmission of inflation-push shocks is largely affected by considering data revisions. The latter being especially true when the nominal stickiness parameter is estimated taking into account data revision processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Revisions of US macroeconomic data are not white-noise. They are persistent, correlated with real-time data, and with high variability (around 80% of volatility observed in US real-time data). Their business cycle effects are examined in an estimated DSGE model extended with both real-time and final data. After implementing a Bayesian estimation approach, the role of both habit formation and price indexation fall significantly in the extended model. The results show how revision shocks of both output and inflation are expansionary because they occur when real-time published data are too low and the Fed reacts by cutting interest rates. Consumption revisions, by contrast, are countercyclical as consumption habits mirror the observed reduction in real-time consumption. In turn, revisions of the three variables explain 9.3% of changes of output in its long-run variance decomposition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We provide empirical evidence to support the claims that social diversity promotes prosocial behavior. We elicit a real-life social network and its members’ adherence to a social norm, namely inequity aversion. The data reveal a positive relationship between subjects’ prosociality and several measures of centrality. This result is in line with the theoretical literature that relates the evolution of social norms to the structure of social interactions and argues that central individuals are crucial for the emergence of prosocial behavior.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study the language choice behavior of bilingual speakers in modern societies, such as the Basque Country, Ireland andWales. These countries have two o cial languages:A, spoken by all, and B, spoken by a minority. We think of the bilinguals in those societies as a population playing repeatedly a Bayesian game in which, they must choose strategically the language, A or B, that might be used in the interaction. The choice has to be made under imperfect information about the linguistic type of the interlocutors. We take the Nash equilibrium of the language use game as a model for real life language choice behavior. It is shown that the predictions made with this model t very well the data about the actual use, contained in the censuses, of Basque, Irish and Welsh languages. Then the question posed by Fishman (2001),which appears in the title, is answered as follows: it is hard, mainly, because bilingual speakers have reached an equilibrium which is evolutionary stable. This means that to solve fast and in a re ex manner their frequent language coordination problem, bilinguals have developed linguistic conventions based chie y on the strategy 'Use the same language as your interlocutor', which weakens the actual use of B.1

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[ENG] If we look around us, we can observe that there is someone who is the best in each field of activity. We could think that they are exceptional individuals. This Final Project aims to increase knowledge of the effects of Deliberate Practice in the domains of music and sport. This will define you a concept of Deliberate Practice and then focus on the diversity of situations in which it shows us how it is presented in real life. From a questionnaire that has been designed for this study and distributed to the music students, I have expected to obtain a result that allow me to come to the conclusion that exists a relation between the hours of practice and the expertise in the execution. This reality has been linked to the regarding situation in the sport practice, whose information has been provided by the coordinators of the different sports. Taking into account the limited number of references available, this work has focused on a qualitative analysis of the data, interpreted from my point of view and my personal experience, which has been confirmed in the results obtained. The statistics managed allow me to conclude that, although the argument is not definitive, the guide effort through deliberate practice is essential to achieve the excellence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When it comes to information sets in real life, often pieces of the whole set may not be available. This problem can find its origin in various reasons, describing therefore different patterns. In the literature, this problem is known as Missing Data. This issue can be fixed in various ways, from not taking into consideration incomplete observations, to guessing what those values originally were, or just ignoring the fact that some values are missing. The methods used to estimate missing data are called Imputation Methods. The work presented in this thesis has two main goals. The first one is to determine whether any kind of interactions exists between Missing Data, Imputation Methods and Supervised Classification algorithms, when they are applied together. For this first problem we consider a scenario in which the databases used are discrete, understanding discrete as that it is assumed that there is no relation between observations. These datasets underwent processes involving different combina- tions of the three components mentioned. The outcome showed that the missing data pattern strongly influences the outcome produced by a classifier. Also, in some of the cases, the complex imputation techniques investigated in the thesis were able to obtain better results than simple ones. The second goal of this work is to propose a new imputation strategy, but this time we constrain the specifications of the previous problem to a special kind of datasets, the multivariate Time Series. We designed new imputation techniques for this particular domain, and combined them with some of the contrasted strategies tested in the pre- vious chapter of this thesis. The time series also were subjected to processes involving missing data and imputation to finally propose an overall better imputation method. In the final chapter of this work, a real-world example is presented, describing a wa- ter quality prediction problem. The databases that characterized this problem had their own original latent values, which provides a real-world benchmark to test the algorithms developed in this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In real life strategic interactions, decision-makers are likely to entertain doubts about the degree of optimality of their play. To capture this feature of real choice-making, we present here a model based on the doubts felt by an agent about how well is playing a game. The doubts are coupled with (and mutually reinforced by) imperfect discrimination capacity, which we model here by means of similarity relations. We assume that each agent builds procedural preferences de ned on the space of expected payoffs-strategy frequencies attached to his current strategy. These preferences, together with an adaptive learning process lead to doubt-based selection dynamic systems. We introduce the concepts of Mixed Strategy Doubt Equilibria, Mixed Strategy Doubt-Full Equilibria and Mixed Strategy Doubtless Equilibria and show the theoretical and the empirical relevance of these concepts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses a structural approach based on the indirect inference principle to estimate a standard version of the new Keynesian monetary (NKM) model augmented with term structure using both revised and real-time data. The estimation results show that the term spread and policy inertia are both important determinants of the U.S. estimated monetary policy rule whereas the persistence of shocks plays a small but significant role when revised and real-time data of output and inflation are both considered. More importantly, the relative importance of term spread and persistent shocks in the policy rule and the shock transmission mechanism drastically change when it is taken into account that real-time data are not well behaved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES] Mientras la realidad se transforma a gran ritmo, las ideas de ciencia y de conocimiento científico entran en crisis. Esta crisis también se manifiesta en los estudios empresariales: muchos afirman que la Economía de la Empresa no es ciencia, sino un conjunto de conocimientos diversos, débilmente conectados entre sí. Otros se pierden en investigaciones alejadas de la realidad, y un último grupo no consigue remontar un enfoque meramente practicista. Por ello, estimamos que es un buen momento para recordar los fundamentos que sustentan una ciencia: su objeto formal y material y su método. Para ello se realizará una exposición sintética del pensamiento del profesor Soldevilla.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Roughly one half of World's languages are in danger of extinction. The endangered languages, spoken by minorities, typically compete with powerful languages such as En- glish or Spanish. Consequently, the speakers of minority languages have to consider that not everybody can speak their language, converting the language choice into strategic,coordination-like situation. We show experimentally that the displacement of minority languages may be partially explained by the imperfect information about the linguistic type of the partner, leading to frequent failure to coordinate on the minority language even between two speakers who can and prefer to use it. The extent of miscoordination correlates with how minoritarian a language is and with the real-life linguistic condition of subjects: the more endangered a language the harder it is to coordinate on its use, and people on whom the language survival relies the most acquire behavioral strategies that lower its use. Our game-theoretical treatment of the issue provides a new perspective for linguistic policies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El objetivo del presente trabajo es el de establecer las directrices de una valoración actuarial en los diversos casos en los que puede ser requerido en caso de litigio e intentar tras realizar una lectura completa de este trabajo tener una visión más cercana y comprensible de lo que al principio podría verse como algo fuera del entendimiento de alguien sin una formación y conocimiento actuarial, sin embargo, con unos conocimientos actuariales básicos el objetivo es que sea sencillo y aporte una visión general de la valoración actuarial. Se compone de una parte teórica donde se define y se desglosan los factores del lucro cesante a tener en cuenta para la realización de su valoración, así mismo, se muestran distintas metodologías, y finalmente se encuentra la parte práctica donde se realizan varias hipótesis, todas ellas con su definición y detallándose su labor en la valoración; y un intervalo de supuestos prácticos que engloba los sucesos más comunes en la vida real. En definitiva, el objetivo del presente trabajo además de entender el procedimiento para la valoración del actuario es tras realizar una lectura completa del trabajo comprender la importancia del papel del actuario, al ser éste la persona encargada de lograr la justicia y equidad social, consiguiendo con sus conocimientos actuariales que aquella persona que se haya visto perjudicada económicamente por el suceso sea resarcida justamente.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES]Este proyecto estudia el flujo de dióxido de carbono a través de toberas de diferentes diámetros con un software de simulación, mediante el método de los volúmenes finitos (CFD). El objetivo es poder elegir la tobera que optimice la cantidad de gas utilizado en función de la distancia a la que tenga que llegar. Con un modelo computacional adecuado, esta simulación puede realizarse en un ordenador sin tener que recurrir a ensayos, ahorrando costes y tiempo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[Es]Los establecimientos industriales a los que la directiva 2012/18/UE es de aplicación deben establecer las acciones necesarias para controlar y planificar los riesgos inherentes a los accidentes graves que pueden ser causados por sustancias peligrosas. Esta directiva obliga a los establecimientos afectados clasificados como nivel superior a elaborar un informe de seguridad en el que se realicen análisis de riesgos de los posibles accidentes graves con el fin de calcular los posibles daños que pudieran producirse. Para ello se siguen una serie de metodologías que incluyen desde el análisis de las posibles hipótesis de accidente grave hasta el cálculo de sus consecuencias para personas, bienes materiales y medio ambiente. El informe técnico trata de analizar las consecuencias de un accidente real provocado por una nube tóxica de dióxido de nitrógeno (NO2) y se comparan sus resultados con los datos reales del accidente. Asimismo se utilizan una serie de supuestos para analizar las consecuencias de un escenario de riesgo en la misma instalación que debiera estar incluido en el informe de seguridad.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, train control in-lab simulation tools play a crucial role in reducing extensive and expensive on-site railway testing activities. In this paper, we present our contribution in this arena by detailing the internals of our European Railway Train Management System in-lab demonstrator. This demonstrator is built over a general-purpose simulation framework, Riverbed Modeler, previously Opnet Modeler. Our framework models both ERTMS subsystems, the Automatic Train Protection application layer based on movement authority message exchange and the telecommunication subsystem based on GSM-R communication technology. We provide detailed information on our modelling strategy. We also validate our simulation framework with real trace data. To conclude, under current industry migration scenario from GSM-R legacy obsolescence to IP-based heterogeneous technologies, our simulation framework represents a singular tool to railway operators. As an example, we present the assessment of related performance indicators for a specific railway network using a candidate replacement technology, LTE, versus current legacy technology. To the best of our knowledge, there is no similar initiative able to measure the impact of the telecommunication subsystem in the railway network availability.