983 resultados para Standard information
Predictors of weight change in sedentary smokers receiving a standard smoking cessation intervention
Resumo:
L'arrêt de la cigarette est généralement associé à une prise de poids. Celle-ci peut menacer la motivation des fumeurs à s'engager dans un processus d'arrêt du tabac et constitue un motif de rechute. L'ordre de grandeur et la cinétique de la prise de poids liée à une tentative d'arrêt chez les fumeurs pris en charge selon les recommandations cliniques actuelles est peu décrite dans la littérature médicale. Le but de cette étude était de quantifier cette prise de poids, d'en déterminer la cinétique ainsi que les facteurs qui l'influencent, chez des fumeurs sédentaires bénéficiant d'une intervention d'aide à l'arrêt du tabac individualisée, composée de conseils individuels et d'une substitution nicotinique associant plusieurs modes d'administration. Nous avons analysé des données récoltées durant un essai clinique randomisé contrôlé au cours duquel était étudié l'impact d'une activité physique modérée sur les taux d'arrêt du tabac après un an chez des fumeurs sédentaires. Nous avons modélisé l'évolution du poids de l'ensemble des participants au cours du temps, selon la technique statistique des « modèles mixtes longitudinaux ». En séparant les périodes d'abstinence de la cigarette de celles de rechute et de l'utilisation reportée de substituts nicotiniques. Cette approche nous a permis de prendre en compte chaque participant à l'étude, par opposition à un modèle plus simple qui séparerait les sujets abstinents de ceux qui rechutent à n'importe quel moment de la période de suivi. Nous avons également ajusté ces modèles pour l'âge, le sexe, le niveau de dépendance à la nicotine et le niveau de formation des participants. Parmi l'ensemble des participants, nous avons noté une augmentation du poids durant les trois premiers mois de l'intervention, suivie d'une stabilisation. Au total, la prise de poids moyenne s'est élevée à 3.3 kg pour les femmes et 3.9 kg pour les hommes. Durant les périodes d'abstinence, les caractéristiques suivantes étaient associées à la prise de poids : sexe masculin et forte dépendance nicotinique. Un âge supérieur à 43 ans était associé à une prise de poids également durant les périodes de rechute. Nous avons observé une tendance, non statistiquement significative, vers une réduction de la prise des poids avec l'utilisation de substituts nicotiniques. Notre étude apporte de nouvelles données sur l'évolution du poids chez les fumeurs sédentaires qui bénéficient d'une intervention d'aide à l'arrêt du tabac. Ils prennent donc du poids, de manière modérée et limitée aux premiers mois. Parmi eux, les hommes, les individus les plus dépendants à la nicotine et les plus âgés doivent s'attendre à une prise de poids supérieure à la moyenne.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. Aquest treball es planteja com un repte la necessitat de connexió a una xarxa de subministrament elèctric i d'aigua. L'objectiu del treball ha estat comprovar si, tan tècnicament com econòmica, es pot aconseguir l'autosuficiència d'una casa de muntanya del Vallès Oriental tot aprofitant els recursos que ens ofereix el medi, però sense haver de renunciar al nivell de vida al qual ens hem acostumat durant els darrers anys. Després d'una recollida de dades i valoració del consum energètic mínim que volem imposar, hem dissenyat una instal·lació capaç d'aprofitar les energies renovables de l'entorn i de satisfer les nostres necessitats. La instal·lació inclou plaques fotovoltaiques, col·lectors solars, cisternes per a la recollida d'aigua, bombes, etc. Finalment, s'inclou un pressupost d'aquesta instal·lació, que indica que els diners que caldria invertir són massa. Es demostra, doncs, que l'autosuficiència d'aquesta casa és tècnicament viable, però econòmicament inviable (o gens rendible).
Resumo:
El problema de controlar les emissions de televisió digital a tota Europa pel desenvolupament de receptors robustos i fiables és cada vegada més significant, per això, sorgeix la necessitat d’automatitzar el procés d’anàlisi i control d’aquests senyals. Aquest projecte presenta el desenvolupament software d’una aplicació que vol solucionar una part d’aquest problema. L’aplicació s’encarrega d’analitzar, gestionar i capturar senyals de televisió digital. Aquest document fa una introducció a la matèria central que és la televisió digital i la informació que porten els senyals de televisió, concretament, la que es refereix a l’estàndard "Digital Video Broadcasting". A continuació d’aquesta part, l’escrit es concentra en l’explicació i descripció de les funcionalitats que necessita cobrir l'aplicació, així com introduir i explicar cada etapa d’un procés de desenvolupament software. Finalment, es resumeixen els avantatges de la creació d’aquest programa per l’automatització de l’anàlisi de senyal digital partint d’una optimització de recursos.
Resumo:
Early visual processing stages have been demonstrated to be impaired in schizophrenia patients and their first-degree relatives. The amplitude and topography of the P1 component of the visual evoked potential (VEP) are both affected; the latter of which indicates alterations in active brain networks between populations. At least two issues remain unresolved. First, the specificity of this deficit (and suitability as an endophenotype) has yet to be established, with evidence for impaired P1 responses in other clinical populations. Second, it remains unknown whether schizophrenia patients exhibit intact functional modulation of the P1 VEP component; an aspect that may assist in distinguishing effects specific to schizophrenia. We applied electrical neuroimaging analyses to VEPs from chronic schizophrenia patients and healthy controls in response to variation in the parafoveal spatial extent of stimuli. Healthy controls demonstrated robust modulation of the VEP strength and topography as a function of the spatial extent of stimuli during the P1 component. By contrast, no such modulations were evident at early latencies in the responses from patients with schizophrenia. Source estimations localized these deficits to the left precuneus and medial inferior parietal cortex. These findings provide insights on potential underlying low-level impairments in schizophrenia.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Dynamic stackelberg game with risk-averse players: optimal risk-sharing under asymmetric information
Resumo:
The objective of this paper is to clarify the interactive nature of the leader-follower relationship when both players are endogenously risk-averse. The analysis is placed in the context of a dynamic closed-loop Stackelberg game with private information. The case of a risk-neutral leader, very often discussed in the literature, is only a borderline possibility in the present study. Each player in the game is characterized by a risk-averse type which is unknown to his opponent. The goal of the leader is to implement an optimal incentive compatible risk-sharing contract. The proposed approach provides a qualitative analysis of adaptive risk behavior profiles for asymmetrically informed players in the context of dynamic strategic interactions modelled as incentive Stackelberg games.
Resumo:
Multiplier analysis based upon the information contained in Leontief's inverse is undoubtedly part of the core of the input-output methodology and numerous applications an extensions have been developed that exploit its informational content. Nonetheless there are some implicit theoretical assumptions whose implications have perhaps not been fully assessed. This is the case of the 'excess capacity' assumption. Because of this assumption resources are available as needed to adjust production to new equilibrium states. In real world applications, however, new resources are scarce and costly. Supply constraints kick in and hence resource allocation needs to take them into account to really assess the effect of government policies. Using a closed general equilibrium model that incorporates supply constraints, we perform some simple numerical exercises and proceed to derive a 'constrained' multiplier matrix that can be compared with the standard 'unrestricted' multiplier matrix. Results show that the effectiveness of expenditure policies hinges critically on whether or not supply constraints are considered.
Resumo:
We propose a definition of egalitarian equivalence that extends Pazner and Schmeidler's (1978) concept to environments with incomplete information. If every feasible allocation rule can be implemented by an incentive compatible mechanism (as, for instance, in the case of non-exclusive information), then interim egalitarian equivalence and interim incentive efficiency remain compatible, as they were under complete information. When incentive constraints are more restrictive, on the other hand, the two criteria may become incompatible.
Resumo:
OBJECTIVE: To reach a consensus on the clinical use of ambulatory blood pressure monitoring (ABPM). METHODS: A task force on the clinical use of ABPM wrote this overview in preparation for the Seventh International Consensus Conference (23-25 September 1999, Leuven, Belgium). This article was amended to account for opinions aired at the conference and to reflect the common ground reached in the discussions. POINTS OF CONSENSUS: The Riva Rocci/Korotkoff technique, although it is prone to error, is easy and cheap to perform and remains worldwide the standard procedure for measuring blood pressure. ABPM should be performed only with properly validated devices as an accessory to conventional measurement of blood pressure. Ambulatory recording of blood pressure requires considerable investment in equipment and training and its use for screening purposes cannot be recommended. ABPM is most useful for identifying patients with white-coat hypertension (WCH), also known as isolated clinic hypertension, which is arbitrarily defined as a clinic blood pressure of more than 140 mmHg systolic or 90 mmHg diastolic in a patient with daytime ambulatory blood pressure below 135 mmHg systolic and 85 mmHg diastolic. Some experts consider a daytime blood pressure below 130 mmHg systolic and 80 mmHg diastolic optimal. Whether WCH predisposes subjects to sustained hypertension remains debated. However, outcome is better correlated to the ambulatory blood pressure than it is to the conventional blood pressure. Antihypertensive drugs lower the clinic blood pressure in patients with WCH but not the ambulatory blood pressure, and also do not improve prognosis. Nevertheless, WCH should not be left unattended. If no previous cardiovascular complications are present, treatment could be limited to follow-up and hygienic measures, which should also account for risk factors other than hypertension. ABPM is superior to conventional measurement of blood pressure not only for selecting patients for antihypertensive drug treatment but also for assessing the effects both of non-pharmacological and of pharmacological therapy. The ambulatory blood pressure should be reduced by treatment to below the thresholds applied for diagnosing sustained hypertension. ABPM makes the diagnosis and treatment of nocturnal hypertension possible and is especially indicated for patients with borderline hypertension, the elderly, pregnant women, patients with treatment-resistant hypertension and patients with symptoms suggestive of hypotension. In centres with sufficient financial resources, ABPM could become part of the routine assessment of patients with clinic hypertension. For patients with WCH, it should be repeated at annual or 6-monthly intervals. Variation of blood pressure throughout the day can be monitored only by ABPM, but several advantages of the latter technique can also be obtained by self-measurement of blood pressure, a less expensive method that is probably better suited to primary practice and use in developing countries. CONCLUSIONS: ABPM or equivalent methods for tracing the white-coat effect should become part of the routine diagnostic and therapeutic procedures applied to treated and untreated patients with elevated clinic blood pressures. Results of long-term outcome trials should better establish the advantage of further integrating ABPM as an accessory to conventional sphygmomanometry into the routine care of hypertensive patients and should provide more definite information on the long-term cost-effectiveness. Because such trials are not likely to be funded by the pharmaceutical industry, governments and health insurance companies should take responsibility in this regard.
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.