884 resultados para Problem analysis
Resumo:
Some past studies analyzed Spanish monetary policy with the standard VAR. Their problem is that this method obliges researchers to impose a certain extreme form of the short run policy rule on their models. Hence, it does not allow researchers to study the possibility of structural changes in this rule, either. This paper overcomes these problems by using the structural VAR. I find that the rule has always been that of partial accommodation. Prior to 1984, it was quite close to money targeting. After 1984, it became closer to the interest rate targeting, with more emphasis on the exchange rate.
Resumo:
This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.
Resumo:
OBJECTIVES: To provide a global, up-to-date picture of the prevalence, treatment, and outcomes of Candida bloodstream infections in intensive care unit patients and compare Candida with bacterial bloodstream infection. DESIGN: A retrospective analysis of the Extended Prevalence of Infection in the ICU Study (EPIC II). Demographic, physiological, infection-related and therapeutic data were collected. Patients were grouped as having Candida, Gram-positive, Gram-negative, and combined Candida/bacterial bloodstream infection. Outcome data were assessed at intensive care unit and hospital discharge. SETTING: EPIC II included 1265 intensive care units in 76 countries. PATIENTS: Patients in participating intensive care units on study day. INTERVENTIONS: None. MEASUREMENT AND MAIN RESULTS: Of the 14,414 patients in EPIC II, 99 patients had Candida bloodstream infections for a prevalence of 6.9 per 1000 patients. Sixty-one patients had candidemia alone and 38 patients had combined bloodstream infections. Candida albicans (n = 70) was the predominant species. Primary therapy included monotherapy with fluconazole (n = 39), caspofungin (n = 16), and a polyene-based product (n = 12). Combination therapy was infrequently used (n = 10). Compared with patients with Gram-positive (n = 420) and Gram-negative (n = 264) bloodstream infections, patients with candidemia were more likely to have solid tumors (p < .05) and appeared to have been in an intensive care unit longer (14 days [range, 5-25 days], 8 days [range, 3-20 days], and 10 days [range, 2-23 days], respectively), but this difference was not statistically significant. Severity of illness and organ dysfunction scores were similar between groups. Patients with Candida bloodstream infections, compared with patients with Gram-positive and Gram-negative bloodstream infections, had the greatest crude intensive care unit mortality rates (42.6%, 25.3%, and 29.1%, respectively) and longer intensive care unit lengths of stay (median [interquartile range]) (33 days [18-44], 20 days [9-43], and 21 days [8-46], respectively); however, these differences were not statistically significant. CONCLUSION: Candidemia remains a significant problem in intensive care units patients. In the EPIC II population, Candida albicans was the most common organism and fluconazole remained the predominant antifungal agent used. Candida bloodstream infections are associated with high intensive care unit and hospital mortality rates and resource use.
Resumo:
Introduction: Emergency services (ES) are often faced with agitated,confused or aggressive patients. Such situations may require physicalrestraint. The prevalence of these measures is poorly documented,concerning 1 to 10% of patients admitted in the ES. The indications forrestraint, the context and the related complications are poorly studied.The emergency service and the security service of our hospital havedocumented physical restraint for several years, using specific protocolsintegrated into the medical records. The study evaluated the magnitudeof the problem, the patient characteristics, and degree of adherence tothe restraint protocol.Methods: Retrospective study of physical restraint used on adultpatients in the ES in 2009. The study included analysis of medical anddemographic characteristics, indications justifying restraint and qualityof restraint documentation. Patients were identified from computerizedES and security service records. The data were supplemented byexamination of patients' medical records.Results: In 2009, according to the security service, 390 patients (1%)were physically restrained in the ES. The ES computerized systemidentified only 196 patients. Most patients were male (62%). The medianage was 40 years (15-98 years; P90 = 80 years). 63 % of the situationsoccurred between 18h00 and 6h00, and most frequently on Saturday(19%). Substance or alcohol abuse was present in 48.7% of cases andacute psychiatric crisis was mentioned in 16.7%. In most cases,restraint was motivated by extreme agitation or auto / hetero-aggressiveviolence. Most patients (68 %) were restrained with upper limb andabdominal restraints. More than three anatomic restraints werenecessary in 52 % of the patients. Intervention of security guards wasrequired in 77% of the cases. 61 restraint protocols (31 %) were missingand 57% of the records were incomplete. In many cases, the protocolsdid not include the signature of the physician (22%) or of the nurse(43.8%). Medical records analysis did not allow reliable estimation ofthe number of restraint-induced complications.Conclusions: Physical restraint is most often motivated by majoragitation and/or secondary to substance abuse. Caregivers regularlycall security guards for help. Restraint documentation is often missing orincomplete, requiring major improvement in education and prescription.
Resumo:
This article builds on the recent policy diffusion literature and attempts to overcome one of its major problems, namely the lack of a coherent theoretical framework. The literature defines policy diffusion as a process where policy choices are interdependent, and identifies several diffusion mechanisms that specify the link between the policy choices of the various actors. As these mechanisms are grounded in different theories, theoretical accounts of diffusion currently have little internal coherence. In this article we put forward an expected-utility model of policy change that is able to subsume all the diffusion mechanisms. We argue that the expected utility of a policy depends on both its effectiveness and the payoffs it yields, and we show that the various diffusion mechanisms operate by altering these two parameters. Each mechanism affects one of the two parameters, and does so in distinct ways. To account for aggregate patterns of diffusion, we embed our model in a simple threshold model of diffusion. Given the high complexity of the process that results, strong analytical conclusions on aggregate patterns cannot be drawn without more extensive analysis which is beyond the scope of this article. However, preliminary considerations indicate that a wide range of diffusion processes may exist and that convergence is only one possible outcome.
Resumo:
The speed and width of front solutions to reaction-dispersal models are analyzed both analytically and numerically. We perform our analysis for Laplace and Gaussian distribution kernels, both for delayed and nondelayed models. The results are discussed in terms of the characteristic parameters of the models
Resumo:
RESUME Le diabète de type 1 se définit comme un désordre métabolique d'origine auto-immune qui aboutit à la destruction progressive et sélective de la cellule ß-pancréatique sécrétrice d'insuline. Cette maladie représente 10 % des cas de diabète enregistrés dans la population mondiale, et touche les jeunes de moins de 20 ans. Le traitement médical par insulinothérapie corrige le manque d'hormone mais ne prévient pas les nombreuses complications telles que les atteintes cardiaques, neurologiques, rénales, rétiniennes, et les amputations que la maladie provoque. Le remplacement de la cellule ß par transplantation d'îlots de Langerhans est une alternative prometteuse au traitement médical du diabète de type 1. Cependant la greffe d'îlots est encore un traitement expérimental et ne permet pas un contrôle efficace de la glycémie au long terme chez les patients transplantés, et les raisons de cet échec restent mal comprises. L'obstacle immédiat qui se pose est la purification d'un nombre suffisant d'îlots viables et la perte massive de ces îlots dans les premières heures suite à la greffe. Cette tendance presque systématique de la perte fonctionnelle du greffon immédiatement après la transplantation est connue sous le terme de « primary graft non-function » (PNF). En effet, la procédure d'isolement des îlots provoque la destruction des composantes cellulaires et non cellulaires du tissu pancréatique qui jouent un rôle déterminant dans le processus de survie de l'îlot. De plus, la transplantation elle-même expose les cellules à différents stress, notamment le stress par les cytokines inflammatoires qui encourage la mort cellulaire par apoptose et provoque par la suite le rejet de la greffe. L'ensemble de ces mécanismes aboutit a une perte de la masse d'îlot estimée a plus de 60%. Dans ce contexte, nous nous sommes intéressés à définir les voies majeures de stress qui régissent cette perte massive d'îlot par apoptose lors du processus d'isolement et suite à l'exposition immédiate aux cytokines. L'ensemble des résultats obtenus indique que plusieurs voies de signalisation intracellulaire sont recrutées qui s'activent de manière maximale très tôt lors des premières phases de l'isolement. La mise en culture des îlots deux jours permet aux voies activées de revenir aux taux de base. De ce fait nous proposons une stratégie dite de protection qui doit être 1) initiée aussitôt que possible lors de l'isolement des îlots pancréatiques, 2) devrait probablement bloquer l'activation de ces différentes voies de stress mis en évidence lors de notre étude et 3) devrait inclure la mise en culture des îlots purifiés deux jours après l'isolement et avant la transplantation. RESUME LARGE PUBLIC Le diabète est une maladie qui entraîne un taux anormalement élevé de sucre (glucose) dans le sang du à une insuffisance du pancréas endocrine à produire de l'insuline, une hormone qui régule la glycémie (taux de glucose dans le sang). On distingue deux types majeurs de diabètes; le diabète de type 1 ou juvénile ou encore appelé diabète maigre qui se manifeste souvent pendant l'enfance et qui se traduit par une déficience absolue en insuline. Le diabète de type 2 ou diabète gras est le plus fréquent, et touche les sujets de plus de 40 ans qui souffrent d'obésité et qui se traduit par une dysfonction de la cellule ß avec une incapacité à réguler la glycémie malgré la production d'insuline. Dans le diabète de type 1, la destruction de la cellule ß est programmée (apoptose) et est majoritairement provoquée par des médiateurs inflammatoires appelés cytokines qui sont produites localement par des cellules inflammatoires du système immunitaire qui envahissent la cellule ß-pancréatiques. Les cytokines activent différentes voies de signalisation parmi lesquelles on distingue celles des Mitogen-Activated Protein Kinase (MAPKs) composées de trois familles de MAPKs: ERK1/2, p38, et JNK, et la voie NF-κB. Le traitement médical par injections quotidiennes d'insuline permet de contrôler la glycémie mais ne prévient pas les nombreuses complications secondaires liées à cette maladie. La greffe d'îlots de Langerhans est une alternative possible au traitement médical, considérée avantageuse comparée a la greffe du pancréas entier. En effet l'embolisation d'îlots dans le foie par injection intraportale constitue une intervention simple sans complications majeures. Néanmoins la technique de préparation d'îlots altère la fonction endocrine et cause la perte massive d'îlots pancréatiques. De plus, la transplantation elle-même expose la cellule ß à différents stress, notamment le stress par les cytokines inflammatoires qui provoque le rejet de greffon cellulaire. Dans la perspective d'augmenter les rendements des îlots purifiés, nous nous sommes intéressés à définir les voies majeures de stress qui régissent cette perte massive d'îlot lors du processus d'isolement et suite à l'exposition immédiate aux cytokines après transplantation. L'ensemble de ces résultats indique que le stress induit lors de l'isolement des îlots et celui des cytokines recrute différentes voies de signalisation intracellulaire (JNK, p38 et NF-κB) qui s'additionnent entre-elles pour altérer la fonction et la viabilité de l'îlot. De ce fait une stratégie doit être mise en place pour bloquer toute action synergique entre ces différentes voies activées pour améliorer la viabilité et la fonction de la cellule ß lors du greffon cellulaire. SUMMARY Type 1 diabetes mellitus (T1DM) is an autoimmune disease characterized by the progressive and selective destruction of the pancreatic ß-cells that secrete insulin, leading to absolute insulin deficiency. T1DM accounts for about 10% of all diabetes cases, affecting persons younger than 20 years of age. Medical treatment using daily exogenous insulin injection corrects hormone deficiency but does not prevent devastating complications such as heart attack, neuropathy, kidney failure, blindness, and amputation caused by the disease. Pancreatic islet transplantation (PIT) is one strategy that holds promise to cure patients with T1DM, but purified pancreatic islet grafts have failed to maintain long-term glucose homeostasis in human recipients, the reasons for this failure being still poorly understood. There is however a more immediate problem with islet grafting that is dependent upon poor islet recovery from donors and early islet loss following the first hours of grafting. This tendency of islet grafts to fail to function within a short period after transplantation is termed primary graft non-function (PNF). Indeed, the islet isolation procedure itself destroys cellular and non-cellular components of the pancreas that may play a role in supporting islet survival. Further, islet transplantation exposes cells to a variety of stressful stimuli, notably pro-inflammatory cytokines that encourage ß-cell death by apoptosis and lead to early graft failure. Altogether these mechanisms lead to an estimated loss of 60% of the total islet mass. Here, we have mapped the major intracellular stress signaling pathways that may mediate human islet loss by apoptosis during isolation and following cytokine attack. We found that several stress pathways are maximally activated from the earliest stages of the isolation procedure. Culturing islet for two days allow for the activated pathways to return to basal levels. We propose that protective strategies should 1) be initiated as early as possible during isolation of the islets, 2) should probably target the activated stress pathways that we uncovered during our studies and 3) should include culturing islets for two days post-isolation and prior transplantation.
Resumo:
We present a novel approach for analyzing single-trial electroencephalography (EEG) data, using topographic information. The method allows for visualizing event-related potentials using all the electrodes of recordings overcoming the problem of previous approaches that required electrode selection and waveforms filtering. We apply this method to EEG data from an auditory object recognition experiment that we have previously analyzed at an ERP level. Temporally structured periods were statistically identified wherein a given topography predominated without any prior information about the temporal behavior. In addition to providing novel methods for EEG analysis, the data indicate that ERPs are reliably observable at a single-trial level when examined topographically.
Resumo:
To study different temporal components on cancer mortality (age, period and cohort) methods of graphic representation were applied to Swiss mortality data from 1950 to 1984. Maps using continuous slopes ("contour maps") and based on eight tones of grey according to the absolute distribution of rates were used to represent the surfaces defined by the matrix of various age-specific rates. Further, progressively more complex regression surface equations were defined, on the basis of two independent variables (age/cohort) and a dependent one (each age-specific mortality rate). General patterns of trends in cancer mortality were thus identified, permitting definition of important cohort (e.g., upwards for lung and other tobacco-related neoplasms, or downwards for stomach) or period (e.g., downwards for intestines or thyroid cancers) effects, besides the major underlying age component. For most cancer sites, even the lower order (1st to 3rd) models utilised provided excellent fitting, allowing immediate identification of the residuals (e.g., high or low mortality points) as well as estimates of first-order interactions between the three factors, although the parameters of the main effects remained still undetermined. Thus, the method should be essentially used as summary guide to illustrate and understand the general patterns of age, period and cohort effects in (cancer) mortality, although they cannot conceptually solve the inherent problem of identifiability of the three components.
Resumo:
Introduction: Emergency services (ES) are often faced with agitated,confused or aggressive patients. Such situations may require physicalrestraint. The prevalence of these measures is poorly documented,concerning 1 to 10% of patients admitted in the ES. The indications forrestraint, the context and the related complications are poorly studied.The emergency service and the security service of our hospital havedocumented physical restraint for several years, using specific protocolsintegrated into the medical records. The study evaluated the magnitudeof the problem, the patient characteristics, and degree of adherence tothe restraint protocol.Methods: Retrospective study of physical restraint used on adultpatients in the ES in 2009. The study included analysis of medical anddemographic characteristics, indications justifying restraint and qualityof restraint documentation. Patients were identified from computerizedES and security service records. The data were supplemented byexamination of patients' medical records.Results: In 2009, according to the security service, 390 patients (1%)were physically restrained in the ES. The ES computerized systemidentified only 196 patients. Most patients were male (62%). The medianage was 40 years (15-98 years; P90 = 80 years). 63 % of the situationsoccurred between 18h00 and 6h00, and most frequently on Saturday(19%). Substance or alcohol abuse was present in 48.7% of cases andacute psychiatric crisis was mentioned in 16.7%. In most cases,restraint was motivated by extreme agitation or auto / hetero-aggressiveviolence. Most patients (68 %) were restrained with upper limb andabdominal restraints. More than three anatomic restraints werenecessary in 52 % of the patients. Intervention of security guards wasrequired in 77% of the cases. 61 restraint protocols (31 %) were missingand 57% of the records were incomplete. In many cases, the protocolsdid not include the signature of the physician (22%) or of the nurse(43.8%). Medical records analysis did not allow reliable estimation ofthe number of restraint-induced complications.Conclusions: Physical restraint is most often motivated by majoragitation and/or secondary to substance abuse. Caregivers regularlycall security guards for help. Restraint documentation is often missing orincomplete, requiring major improvement in education and prescription.
Resumo:
Tractography algorithms provide us with the ability to non-invasively reconstruct fiber pathways in the white matter (WM) by exploiting the directional information described with diffusion magnetic resonance. These methods could be divided into two major classes, local and global. Local methods reconstruct each fiber tract iteratively by considering only directional information at the voxel level and its neighborhood. Global methods, on the other hand, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The latter have shown improvements compared to previous techniques but these algorithms still suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are usually considered during the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the WM; this violates important properties of neural connections, which are known to originate in the gray matter (GM) and develop in the WM. Hence, this shortcoming poses serious limitations for the use of these techniques for the assessment of the structural connectivity between brain regions and, de facto, it can potentially bias any subsequent analysis. Moreover, the estimated tracts are not quantitative, every fiber contributes with the same weight toward the predicted diffusion signal. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications which: (i) explicitly enforces anatomical priors of the tracts in the optimization and (ii) considers the effective contribution of each of them, i.e., volume, to the acquired diffusion magnetic resonance imaging (MRI) image. We evaluated our approach on both a realistic diffusion MRI phantom and in vivo data, and also compared its performance to existing tractography algorithms.
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
The authors are discussing the results of the international literature with regards to referrals between ambulatory physicians. There are still few studies on this problem and the methodologies used are often too different to make valid comparisons. However, the earned results suggest more questions than they give answers to the determinants of the referral process. This can be explained by the multidimensionality of factors which are involved in the decision to refer a patient to another practitioner, particularly by the complex interaction between the characteristics of each patient, practitioner and the sanitary system itself.
Resumo:
The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.