992 resultados para Materialism historical dialectical method
Resumo:
Malaria has always been an important public health problem in Brazil. The early history of Brazilian malaria and its control was powered by colonisation by Europeans and the forced relocation of Africans as slaves. Internal migration brought malaria to many regions in Brazil where, given suitableAnopheles mosquito vectors, it thrived. Almost from the start, officials recognised the problem malaria presented to economic development, but early control efforts were hampered by still developing public health control and ignorance of the underlying biology and ecology of malaria. Multiple regional and national malaria control efforts have been attempted with varying success. At present, the Amazon Basin accounts for 99% of Brazil’s reported malaria cases with regional increases in incidence often associated with large scale public works or migration. Here, we provide an exhaustive summary of primary literature in English, Spanish and Portuguese regarding Brazilian malaria control. Our goal was not to interpret the history of Brazilian malaria control from a particular political or theoretical perspective, but rather to provide a straightforward, chronological narrative of the events that have transpired in Brazil over the past 200 years and identify common themes.
Resumo:
Laboratory safety data are routinely collected in clinical studies for safety monitoring and assessment. We have developed a truncated robust multivariate outlier detection method for identifying subjects with clinically relevant abnormal laboratory measurements. The proposed method can be applied to historical clinical data to establish a multivariate decision boundary that can then be used for future clinical trial laboratory safety data monitoring and assessment. Simulations demonstrate that the proposed method has the ability to detect relevant outliers while automatically excluding irrelevant outliers. Two examples from actual clinical studies are used to illustrate the use of this method for identifying clinically relevant outliers.
Resumo:
How can we best understand the emergence of the European Security and Defence Policy (ESDP)? This paper applies the theories of historical institutionalism and experiential learning to offer a dynamic conceptualisation of moves towards an ESDP which highlights some of the causal factors that a more temporally-restricted analysis would miss. It firstly shows how the institutional and functional expansion of European Political Cooperation (EPC) over the course of the 1970s and 80s gave rise to a context in which the development of a security and defence dimension came to be viewed as more logical and even necessary. It then goes on to analyse some of the external factors (in the form of actors, events and institutions) that further pushed in this direction and proved to influence the policy’s subsequent evolution. The paper is therefore intended to act as a first-step to understanding the ESDP’s development from this perspective.
Resumo:
The work presented in this paper belongs to the power quality knowledge area and deals with the voltage sags in power transmission and distribution systems. Propagating throughout the power network, voltage sags can cause plenty of problems for domestic and industrial loads that can financially cost a lot. To impose penalties to responsible party and to improve monitoring and mitigation strategies, sags must be located in the power network. With such a worthwhile objective, this paper comes up with a new method for associating a sag waveform with its origin in transmission and distribution networks. It solves this problem through developing hybrid methods which hire multiway principal component analysis (MPCA) as a dimension reduction tool. MPCA reexpresses sag waveforms in a new subspace just in a few scores. We train some well-known classifiers with these scores and exploit them for classification of future sags. The capabilities of the proposed method for dimension reduction and classification are examined using the real data gathered from three substations in Catalonia, Spain. The obtained classification rates certify the goodness and powerfulness of the developed hybrid methods as brand-new tools for sag classification
Resumo:
Objectives. The goal of this study is to evaluate a T2-mapping sequence by: (i) measuring the reproducibility intra- and inter-observer variability in healthy volunteers in two separate scanning session with a T2 reference phantom; (2) measuring the mean T2 relaxation times by T2-mapping in infarcted myocardium in patients with subacute MI and compare it with patient's the gold standard X-ray coronary angiography and healthy volunteers results. Background. Myocardial edema is a consequence of an inflammation of the tissue, as seen in myocardial infarct (MI). It can be visualized by cardiovascular magnetic resonance (CMR) imaging using the T2 relaxation time. T2-mapping is a quantitative methodology that has the potential to address the limitation of the conventional T2-weighted (T2W) imaging. Methods. The T2-mapping protocol used for all MRI scans consisted in a radial gradient echo acquisition with a lung-liver navigator for free-breathing acquisition and affine image registration. Mid-basal short axis slices were acquired.T2-maps analyses: 2 observers semi- automatically segmented the left ventricle in 6 segments accordingly to the AHA standards. 8 healthy volunteers (age: 27 ± 4 years; 62.5% male) were scanned in 2 separate sessions. 17 patients (age : 61.9 ± 13.9 years; 82.4% male) with subacute STEMI (70.6%) and NSTEMI underwent a T2-mapping scanning session. Results. In healthy volunteers, the mean inter- and intra-observer variability over the entire short axis slice (segment 1 to 6) was 0.1 ms (95% confidence interval (CI): -0.4 to 0.5, p = 0.62) and 0.2 ms (95% CI: -2.8 to 3.2, p = 0.94, respectively. T2 relaxation time measurements with and without the correction of the phantom yielded an average difference of 3.0 ± 1.1 % and 3.1 ± 2.1 % (p = 0.828), respectively. In patients, the inter-observer variability in the entire short axis slice (S1-S6), was 0.3 ms (95% CI: -1.8 to 2.4, p = 0.85). Edema location as determined through the T2-mapping and the coronary artery occlusion as determined on X-ray coronary angiography correlated in 78.6%, but only in 60% in apical infarcts. All except one of the maximal T2 values in infarct patients were greater than the upper limit of the 95% confidence interval for normal myocardium. Conclusions. The T2-mapping methodology is accurate in detecting infarcted, i.e. edematous tissue in patients with subacute infarcts. This study further demonstrated that this T2-mapping technique is reproducible and robust enough to be used on a segmental basis for edema detection without the need of a phantom to yield a T2 correction factor. This new quantitative T2-mapping technique is promising and is likely to allow for serial follow-up studies in patients to improve our knowledge on infarct pathophysiology, on infarct healing, and for the assessment of novel treatment strategies for acute infarctions.
Resumo:
Rapport de synthèse : Objectif : Le but de ce travail est d`étudier l'angiographie par scanner multi-barrette (AS) dans l'évaluation de l'artériopathie oblitérante (AOMI) de l'aorte abdominale et des membres inférieurs utilisant une méthode adaptative d'acquisition pour optimiser le rehaussement artériel en particulier pour le lit artériel distal et les artères des pieds. Matériels et méthodes : Trente-quatre patients pressentant une AOMI ont bénéficié d'une angiographie trans-cathéter (ATC) et d'une AS dans un délai inférieur ou égal à 15 jours. L'AS a été effectuée du tronc coeliaque jusqu'aux artères des pieds en une seule acquisition utilisant une haute résolution spatiale (16x0.625 mm). La vitesse de table et le temps de rotation pour chaque examen ont été choisis selon le temps de transit du produit de contraste, obtenu après un bolus test. Une quantité totale de 130 ml de contraste à 4 ml/s a été utilisée. L'analyse des images de l'AS a été effectuée par deux observateurs et les données ATC ont été interprétées de manière indépendante par deux autres observateurs. L'analyse a inclus la qualité de l'image et la détection de sténose supérieure ou égale à 50 % par patient et par segment artériel. La sensibilité et la spécificité de l'AS ont été calculées en considérant l'ATC comme examen de référence. La variabilité Interobservateur a été mesurée au moyen d'une statistique de kappa. Résultas : L'ATC a été non-conclusive dans 0.7 % des segments, tandis que l'AS était conclusive dans tous les segments. Sur l'analyse par patient, la sensibilité et la spécificité totales pour détecter une sténose significative égale ou supérieure à 50 % étaient de 100 %. L'analyse par segment a montré des sensibilités et de spécificités variant respectivement de 91 à 100 % et de 81 à 100 %. L'analyse des artères distales des pieds a révélé une sensibilité de 100 % et une spécificité de 90 %. Conclusion : L'angiographie par CT multi-barrettes utilisant cette méthode adaptative d'acquisition améliore la qualité de l'image et fournit une technique non-invasive et fiable pour évaluer L'AOMI, y compris les artères distales des pieds.
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression
Resumo:
BACKGROUND: The use of the family history method is recommended in family studies as a type of proxy interview of non-participating relatives. However, using different sources of information can result in bias as direct interviews may provide a higher likelihood of assigning diagnoses than family history reports. The aims of the present study were to: 1) compare diagnoses for threshold and subthreshold mood syndromes from interviews to those relying on information from relatives; 2) test the appropriateness of lowering the diagnostic threshold and combining multiple reports from the family history method to obtain comparable prevalence estimates to the interviews; 3) identify factors that influence the likelihood of agreement and reporting of disorders by informants. METHODS: Within a family study, 1621 informant-index subject pairs were identified. DSM-5 diagnoses from direct interviews of index subjects were compared to those derived from family history information provided by their first-degree relatives. RESULTS: 1) Inter-informant agreement was acceptable for Mania, but low for all other mood syndromes. 2) Except for Mania and subthreshold depression, the family history method provided significantly lower prevalence estimates. The gap improved for all other syndromes after lowering the threshold of the family history method. 3) Individuals who had a history of depression themselves were more likely to report depression in their relatives. LIMITATIONS: Low proportion of affected individuals for manic syndromes and lack of independence of data. CONCLUSIONS: The higher likelihood of reporting disorders by affected informants entails the risk of overestimation of the size of familial aggregation of depression.
Resumo:
This book gives a general view of sequence analysis, the statistical study of successions of states or events. It includes innovative contributions on life course studies, transitions into and out of employment, contemporaneous and historical careers, and political trajectories. The approach presented in this book is now central to the life-course perspective and the study of social processes more generally. This volume promotes the dialogue between approaches to sequence analysis that developed separately, within traditions contrasted in space and disciplines. It includes the latest developments in sequential concepts, coding, atypical datasets and time patterns, optimal matching and alternative algorithms, survey optimization, and visualization. Field studies include original sequential material related to parenting in 19th-century Belgium, higher education and work in Finland and Italy, family formation before and after German reunification, French Jews persecuted in occupied France, long-term trends in electoral participation, and regime democratization. Overall the book reassesses the classical uses of sequences and it promotes new ways of collecting, formatting, representing and processing them. The introduction provides basic sequential concepts and tools, as well as a history of the method. Chapters are presented in a way that is both accessible to the beginner and informative to the expert.
Resumo:
Aortic stiffness is an independent predictor factor for cardiovascular risk. Different methods for determining pulse wave velocity (PWV) are used, among which the most common are mechanical methods such as SphygmoCor or Complior, which require specific devices and are limited by technical difficulty in obtaining measurements. Doppler guided by 2D ultrasound is a good alternative to these methods. We studied 40 patients (29 male, aged 21 to 82 years) comparing the Complior method with Doppler. Agreement of both devices was high (R = 0.91, 0.84-0.95, 95% CI). The reproducibility analysis revealed no intra-nor interobserver differences. Based on these results, we conclude that Doppler ultrasound is a reliable and reproducible alternative to other established methods for themeasurement of aortic PWV
Resumo:
Chromatin immunoprecipitation followed by deep sequencing (ChIP-seq) experiments are widely used to determine, within entire genomes, the occupancy sites of any protein of interest, including, for example, transcription factors, RNA polymerases, or histones with or without various modifications. In addition to allowing the determination of occupancy sites within one cell type and under one condition, this method allows, in principle, the establishment and comparison of occupancy maps in various cell types, tissues, and conditions. Such comparisons require, however, that samples be normalized. Widely used normalization methods that include a quantile normalization step perform well when factor occupancy varies at a subset of sites, but may miss uniform genome-wide increases or decreases in site occupancy. We describe a spike adjustment procedure (SAP) that, unlike commonly used normalization methods intervening at the analysis stage, entails an experimental step prior to immunoprecipitation. A constant, low amount from a single batch of chromatin of a foreign genome is added to the experimental chromatin. This "spike" chromatin then serves as an internal control to which the experimental signals can be adjusted. We show that the method improves similarity between replicates and reveals biological differences including global and largely uniform changes.
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
Iowa has 8 commercial service airports and 105 general aviation airports, of which three serve as reliever airports. ***NOTE*** This document is for historical viewing, the internal information is no longer current or accurate! ***NOTE*** Current information can be found at http://www.iowadot.gov/aviation/aircraftregistration/registration.aspx ***NOTE***