819 resultados para Observational Methodology
Resumo:
Aquest estudi té com objectiu observar les relacions entre les estratègies de cura (formal, informal, mixta) que fan servir els cuidadors de persones grans dependents, la seva situació social i les seves motivacions per la decisió respecte a com fan la cura. L’estat del tema destaca el predomini de les cures informals sobre les formals en els models de benestar mediterranis i la rellevància de la interacció entre factors personals socioculturals i les polítiques socials en la presa de decisions individuals sobre la cura de la dependència. La llei de la dependència, de recent implementació a l’Estat espanyol, ha universalitzat l’accés als recursos formals, creant un nou paradigma d’interacció cuidadors-recursos. Es tracta d’un estudi observacional, transversal, descriptiu de tipus mixt quantitatiu/qualitatiu realitzat a partir d’entrevistes individuals als cuidadors de dependents ingressats en una unitat geriàtrica d’atenció intermèdia. Es recullen dades sobre el context sociofamiliar, l’estratègia de cura, l’autopercepció i les motivacions. Els resultats mostren que els cuidadors combinen prestacions econòmiques i serveis (públics i privats) per adaptar al màxim l’estratègia a les condicions del dependent i a les seves pròpies. Tenen la convicció generalitzada que l’atenció cal fer-la al domicili per motius de reciprocitat i respecte a la persona cuidada. El pas a l’atenció residencial és una decisió molt difícil pels cuidadors. La implantació de la llei de la dependència ha normalitzat la relació entre cuidadors i recursos formals, però la burocratització i la insuficiència de l’oferta de serveis no afavoreixen canvis substancials en la provisió de l’ajut, que continua essent majoritàriament informal. La millora en la percepció de continuïtat d’atenció entre el domicili i la residència, i també en la gestió i l’oferta de serveis formals públics es presenten com a reptes de treball importants al nostre país.
Resumo:
During the period 1996-2000, forty-three heavy rainfall events have been detected in the Internal Basins of Catalonia (Northeastern of Spain). Most of these events caused floods and serious damage. This high number leads to the need for a methodology to classify them, on the basis of their surface rainfall distribution, their internal organization and their physical features. The aim of this paper is to show a methodology to analyze systematically the convective structures responsible of those heavy rainfall events on the basis of the information supplied by the meteorological radar. The proposed methodology is as follows. Firstly, the rainfall intensity and the surface rainfall pattern are analyzed on the basis of the raingauge data. Secondly, the convective structures at the lowest level are identified and characterized by using a 2-D algorithm, and the convective cells are identified by using a 3-D procedure that looks for the reflectivity cores in every radar volume. Thirdly, the convective cells (3-D) are associated with the 2-D structures (convective rainfall areas). This methodology has been applied to the 43 heavy rainfall events using the meteorological radar located near Barcelona and the SAIH automatic raingauge network.
Resumo:
This paper presents an observational study of the tornado outbreak that took place on the 7 September 2005 in the Llobregat delta river, affecting a densely populated and urbanised area and the Barcelona International airport (NE Spain). The site survey confirmed at least five short-lived tornadoes. Four of them were weak (F0, F1) and the other one was significant (F2 on the Fujita scale). They started mostly as waterspouts and moved later inland causing extensive damage estimated in 9 million Euros, three injured people but fortunately no fatalities. Large scale forcing was provided by upper level diffluence and low level warm air advection. Satellite and weather radar images revealed the development of the cells that spawned the waterspouts along a mesoscale convergence line in a highly sheared and relatively low buoyant environment. Further analysis indicated characteristics that could be attributed indistinctively to non-supercell or to mini-supercell thunderstorms.
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
A statistical methodology for the objective comparison of LDI-MS mass spectra of blue gel pen inks was evaluated. Thirty-three blue gel pen inks previously studied by RAMAN were analyzed directly on the paper using both positive and negative mode. The obtained mass spectra were first compared using relative areas of selected peaks using the Pearson correlation coefficient and the Euclidean distance. Intra-variability among results from one ink and inter-variability between results from different inks were compared in order to choose a differentiation threshold minimizing the rate of false negative (i.e. avoiding false differentiation of the inks). This yielded a discriminating power of up to 77% for analysis made in the negative mode. The whole mass spectra were then compared using the same methodology, allowing for a better DP in the negative mode of 92% using the Pearson correlation on standardized data. The positive mode results generally yielded a lower differential power (DP) than the negative mode due to a higher intra-variability compared to the inter-variability in the mass spectra of the ink samples.
Resumo:
On 1 January 2012 Swiss Diagnosis Related Groups (DRG), a new uniform payment system for in-patients was introduced in Switzerland with the intention to replace a "cost-based" with a "case-based" reimbursement system to increase efficiency. With the introduction of the new payment system we aim to answer questions raised regarding length of stay as well as patients' outcome and satisfaction. This is a prospective, two-centre observational cohort study with data from University Hospital Basel and the Cantonal Hospital Aarau, Switzerland, from January to June 2011 and 2012, respectively. Consecutive in-patients with the main diagnosis of either community-acquired pneumonia, exacerbation of COPD, acute heart failure or hip fracture were included. A questionnaire survey was sent out after discharge investigating changes before and after SwissDRG implementation. Our primary endpoint was LOS. Of 1,983 eligible patients 841 returned the questionnaire and were included into the analysis (429 in 2011, 412 in 2012). The median age was 76.7 years (50.8% male). Patients in the two years were well balanced in regard to main diagnoses and co-morbidities. Mean LOS in the overall patient population was 10.0 days and comparable between the 2011 cohort and the 2012 cohort (9.7 vs 10.3; p = 0.43). Overall satisfaction with care changed only slightly after introduction of SwissDRG and remained high (89.0% vs 87.8%; p = 0.429). Investigating the influence of the implementation of SwissDRG in 2012 regarding LOS patients' outcome and satisfaction, we found no significant changes. However, we observed some noteworthy trends, which should be monitored closely.
Resumo:
Background. Early identification of pathogens from blood cultures using matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF) mass spectrometry may optimize the choice of empirical antibiotic therapy in the setting of bloodstream infections. We aimed to assess the impact of this new technology on the use of antibiotic treatment in patients with gram-negative bacteremia. Methods. We conducted a prospective observational study from January to December 2010 to evaluate the sequential and separate impacts of Gram stain reporting and MALDI-TOF bacterial identification performed on blood culture pellets in patients with gram-negative bacteremia. The primary outcome was the impact of MALDI-TOF on empirical antibiotic choice. Results. Among 202 episodes of gram-negative bacteremia, Gram stain reporting had an impact in 42 cases (20.8%). MALDI-TOF identification led to a modification of empirical therapy in 71 of all 202 cases (35.1%), and in 16 of 27 cases (59.3%) of monomicrobial bacteremia caused by AmpC-producing Enterobacteriaceae. The most frequently observed impact was an early appropriate broadening of the antibiotic spectrum in 31 of 71 cases (43.7%). In total, 143 of 165 episodes (86.7%) of monomicrobial bacteremia were correctly identified at genus level by MALDI-TOF. Conclusions. In a low prevalence area for extended spectrum betalactamases (ESBL) and multiresistant gram-negative bacteria, MALDI-TOF performed on blood culture pellets had an impact on the clinical management of 35.1% of all gram-negative bacteremia cases, demonstrating a greater impact than Gram stain reporting. Thus, MALDI-TOF could become a vital second step beside Gram stain in guiding the empirical treatment of patients with bloodstream infection.
Resumo:
My research in live drawing and new technologies uses a combination of a human figure in live in composition, overlaid with a digital projection of a second human figure. The aim is to explore, to amplify and thoroughly analyse the search for distinctive identities and graphic languages of representation for live and projected models.
Resumo:
En este artículo abordamos el uso y la importancia de las herramientas estadísticas que se utilizan principalmente en los estudios médicos del ámbito de la oncología y la hematología, pero aplicables a muchos otros campos tanto médicos como experimentales o industriales. El objetivo del presente trabajo es presentar de una manera clara y precisa la metodología estadística necesaria para analizar los datos obtenidos en los estudios rigurosa y concisamente en cuanto a las hipótesis de trabajo planteadas por los investigadores. La medida de la respuesta al tratamiento elegidas en al tipo de estudio elegido determinarán los métodos estadísticos que se utilizarán durante el análisis de los datos del estudio y también el tamaño de muestra. Mediante la correcta aplicación del análisis estadístico y de una adecuada planificación se puede determinar si la relación encontrada entre la exposición a un tratamiento y un resultado es casual o por el contrario, está sujeto a una relación no aleatoria que podría establecer una relación de causalidad. Hemos estudiado los principales tipos de diseño de los estudios médicos más utilizados, tales como ensayos clínicos y estudios observacionales (cohortes, casos y controles, estudios de prevalencia y estudios ecológicos). También se presenta una sección sobre el cálculo del tamaño muestral de los estudios y cómo calcularlo, ¿Qué prueba estadística debe utilizarse?, los aspectos sobre fuerza del efecto ¿odds ratio¿ (OR) y riesgo relativo (RR), el análisis de supervivencia. Se presentan ejemplos en la mayoría de secciones del artículo y bibliografía más relevante.
Resumo:
Two trends which presently exist in relation to the concept of Paleontology are analyzed, pointing out some of the aspects which negative influence. Various reflections are made based on examples of some of the principal points of paleontological method, such as the influence of a punctual sampling, the meaning of size-frequency distribution and subjectivity in the identification of fossils. Topics which have a marked repercussion in diverse aspects of Paleontology are discussed.
Resumo:
1. Identifying the boundary of a species' niche from observational and environmental data is a common problem in ecology and conservation biology and a variety of techniques have been developed or applied to model niches and predict distributions. Here, we examine the performance of some pattern-recognition methods as ecological niche models (ENMs). Particularly, one-class pattern recognition is a flexible and seldom used methodology for modelling ecological niches and distributions from presence-only data. The development of one-class methods that perform comparably to two-class methods (for presence/absence data) would remove modelling decisions about sampling pseudo-absences or background data points when absence points are unavailable. 2. We studied nine methods for one-class classification and seven methods for two-class classification (five common to both), all primarily used in pattern recognition and therefore not common in species distribution and ecological niche modelling, across a set of 106 mountain plant species for which presence-absence data was available. We assessed accuracy using standard metrics and compared trade-offs in omission and commission errors between classification groups as well as effects of prevalence and spatial autocorrelation on accuracy. 3. One-class models fit to presence-only data were comparable to two-class models fit to presence-absence data when performance was evaluated with a measure weighting omission and commission errors equally. One-class models were superior for reducing omission errors (i.e. yielding higher sensitivity), and two-classes models were superior for reducing commission errors (i.e. yielding higher specificity). For these methods, spatial autocorrelation was only influential when prevalence was low. 4. These results differ from previous efforts to evaluate alternative modelling approaches to build ENM and are particularly noteworthy because data are from exhaustively sampled populations minimizing false absence records. Accurate, transferable models of species' ecological niches and distributions are needed to advance ecological research and are crucial for effective environmental planning and conservation; the pattern-recognition approaches studied here show good potential for future modelling studies. This study also provides an introduction to promising methods for ecological modelling inherited from the pattern-recognition discipline.
Resumo:
BACKGROUND: Rotational thromboelastometry (ROTEM) is a whole blood point-of-test used to assess the patient's coagulation status. Three of the available ROTEM tests are EXTEM, INTEM and HEPTEM. In the latter, heparinase added to the INTEM reagent inactivates heparin to reveal residual heparin effect. Performing ROTEM analysis during cardiopulmonary bypass (CPB) might allow the anaesthesiologist to anticipate the need for blood products. OBJECTIVE: The goal of this study was to validate ROTEM analysis in the presence of very high heparin concentrations during CPB. DESIGN: Prospective, observational trial. SETTING: Single University Hospital. PARTICIPANTS: Twenty patients undergoing coronary artery bypass grafting. MAIN OUTCOME MEASURE: ROTEM analysis was performed before heparin administration (T0), 10 min after heparin (T1), at the end of CPB (T2) and 10 min after protamine (T3). The following tests were performed: EXTEM, INTEM, and HEPTEM. Heparin concentrations were measured at T1 and at the end of bypass (T2). RESULTS: At T1, EXTEM differed from baseline for coagulation time: +26.7 s (18.4 to 34.9, P < 0.0001), α: -3° (1.0 to 5.4, P = 0.006) and A10: -4.4 mm (2.3 to 6.5, P = 0.0004). INTEM at T0 was different from HEPTEM at T1 for coagulation time: + 47 s (34.3 to 59.6, P >0.0001), A10: -2.3 mm (0.5 to 4.0, P = 0.01) and α -2° (1.0 to 3.0; P = 0.0007). At T2, all parameters in EXTEM and HEPTEM related to fibrin-platelet interaction deteriorated significantly compared to T1. At T3, EXTEM and INTEM were comparable to EXTEM and HEPTEM at T2. CONCLUSION: HEPTEM and EXTEM measurements are valid in the presence of very high heparin concentrations and can be performed before protamine administration in patients undergoing cardiac surgery with CPB. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT01455454.
Resumo:
Synthetic root exudates were formulated based on the organic acid composition of root exudates derived from the rhizosphere of aseptically grown corn plants, pH of the rhizosphere, and the background chemical matrices of the soil solutions. The synthetic root exudates, which mimic the chemical conditions of the rhizosphere environment where soil-borne metals are dissolved and absorbed by plants, were used to extract metals from sewage-sludge treated soils 16 successive times. The concentrations of Zn, Cd, Ni, Cr, and Cu of the sludge-treated soil were 71.74, 0.21, 15.90, 58.12, and 37.44 mg kg-1, respectively. The composition of synthetic root exudates consisted of acetic, butyric, glutaric, lactic, maleic, propionic, pyruvic, succinic, tartaric, and valeric acids. The organic acid mixtures had concentrations of 0.05 and 0.1 mol L-1 -COOH. The trace elements removed by successive extractions may be considered representative for the availability of these metals to plants in these soils. The chemical speciation of the metals in the liquid phase was calculated; results showed that metals in sludge-treated soils were dissolved and formed soluble complexes with the different organic acid-based root exudates. The most reactive organic acid ligands were lactate, maleate, tartarate, and acetate. The inorganic ligands of chloride and sulfate played insignificant roles in metal dissolution. Except for Cd, free ions did not represent an important chemical species of the metals in the soil rhizosphere. As different metals formed soluble complexes with different ligands in the rhizosphere, no extractor, based on a single reagent would be able to recover all of the potentially plant-available metals from soils; the root exudate-derived organic acid mixtures tested in this study may be better suited to recover potentially plant-available metals from soils than the conventional extractors.
Resumo:
La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.