587 resultados para Causality
Resumo:
In this essay, we sustain the idea that structuralist thinking is part of spontaneous criticism against the reductionisms that surround psychology. We depart from the radical split-up between the scientific viewpoint and that of metaphysics, expressed in the end-19th century scientific psychology projects. Next, we highlight the importance of the structuralist perspective in the review of the antinomic relations between the subjective and objective, operated at the heart of psychology throughout the 20th century. We show that the rejection of unilineal causality in favor of network causality curbed the advancement of unilateral or reductionist theories in psychology. Moreover, we consider the idea of structure as a point of convergence between psychology and philosophy. More than its explanatory nature, the notion of structure reveals an epistemological register capable of re-approximating psychology to the relativization of the ideal of scientific neutrality. The importance of structuralist thinking in psychology makes us consider the history of psychological knowledge as a type of research that belongs to cultural history.
Resumo:
Pós-graduação em Psicologia - FCLAS
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Spontaneous adverse drug events (ADE) reporting is the main source of data for assessing the risk/benefit of drugs available in the pharmaceutical market. However, its major limitation is underreporting, which hinders and delays the signal detection by Pharmacovigilance (PhV). To identify the techniques of educational intervention (EI) for promotion of PhV by health professionals and to assess their impact. A systematic review was performed in the PUBMED, PAHO, LILACS and EMBASE databases, from November/2011 to January/2012, updated in March/2013. The strategy search included the use of health descriptors and a manual search in the references cited by selected papers. 101 articles were identified, of which 16 met the inclusion criteria. Most of these studies (10) were conducted in European hospitals and physicians were the health professionals subjected to most EI (12), these studies lasted from one month to two years. EI with multifaceted techniques raised the absolute number, the rate of reporting related to adverse drug reactions (ADR), technical defects of health technologies, and also promoted an improvement in the quality of reports, since there was increased reporting of ADR classified as serious, unexpected, related to new drugs and with high degree of causality. Multifaceted educational interventions for multidisciplinary health teams working at all healthcare levels, with sufficient duration to reach all professionals who act in the institution, including issues related to medication errors and therapeutic ineffectiveness, must be validated, with the aim of standardizing the Good Practice of PhV and improve drug safety indicators.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In addition to understanding the distribution of the populations’ health-disease process, epidemiology has sought to study the causality associated with this process, which humanity developed over time, and to interpret the narrative of this field of knowledge. A solid review of the literature was done to emphasize the importance of using popular knowledge as a qualitative health-related investigation strategy and to demystify the use of social representations in the field of dentistry. By initiating the design of a new paradigm for understanding the oral health-disease process, which favors the idea that it is also the result of a sociocultural production, knowledge of the circumstances and context in which it is inserted becomes critical for health assessment actions. Although scientific dentistry has advanced the understanding of oral diseases, communication with popular knowledge leaves much to be desired, since most professionals find themselves trapped in a fragmented model of care. Reconstruction of the logic by which the representations of oral health were produced and socialized over time can be considered a relevant and productive purpose of the representations in the dental area.
Resumo:
Introduction: Post-marketing surveillance of drugs aims to detect problems related to safety, effectiveness and quality. The identification of adverse drug events (ADE) is made, mainly, by health professionals´ spontaneous reporting. This method allows risk communication in pharmacovigilance and contributes for market regulation. Objective: To estimate the prevalence of adverse drug reaction (ADR) and the suspicions of therapeutic failure (TF) reported by health professionals; to verify the active principle and type of drugs related to ADE, seriousness, causality, production mechanism and clinical manifestation of the events identified. METHODS: A cross-sectional study was performed in a teaching and public hospital which integrates the Sentinel Hospital Network, in 2008. ADR seriousness was classified according to intensity (mild, moderate, serious and lethal); drugs associated with ADE were categorized according to type (brand name drugs and non-brand name drugs); causality was imputed with Naranjo algorithm and the mechanism of occurrence was analyzed according to Rawlins e Thompson definitions (A or B). Results: There were 103 ADE reports in the period, of which 39 comprised TF and 64 ADR. Nurses reported the most ADE (53.4%). The majority of ADR were classified as type A (82.8%), mild (81.3%), possible (57.8%), according to causality assessment, and related to brand name drugs (20/35). Human immunoglobulin, docetaxel and paclitaxel were the drugs frequently associated with ADR. TF arising from no-brand name drugs (26/29), regarding, mainly, midazolam and ganciclovir. Conclusion: The results of the ADE report contribute for proposition of trigger tools for intensive monitoring of drug safety, as well as for the supplier qualification and for the improvement of quality products.
Resumo:
O artigo discute a proposta behaviorista radical de constituição da Psicologia como ciência do comportamento, destacando três conjuntos de questões: a) a noção de conhecimento com a qual opera, especialmente do ponto de vista da rejeição de princípios do positivismo lógico e adoção de uma concepção instrumental e relacional; b) uma interpretação da Psicologia como campo de saber que articula conteúdos filosóficos, científicos e aplicados e c) o programa de investigação dos fenômenos psicológicos orientado por um recorte externalista e por uma concepção selecionista de causalidade. A elaboração behaviorista radical é contrastada com concepções modernas acerca do homem, salientando-se seu alcance e seu caráter crítico e inovador na Psicologia e na cultura em geral.
Resumo:
We propose a new CPT-even and Lorentz-violating nonminimal coupling between fermions and Abelian gauge fields involving the CPT-even tensor (K-F)(mu nu alpha beta) of the standard model extension. We thus investigate its effects on the cross section of the electron-positron scattering by analyzing the process e(+) + e(-) -> mu(+) + mu(-). Such a study was performed for the parity-odd and parity-even nonbirefringent components of the Lorentz-violating (K-F)(mu nu alpha beta) tensor. Finally, by using experimental data available in the literature, we have imposed upper bounds as tight as 10(-12) (eV)(-1) on the magnitude of the CPT-even and Lorentz-violating parameters while nonminimally coupled. DOI: 10.1103/PhysRevD.86.125033
Resumo:
Purpose. The primary objective of this study was to investigate the incidence of drug-drug interactions (DDIs) related to adverse drug reactions (ADRs) in elderly outpatients who attended public primary healthcare units in a southeastern region of Brazil. The secondary objective was to investigate the possible predictors of DDI-related ADRs. Methods. A prospective cohort study was conducted between November 1, 2010, and November 31, 2011, in the primary public healthcare system in the Ourinhos micro-region in Brazil. Patients who were at least 60 years old, with at least one potential DDI, were eligible for inclusion in the study. Eligible patients were assessed by clinical pharmacists for DDI-related ADRs for 4 months. The causality of DDI-related ADRs was assessed independently by four clinicians using three decisional algorithms. The incidence of DDI-related ADRs during the study period was calculated. Logistic regression analysis was used to study DDI-related ADR predictors. Results. A total of 433 patients completed the study. The incidence of DDI-related ADRs was 6.5%. A multivariate analysis indicated that the adjusted odds ratios (ORs) rose from 0.91 (95% confidence interval [CI] = 0.75-1.12, p = 0.06) in patients aged 65-69 years to 4.40 (95% CI = 3.00-6.12, p < 0.01) in patients aged 80 years or older. Patients who presented two to three diagnosed diseases presented lower adjusted ORs (OR = 0.93 [95% CI = 0.68-1.18, p = 0.08]) than patients who presented six or more diseases (OR = 1.12 [95% CI = 1.02-2.01, p < 0.01]). Elderly patients who took five or more drugs had a significantly higher risk of DDI-related ADRs (OR = 2.72 [95% CI = 1.92-3.12, p < 0.01]) than patients who took three to four drugs (OR = 0.93 [95% CI = 0.74-1.11, p = 0.06]). No significant difference was found with regard to sex (OR = 1.08 [95% CI 0.48-2.02, p = 0.44]). Conclusion. The incidence of DDI-related ADRs in elderly outpatients was significant, and most of the events presented important clinical consequences. Because clinicians still have difficulty managing this problem, highlighting the factors that increase the risk of DDI-related ADRs is essential. Polypharmacy was found to be a significant predictor of DDI-related ADRs in our sample.
Resumo:
Although the prevalence of drug-drug interactions (DDIs) in elderly outpatients is high, many potential DDIs do not have any actual clinical effect, and data on the occurrence of DDI-related adverse drug reactions (ADRs) in elderly outpatients are scarce. This study aimed to determine the incidence and characteristics of DDI-related ADRs among elderly outpatients as well as the factors associated with these reactions. A prospective cohort study was conducted between 1 November 2010 and 31 November 2011 in the primary public health system of the Ourinhos micro-region, Brazil. Patients aged a parts per thousand yen60 years with at least one potential DDI were eligible for inclusion. Causality, severity, and preventability of the DDI-related ADRs were assessed independently by four clinicians using validated methods; data were analysed using descriptive analysis and multiple logistic regression. A total of 433 patients completed the study. The incidence of DDI-related ADRs was 6 % (n = 30). Warfarin was the most commonly involved drug (37 % cases), followed by acetylsalicylic acid (17 %), digoxin (17 %), and spironolactone (17 %). Gastrointestinal bleeding occurred in 37 % of the DDI-related ADR cases, followed by hyperkalemia (17 %) and myopathy (13 %). The multiple logistic regression showed that age a parts per thousand yen80 years [odds ratio (OR) 4.4; 95 % confidence interval (CI) 3.0-6.1, p < 0.01], a Charlson comorbidity index a parts per thousand yen4 (OR 1.3; 95 % CI 1.1-1.8, p < 0.01), consumption of five or more drugs (OR 2.7; 95 % CI 1.9-3.1, p < 0.01), and the use of warfarin (OR 1.7; 95 % CI1.1-1.9, p < 0.01) were associated with the occurrence of DDI-related ADRs. With regard to severity, approximately 37 % of the DDI-related ADRs detected in our cohort necessitated hospital admission. All DDI-related ADRs could have been avoided (87 % were ameliorable and 13 % were preventable). The incidence of ADRs not related to DDIs was 10 % (n = 44). The incidence of DDI-related ADRs in elderly outpatients is high; most events presented important clinical consequences and were preventable or ameliorable.
Resumo:
Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.
Resumo:
This thesis presents a creative and practical approach to dealing with the problem of selection bias. Selection bias may be the most important vexing problem in program evaluation or in any line of research that attempts to assert causality. Some of the greatest minds in economics and statistics have scrutinized the problem of selection bias, with the resulting approaches – Rubin’s Potential Outcome Approach(Rosenbaum and Rubin,1983; Rubin, 1991,2001,2004) or Heckman’s Selection model (Heckman, 1979) – being widely accepted and used as the best fixes. These solutions to the bias that arises in particular from self selection are imperfect, and many researchers, when feasible, reserve their strongest causal inference for data from experimental rather than observational studies. The innovative aspect of this thesis is to propose a data transformation that allows measuring and testing in an automatic and multivariate way the presence of selection bias. The approach involves the construction of a multi-dimensional conditional space of the X matrix in which the bias associated with the treatment assignment has been eliminated. Specifically, we propose the use of a partial dependence analysis of the X-space as a tool for investigating the dependence relationship between a set of observable pre-treatment categorical covariates X and a treatment indicator variable T, in order to obtain a measure of bias according to their dependence structure. The measure of selection bias is then expressed in terms of inertia due to the dependence between X and T that has been eliminated. Given the measure of selection bias, we propose a multivariate test of imbalance in order to check if the detected bias is significant, by using the asymptotical distribution of inertia due to T (Estadella et al. 2005) , and by preserving the multivariate nature of data. Further, we propose the use of a clustering procedure as a tool to find groups of comparable units on which estimate local causal effects, and the use of the multivariate test of imbalance as a stopping rule in choosing the best cluster solution set. The method is non parametric, it does not call for modeling the data, based on some underlying theory or assumption about the selection process, but instead it calls for using the existing variability within the data and letting the data to speak. The idea of proposing this multivariate approach to measure selection bias and test balance comes from the consideration that in applied research all aspects of multivariate balance, not represented in the univariate variable- by-variable summaries, are ignored. The first part contains an introduction to evaluation methods as part of public and private decision process and a review of the literature of evaluation methods. The attention is focused on Rubin Potential Outcome Approach, matching methods, and briefly on Heckman’s Selection Model. The second part focuses on some resulting limitations of conventional methods, with particular attention to the problem of how testing in the correct way balancing. The third part contains the original contribution proposed , a simulation study that allows to check the performance of the method for a given dependence setting and an application to a real data set. Finally, we discuss, conclude and explain our future perspectives.
Resumo:
In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.