973 resultados para Typographical Errors
Resumo:
Los déficits y sesgos tanto cognitivos como afectivos han sido fuente creciente de interés en el ámbito de la Neurociéncia de los Trastornos Mentales. En este proyecto, que se inicia en 2004 y finaliza a finales de 2008, se han estudiado los siguientes Trastornos Mentales: Juego Patológico (JP), Trastornos de la Conducta Alimentaria (TCA) y Trastornos Depresivos. En esta memoria nos centraremos en resumir parte de los resultados obtenidos en un estudio sobre JP y toma de decisiones (articulo en revisión y pendiente de aceptación) y otro de funcionamiento ejecutivo en JP y Bulimia Nerviosa (BN) (artículo en prensa). Resumiento el primer estudio los JP (N=32) muestran un proceso de toma de decisiones sesgado por la búsqueda de recompensa en forma de elevada toma de riesgos en comparación con Controles Sanos (CS). También se observan déficits en flexibilidad cognitiva pero no en control inhibitorio entre JP y CS. Los resultados descartan miopía conductual para lo toma de decisiones en JP, pero apuntan a un sesgo cognitivo-afectivo, en el que el control de los impulsos jugaría un papel relevante, en forma de ilusión de control, para los procesos de toma de decisiones con recompensa inmediata pero con castigo diferido, medidos por una prueba de toma de decisiones (IGT ABCD). En el segundo estudio, basándose en las vulnerabilidadades compartidas descritas entre JP y BN se comparó el funcionamiento ejecutivo de mujeres con JP y BN. Tras la administración del WCST y Stroop y ajustando el análisis por edad y educación, las JP mostraron mayor afectación, en concreto mayor porcentaje de errores perservaritvos, menor nivel de respuestas conceptuales y mayor número de ensayos administrados, mientras que el grupo de BN mostró mayor porcentaje de errores no persevarativos. Ambas, mujeres JP y BN mostraron disfunción ejecutiva en relación a los CS pero con diferentes correlatos específcos.
Resumo:
Empirical researchers interested in how governance shapes various aspects of economic development frequently use the Worldwide Governance indicators (WGI). These variables come in the form of an estimate along with a standard error reflecting the uncertainty of this estimate. Existing empirical work simply uses the estimates as an explanatory variable and discards the information provided by the standard errors. In this paper, we argue that the appropriate practice should be to take into account the uncertainty around the WGI estimates through the use of multiple imputation. We investigate the importance of our proposed approach by revisiting in three applications the results of recently published studies. These applications cover the impact of governance on (i) capital flows; (ii) international trade; (iii) income levels around the world. We generally find that the estimated effects of governance are highly sensitive to the use of multiple imputation. We also show that model misspecification is a concern for the results of our reference studies. We conclude that the effects of governance are hard to establish once we take into account uncertainty around both the WGI estimates and the correct model specification.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.
Resumo:
This paper develops a methodology to estimate the entire population distributions from bin-aggregated sample data. We do this through the estimation of the parameters of mixtures of distributions that allow for maximal parametric flexibility. The statistical approach we develop enables comparisons of the full distributions of height data from potential army conscripts across France's 88 departments for most of the nineteenth century. These comparisons are made by testing for differences-of-means stochastic dominance. Corrections for possible measurement errors are also devised by taking advantage of the richness of the data sets. Our methodology is of interest to researchers working on historical as well as contemporary bin-aggregated or histogram-type data, something that is still widely done since much of the information that is publicly available is in that form, often due to restrictions due to political sensitivity and/or confidentiality concerns.
Resumo:
A study on lead pollution was carried out on a sample of ca. 300 city children. This paper presents the errors producing bias in the sample. It is emphasized that, in Switzerland, the difference between the Swiss and the migrant population (the latter being mainly Italian and Spanish) must be taken into account.
Resumo:
BACKGROUND: Little information is available on the validity of simple and indirect body-composition methods in non-Western populations. Equations for predicting body composition are population-specific, and body composition differs between blacks and whites. OBJECTIVE: We tested the hypothesis that the validity of equations for predicting total body water (TBW) from bioelectrical impedance analysis measurements is likely to depend on the racial background of the group from which the equations were derived. DESIGN: The hypothesis was tested by comparing, in 36 African women, TBW values measured by deuterium dilution with those predicted by 23 equations developed in white, African American, or African subjects. These cross-validations in our African sample were also compared, whenever possible, with results from other studies in black subjects. RESULTS: Errors in predicting TBW showed acceptable values (1.3-1.9 kg) in all cases, whereas a large range of bias (0.2-6.1 kg) was observed independently of the ethnic origin of the sample from which the equations were derived. Three equations (2 from whites and 1 from blacks) showed nonsignificant bias and could be used in Africans. In all other cases, we observed either an overestimation or underestimation of TBW with variable bias values, regardless of racial background, yielding no clear trend for validity as a function of ethnic origin. CONCLUSIONS: The findings of this cross-validation study emphasize the need for further fundamental research to explore the causes of the poor validity of TBW prediction equations across populations rather than the need to develop new prediction equations for use in Africa.
Resumo:
Current American Academy of Neurology (AAN) guidelines for outcome prediction in comatose survivors of cardiac arrest (CA) have been validated before the therapeutic hypothermia era (TH). We undertook this study to verify the prognostic value of clinical and electrophysiological variables in the TH setting. A total of 111 consecutive comatose survivors of CA treated with TH were prospectively studied over a 3-year period. Neurological examination, electroencephalography (EEG), and somatosensory evoked potentials (SSEP) were performed immediately after TH, at normothermia and off sedation. Neurological recovery was assessed at 3 to 6 months, using Cerebral Performance Categories (CPC). Three clinical variables, assessed within 72 hours after CA, showed higher false-positive mortality predictions as compared with the AAN guidelines: incomplete brainstem reflexes recovery (4% vs 0%), myoclonus (7% vs 0%), and absent motor response to pain (24% vs 0%). Furthermore, unreactive EEG background was incompatible with good long-term neurological recovery (CPC 1-2) and strongly associated with in-hospital mortality (adjusted odds ratio for death, 15.4; 95% confidence interval, 3.3-71.9). The presence of at least 2 independent predictors out of 4 (incomplete brainstem reflexes, myoclonus, unreactive EEG, and absent cortical SSEP) accurately predicted poor long-term neurological recovery (positive predictive value = 1.00); EEG reactivity significantly improved the prognostication. Our data show that TH may modify outcome prediction after CA, implying that some clinical features should be interpreted with more caution in this setting as compared with the AAN guidelines. EEG background reactivity is useful in determining the prognosis after CA treated with TH.
Resumo:
As computer chips implementation technologies evolve to obtain more performance, those computer chips are using smaller components, with bigger density of transistors and working with lower power voltages. All these factors turn the computer chips less robust and increase the probability of a transient fault. Transient faults may occur once and never more happen the same way in a computer system lifetime. There are distinct consequences when a transient fault occurs: the operating system might abort the execution if the change produced by the fault is detected by bad behavior of the application, but the biggest risk is that the fault produces an undetected data corruption that modifies the application final result without warnings (for example a bit flip in some crucial data). With the objective of researching transient faults in computer system’s processor registers and memory we have developed an extension of HP’s and AMD joint full system simulation environment, named COTSon. This extension allows the injection of faults that change a single bit in processor registers and memory of the simulated computer. The developed fault injection system makes it possible to: evaluate the effects of single bit flip transient faults in an application, analyze an application robustness against single bit flip transient faults and validate fault detection mechanism and strategies.
Resumo:
Brucellosis has become a rare entity in many industrialised countries, because of animal vaccination programs. We report a first case in the literature of Brucella abscess in the hip region observed in Switzerland in a subject without any clear risk factor, leading us to conclude that abscess formation can be a rare manifestation of brucellosis. Because it can present in many different forms and locations without having characteristic clinics, a high index of suspicion is needed for the diagnosis even if the patient is a healthy athlete with no clear way of obvious route for contamination, and this even more if all the common causes of athletic hip pain have been ruled out.
Resumo:
Treball de recerca realitzat per una alumna d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009.Es tracta d'estudi de caràcter històric, literari i crític del monòleg d'un sol acte 'Records de l'Avi', escrit per l'igualadí Gabriel Castellà i Raich (1876-1959) l'any 1913. L'obra narra una jornada de la batalla del Bruc (1808). L'estudi s'inicia amb una recerca sobre la situació històrica del territori de l'autor. A continuació, inclou la biografia i la contribució literària de Gabriel Castellà. D'altra banda, la part de caràcter literari fa referència als aspectes referents a l'obra 'Records de l'Avi', per tal de donar a conèixer els seus aspectes interns. També es fa una edició crítica, en la qual s'analitzen els errors de l'obra a partir de la comparació de dos testimonis -un dels quals manuscrit- que actualment encara existeixen. Amb aquesta comparació s'han pogut realitzar alguns canvis, fets amb l'objectiu de poder aproximar l'obra de la manera més precisa possible al format que l'autor hauria desitjat.
Resumo:
Excessive exposure to solar ultraviolet (UV) is the main cause of skin cancer. Specific prevention should be further developed to target overexposed or highly vulnerable populations. A better characterisation of anatomical UV exposure patterns is however needed for specific prevention. To develop a regression model for predicting the UV exposure ratio (ER, ratio between the anatomical dose and the corresponding ground level dose) for each body site without requiring individual measurements. A 3D numeric model (SimUVEx) was used to compute ER for various body sites and postures. A multiple fractional polynomial regression analysis was performed to identify predictors of ER. The regression model used simulation data and its performance was tested on an independent data set. Two input variables were sufficient to explain ER: the cosine of the maximal daily solar zenith angle and the fraction of the sky visible from the body site. The regression model was in good agreement with the simulated data ER (R(2)=0.988). Relative errors up to +20% and -10% were found in daily doses predictions, whereas an average relative error of only 2.4% (-0.03% to 5.4%) was found in yearly dose predictions. The regression model predicts accurately ER and UV doses on the basis of readily available data such as global UV erythemal irradiance measured at ground surface stations or inferred from satellite information. It renders the development of exposure data on a wide temporal and geographical scale possible and opens broad perspectives for epidemiological studies and skin cancer prevention.
Resumo:
The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.
Resumo:
La adaptación del reconocimiento de objetos sobre la robótica móvil requiere un enfoque y nuevas aplicaciones que optimicen el entrenamiento de los robots para obtener resultados satisfactorios. Es conocido que el proceso de entrenamiento es largo y tedioso, donde la intervención humana es absolutamente necesaria para supervisar el comportamiento del robot y la dirección hacia los objetivos. Es por esta razón que se ha desarrollado una herramienta que reduce notablemente el esfuerzo humano que se debe hacer para esta supervisión, automatizando el proceso necesario para obtener una evaluación de resultados, y minimizando el tiempo que se malgasta debido a errores humanos o falta de infraestructuras.