64 resultados para 2ND-ORDER
em Université de Lausanne, Switzerland
Resumo:
According to the most widely accepted Cattell-Horn-Carroll (CHC) model of intelligence measurement, each subtest score of the Wechsler Intelligence Scale for Adults (3rd ed.; WAIS-III) should reflect both 1st- and 2nd-order factors (i.e., 4 or 5 broad abilities and 1 general factor). To disentangle the contribution of each factor, we applied a Schmid-Leiman orthogonalization transformation (SLT) to the standardization data published in the French technical manual for the WAIS-III. Results showed that the general factor accounted for 63% of the common variance and that the specific contributions of the 1st-order factors were weak (4.7%-15.9%). We also addressed this issue by using confirmatory factor analysis. Results indicated that the bifactor model (with 1st-order group and general factors) better fit the data than did the traditional higher order structure. Models based on the CHC framework were also tested. Results indicated that a higher order CHC model showed a better fit than did the classical 4-factor model; however, the WAIS bifactor structure was the most adequate. We recommend that users do not discount the Full Scale IQ when interpreting the index scores of the WAIS-III because the general factor accounts for the bulk of the common variance in the French WAIS-III. The 4 index scores cannot be considered to reflect only broad ability because they include a strong contribution of the general factor.
Resumo:
The interhemispheric asymmetries that originate from connectivity-related structuring of the cortex are compromised in schizophrenia (SZ). Under the assumption that such abnormalities affect functional connectivity, we analyzed its correlate-EEG synchronization-in SZ patients and matched controls. We applied multivariate synchronization measures based on Laplacian EEG and tuned to various spatial scales. Compared to the controls who had rightward asymmetry at a local level (EEG power), rightward anterior and leftward posterior asymmetries at an intraregional level (1st and 2nd order S-estimator), and rightward global asymmetry (hemispheric S-estimator), SZ patients showed generally attenuated asymmetry, the effect being strongest for intraregional synchronization in the alpha and beta bands. The abnormalities of asymmetry increased with the duration of the disease and correlated with the negative symptoms. We discuss the tentative links between these findings and gross anatomical asymmetries, including the cerebral torque and gyrification pattern, in normal subjects and SZ patients.
Resumo:
Recent findings suggest an association between exposure to cleaning products and respiratory dysfunctions including asthma. However, little information is available about quantitative airborne exposures of professional cleaners to volatile organic compounds deriving from cleaning products. During the first phases of the study, a systematic review of cleaning products was performed. Safety data sheets were reviewed to assess the most frequently added volatile organic compounds. It was found that professional cleaning products are complex mixtures of different components (compounds in cleaning products: 3.5 ± 2.8), and more than 130 chemical substances listed in the safety data sheets were identified in 105 products. The main groups of chemicals were fragrances, glycol ethers, surfactants, solvents; and to a lesser extent phosphates, salts, detergents, pH-stabilizers, acids, and bases. Up to 75% of products contained irritant (Xi), 64% harmful (Xn) and 28% corrosive (C) labeled substances. Hazards for eyes (59%), skin (50%) and by ingestion (60%) were the most reported. Monoethanolamine, a strong irritant and known to be involved in sensitizing mechanisms as well as allergic reactions, is frequently added to cleaning products. Monoethanolamine determination in air has traditionally been difficult and air sampling and analysis methods available were little adapted for personal occupational air concentration assessments. A convenient method was developed with air sampling on impregnated glass fiber filters followed by one step desorption, gas chromatography and nitrogen phosphorous selective detection. An exposure assessment was conducted in the cleaning sector, to determine airborne concentrations of monoethanolamine, glycol ethers, and benzyl alcohol during different cleaning tasks performed by professional cleaning workers in different companies, and to determine background air concentrations of formaldehyde, a known indoor air contaminant. The occupational exposure study was carried out in 12 cleaning companies, and personal air samples were collected for monoethanolamine (n=68), glycol ethers (n=79), benzyl alcohol (n=15) and formaldehyde (n=45). All but ethylene glycol mono-n-butyl ether air concentrations measured were far below (<1/10) of the Swiss eight hours occupational exposure limits, except for butoxypropanol and benzyl alcohol, where no occupational exposure limits were available. Although only detected once, ethylene glycol mono-n-butyl ether air concentrations (n=4) were high (49.5 mg/m3 to 58.7 mg/m3), hovering at the Swiss occupational exposure limit (49 mg/m3). Background air concentrations showed no presence of monoethanolamine, while the glycol ethers were often present, and formaldehyde was universally detected. Exposures were influenced by the amount of monoethanolamine in the cleaning product, cross ventilation and spraying. The collected data was used to test an already existing exposure modeling tool during the last phases of the study. The exposure estimation of the so called Bayesian tool converged with the measured range of exposure the more air concentrations of measured exposure were added. This was best described by an inverse 2nd order equation. The results suggest that the Bayesian tool is not adapted to predict low exposures. The Bayesian tool should be tested also with other datasets describing higher exposures. Low exposures to different chemical sensitizers and irritants should be further investigated to better understand the development of respiratory disorders in cleaning workers. Prevention measures should especially focus on incorrect use of cleaning products, to avoid high air concentrations at the exposure limits. - De récentes études montrent l'existence d'un lien entre l'exposition aux produits de nettoyages et les maladies respiratoires telles que l'asthme. En revanche, encore peu d'informations sont disponibles concernant la quantité d'exposition des professionnels du secteur du nettoyage aux composants organiques volatiles provenant des produits qu'ils utilisent. Pendant la première phase de cette étude, un recueil systématique des produits professionnels utilisés dans le secteur du nettoyage a été effectué. Les fiches de données de sécurité de ces produits ont ensuite été analysées, afin de répertorier les composés organiques volatiles les plus souvent utilisés. Il a été mis en évidence que les produits de nettoyage professionnels sont des mélanges complexes de composants chimiques (composants chimiques dans les produits de nettoyage : 3.5 ± 2.8). Ainsi, plus de 130 substances listées dans les fiches de données de sécurité ont été retrouvées dans les 105 produits répertoriés. Les principales classes de substances chimiques identifiées étaient les parfums, les éthers de glycol, les agents de surface et les solvants; dans une moindre mesure, les phosphates, les sels, les détergents, les régulateurs de pH, les acides et les bases ont été identifiés. Plus de 75% des produits répertoriés contenaient des substances décrites comme irritantes (Xi), 64% nuisibles (Xn) et 28% corrosives (C). Les risques pour les yeux (59%), la peau (50%) et par ingestion (60%) était les plus mentionnés. La monoéthanolamine, un fort irritant connu pour être impliqué dans les mécanismes de sensibilisation tels que les réactions allergiques, est fréquemment ajouté aux produits de nettoyage. L'analyse de la monoéthanolamine dans l'air a été habituellement difficile et les échantillons d'air ainsi que les méthodes d'analyse déjà disponibles étaient peu adaptées à l'évaluation de la concentration individuelle d'air aux postes de travail. Une nouvelle méthode plus efficace a donc été développée en captant les échantillons d'air sur des filtres de fibre de verre imprégnés, suivi par une étape de désorption, puis une Chromatographie des gaz et enfin une détection sélective des composants d'azote. Une évaluation de l'exposition des professionnels a été réalisée dans le secteur du nettoyage afin de déterminer la concentration atmosphérique en monoéthanolamine, en éthers de glycol et en alcool benzylique au cours des différentes tâches de nettoyage effectuées par les professionnels du nettoyage dans différentes entreprises, ainsi que pour déterminer les concentrations atmosphériques de fond en formaldéhyde, un polluant de l'air intérieur bien connu. L'étude de l'exposition professionnelle a été effectuée dans 12 compagnies de nettoyage et les échantillons d'air individuels ont été collectés pour l'éthanolamine (n=68), les éthers de glycol (n=79), l'alcool benzylique (n=15) et le formaldéhyde (n=45). Toutes les substances mesurées dans l'air, excepté le 2-butoxyéthanol, étaient en-dessous (<1/10) de la valeur moyenne d'exposition aux postes de travail en Suisse (8 heures), excepté pour le butoxypropanol et l'alcool benzylique, pour lesquels aucune valeur limite d'exposition n'était disponible. Bien que détecté qu'une seule fois, les concentrations d'air de 2-butoxyéthanol (n=4) étaient élevées (49,5 mg/m3 à 58,7 mg/m3), se situant au-dessus de la frontière des valeurs limites d'exposition aux postes de travail en Suisse (49 mg/m3). Les concentrations d'air de fond n'ont montré aucune présence de monoéthanolamine, alors que les éthers de glycol étaient souvent présents et les formaldéhydes quasiment toujours détectés. L'exposition des professionnels a été influencée par la quantité de monoéthanolamine présente dans les produits de nettoyage utilisés, par la ventilation extérieure et par l'emploie de sprays. Durant la dernière phase de l'étude, les informations collectées ont été utilisées pour tester un outil de modélisation de l'exposition déjà existant, l'outil de Bayesian. L'estimation de l'exposition de cet outil convergeait avec l'exposition mesurée. Cela a été le mieux décrit par une équation du second degré inversée. Les résultats suggèrent que l'outil de Bayesian n'est pas adapté pour mettre en évidence les taux d'expositions faibles. Cet outil devrait également être testé avec d'autres ensembles de données décrivant des taux d'expositions plus élevés. L'exposition répétée à des substances chimiques ayant des propriétés irritatives et sensibilisantes devrait être investiguée d'avantage, afin de mieux comprendre l'apparition de maladies respiratoires chez les professionnels du nettoyage. Des mesures de prévention devraient tout particulièrement être orientées sur l'utilisation correcte des produits de nettoyage, afin d'éviter les concentrations d'air élevées se situant à la valeur limite d'exposition acceptée.
Resumo:
Most studies about the higher-order dimensions to be considered in order to parsimoniously describe Personality Disorders (PDs) have identified between two and four factors but there is still no consensus about their exact number. In this context, the cultural stability of these structures might be a criterion to be considered. The aim of this study was to identify stable higher-order structures of PD traits in a French-speaking African and Swiss sample (N = 2,711). All subject completed the IPDE screening questionnaire. Using Everett's criterion and conducting a series of principal component analyses, a cross-culturally stable two- and four-factor structure were identified, associated with a total congruence coefficient of respectively .98 and .94 after Procrustes rotation. Moreover, these two structures were also highly replicable across the four African regions considered, North Africa, West Africa, Central Africa, and Mauritius, with a mean total congruence coefficient of respectively .97 and .87. The four-factor structure presented the advantage of being similar to Livesely's four components and of describing the ten PDs more accurately.
Resumo:
OBJECTIVE: To assess whether formatting the medical order sheet has an effect on the accuracy and security of antibiotics prescription. DESIGN: Prospective assessment of antibiotics prescription over time, before and after the intervention, in comparison with a control ward. SETTING: The medical and surgical intensive care unit (ICU) of a university hospital. PATIENTS: All patients hospitalized in the medical or surgical ICU between February 1 and April 30, 1997, and July 1 and August 31, 2000, for whom antibiotics were prescribed. INTERVENTION: Formatting of the medical order sheet in the surgical ICU in 1998. MEASUREMENTS AND MAIN RESULTS: Compliance with the American Society of Hospital Pharmacists' criteria for prescription safety was measured. The proportion of safe orders increased in both units, but the increase was 4.6 times greater in the surgical ICU (66% vs. 74% in the medical ICU and 48% vs. 74% in the surgical ICU). For unsafe orders, the proportion of ambiguous orders decreased by half in the medical ICU (9% vs. 17%) and nearly disappeared in the surgical ICU (1% vs. 30%). The only missing criterion remaining in the surgical ICU was the drug dose unit, which could not be preformatted. The aim of antibiotics prescription (either prophylactic or therapeutic) was indicated only in 51% of the order sheets. CONCLUSIONS: Formatting of the order sheet markedly increased security of antibiotics prescription. These findings must be confirmed in other settings and with different drug classes. Formatting the medical order sheet decreases the potential for prescribing errors before full computerized prescription is available.
Resumo:
We extend PML theory to account for information on the conditional moments up to order four, but without assuming a parametric model, to avoid a risk of misspecification of the conditional distribution. The key statistical tool is the quartic exponential family, which allows us to generalize the PML2 and QGPML1 methods proposed in Gourieroux et al. (1984) to PML4 and QGPML2 methods, respectively. An asymptotic theory is developed. The key numerical tool that we use is the Gauss-Freud integration scheme that solves a computational problem that has previously been raised in several fields. Simulation exercises demonstrate the feasibility and robustness of the methods [Authors]
Resumo:
Since the 1990s, and especially since the early 2000s, passionate controversies (Göle 2014) have emerged around the new visibility of Islam in the public sphere across Europe. These controversies, which crystallized in the headscarf debate, seem even more disturbing given that women who wear it are often young, urban and educated: that is to say, "modern" (Göle 1997, 2011). Indeed, these young women wearing the hijab seem to disrupt the narrative of Western modernity, including the decline in religious practice (Hervieu-Léger 2006) or the narration of the process of secularization in Europe. It is in the context of these controversies that Islam is built imaginatively as a "public problem" that has to be "solved" (Behloul 2012). Thus, this social construction of the Muslim other has nurtured an assessment of the failure of multiculturalism in some European countries and a process of convergence around a single model of civic integration in Europe (Behloul 2012, Joppke 2004, 2010).
Resumo:
OBJECTIVE: To assess the change in non-compliant items in prescription orders following the implementation of a computerized physician order entry (CPOE) system named PreDiMed. SETTING: The department of internal medicine (39 and 38 beds) in two regional hospitals in Canton Vaud, Switzerland. METHOD: The prescription lines in 100 pre- and 100 post-implementation patients' files were classified according to three modes of administration (medicines for oral or other non-parenteral uses; medicines administered parenterally or via nasogastric tube; pro re nata (PRN), as needed) and analyzed for a number of relevant variables constitutive of medical prescriptions. MAIN OUTCOME MEASURE: The monitored variables depended on the pharmaceutical category and included mainly name of medicine, pharmaceutical form, posology and route of administration, diluting solution, flow rate and identification of prescriber. RESULTS: In 2,099 prescription lines, the total number of non-compliant items was 2,265 before CPOE implementation, or 1.079 non-compliant items per line. Two-thirds of these were due to missing information, and the remaining third to incomplete information. In 2,074 prescription lines post-CPOE implementation, the number of non-compliant items had decreased to 221, or 0.107 non-compliant item per line, a dramatic 10-fold decrease (chi(2) = 4615; P < 10(-6)). Limitations of the computerized system were the risk for erroneous items in some non-prefilled fields and ambiguity due to a field with doses shown on commercial products. CONCLUSION: The deployment of PreDiMed in two departments of internal medicine has led to a major improvement in formal aspects of physicians' prescriptions. Some limitations of the first version of PreDiMed were unveiled and are being corrected.
Resumo:
Introduction: Coronary magnetic resonance angiography (MRA) is a medical imaging technique that involves collecting data from consecutive heartbeats, always at the same time in the cardiac cycle, in order to minimize heart motion artifacts. This technique relies on the assumption that coronary arteries always follow the same trajectory from heartbeat to heartbeat. Until now, choosing the acquisition window in the cardiac cycle was based exclusively on the position of minimal coronary motion. The goal of this study was to test the hypothesis that there are time intervals during the cardiac cycle when coronary beat-to-beat repositioning is optimal. The repositioning uncertainty values in these time intervals were then compared with the intervals of low coronary motion in order to propose an optimal acquisition window for coronary MRA. Methods: Cine breath-hold x-ray angiograms with synchronous ECG were collected from 11 patients who underwent elective routine diagnostic coronarography. Twenty-three bifurcations of the left coronary artery were selected as markers to evaluate repositioning uncertainty and velocity during cardiac cycle. Each bifurcation was tracked by two observers, with the help of a user-assisted algorithm implemented in Matlab (The Mathworks, Natick, MA, USA) that compared the trajectories of the markers coming from consecutive heartbeats and computed the coronary repositioning uncertainty with steps of 50ms until 650ms after the R-wave. Repositioning uncertainty was defined as the diameter of the smallest circle encompassing the points to be compared at the same time after the R-wave. Student's t-tests with a false discovery rate (FDR, q=0.1) correction for multiple comparison were applied to see whether coronary repositioning and velocity vary statistically during cardiac cycle. Bland-Altman plots and linear regression were used to assess intra- and inter-observer agreement. Results: The analysis of left coronary artery beat-to-beat repositioning uncertainty shows a tendency to have better repositioning in mid systole (less than 0.84±0.58mm) and mid diastole (less than 0.89±0.6mm) than in the rest of the cardiac cycle (highest value at 50ms=1.35±0.64mm). According to Student's t-tests with FDR correction for multiple comparison (q=0.1), two intervals, in mid systole (150-200ms) and mid diastole (550-600ms), provide statistically better repositioning in comparison with the early systole and the early diastole. Coronary velocity analysis reveals that left coronary artery moves more slowly in end systole (14.35±11.35mm/s at 225ms) and mid diastole (11.78±11.62mm/s at 625ms) than in the rest of the cardiac cycle (highest value at 25ms: 55.96±22.34mm/s). This was confirmed by Student's t-tests with FDR correction for multiple comparison (q=0.1, FDR-corrected p-value=0.054): coronary velocity values at 225, 575 and 625ms are not much different between them but they are statistically inferior to all others. Bland-Altman plots and linear regression show that intra-observer agreement (y=0.97x+0.02 with R²=0.93 at 150ms) is better than inter-observer (y=0.8x+0.11 with R²=0.67 at 150ms). Discussion: The present study has demonstrated that there are two time intervals in the cardiac cycle, one in mid systole and one in mid diastole, where left coronary artery repositioning uncertainty reaches points of local minima. It has also been calculated that the velocity is the lowest in end systole and mid diastole. Since systole is less influenced by heart rate variability than diastole, it was finally proposed to test an acquisition window between 150 and 200ms after the R-wave.
Resumo:
Introduction: Accurate registration of the relative timing between the occurrence of sensory events on a sub-second time scale is crucial for both sensory-motor and cognitive functions (Mauk and Buonomano, 2004; Habib, 2000). Support for this assumption comes notably from evidence that temporal processing impairments are implicated in a range of neurological and psychiatric conditions (e.g. Buhusi & Meck, 2005). For instance, deficits in fast auditory temporal integration have been regularly put forward as resulting in phonologic discrimination impairments at the basis of speech comprehension deficits characterizing e.g. dyslexia (Habib, 2000). At least two aspects of the brain mechanisms of temporal order judgment remain unknown. First, it is unknown when during the course of stimulus processing a temporal ,,stamp‟ is established to guide TOJ perception. Second, the extent of interplay between the cerebral hemispheres in engendering accurate TOJ performance is unresolved Methods: We investigated the spatiotemporal brain dynamics of auditory temporal order judgment (aTOJ) using electrical neuroimaging analyses of auditory evoked potentials (AEPs) recorded while participants completed a near-threshold task requiring spatial discrimination of left-right and right-left sound sequences. Results: AEPs to sound pairs modulated topographically as a function of aTOJ accuracy over the 39-77ms post-stimulus period, indicating the engagement of distinct configurations of brain networks during early auditory processing stages. Source estimations revealed that accurate and inaccurate performance were linked to bilateral posterior sylvian regions activity (PSR). However, activity within left, but not right, PSR predicted behavioral performance suggesting that left PSR activity during early encoding phases of pairs of auditory spatial stimuli appears critical for the perception of their order of occurrence. Correlation analyses of source estimations further revealed that activity between left and right PSR was significantly correlated in the inaccurate but not accurate condition, indicating that aTOJ accuracy depends on the functional de-coupling between homotopic PSR areas. Conclusions: These results support a model of temporal order processing wherein behaviorally relevant temporal information - i.e. a temporal 'stamp'- is extracted within the early stages of cortical processes within left PSR but critically modulated by inputs from right PSR. We discuss our results with regard to current models of temporal of temporal order processing, namely gating and latency mechanisms.