882 resultados para multi-method study
Resumo:
Workaholism is defined as the combination of two underlying dimensions: working excessively and working compulsively. The present thesis aims at achieving the following purposes: 1) to test whether the interaction between environmental and personal antecedents may enhance workaholism; 2) to develop a questionnaire aimed to assess overwork climate in the workplace; 3) to contrast focal employees’ and coworkers’ perceptions of employees’ workaholism and engagement. Concerning the first purpose, the interaction between overwork climate and person characteristics (achievement motivation, perfectionism, conscientiousness, self-efficacy) was explored on a sample of 333 Dutch employees. The results of moderated regression analyses showed that the interaction between overwork climate and person characteristics is related to workaholism. The second purpose was pursued with two interrelated studies. In Study 1 the Overwork Climate Scale (OWCS) was developed and tested using a principal component analysis (N = 395) and a confirmatory factor analysis (N = 396). Two overwork climate dimensions were distinguished, overwork endorsement and lacking overwork rewards. In Study 2 the total sample (N = 791) was used to explore the association of overwork climate with two types of working hard: work engagement and workaholism. Lacking overwork rewards was negatively associated with engagement, whereas overwork endorsement showed a positive association with workaholism. Concerning the third purpose, using a sample of 73 dyads composed by focal employees and their coworkers, a multitrait-multimethod matrix and a correlated trait-correlated method model, i.e. the CT-C(M–1) model, were examined. Our results showed a considerable agreement between raters on focal employees' engagement and workaholism. In contrast, we observed a significant difference concerning the cognitive dimension of workaholism, working compulsively. Moreover, we provided further evidence for the discriminant validity between engagement and workaholism. Overall, workaholism appears as a negative work-related state that could be better explained by assuming a multi-causal and multi-rater approach.
Resumo:
An extensive study of the morphology and the dynamics of the equatorial ionosphere over South America is presented here. A multi parametric approach is used to describe the physical characteristics of the ionosphere in the regions where the combination of the thermospheric electric field and the horizontal geomagnetic field creates the so-called Equatorial Ionization Anomalies. Ground based measurements from GNSS receivers are used to link the Total Electron Content (TEC), its spatial gradients and the phenomenon known as scintillation that can lead to a GNSS signal degradation or even to a GNSS signal ‘loss of lock’. A new algorithm to highlight the features characterizing the TEC distribution is developed in the framework of this thesis and the results obtained are validated and used to improve the performance of a GNSS positioning technique (long baseline RTK). In addition, the correlation between scintillation and dynamics of the ionospheric irregularities is investigated. By means of a software, here implemented, the velocity of the ionospheric irregularities is evaluated using high sampling rate GNSS measurements. The results highlight the parallel behaviour of the amplitude scintillation index (S4) occurrence and the zonal velocity of the ionospheric irregularities at least during severe scintillations conditions (post-sunset hours). This suggests that scintillations are driven by TEC gradients as well as by the dynamics of the ionospheric plasma. Finally, given the importance of such studies for technological applications (e.g. GNSS high-precision applications), a validation of the NeQuick model (i.e. the model used in the new GALILEO satellites for TEC modelling) is performed. The NeQuick performance dramatically improves when data from HF radar sounding (ionograms) are ingested. A custom designed algorithm, based on the image recognition technique, is developed to properly select the ingested data, leading to further improvement of the NeQuick performance.
Resumo:
Since its discovery, top quark has represented one of the most investigated field in particle physics. The aim of this thesis is the reconstruction of hadronic top with high transverse momentum (boosted) with the Template Overlap Method (TOM). Because of the high energy, the decay products of boosted tops are partially or totally overlapped and thus they are contained in a single large radius jet (fat-jet). TOM compares the internal energy distributions of the candidate fat-jet to a sample of tops obtained by a MC simulation (template). The algorithm is based on the definition of an overlap function, which quantifies the level of agreement between the fat-jet and the template, allowing an efficient discrimination of signal from the background contributions. A working point has been decided in order to obtain a signal efficiency close to 90% and a corresponding background rejection at 70%. TOM performances have been tested on MC samples in the muon channel and compared with the previous methods present in literature. All the methods will be merged in a multivariate analysis to give a global top tagging which will be included in ttbar production differential cross section performed on the data acquired in 2012 at sqrt(s)=8 TeV in high phase space region, where new physics processes could be possible. Due to its peculiarity to increase the pT, the Template Overlap Method will play a crucial role in the next data taking at sqrt(s)=13 TeV, where the almost totality of the tops will be produced at high energy, making the standard reconstruction methods inefficient.
Resumo:
Oggi sappiamo che la materia ordinaria rappresenta solo una piccola parte dell'intero contenuto in massa dell'Universo. L'ipotesi dell'esistenza della Materia Oscura, un nuovo tipo di materia che interagisce solo gravitazionalmente e, forse, tramite la forza debole, è stata avvalorata da numerose evidenze su scala sia galattica che cosmologica. Gli sforzi rivolti alla ricerca delle cosiddette WIMPs (Weakly Interacting Massive Particles), il generico nome dato alle particelle di Materia Oscura, si sono moltiplicati nel corso degli ultimi anni. L'esperimento XENON1T, attualmente in costruzione presso i Laboratori Nazionali del Gran Sasso (LNGS) e che sarà in presa dati entro la fine del 2015, segnerà un significativo passo in avanti nella ricerca diretta di Materia Oscura, che si basa sulla rivelazione di collisioni elastiche su nuclei bersaglio. XENON1T rappresenta la fase attuale del progetto XENON, che ha già realizzato gli esperimenti XENON10 (2005) e XENON100 (2008 e tuttora in funzione) e che prevede anche un ulteriore sviluppo, chiamato XENONnT. Il rivelatore XENON1T sfrutta circa 3 tonnellate di xeno liquido (LXe) e si basa su una Time Projection Chamber (TPC) a doppia fase. Dettagliate simulazioni Monte Carlo della geometria del rivelatore, assieme a specifiche misure della radioattività dei materiali e stime della purezza dello xeno utilizzato, hanno permesso di predire con accuratezza il fondo atteso. In questo lavoro di tesi, presentiamo lo studio della sensibilità attesa per XENON1T effettuato tramite il metodo statistico chiamato Profile Likelihood (PL) Ratio, il quale nell'ambito di un approccio frequentista permette un'appropriata trattazione delle incertezze sistematiche. In un primo momento è stata stimata la sensibilità usando il metodo semplificato Likelihood Ratio che non tiene conto di alcuna sistematica. In questo modo si è potuto valutare l'impatto della principale incertezza sistematica per XENON1T, ovvero quella sulla emissione di luce di scintillazione dello xeno per rinculi nucleari di bassa energia. I risultati conclusivi ottenuti con il metodo PL indicano che XENON1T sarà in grado di migliorare significativamente gli attuali limiti di esclusione di WIMPs; la massima sensibilità raggiunge una sezione d'urto σ=1.2∙10-47 cm2 per una massa di WIMP di 50 GeV/c2 e per una esposizione nominale di 2 tonnellate∙anno. I risultati ottenuti sono in linea con l'ambizioso obiettivo di XENON1T di abbassare gli attuali limiti sulla sezione d'urto, σ, delle WIMPs di due ordini di grandezza. Con tali prestazioni, e considerando 1 tonnellata di LXe come massa fiduciale, XENON1T sarà in grado di superare gli attuali limiti (esperimento LUX, 2013) dopo soli 5 giorni di acquisizione dati.
Resumo:
This study assessed the safety and efficacy of a novel implantable device therapy in resistant hypertension patients.
Resumo:
Craniosynostosis consists of a premature fusion of the sutures in an infant skull that restricts skull and brain growth. During the last decades, there has been a rapid increase of fundamentally diverse surgical treatment methods. At present, the surgical outcome has been assessed using global variables such as cephalic index, head circumference, and intracranial volume. However, these variables have failed in describing the local deformations and morphological changes that may have a role in the neurologic disorders observed in the patients. This report describes a rigid image registration-based method to evaluate outcomes of craniosynostosis surgical treatments, local quantification of head growth, and indirect intracranial volume change measurements. The developed semiautomatic analysis method was applied to computed tomography data sets of a 5-month-old boy with sagittal craniosynostosis who underwent expansion of the posterior skull with cranioplasty. Quantification of the local changes between pre- and postoperative images was quantified by mapping the minimum distance of individual points from the preoperative to the postoperative surface meshes, and indirect intracranial volume changes were estimated. The proposed methodology can provide the surgeon a tool for the quantitative evaluation of surgical procedures and detection of abnormalities of the infant skull and its development.
Resumo:
The Bergman cyclization of large polycyclic enediyne systems that mimic the cores of the enediyne anticancer antibiotics was studied using the ONIOM hybrid method. Tests on small enediynes show that ONIOM can accurately match experimental data. The effect of the triggering reaction in the natural products is investigated, and we support the argument that it is strain effects that lower the cyclization barrier. The barrier for the triggered molecule is very low, leading to a reasonable half-life at biological temperatures. No evidence is found that would suggest a concerted cyclization/H-atom abstraction mechanism is necessary for DNA cleavage.
Resumo:
Over the past 7 years, the enediyne anticancer antibiotics have been widely studied due to their DNA cleaving ability. The focus of these antibiotics, represented by kedarcidin chromophore, neocarzinostatin chromophore, calicheamicin, esperamicin A, and dynemicin A, is on the enediyne moiety contained within each of these antibiotics. In its inactive form, the moiety is benign to its environment. Upon suitable activation, the system undergoes a Bergman cycloaromatization proceeding through a 1,4-dehydrobenzene diradical intermediate. It is this diradical intermediate that is thought to cleave double-stranded dna through hydrogen atom abstraction. Semiempirical, semiempiricalci, Hartree–Fock ab initio, and mp2 electron correlation methods have been used to investigate the inactive hex-3-ene-1,5-diyne reactant, the 1,4-dehydrobenzene diradical, and a transition state structure of the Bergman reaction. Geometries calculated with different basis sets and by semiempirical methods have been used for single-point calculations using electron correlation methods. These results are compared with the best experimental and theoretical results reported in the literature. Implications of these results for computational studies of the enediyne anticancer antibiotics are discussed.
Resumo:
Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.
Resumo:
Nonserial observations have shown this bioresorbable scaffold to have no signs of area reduction at 6 months and recovery of vasomotion at 1 year. Serial observations at 6 months and 2 years have to confirm the absence of late restenosis or unfavorable imaging outcomes.
Resumo:
The purpose of this study was (1) to determine frequency and type of medication errors (MEs), (2) to assess the number of MEs prevented by registered nurses, (3) to assess the consequences of ME for patients, and (4) to compare the number of MEs reported by a newly developed medication error self-reporting tool to the number reported by the traditional incident reporting system. We conducted a cross-sectional study on ME in the Cardiovascular Surgery Department of Bern University Hospital in Switzerland. Eligible registered nurses (n = 119) involving in the medication process were included. Data on ME were collected using an investigator-developed medication error self reporting tool (MESRT) that asked about the occurrence and characteristics of ME. Registered nurses were instructed to complete a MESRT at the end of each shift even if there was no ME. All MESRTs were completed anonymously. During the one-month study period, a total of 987 MESRTs were returned. Of the 987 completed MESRTs, 288 (29%) indicated that there had been an ME. Registered nurses reported preventing 49 (5%) MEs. Overall, eight (2.8%) MEs had patient consequences. The high response rate suggests that this new method may be a very effective approach to detect, report, and describe ME in hospitals.
Resumo:
Blood oxygenation level-dependent (BOLD) MRI was shown to allow non-invasive observation of renal oxygenation in humans. However, clinical applications of this type of functional MRI of the kidney are still limited, most likely because of difficulties in obtaining reproducible and reliable information. The aim of this study was to evaluate the reproducibility and robustness of a BOLD method applied to the kidneys and to identify systematic physiological changes potentially influencing the renal oxygenation of healthy volunteers. To measure the BOLD effect, a modified multi-echo data image combination (MEDIC) sequence was used to acquire 12 T2*-weighted images within a single breath-hold. Three identical measurements were performed on three axial and three coronal slices of right and left kidneys in 18 volunteers. The mean R2* (1/T2*) values determined in medulla and cortex showed no significant differences over three repetitions and low intra-subject coefficients of variation (CV) (3 and 4% in medulla and cortex, respectively). The average R2* values were higher in the medulla (16.15 +/- 0.11) than in the cortex (11.69 +/- 0.18) (P < 0.001). Only a minor influence of slice orientation was observed. Mean R2* values were slightly higher (3%) in the left than in the right kidney (P < 0.001). Differences between volunteers were identified (P < 0.001). Part of these differences was attributable to age-dependent R2* values, since these values increased with age when medulla (P < 0.001, r = 0.67) or cortex (P < 0.020, r = 0.42) were considered. Thus, BOLD measurements in the kidney are highly reproducible and robust. The results allow one to identify the known cortico-medullary gradient of oxygenation evidenced by the gradient of R2* values and suggest that medulla is more hypoxic in older than younger individuals. BOLD-MRI is therefore a useful tool to study sequentially and non-invasively regional oxygenation of human kidneys.
Resumo:
PURPOSE: To prospectively evaluate, for the depiction of simulated hypervascular liver lesions in a phantom, the effect of a low tube voltage, high tube current computed tomographic (CT) technique on image noise, contrast-to-noise ratio (CNR), lesion conspicuity, and radiation dose. MATERIALS AND METHODS: A custom liver phantom containing 16 cylindric cavities (four cavities each of 3, 5, 8, and 15 mm in diameter) filled with various iodinated solutions to simulate hypervascular liver lesions was scanned with a 64-section multi-detector row CT scanner at 140, 120, 100, and 80 kVp, with corresponding tube current-time product settings at 225, 275, 420, and 675 mAs, respectively. The CNRs for six simulated lesions filled with different iodinated solutions were calculated. A figure of merit (FOM) for each lesion was computed as the ratio of CNR2 to effective dose (ED). Three radiologists independently graded the conspicuity of 16 simulated lesions. An anthropomorphic phantom was scanned to evaluate the ED. Statistical analysis included one-way analysis of variance. RESULTS: Image noise increased by 45% with the 80-kVp protocol compared with the 140-kVp protocol (P < .001). However, the lowest ED and the highest CNR were achieved with the 80-kVp protocol. The FOM results indicated that at a constant ED, a reduction of tube voltage from 140 to 120, 100, and 80 kVp increased the CNR by factors of at least 1.6, 2.4, and 3.6, respectively (P < .001). At a constant CNR, corresponding reductions in ED were by a factor of 2.5, 5.5, and 12.7, respectively (P < .001). The highest lesion conspicuity was achieved with the 80-kVp protocol. CONCLUSION: The CNR of simulated hypervascular liver lesions can be substantially increased and the radiation dose reduced by using an 80-kVp, high tube current CT technique.