8 resultados para multiple choice tests
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
In the field of educational and psychological measurement, the shift from paper-based to computerized tests has become a prominent trend in recent years. Computerized tests allow for more complex and personalized test administration procedures, like Computerized Adaptive Testing (CAT). CAT, following the Item Response Theory (IRT) models, dynamically generates tests based on test-taker responses, driven by complex statistical algorithms. Even if CAT structures are complex, they are flexible and convenient, but concerns about test security should be addressed. Frequent item administration can lead to item exposure and cheating, necessitating preventive and diagnostic measures. In this thesis a method called "CHeater identification using Interim Person fit Statistic" (CHIPS) is developed, designed to identify and limit cheaters in real-time during test administration. CHIPS utilizes response times (RTs) to calculate an Interim Person fit Statistic (IPS), allowing for on-the-fly intervention using a more secret item bank. Also, a slight modification is proposed to overcome situations with constant speed, called Modified-CHIPS (M-CHIPS). A simulation study assesses CHIPS, highlighting its effectiveness in identifying and controlling cheaters. However, it reveals limitations when cheaters possess all correct answers. The M-CHIPS overcame this limitation. Furthermore, the method has shown not to be influenced by the cheaters’ ability distribution or the level of correlation between ability and speed of test-takers. Finally, the method has demonstrated flexibility for the choice of significance level and the transition from fixed-length tests to variable-length ones. The thesis discusses potential applications, including the suitability of the method for multiple-choice tests, assumptions about RT distribution and level of item pre-knowledge. Also limitations are discussed to explore future developments such as different RT distributions, unusual honest respondent behaviors, and field testing in real-world scenarios. In summary, CHIPS and M-CHIPS offer real-time cheating detection in CAT, enhancing test security and ability estimation while not penalizing test respondents.
Resumo:
Background New potential hazards in the use of ultrasound (US) are implied in new diagnostic applications of US, such as contrast enhanced US. Aim To assess the level of awareness and knowledge on safety issues of clinical use of US among physicians who are members of the Italian National Society for Ultrasound (SIUMB) Materials and methods A questionnaire including 11 multiple choice quiz was sent by e-mail to members of SIUMB, who preliminarly agreed to participate in this initiative. The answers were received anonimously and statistically analyzed. Results The number of returned valid questionnaires was 97 (8 were considered not valid for less than 10 answers filled). Mean age of the responders was 44 years old, and the average time the physician has been performing ultrasound examinations was 13 years. The principal workplace (70%) was a public Hospital. Physicians seemed to know the general definitions of principal safety-parameters, but few of them knew the definition of specific indexes. There was a general knowledge about the safe use of ultrasound in obstetrics, but there was a poor knowledge of biological effects of US: only about 37% answered correctly to questions about damage of vasculature of lung by high Mechanical Index US investigation and about the increase of temperature under the probe, according to the thermal indexes. Conclusion In conclusion the present findings indicate that greater efforts of National Ultrasound Societies are warranted in disseminating knowledge about the bio-effects of diagnostic ultrasound modalities among their members to prevent possible hazards.
Resumo:
This work is structured as follows: In Section 1 we discuss the clinical problem of heart failure. In particular, we present the phenomenon known as ventricular mechanical dyssynchrony: its impact on cardiac function, the therapy for its treatment and the methods for its quantification. Specifically, we describe the conductance catheter and its use for the measurement of dyssynchrony. At the end of the Section 1, we propose a new set of indexes to quantify the dyssynchrony that are studied and validated thereafter. In Section 2 we describe the studies carried out in this work: we report the experimental protocols, we present and discuss the results obtained. Finally, we report the overall conclusions drawn from this work and we try to envisage future works and possible clinical applications of our results. Ancillary studies that were carried out during this work mainly to investigate several aspects of cardiac resynchronization therapy (CRT) are mentioned in Appendix. -------- Ventricular mechanical dyssynchrony plays a regulating role already in normal physiology but is especially important in pathological conditions, such as hypertrophy, ischemia, infarction, or heart failure (Chapter 1,2.). Several prospective randomized controlled trials supported the clinical efficacy and safety of cardiac resynchronization therapy (CRT) in patients with moderate or severe heart failure and ventricular dyssynchrony. CRT resynchronizes ventricular contraction by simultaneous pacing of both left and right ventricle (biventricular pacing) (Chapter 1.). Currently, the conductance catheter method has been used extensively to assess global systolic and diastolic ventricular function and, more recently, the ability of this instrument to pick-up multiple segmental volume signals has been used to quantify mechanical ventricular dyssynchrony. Specifically, novel indexes based on volume signals acquired with the conductance catheter were introduced to quantify dyssynchrony (Chapter 3,4.). Present work was aimed to describe the characteristics of the conductancevolume signals, to investigate the performance of the indexes of ventricular dyssynchrony described in literature and to introduce and validate improved dyssynchrony indexes. Morevoer, using the conductance catheter method and the new indexes, the clinical problem of the ventricular pacing site optimization was addressed and the measurement protocol to adopt for hemodynamic tests on cardiac pacing was investigated. In accordance to the aims of the work, in addition to the classical time-domain parameters, a new set of indexes has been extracted, based on coherent averaging procedure and on spectral and cross-spectral analysis (Chapter 4.). Our analyses were carried out on patients with indications for electrophysiologic study or device implantation (Chapter 5.). For the first time, besides patients with heart failure, indexes of mechanical dyssynchrony based on conductance catheter were extracted and studied in a population of patients with preserved ventricular function, providing information on the normal range of such a kind of values. By performing a frequency domain analysis and by applying an optimized coherent averaging procedure (Chapter 6.a.), we were able to describe some characteristics of the conductance-volume signals (Chapter 6.b.). We unmasked the presence of considerable beat-to-beat variations in dyssynchrony that seemed more frequent in patients with ventricular dysfunction and to play a role in discriminating patients. These non-recurrent mechanical ventricular non-uniformities are probably the expression of the substantial beat-to-beat hemodynamic variations, often associated with heart failure and due to cardiopulmonary interaction and conduction disturbances. We investigated how the coherent averaging procedure may affect or refine the conductance based indexes; in addition, we proposed and tested a new set of indexes which quantify the non-periodic components of the volume signals. Using the new set of indexes we studied the acute effects of the CRT and the right ventricular pacing, in patients with heart failure and patients with preserved ventricular function. In the overall population we observed a correlation between the hemodynamic changes induced by the pacing and the indexes of dyssynchrony, and this may have practical implications for hemodynamic-guided device implantation. The optimal ventricular pacing site for patients with conventional indications for pacing remains controversial. The majority of them do not meet current clinical indications for CRT pacing. Thus, we carried out an analysis to compare the impact of several ventricular pacing sites on global and regional ventricular function and dyssynchrony (Chapter 6.c.). We observed that right ventricular pacing worsens cardiac function in patients with and without ventricular dysfunction unless the pacing site is optimized. CRT preserves left ventricular function in patients with normal ejection fraction and improves function in patients with poor ejection fraction despite no clinical indication for CRT. Moreover, the analysis of the results obtained using new indexes of regional dyssynchrony, suggests that pacing site may influence overall global ventricular function depending on its relative effects on regional function and synchrony. Another clinical problem that has been investigated in this work is the optimal right ventricular lead location for CRT (Chapter 6.d.). Similarly to the previous analysis, using novel parameters describing local synchrony and efficiency, we tested the hypothesis and we demonstrated that biventricular pacing with alternative right ventricular pacing sites produces acute improvement of ventricular systolic function and improves mechanical synchrony when compared to standard right ventricular pacing. Although no specific right ventricular location was shown to be superior during CRT, the right ventricular pacing site that produced the optimal acute hemodynamic response varied between patients. Acute hemodynamic effects of cardiac pacing are conventionally evaluated after stabilization episodes. The applied duration of stabilization periods in most cardiac pacing studies varied considerably. With an ad hoc protocol (Chapter 6.e.) and indexes of mechanical dyssynchrony derived by conductance catheter we demonstrated that the usage of stabilization periods during evaluation of cardiac pacing may mask early changes in systolic and diastolic intra-ventricular dyssynchrony. In fact, at the onset of ventricular pacing, the main dyssynchrony and ventricular performance changes occur within a 10s time span, initiated by the changes in ventricular mechanical dyssynchrony induced by aberrant conduction and followed by a partial or even complete recovery. It was already demonstrated in normal animals that ventricular mechanical dyssynchrony may act as a physiologic modulator of cardiac performance together with heart rate, contractile state, preload and afterload. The present observation, which shows the compensatory mechanism of mechanical dyssynchrony, suggests that ventricular dyssynchrony may be regarded as an intrinsic cardiac property, with baseline dyssynchrony at increased level in heart failure patients. To make available an independent system for cardiac output estimation, in order to confirm the results obtained with conductance volume method, we developed and validated a novel technique to apply the Modelflow method (a method that derives an aortic flow waveform from arterial pressure by simulation of a non-linear three-element aortic input impedance model, Wesseling et al. 1993) to the left ventricular pressure signal, instead of the arterial pressure used in the classical approach (Chapter 7.). The results confirmed that in patients without valve abnormalities, undergoing conductance catheter evaluations, the continuous monitoring of cardiac output using the intra-ventricular pressure signal is reliable. Thus, cardiac output can be monitored quantitatively and continuously with a simple and low-cost method. During this work, additional studies were carried out to investigate several areas of uncertainty of CRT. The results of these studies are briefly presented in Appendix: the long-term survival in patients treated with CRT in clinical practice, the effects of CRT in patients with mild symptoms of heart failure and in very old patients, the limited thoracotomy as a second choice alternative to transvenous implant for CRT delivery, the evolution and prognostic significance of diastolic filling pattern in CRT, the selection of candidates to CRT with echocardiographic criteria and the prediction of response to the therapy.
Resumo:
In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.
Resumo:
High spectral resolution radiative transfer (RT) codes are essential tools in the study of the radiative energy transfer in the Earth atmosphere and a support for the development of parameterizations for fast RT codes used in climate and weather prediction models. Cirrus clouds cover permanently 30% of the Earth's surface, representing an important contribution to the Earth-atmosphere radiation balance. The work has been focussed on the development of the RT model LBLMS. The model, widely tested in the infra-red spectral range, has been extended to the short wave spectrum and it has been used in comparison with airborne and satellite measurements to study the optical properties of cirrus clouds. A new database of single scattering properties has been developed for mid latitude cirrus clouds. Ice clouds are treated as a mixture of ice crystals with various habits. The optical properties of the mixture are tested in comparison to radiometric measurements in selected case studies. Finally, a parameterization of the mixture for application to weather prediction and global circulation models has been developed. The bulk optical properties of ice crystals are parameterized as functions of the effective dimension of measured particle size distributions that are representative of mid latitude cirrus clouds. Tests with the Limited Area Weather Prediction model COSMO have shown the impact of the new parameterization with respect to cirrus cloud optical properties based on ice spheres.
Resumo:
Organotin compounds are worldwide diffused environmental contaminants, mainly as consequence of their extensive past use as biocides in antifouling paints. In spite of law restrictions, due to unwanted effects, organotin still persist in waters, being poorly degraded, easily resuspended from sediments and bioaccumulated in exposed organisms. The widespread toxicity and the possible threat to humans, likely to be organotin-exposed through contaminated seafood, make organotin interactions with biomolecules an intriguing biochemical topic, apart from a matter of ecotoxicological concern. Among organotins, tributyltin (TBT) is long known as the most dangerous and abundant chemical species in the Mediterranean Sea. Due to its amphiphilic nature, provided by three lipophilic arms and an electrophilic tin core, TBT can be easily incorporated in biomembranes and affect their functionality. Accordingly, it is known as a membrane-active toxicant and a mitochondrial poison. Up to now the molecular action modes of TBT are still partially unclear and poorly explored in bivalve mollusks, even if the latter play a not neglectable role in the marine trophic chain and efficiently accumulate organotins. The bivalve mollusk Mytilus galloprovincialis, selected for all experiments, is widely cultivated in the Mediterranean and currently used in ecotoxicological studies. Most work of this thesis was devoted to TBT effects on mussel mitochondria, but other possible targets of TBT were also considered. A great deal of literature points out TBT as endocrine disrupter and the masculinization of female marine gastropods, the so-called imposex, currently signals environmental organotin contamination. The hormonal status of TBT-exposed mussels and the possible interaction between hormones and contaminants in modulating microsomal hydroxilases, involved in steroid hormone and organotin detoxification, were the research topics in the period spent in Barcelona (Marco Polo fellowship). The variegated experimental approach, which consisted of two exposure experiments and in vitro tests, and the choice of selected tissues of M. galloprovincialis, the midgut gland for mitochondrial and microsomal preparations for subsequent laboratory assays and the gonads for the endocrine evaluations, aimed at drawing a clarifying pattern on the molecular mechanisms involved in organotin toxicity. TBT was promptly incorporated in midgut gland mitochondria of adult mussels exposed to 0.5 and 1.0 μg/L TBT, and partially degraded to DBT. TBT incorporation was accompanied by a decrease in the mitochondrial oligomycin-sensitive Mg-ATPase activity, while the coexistent oligomycin-insensitive fraction was unaffected. Mitochondrial fatty acids showed a clear rise in n-3 polyunsaturated fatty acids after 120 hr of TBT exposure, mainly referable to an increase in 22:6 level. TBT was also shown to inhibit the ATP hydrolytic activity of the mitochondrial F1FO complex in vitro and to promote an apparent loss of oligomycin sensitivity at higher than 1.0 μM concentration. The complex dose-dependent profile of the inhibition curve lead to the hypothesis of multiple TBT binding sites. At lower than 1.0 μM TBT concentrations the non competitive enzyme inhibition by TBT was ascribed to the non covalent binding of TBT to FO subunit. On the other hand the observed drop in oligomycin sensitivity at higher than 1.0 μM TBT could be related to the onset of covalent bonds involving thiolic groups on the enzyme structure, apparently reached only at high TBT levels. The mitochondrial respiratory complexes were in vitro affected by TBT, apart from the cytocrome c oxidase which was apparently refractory to the contaminant. The most striking inhibitory effect was shown on complex I, and ascribed to possible covalent bonds of TBT with –SH groups on the enzyme complexes. This mechanism, shouldered by the progressive decrease of free cystein residues in the presence of increasing TBT concentrations, suggests that the onset of covalent tin-sulphur bonds in distinct protein structures may constitute the molecular basis of widespread TBT effects on mitochondrial complexes. Energy production disturbances, in turn affecting energy consuming mechanisms, could be involved in other cellular changes. Mussels exposed to a wide range of TBT concentrations (20 - 200 and 2000 ng/L respectively) did not show any change in testosterone and estrogen levels in mature gonads. Most hormones were in the non-biologically active esterified form both in control and in TBT-treated mussels. Probably the endocrine status of sexually mature mussels could be refractory even to high TBT doses. In mussel digestive gland the high biological variability of microsomal 7-benzyloxy-4-trifluoromethylcoumarin-O-Debenzyloxylase (BFCOD) activity, taken as a measure of CYP3A-like efficiency, probably concealed any enzyme response to TBT exposure. On the other hand the TBT-driven enhancement of BFCOD activity in vitro was once again ascribed to covalent binding to thiol groups which, in this case, would stimulate the enzyme activity. In mussels from Barcelona harbour, a highly contaminated site, the enzyme showed a decreased affinity for the 7-benzyloxy-4-trifluoromethylcoumarin (BCF) substrate with respect to mussel sampled from Ebro Delta, a non-polluted marine site. Contaminant exposure may thus alter the kinetic features of enzymes involved in detoxification mechanisms. Contaminants and steroid hormones were clearly shown to mutually interact in the modulation of detoxification mechanisms. The xenoestrogen 17α-ethylenyl estradiol (EE2) displayed a non-competitive mixed inhibition of CYP3A-like activity by a preferential bond to the free enzyme both in Barcelona harbour and Ebro Delta mussels. The possible interaction with co-present contaminants in Barcelona harbour mussels apparently lessened the formation of the ternary complex enzyme-EE2-BCF. The whole of data confirms TBT as membrane toxicant in mussels as in other species and stresses TBT covalent binding to protein thiols as a widespread mechanism of membrane-bound-enzyme activity modulation by the contaminant.
Resumo:
Introduction and Background: Multiple system atrophy (MSA) is a sporadic, adult-onset, progressive neurodegenerative disease characterized clinically by parkinsonism, cerebellar ataxia, and autonomic failure. We investigated cognitive functions longitudinally in a group of probable MSA patients, matching data with sleep parameters. Patients and Methods: 10 patients (7m/3f) underwent a detailed interview, a general and neurological examination, laboratory exams, MRI scans, a cardiovascular reflexes study, a battery of neuropsychological tests, and video-polysomnographic recording (VPSG). Patients were revaluated (T1) a mean of 16±5 (range: 12-28) months after the initial evaluation (T0). At T1, the neuropsychological assessment and VPSG were repeated. Results: The mean patient age was 57.8±6.4 years (range: 47-64) with a mean age at disease onset of 53.2±7.1 years (range: 43-61) and symptoms duration at T0 of 60±48 months (range: 12-144). At T0, 7 patients showed no cognitive deficits while 3 patients showed isolated cognitive deficits. At T1, 1 patient worsened developing multiple cognitive deficits from a normal condition. At T0 and T1, sleep efficiency was reduced, REM latency increased, NREM sleep stages 1-2 slightly increased. Comparisons between T1 and T0 showed a significant worsening in two tests of attention and no significant differences of VPSG parameters. No correlation was found between neuropsychological results and VPSG findings or RBD duration. Discussion and Conclusions: The majority of our patients do not show any cognitive deficits at T0 and T1, while isolated cognitive deficits are present in the remaining patients. Attention is the cognitive function which significantly worsened. Our data confirm the previous findings concerning the prevalence, type and the evolution of cognitive deficits in MSA. Regarding the developing of a condition of dementia, our data did not show a clear-cut diagnosis of dementia. We confirm a mild alteration of sleep structure. RBD duration does not correlate with neuropsychological findings.
Resumo:
In rural and isolated areas without cellular coverage, Satellite Communication (SatCom) is the best candidate to complement terrestrial coverage. However, the main challenge for future generations of wireless networks will be to meet the growing demand for new services while dealing with the scarcity of frequency spectrum. As a result, it is critical to investigate more efficient methods of utilizing the limited bandwidth; and resource sharing is likely the only choice. The research community’s focus has recently shifted towards the interference management and exploitation paradigm to meet the increasing data traffic demands. In the Downlink (DL) and Feedspace (FS), LEO satellites with an on-board antenna array can offer service to numerous User Terminals (UTs) (VSAT or Handhelds) on-ground in FFR schemes by using cutting-edge digital beamforming techniques. Considering this setup, the adoption of an effective user scheduling approach is a critical aspect given the unusually high density of User terminals on the ground as compared to the on-board available satellite antennas. In this context, one possibility is that of exploiting clustering algorithms for scheduling in LEO MU-MIMO systems in which several users within the same group are simultaneously served by the satellite via Space Division Multiplexing (SDM), and then these different user groups are served in different time slots via Time Division Multiplexing (TDM). This thesis addresses this problem by defining a user scheduling problem as an optimization problem and discusses several algorithms to solve it. In particular, focusing on the FS and user service link (i.e., DL) of a single MB-LEO satellite operating below 6 GHz, the user scheduling problem in the Frequency Division Duplex (FDD) mode is addressed. The proposed State-of-the-Art scheduling approaches are based on graph theory. The proposed solution offers high performance in terms of per-user capacity, Sum-rate capacity, SINR, and Spectral Efficiency.