903 resultados para Probabilistic latent semantic analysis (PLSA)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the numerical solution of random crack propagation problems using the coupling boundary element method (BEM) and reliability algorithms. Crack propagation phenomenon is efficiently modelled using BEM, due to its mesh reduction features. The BEM model is based on the dual BEM formulation, in which singular and hyper-singular integral equations are adopted to construct the system of algebraic equations. Two reliability algorithms are coupled with BEM model. The first is the well known response surface method, in which local, adaptive polynomial approximations of the mechanical response are constructed in search of the design point. Different experiment designs and adaptive schemes are considered. The alternative approach direct coupling, in which the limit state function remains implicit and its gradients are calculated directly from the numerical mechanical response, is also considered. The performance of both coupling methods is compared in application to some crack propagation problems. The investigation shows that direct coupling scheme converged for all problems studied, irrespective of the problem nonlinearity. The computational cost of direct coupling has shown to be a fraction of the cost of response surface solutions, regardless of experiment design or adaptive scheme considered. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Wound healing process involves the activation of extracellular matrix components, remodeling enzymes, cellular adhesion molecules, growth factors, cytokines and chemokines genes. However, the molecular patterns underlying the healing process periapical environment remain unclear. Here we hypothesized that endodontic infection might result in an imbalance in the expression of wound healing genes involved in the pathogenesis of periapical lesions. Furthermore, we suggest that differential expression of wound healing markers in active and latent granulomas could account for different clinical outcomes for such lesions. Methods: Study samples consisted of 93 periapical granulomas collected after endodontic surgeries and 24 healthy periodontal ligament tissues collected from premolars extracted for orthodontic purposes as control samples. Of these, 10 periapical granulomas and 5 healthy periapical tissues were used for expression analysis of 84 wound healing genes by using a pathway-specific real-time polymerase chain reaction array. The remaining 83 granulomas and all 24 control specimens were used to validate the obtained array data by real-time polymerase chain reaction. Observed variations in expression of wound healing genes were analyzed according to the classification of periapical granulomas as active/progressive versus inactive/stable (as determined by receptor activator for nuclear factor kappa B ligand/osteoprotegerin expression ratio). Results: We observed a marked increase of 5-fold or greater in SERPINE1, TIMP1, COL1A1, COL5A1, VTN, CTGF, FGF7, TGFB1, TNF, CXCL11, ITGA4, and ITGA5 genes in the periapical granulomas when compared with control samples. SERPINE1, TIMP1, COL1A1, TGFB1, and ITGA4 mRNA expression was significantly higher in inactive compared with active periapical granulomas (P < .001), whereas TNF and CXCL11 mRNA expression was higher in active lesions (P < .001). Conclusions: The identification of novel gene targets that curb the progression status of periapical lesions might contribute to a more accurate diagnosis and lead to treatment modalities more conducive to endodontic success. (J Endod 2012;38:185-190)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The frequent occurrence of inconclusive serology in blood banks and the absence of a gold standard test for Chagas'disease led us to examine the efficacy of the blood culture test and five commercial tests (ELISA, IIF, HAI, c-ELISA, rec-ELISA) used in screening blood donors for Chagas disease, as well as to investigate the prevalence of Trypanosoma cruzi infection among donors with inconclusive serology screening in respect to some epidemiological variables. METHODS: To obtain estimates of interest we considered a Bayesian latent class model with inclusion of covariates from the logit link. RESULTS: A better performance was observed with some categories of epidemiological variables. In addition, all pairs of tests (excluding the blood culture test) presented as good alternatives for both screening (sensitivity > 99.96% in parallel testing) and for confirmation (specificity > 99.93% in serial testing) of Chagas disease. The prevalence of 13.30% observed in the stratum of donors with inconclusive serology, means that probably most of these are non-reactive serology. In addition, depending on the level of specific epidemiological variables, the absence of infection can be predicted with a probability of 100% in this group from the pairs of tests using parallel testing. CONCLUSION: The epidemiological variables can lead to improved test results and thus assist in the clarification of inconclusive serology screening results. Moreover, all combinations of pairs using the five commercial tests are good alternatives to confirm results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the seasonal patterns of Amazonian forest photosynthetic activity, and the effects thereon of variations in climate and land-use, by integrating data from a network of ground-based eddy flux towers in Brazil established as part of the ‘Large-Scale Biosphere Atmosphere Experiment in Amazonia’ project. We found that degree of water limitation, as indicated by the seasonality of the ratio of sensible to latent heat flux (Bowen ratio) predicts seasonal patterns of photosynthesis. In equatorial Amazonian forests (5◦ N–5◦ S), water limitation is absent, and photosynthetic fluxes (or gross ecosystem productivity, GEP) exhibit high or increasing levels of photosynthetic activity as the dry season progresses, likely a consequence of allocation to growth of new leaves. In contrast, forests along the southern flank of the Amazon, pastures converted from forest, and mixed forest-grass savanna, exhibit dry-season declines in GEP, consistent with increasing degrees of water limitation. Although previous work showed tropical ecosystem evapotranspiration (ET) is driven by incoming radiation, GEP observations reported here surprisingly show no or negative relationships with photosynthetically active radiation (PAR). Instead, GEP fluxes largely followed the phenology of canopy photosynthetic capacity (Pc), with only deviations from this primary pattern driven by variations in PAR. Estimates of leaf flush at three

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A regional envelope curve (REC) of flood flows summarises the current bound on our experience of extreme floods in a region. RECs are available for most regions of the world. Recent scientific papers introduced a probabilistic interpretation of these curves and formulated an empirical estimator of the recurrence interval T associated with a REC, which, in principle, enables us to use RECs for design purposes in ungauged basins. The main aim of this work is twofold. First, it extends the REC concept to extreme rainstorm events by introducing the Depth-Duration Envelope Curves (DDEC), which are defined as the regional upper bound on all the record rainfall depths at present for various rainfall duration. Second, it adapts the probabilistic interpretation proposed for RECs to DDECs and it assesses the suitability of these curves for estimating the T-year rainfall event associated with a given duration and large T values. Probabilistic DDECs are complementary to regional frequency analysis of rainstorms and their utilization in combination with a suitable rainfall-runoff model can provide useful indications on the magnitude of extreme floods for gauged and ungauged basins. The study focuses on two different national datasets, the peak over threshold (POT) series of rainfall depths with duration 30 min., 1, 3, 9 and 24 hrs. obtained for 700 Austrian raingauges and the Annual Maximum Series (AMS) of rainfall depths with duration spanning from 5 min. to 24 hrs. collected at 220 raingauges located in northern-central Italy. The estimation of the recurrence interval of DDEC requires the quantification of the equivalent number of independent data which, in turn, is a function of the cross-correlation among sequences. While the quantification and modelling of intersite dependence is a straightforward task for AMS series, it may be cumbersome for POT series. This paper proposes a possible approach to address this problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The inherent stochastic character of most of the physical quantities involved in engineering models has led to an always increasing interest for probabilistic analysis. Many approaches to stochastic analysis have been proposed. However, it is widely acknowledged that the only universal method available to solve accurately any kind of stochastic mechanics problem is Monte Carlo Simulation. One of the key parts in the implementation of this technique is the accurate and efficient generation of samples of the random processes and fields involved in the problem at hand. In the present thesis an original method for the simulation of homogeneous, multi-dimensional, multi-variate, non-Gaussian random fields is proposed. The algorithm has proved to be very accurate in matching both the target spectrum and the marginal probability. The computational efficiency and robustness are very good too, even when dealing with strongly non-Gaussian distributions. What is more, the resulting samples posses all the relevant, welldefined and desired properties of “translation fields”, including crossing rates and distributions of extremes. The topic of the second part of the thesis lies in the field of non-destructive parametric structural identification. Its objective is to evaluate the mechanical characteristics of constituent bars in existing truss structures, using static loads and strain measurements. In the cases of missing data and of damages that interest only a small portion of the bar, Genetic Algorithm have proved to be an effective tool to solve the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Doctoral Thesis focuses on the study of individual behaviours as a result of organizational affiliation. The objective is to assess the Entrepreneurial Orientation of individuals proving the existence of a set of antecedents to that measure returning a structural model of its micro-foundation. Relying on the developed measurement model, I address the issue whether some Entrepreneurs experience different behaviours as a result of their academic affiliation, comparing a sample of ‘Academic Entrepreneurs’ to a control sample of ‘Private Entrepreneurs’ affiliated to a matched sample of Academic Spin-offs and Private Start-ups. Building on the Theory of the Planned Behaviour, proposed by Ajzen (1991), I present a model of causal antecedents of Entrepreneurial Orientation on constructs extensively used and validated, both from a theoretical and empirical perspective, in sociological and psychological studies. I focus my investigation on five major domains: (a) Situationally Specific Motivation, (b) Personal Traits and Characteristics, (c) Individual Skills, (d) Perception of the Business Environment and (e) Entrepreneurial Orientation Related Dimensions. I rely on a sample of 200 Entrepreneurs, affiliated to a matched sample of 72 Academic Spin-offs and Private Start-ups. Firms are matched by Industry, Year of Establishment and Localization and they are all located in the Emilia Romagna region, in northern Italy. I’ve gathered data by face to face interviews and used a Structural Equation Modeling technique (Lisrel 8.80, Joreskog, K., & Sorbom, D. 2006) to perform the empirical analysis. The results show that Entrepreneurial Orientation is a multi-dimensional micro-founded construct which can be better represented by a Second-Order Model. The t-tests on the latent means reveal that the Academic Entrepreneurs differ in terms of: Risk taking, Passion, Procedural and Organizational Skills, Perception of the Government, Context and University Supports. The Structural models also reveal that the main differences between the two groups lay in the predicting power of Technical Skills, Perceived Context Support and Perceived University Support in explaining the Entrepreneurial Orientation Related Dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forecasting the time, location, nature, and scale of volcanic eruptions is one of the most urgent aspects of modern applied volcanology. The reliability of probabilistic forecasting procedures is strongly related to the reliability of the input information provided, implying objective criteria for interpreting the historical and monitoring data. For this reason both, detailed analysis of past data and more basic research into the processes of volcanism, are fundamental tasks of a continuous information-gain process; in this way the precursor events of eruptions can be better interpreted in terms of their physical meanings with correlated uncertainties. This should lead to better predictions of the nature of eruptive events. In this work we have studied different problems associated with the long- and short-term eruption forecasting assessment. First, we discuss different approaches for the analysis of the eruptive history of a volcano, most of them generally applied for long-term eruption forecasting purposes; furthermore, we present a model based on the characteristics of a Brownian passage-time process to describe recurrent eruptive activity, and apply it for long-term, time-dependent, eruption forecasting (Chapter 1). Conversely, in an effort to define further monitoring parameters as input data for short-term eruption forecasting in probabilistic models (as for example, the Bayesian Event Tree for eruption forecasting -BET_EF-), we analyze some characteristics of typical seismic activity recorded in active volcanoes; in particular, we use some methodologies that may be applied to analyze long-period (LP) events (Chapter 2) and volcano-tectonic (VT) seismic swarms (Chapter 3); our analysis in general are oriented toward the tracking of phenomena that can provide information about magmatic processes. Finally, we discuss some possible ways to integrate the results presented in Chapters 1 (for long-term EF), 2 and 3 (for short-term EF) in the BET_EF model (Chapter 4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Untersuchungen der murinen Cytomegalovirus (mCMV) Infektion im BALB/c Mausmodell konzentrierten sich bislang auf die Lunge, da diese einen Hauptort der mCMV Latenz darstellt. Da latentes CMV auch häufig durch Lebertransplantationen übertragen wird, wurde in dieser Arbeit die Leber als ein weiteres medizinisch relevantes Organ der CMV Latenz und Reaktivierung untersucht. Um zunächst die zellulären Orte der mCMV Latenz in der Leber zu ermitteln, wurden verschiedengeschlechtliche Knochenmarktransplantationen (KMT) mit männlichen tdy-positiven Spendern und weiblichen, tdy-negativen Empfängern, mit anschließender mCMV Infektion durchgeführt, um latent infizierte Mäuse mit geschlechtschromosomalem Chimärismus zu generieren. Diese Chimären erlaubten eine Unterscheidung zwischen tdy-positiven Zellen hämatopoetischen Ursprungs und tdy-negativen stromalen und parenchymalen Gewebszellen. Die Separation von Leberzellen der Chimären mittels zentrifugaler Elutriation und anschließender DNA Quantifizierung viraler und zellulärer Genome durch eine quantitative real-time PCR ergab einen ersten Hinweis, dass Endothelzellen ein zellulärer Ort der mCMV Latenz sind. Die darauf folgende immunomagnetische Zelltrennung lokalisierte latente virale DNA in der CD31-positiven Zellfraktion. Die Koexpression von CD31 mit dem endothelzellspezifischen Oberflächenmarker ME-9F1 identifizierte die sinusoidalen Endothelzellen der Leber (LSEC) als die Zellen, die latente virale DNA beherbergen. In den zytofluorometrisch aufgereinigten CD31+/ME-9F1+ LSEC waren bei gleichzeitigem Rückgang der männlichen tdy Markergene virale Genome angereichert, was darauf hinwies, dass Zellen, die virale DNA enthalten, vom Knochenmark-Empfänger stammen. Durch zytofluorometrische Analysen isolierter LSEC konnte eine vom Spender abstammende Subpopulation MHCII+/CD11b+ LSEC identifiziert werden. Anschließende Quantifizierungen viraler DNA aus latent infizierten Mäusen detektierten eine Abnahme viraler Genome mit zunehmender Menge an tdy-positiven Zellen, was beweist, dass MHCII+/CD11b+ LSEC keinen Ort der mCMV Latenz darstellen. Die limiting dilution Untersuchungen der isolierten latent infizierten LSEC ergaben eine Frequenz von einer latent infizierten Zelle unter ~1,9x104 LSEC und eine Anzahl von 7 bis 19 viralen Genomen pro latent infizierter Zelle. Nach 24 Stunden Kultivierung der LSEC konnte mittels quantitativer real-time RT-PCR mit Gesamt-RNA aus LSEC ein Anstieg der Genexpression der immediate early Gene ie1 und ie3 sowie eine Induktion des early Gens e1 gezeigt werden. Eine Erhöhung der transkriptionellen Reaktivierung durch die Inkubation der LSEC mit unterschiedlichen HDAC Inhibitoren konnte allerdings nicht erzielt werden, da sowohl die Menge der isolierten RNA aus behandelten Kulturen, als auch die Anzahl viraler Transkripte im Vergleich zu den unbehandelten Kulturen erniedrigt war. Aufgrund der kurzen Lebensdauer isolierter LSEC in vitro konnte durch Kokultivierungen latent infizierter LSEC zusammen mit murinen embryonalen Fibroblasten keine Virusreaktivierung induziert werden. Im Gegensatz dazu wurden durch den Transfer gereinigter ME-9F1+/CD31+ LSEC aus latent infizierten Spendern in immunsupprimierte Empfänger virale Rekurrenzen in Lungenexplantatkulturen des Rezipienten detektiert. Damit konnten LSEC eindeutig als zellulärer Ort von mCMV Latenz und Reaktivierung in der Leber identifiziert werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many industries and academic institutions share the vision that an appropriate use of information originated from the environment may add value to services in multiple domains and may help humans in dealing with the growing information overload which often seems to jeopardize our life. It is also clear that information sharing and mutual understanding between software agents may impact complex processes where many actors (humans and machines) are involved, leading to relevant socioeconomic benefits. Starting from these two input, architectural and technological solutions to enable “environment-related cooperative digital services” are here explored. The proposed analysis starts from the consideration that our environment is physical space and here diversity is a major value. On the other side diversity is detrimental to common technological solutions, and it is an obstacle to mutual understanding. An appropriate environment abstraction and a shared information model are needed to provide the required levels of interoperability in our heterogeneous habitat. This thesis reviews several approaches to support environment related applications and intends to demonstrate that smart-space-based, ontology-driven, information-sharing platforms may become a flexible and powerful solution to support interoperable services in virtually any domain and even in cross-domain scenarios. It also shows that semantic technologies can be fruitfully applied not only to represent application domain knowledge. For example semantic modeling of Human-Computer Interaction may support interaction interoperability and transformation of interaction primitives into actions, and the thesis shows how smart-space-based platforms driven by an interaction ontology may enable natural ad flexible ways of accessing resources and services, e.g, with gestures. An ontology for computational flow execution has also been built to represent abstract computation, with the goal of exploring new ways of scheduling computation flows with smart-space-based semantic platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral dissertation presents a new method to asses the influence of clearancein the kinematic pairs on the configuration of planar and spatial mechanisms. The subject has been widely investigated in both past and present scientific literature, and is approached in different ways: a static/kinetostatic way, which looks for the clearance take-up due to the external loads on the mechanism; a probabilistic way, which expresses clearance-due displacements using probability density functions; a dynamic way, which evaluates dynamic effects like the actual forces in the pairs caused by impacts, or the consequent vibrations. This dissertation presents a new method to approach the problem of clearance. The problem is studied from a purely kinematic perspective. With reference to a given mechanism configuration, the pose (position and orientation) error of the mechanism link of interest is expressed as a vector function of the degrees of freedom introduced in each pair by clearance: the presence of clearance in a kinematic pair, in facts, causes the actual pair to have more degrees of freedom than the theoretical clearance-free one. The clearance-due degrees of freedom are bounded by the pair geometry. A proper modelling of clearance-affected pairs allows expressing such bounding through analytical functions. It is then possible to study the problem as a maximization problem, where a continuous function (the pose error of the link of interest) subject to some constraints (the analytical functions bounding clearance- due degrees of freedom) has to be maximize. Revolute, prismatic, cylindrical, and spherical clearance-affected pairs have been analytically modelled; with reference to mechanisms involving such pairs, the solution to the maximization problem has been obtained in a closed form.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dealing with latent constructs (loaded by reflective and congeneric measures) cross-culturally compared means studying how these unobserved variables vary, and/or covary each other, after controlling for possibly disturbing cultural forces. This yields to the so-called ‘measurement invariance’ matter that refers to the extent to which data collected by the same multi-item measurement instrument (i.e., self-reported questionnaire of items underlying common latent constructs) are comparable across different cultural environments. As a matter of fact, it would be unthinkable exploring latent variables heterogeneity (e.g., latent means; latent levels of deviations from the means (i.e., latent variances), latent levels of shared variation from the respective means (i.e., latent covariances), levels of magnitude of structural path coefficients with regard to causal relations among latent variables) across different populations without controlling for cultural bias in the underlying measures. Furthermore, it would be unrealistic to assess this latter correction without using a framework that is able to take into account all these potential cultural biases across populations simultaneously. Since the real world ‘acts’ in a simultaneous way as well. As a consequence, I, as researcher, may want to control for cultural forces hypothesizing they are all acting at the same time throughout groups of comparison and therefore examining if they are inflating or suppressing my new estimations with hierarchical nested constraints on the original estimated parameters. Multi Sample Structural Equation Modeling-based Confirmatory Factor Analysis (MS-SEM-based CFA) still represents a dominant and flexible statistical framework to work out this potential cultural bias in a simultaneous way. With this dissertation I wanted to make an attempt to introduce new viewpoints on measurement invariance handled under covariance-based SEM framework by means of a consumer behavior modeling application on functional food choices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the thesis is to propose a Bayesian estimation through Markov chain Monte Carlo of multidimensional item response theory models for graded responses with complex structures and correlated traits. In particular, this work focuses on the multiunidimensional and the additive underlying latent structures, considering that the first one is widely used and represents a classical approach in multidimensional item response analysis, while the second one is able to reflect the complexity of real interactions between items and respondents. A simulation study is conducted to evaluate the parameter recovery for the proposed models under different conditions (sample size, test and subtest length, number of response categories, and correlation structure). The results show that the parameter recovery is particularly sensitive to the sample size, due to the model complexity and the high number of parameters to be estimated. For a sufficiently large sample size the parameters of the multiunidimensional and additive graded response models are well reproduced. The results are also affected by the trade-off between the number of items constituting the test and the number of item categories. An application of the proposed models on response data collected to investigate Romagna and San Marino residents' perceptions and attitudes towards the tourism industry is also presented.