909 resultados para Multidimensional Scaling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The October 2015 Food and Beverage Entrepreneurship Roundtable brought together over 30 food and beverage industry leaders, entrepreneurs, faculty, and students at the School of Hotel Administration at Cornell University. Discussion topics covered entrepreneurship in the food and beverage industry, including development, intrapreneurship, operational efficiency, beverage product development, and technology. The roundtable began with the presentation of a five-point framework on food and beverage venue development. The first three phases focused on the launch of a venue, including how to define the guest experience; the creation of operational functionality by strategically planning out the design, flow, and efficiency of a defined space; and development capacity. The remaining two points of the framework focused on post-opening considerations, including operating systems and culture development. Participants discussed the importance of culture in the growth of a business. They suggested that intrapreneurship needs to be fostered in the culture of an organization and in an educational curriculum for those who are preparing to enter the industry. Participants also discussed the fine balance between setting expectations for an experience and subsequently being able to maintain this experience in a fast changing environment. In particular they considered what it means to say no to customers. A discussion on the beverage industry focused on how to distribute products in a crowded marketplace. One method to ensure that the product gets into the hands of the consumers is face-to-face sales. Finally, in the technology session, the group discussed technology adoption, specifically focusing on the point at which technology detracts from the guest experience, how to minimize operational risk from technology, and how to maximize consumers’ adoption rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução: A profusão de informação na área médica cria problemas de gestão, sendo necessários métodos sistematizados para armazenamento e recuperação. Quando a informação se insere no contexto do processo clínico, os métodos devem integrar terminologias biomédicas controladas e igualmente devem integrar as características desejáveis dirigidas à estrutura, conteúdo e resultados clínicos. O objectivo deste artigo é testar a aplicabilidade e capacidade de recuperação, de um sistema multidimensional desenvolvido para classificação e gestão de informação em saúde. Métodos: A partir das questões recebidas em seis anos (Serviço de Informação de Medicamentos, Serviços Farmacêuticos, Hospitais da Universidade de Coimbra), seleccionaram-se 300 questões sobre informação clínica, por método aleatório informatizado. Caracterizou-se e avaliou-se a aplicabilidade pela quantidade classificada e pela necessidade de alterações ao sistema que é constituído por várias dimensões independentes e que englobam conceitos por vezes hierarquizados. A recuperação das questões foi testada pesquisando informação numa dimensão ou cruzamento de dimensões. Resultados: Todas as questões foram classificadas: 53% são casos clínicos com incidência nas doenças geniturinárias; doenças metabólicas, nutricionais e endócrinas; neoplasias; infecções e doenças do sistema nervoso. Em 81%, o objecto é o medicamento, sobretudo anti-infecciosos e anti-neoplásicos. As áreas de terapêutica e segurança foram as mais solicitadas, incidindo principalmente sobre os assuntos: utilização, reacções adversas, identificação de medicamentos e tecnologia farmacêutica. Na aplicabilidade, foi necessário adicionar alguns conceitos e modificar alguns grupos hierárquicos que não modificaram a estrutura base, nem colidiram com as características desejáveis. As limitações prenderam-se com os sistemas de classificação externos integrados. A pesquisa na dimensão assunto, do conceito administração de medicamentos, recuperou 19 questões. O cruzamento de duas dimensões: anti-infecciosos (externa) e teratogenicidade (assunto), recuperou três questões. Nos dois exemplos recupera-se informação a partir de qualquer um dos níveis da hierarquia, do mais geral ao mais específico e mesmo a partir de dimensões externas. Conclusões: A utilização do sistema nesta amostra demonstrou aplicabilidade na classificação e arquivo de informação clínica, capacidade de recuperação e flexibilidade, sofrendo alterações sem interferir com as características desejáveis. Esta ferramenta permite a recuperação da evidência que interessa orientada para o doente. Introduction: The large amount of information in the medical area creates management problems, being necessary systematic methods for filing and retrieval. With information on the context of clinical records, methods must integrate controlled biomedical terminologies and desirable characteristics oriented to the structure, content and clinical results. The objective is to test the applicability and capacity for retrieval of a multidimensional system developed for classification and management of health information. Methods: Three hundred questions were randomly selected, by computerized method, from the questions received in six years (Medicine Information Service, Pharmaceutical Department, Coimbra University Hospitals). They were characterized and applicability evaluated by classified amount and need to alter the system, which is composed of various independent dimensions, incorporating concepts sometimes hierarchical. Questions retrieval was tested searching information in a dimension or between dimensions. Results: All questions were classified: 53% are clinical cases with illnesses incidence in the genitourinary system; metabolic, nutritional and endocrine disease; cancer; infections and nervous system. In 81%, the object is a drug, mostly anti-infectious and anti-neoplastic agents. The therapeutic and safety areas had been the most requested, regarding the subjects: use, adverse reactions, drug identification and pharmaceutical technology. As to applicability, it was necessary to add some concepts and modify same hierarchical groups, that didn’t modify the basic structure, nor had collided with the desirable characteristics. The limitations were related with the incorporated external classification systems. The search in the subject dimension of the concept drug administration retrieved 19 questions. The search between two dimensions: antiinfectious (external) and teratogenicity (subject) retrieved three questions. In the two examples, it was possible to retrieve information from any one of the levels of the hierarchy, from the most general to the most specific and even from external dimensions. Conclusions: The use of the system in this sample showed its applicability in clinical information classification and filing, retrieval capacity and flexibility, supporting modifications without interfering with desirable characteristics. This tool allows retrieval of patient-oriented evidence that matters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de doutoramento, Psicologia (Psicologia da Educação), Universidade de Lisboa, Faculdade de Psicologia, Universidade de Coimbra, Faculdade de Psicologia e de Ciências da Educação, 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple but effective technique to improve the performance of the Max-Log-MAP algorithm is to scale the extrinsic information exchanged between two MAP decoders. A comprehensive analysis of the selection of the scaling factors according to channel conditions and decoding iterations is presented in this paper. Choosing a constant scaling factor for all SNRs and iterations is compared with the best scaling factor selection for changing channel conditions and decoding iterations. It is observed that a constant scaling factor for all channel conditions and decoding iterations is the best solution and provides a 0.2-0.4 dB gain over the standard Max- Log-MAP algorithm. Therefore, a constant scaling factor should be chosen for the best compromise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The iterative nature of turbo-decoding algorithms increases their complexity compare to conventional FEC decoding algorithms. Two iterative decoding algorithms, Soft-Output-Viterbi Algorithm (SOVA) and Maximum A posteriori Probability (MAP) Algorithm require complex decoding operations over several iteration cycles. So, for real-time implementation of turbo codes, reducing the decoder complexity while preserving bit-error-rate (BER) performance is an important design consideration. In this chapter, a modification to the Max-Log-MAP algorithm is presented. This modification is to scale the extrinsic information exchange between the constituent decoders. The remainder of this chapter is organized as follows: An overview of the turbo encoding and decoding processes, the MAP algorithm and its simplified versions the Log-MAP and Max-Log-MAP algorithms are presented in section 1. The extrinsic information scaling is introduced, simulation results are presented, and the performance of different methods to choose the best scaling factor is discussed in Section 2. Section 3 discusses trends and applications of turbo coding from the perspective of wireless applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese apresentada para cumprimento dos requisitos necessários à obtenção do grau de Doutor em Media Digitais

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RESUMO: Os estudos sobre a funcionalidade da população idosa têm uma representação importante naquilo que é o atual conhecimento da demografia do mundo. Portugal posiciona-se e perspetiva-se como pertencendo aos países mais envelhecidos, possuindo uma rede de cuidados pós-agudos – a Rede Nacional de Cuidados Continuados Integrados (RNCCI)– que assiste uma parcela importante dessa população. Os aspetos conceptuais da funcionalidade de acordo com a OMS e operacionalizados pela Classificação Internacional de Funcionalidade (CIF), não mereceram até agora suficiente aplicabilidade no nosso país, inviabilizando a possibilidade de oferecermos contributos para a sua operacionalização. Da mesma forma, também os Core Sets da Classificação não têm sido sujeitos a processos de validação que contemplem amostras portuguesas, mantendo-se desconhecimento da especificidade dos fatores contextuais na nossa população. O presente estudo tem como objetivos conhecer a evolução da funcionalidade dos idosos assistidos na RNCCI na região do Algarve nas unidades de convalescença e média duração, validar o Core Set Geriátrico da OMS e propor uma versão abreviada da sua modalidade abrangente, no contexto destes cuidados. A amostra constituída por 451 idosos, dos quais 62,1% eram mulheres, revelou na pré-morbilidade níveis favoráveis de funcionalidade, com exceção para as Atividades Domésticas. Contudo, os mais idosos (≥ 85 anos), os indivíduos sem escolaridade, as mulheres e os viúvos/solteiros apresentaram mais casos desfavoráveis quando comparados com os seus pares. Na evolução da funcionalidade observámos melhorias significativas em todos os domínios avaliados, com diferenças relativamente à idade e à escolaridade; apesar dos resultados positivos os mais idosos e os indivíduos sem escolaridade apresentaram níveis inferiores de evolução. No entanto, a funcionalidade alcançada revelou ficar com resultados significativamente inferiores na comparação com aquela que os indivíduos possuíam na pré-morbilidade. Os modelos de regressão revelaram que as Funções Mentais, a Perceção do Estado de Saúde e a atividade Usar o Telefone, foram as variáveis que melhor explicaram os outcomes da funcionalidade alcançada. A validação do Core Set Geriátrico foi possível na maioria das categorias, sendo que foi no componente das Funções do Corpo onde esse processo revelou maior fragilidade. As Funções Neuromusculoesqueléticas e Relacionadas com o Movimento foram aquelas que registaram em ambos os momentos avaliativos frequências mais elevadas de deficiência, enquanto no componente Atividades & Participação isso ocorreu na atividade Utilização dos Movimentos Finos da Mão. Os capítulos Apoios e Relacionamentos e Atitudes foram considerados os Fatores Ambientais mais Facilitadores mas também com maior impacto Barreira. A proposta para o Core Set Geriátrico Abreviado resultou das categorias independentes que explicaram os modelos da funcionalidade alcançada e cujo resultado engloba um conjunto de 27 categorias, com um enfoque importante no componente Atividades/Participação de onde se destacam os domínios da Mobilidade e dos Auto Cuidados. A funcionalidade dos indivíduos e das populações deve ser considerada uma variável incontornável da Saúde Pública, cuja avaliação deve refletir uma abordagem biopsicossocial, apoiada na Classificação Internacional de Funcionalidade. A operacionalização da Classificação a partir dos Core Sets necessita de pesquisa mais aprofundada relativamente às caraterísticas psicométricas dos seus qualificadores e dos seus processos de validação.-----------ABSTRACT: The studies about the functioning of the elderly play an important role on what the present knowledge of the demography in the world is. Portugal figures high on the most aged countries, having a network of post-acute care - the National Network of Integrated Continuous Care (RNCCI) - which assists a large part of that population. The conceptual aspects of functioning according to WHO and operated by the International Classification of Functioning (ICF), have been insufficiently addressed concerning its adequate applicability in our country, hindering the contributions of its operation. In the same way, also the Core Sets of the Classification have not been subjected to validation procedures that include portuguese samples, keeping the unawareness of specificity of the contextual factors in our population. The objectives of the present study were to know the evolution of the functioning of the elderly assisted in the RNCCI in the Algarve region in units of convalescence and average duration, validate the WHO Geriatric Core Set and propose an abridged version of this comprehensive core set in this healthcare context. The sample was composed by 451 elderly people, of which 62.1% were women, they showed favourable levels in functioning in the pre-morbid state, except for Domestic Activities. However, the oldest (≥ 85 years), the individuals with no education, women and widowed/ unmarried showed more unfavourable cases when compared to their peers. In the evolution of functioning we observed significant improvements in all domains assessed, with diferences with respect to age and education. In spite of positive results, the oldest and the individuals with no education showed lower levels of evolution. However, the functioning achieved showed significantly lower results when compared to the those observed in pre-morbidity state. Regression models reveal that Mental Functions, the Perceived Health Status and the Use of the Phone activity, were the variables that better explain the functioning of the outcomes achieved. The validation of the Geriatric Core Set of ICF was possible in most categories, and Body Functions was the component where this process showed greatest weakness. Neuromusculoskeletal and Movement-Related Functions experienced in both evaluation times with higher rates of disability, while in the Activities & Participation component this occurred in the Fine Hand Use activity. The Support and Relationships and Attitudes chapters were considered the Environmental Factors most Facilitators but also with greater impact Barrier. The proposal for the Brief Geriatric Core Set has resulted from the independent categories that explained the regression models of functioning and includes a set of 27 categories, with na important emphasis on Activities & Participation component where we can highlight the areas of Mobility and Self Care domains. The functioning of individuals and populations should be considered as an unavoidable variable of Public Health, of which the assessment should reflect a biopsychosocial approach, based on the International Classification of Functioning. The operationalization of the Classification from the Core Sets requires further research regarding the psychometric characteristics of their qualifiers and their validation procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O objectivo desta dissertação é identificar a estrutura da percepção tal como ela é desenhada por Hiérocles nos <,(#= >1!(?$%&'(). Para alcançar este objectivo focar-me-ei na análise que o filósofo estóico leva a cabo em torno da percepção. O foco começa por incindir sobre a distinção entre percepção de si e percepção do exterior, com todas as suas subtilezas. Tal análise implicará então a consideração da importância da não-indiferença nessa estrutura de percepção. O objectivo é compreender como a percepção é sempre relacional e interessada. . Centrar-me-ei então na noção de !"#$%&'(), tentanto explorar a complexidade e multiplicidade do fenómeno em causa. Será também estabalecida a relação com o fragmento de Estobeu (6.671), que é, a par da obra, a fonte mais importante do pensamento de Hiérocles. Através desta relação introduzir-se-á a estrutura circular e concêntrica de uma percepção relacional e interessada. Toda esta dissertação será levada a cabo focando-se principalmente nos textos do próprio Hiérocles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current computer systems have evolved from featuring only a single processing unit and limited RAM, in the order of kilobytes or few megabytes, to include several multicore processors, o↵ering in the order of several tens of concurrent execution contexts, and have main memory in the order of several tens to hundreds of gigabytes. This allows to keep all data of many applications in the main memory, leading to the development of inmemory databases. Compared to disk-backed databases, in-memory databases (IMDBs) are expected to provide better performance by incurring in less I/O overhead. In this dissertation, we present a scalability study of two general purpose IMDBs on multicore systems. The results show that current general purpose IMDBs do not scale on multicores, due to contention among threads running concurrent transactions. In this work, we explore di↵erent direction to overcome the scalability issues of IMDBs in multicores, while enforcing strong isolation semantics. First, we present a solution that requires no modification to either database systems or to the applications, called MacroDB. MacroDB replicates the database among several engines, using a master-slave replication scheme, where update transactions execute on the master, while read-only transactions execute on slaves. This reduces contention, allowing MacroDB to o↵er scalable performance under read-only workloads, while updateintensive workloads su↵er from performance loss, when compared to the standalone engine. Second, we delve into the database engine and identify the concurrency control mechanism used by the storage sub-component as a scalability bottleneck. We then propose a new locking scheme that allows the removal of such mechanisms from the storage sub-component. This modification o↵ers performance improvement under all workloads, when compared to the standalone engine, while scalability is limited to read-only workloads. Next we addressed the scalability limitations for update-intensive workloads, and propose the reduction of locking granularity from the table level to the attribute level. This further improved performance for intensive and moderate update workloads, at a slight cost for read-only workloads. Scalability is limited to intensive-read and read-only workloads. Finally, we investigate the impact applications have on the performance of database systems, by studying how operation order inside transactions influences the database performance. We then propose a Read before Write (RbW) interaction pattern, under which transaction perform all read operations before executing write operations. The RbW pattern allowed TPC-C to achieve scalable performance on our modified engine for all workloads. Additionally, the RbW pattern allowed our modified engine to achieve scalable performance on multicores, almost up to the total number of cores, while enforcing strong isolation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study investigates the usefulness of a multi-method approach to the measurement of reading motivation and achievement. A sample of 127 elementary and middle-school children aged 10 to 14 responded to measures of motivation, attributions, and achievement both longitudinally and in a challenging reading context. Novel measures of motivation and attributions were constructed, validated, and utilized to examine the relationship between ~ motivation, attributions, and achievement over a one-year period (Study I). The impact of classroom contexts and instructional practices was also explored through a study of the influence of topic interest and challenge on motivation, attributions, and persistence (Study II), as well as through interviews with children regarding motivation and reading in the classroom (Study III). Creation and validation of novel measures of motivation and attributions supported the use of a self-report measure of motivation in situation-specific contexts, and confirmed a three-factor structure of attributions for reading performance in both hypothetical and situation-specific contexts. A one-year follow up study of children's motivation and reading achievement demonstrated declines in all components of motivation beginning at age 10 through 12, and particularly strong decreases in motivation with the transition to middle school. Past perceived competence for reading predicted current achievement after controlling for past achievement, and showed the strongest relationships with reading-related skills in both elementary and middle school. Motivation and attributions were strongly related, and children with higher motivation Fulmer III displayed more adaptive attributions for reading success and failure. In the context of a developmentally inappropriate challenging reading task, children's motivation for reading, especially in terms of perceived competence, was threatened. However, interest in the story buffered some ofthe negative impacts of challenge, sustaining children's motivation, adaptive attributions, and reading persistence. Finally, children's responses during interviews outlined several emotions, perceptions, and aspects of reading tasks and contexts that influence reading motivation and achievement. Findings revealed that children with comparable motivation and achievement profiles respond in a similar way to particular reading situations, such as excessive challenge, but also that motivation is dynamic and individualistic and can change over time and across contexts. Overall, the present study outlines the importance of motivation and adaptive attributions for reading success, and the necessity of integrating various methodologies to study the dynamic construct of achievement motivation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the measurement of multidimensional poverty and material deprivation following the counting approach. In contrast to earlier contributions, dimensions of well-being are not forced to be equally important but different weights can be assigned to different dimensions. We characterize a class of individual measures reflecting this feature. In addition, we axiomatize an aggregation procedure to obtain a class of indices for entire societies allowing for different degrees of inequality aversion in poverty. We apply the proposed measures to European Union member states where the concept of material deprivation was initiated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les sociétés modernes dépendent de plus en plus sur les systèmes informatiques et ainsi, il y a de plus en plus de pression sur les équipes de développement pour produire des logiciels de bonne qualité. Plusieurs compagnies utilisent des modèles de qualité, des suites de programmes qui analysent et évaluent la qualité d'autres programmes, mais la construction de modèles de qualité est difficile parce qu'il existe plusieurs questions qui n'ont pas été répondues dans la littérature. Nous avons étudié les pratiques de modélisation de la qualité auprès d'une grande entreprise et avons identifié les trois dimensions où une recherche additionnelle est désirable : Le support de la subjectivité de la qualité, les techniques pour faire le suivi de la qualité lors de l'évolution des logiciels, et la composition de la qualité entre différents niveaux d'abstraction. Concernant la subjectivité, nous avons proposé l'utilisation de modèles bayésiens parce qu'ils sont capables de traiter des données ambiguës. Nous avons appliqué nos modèles au problème de la détection des défauts de conception. Dans une étude de deux logiciels libres, nous avons trouvé que notre approche est supérieure aux techniques décrites dans l'état de l'art, qui sont basées sur des règles. Pour supporter l'évolution des logiciels, nous avons considéré que les scores produits par un modèle de qualité sont des signaux qui peuvent être analysés en utilisant des techniques d'exploration de données pour identifier des patrons d'évolution de la qualité. Nous avons étudié comment les défauts de conception apparaissent et disparaissent des logiciels. Un logiciel est typiquement conçu comme une hiérarchie de composants, mais les modèles de qualité ne tiennent pas compte de cette organisation. Dans la dernière partie de la dissertation, nous présentons un modèle de qualité à deux niveaux. Ces modèles ont trois parties: un modèle au niveau du composant, un modèle qui évalue l'importance de chacun des composants, et un autre qui évalue la qualité d'un composé en combinant la qualité de ses composants. L'approche a été testée sur la prédiction de classes à fort changement à partir de la qualité des méthodes. Nous avons trouvé que nos modèles à deux niveaux permettent une meilleure identification des classes à fort changement. Pour terminer, nous avons appliqué nos modèles à deux niveaux pour l'évaluation de la navigabilité des sites web à partir de la qualité des pages. Nos modèles étaient capables de distinguer entre des sites de très bonne qualité et des sites choisis aléatoirement. Au cours de la dissertation, nous présentons non seulement des problèmes théoriques et leurs solutions, mais nous avons également mené des expériences pour démontrer les avantages et les limitations de nos solutions. Nos résultats indiquent qu'on peut espérer améliorer l'état de l'art dans les trois dimensions présentées. En particulier, notre travail sur la composition de la qualité et la modélisation de l'importance est le premier à cibler ce problème. Nous croyons que nos modèles à deux niveaux sont un point de départ intéressant pour des travaux de recherche plus approfondis.