956 resultados para ATTRIBUTE WEIGHTING
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics and Maastricht University School of Business and Economics
Resumo:
Consumers’ indecisions about the ethical value of their choices are amongst the highest concerns regarding ethical products’ purchasing. This is especially true for Fair Trade certified products where the ethical attribute information provided by the packaging is often unacknowledged by consumers. While well-informed consumers are likely to generate positive consumer reactions to ethical products and increase its ethical consumption, less knowledgeable buyers show different purchasing patterns. In such circumstances, decisions are often driven by socio-cultural beliefs about the low functional performance of ethical or sustainable attributes. For instance, products more congruent with sustainability (e.g., produce) are considered to be simpler but less tasty than less sustainable products. Less sustainable products instead, are considered to be more sophisticated and to provide consumers with more hedonic pleasures (e.g., chocolate mousse). The extent that ethicality is linked with experiences that provide consumers with more pain than pleasure is also manifested in pro-social social behaviors. More specifically through conspicuous self-sacrificial consumption experiences like running for charity in marathons with wide public exposure. The willingness of consumers to engage in such costly initiatives is moderated by gender differences and further, mediated by the chronic productivity orientation of some individuals to use time in a productive manner. Using experimental design studies, I show that consumers (1) use a set of affective and cognitive associations with on-package elements to interpret ethical attributes, (2) implicitly associate ethicality with simplicity, and that (3) men versus women show different preferences in their forms of contribution to pro-social causes.
Resumo:
Tese de Doutoramento apresentada ao ISPA - Instituto Universitário
Resumo:
The report addresses the question of what are the preferences of broadband consumers on the Portuguese telecommunication market. A triple play bundle is being investigated. The discrete choice analysis, adopted in the study, base on 110 responses, mainly from NOVA students. The data for the analysis was collected via manually designed on-line survey. The results show that the price attribute is relatively the most important one while the television attribute is being overlooked in the decision making process. Main effects examined in the research are robust. In addition, "extras" components are being tested in terms of users' preferences.
Resumo:
Current computer systems have evolved from featuring only a single processing unit and limited RAM, in the order of kilobytes or few megabytes, to include several multicore processors, o↵ering in the order of several tens of concurrent execution contexts, and have main memory in the order of several tens to hundreds of gigabytes. This allows to keep all data of many applications in the main memory, leading to the development of inmemory databases. Compared to disk-backed databases, in-memory databases (IMDBs) are expected to provide better performance by incurring in less I/O overhead. In this dissertation, we present a scalability study of two general purpose IMDBs on multicore systems. The results show that current general purpose IMDBs do not scale on multicores, due to contention among threads running concurrent transactions. In this work, we explore di↵erent direction to overcome the scalability issues of IMDBs in multicores, while enforcing strong isolation semantics. First, we present a solution that requires no modification to either database systems or to the applications, called MacroDB. MacroDB replicates the database among several engines, using a master-slave replication scheme, where update transactions execute on the master, while read-only transactions execute on slaves. This reduces contention, allowing MacroDB to o↵er scalable performance under read-only workloads, while updateintensive workloads su↵er from performance loss, when compared to the standalone engine. Second, we delve into the database engine and identify the concurrency control mechanism used by the storage sub-component as a scalability bottleneck. We then propose a new locking scheme that allows the removal of such mechanisms from the storage sub-component. This modification o↵ers performance improvement under all workloads, when compared to the standalone engine, while scalability is limited to read-only workloads. Next we addressed the scalability limitations for update-intensive workloads, and propose the reduction of locking granularity from the table level to the attribute level. This further improved performance for intensive and moderate update workloads, at a slight cost for read-only workloads. Scalability is limited to intensive-read and read-only workloads. Finally, we investigate the impact applications have on the performance of database systems, by studying how operation order inside transactions influences the database performance. We then propose a Read before Write (RbW) interaction pattern, under which transaction perform all read operations before executing write operations. The RbW pattern allowed TPC-C to achieve scalable performance on our modified engine for all workloads. Additionally, the RbW pattern allowed our modified engine to achieve scalable performance on multicores, almost up to the total number of cores, while enforcing strong isolation.
Resumo:
Servant leadership theory has been the subject of great academic discussion, namely in what concerns reaching a consensus for its definition. As many frameworks have been designed in order to define the servant leader’s characteristics, we based ourselves in van Dierendonck’s review and synthesis on servant leadership (2011) to assess how it is perceived in a Portuguese organizational context. After performing several interviews in a private health care organization, we conclude that the perception of servant leadership is generally positive and that its characteristics seem to be in line with academic literature. However, some issues arose such as a seemingly lack of relevance given to authenticity and humility, the latter being a unique attribute of servant leadership. Also, we found a discrepancy between hierarchical levels’ perception of servant leadership characteristics as well as questioning if an over emphasis on service can diminish the servant leader’s impact on organizational performance.
Resumo:
AIMS/HYPOTHESIS: Several susceptibility genes for type 2 diabetes have been discovered recently. Individually, these genes increase the disease risk only minimally. The goals of the present study were to determine, at the population level, the risk of diabetes in individuals who carry risk alleles within several susceptibility genes for the disease and the added value of this genetic information over the clinical predictors. METHODS: We constructed an additive genetic score using the most replicated single-nucleotide polymorphisms (SNPs) within 15 type 2 diabetes-susceptibility genes, weighting each SNP with its reported effect. We tested this score in the extensively phenotyped population-based cross-sectional CoLaus Study in Lausanne, Switzerland (n = 5,360), involving 356 diabetic individuals. RESULTS: The clinical predictors of prevalent diabetes were age, BMI, family history of diabetes, WHR, and triacylglycerol/HDL-cholesterol ratio. After adjustment for these variables, the risk of diabetes was 2.7 (95% CI 1.8-4.0, p = 0.000006) for individuals with a genetic score within the top quintile, compared with the bottom quintile. Adding the genetic score to the clinical covariates improved the area under the receiver operating characteristic curve slightly (from 0.86 to 0.87), yet significantly (p = 0.002). BMI was similar in these two extreme quintiles. CONCLUSIONS/INTERPRETATION: In this population, a simple weighted 15 SNP-based genetic score provides additional information over clinical predictors of prevalent diabetes. At this stage, however, the clinical benefit of this genetic information is limited.
Resumo:
[Contents] Introduction. Objectives. Methodology. Results. Characteristics of the sample. Substance use (Psychoactive substances, Performance-enhancing substances). Profile of sportive adolescents using substances. Mixed substance use. Other factors related to substance use. Inactivity. Conclusions. References. Annexes. Annex 1. Questionnaire. Annex 2. Sample weighting procedure. Annex 3. Sports type.
Resumo:
Gestures are the first forms of conventional communication that young children develop in order to intentionally convey a specific message. However, at first, infants rarely communicate successfully with their gestures, prompting caregivers to interpret them. Although the role of caregivers in early communication development has been examined, little is known about how caregivers attribute a specific communicative function to infants' gestures. In this study, we argue that caregivers rely on the knowledge about the referent that is shared with infants in order to interpret what communicative function infants wish to convey with their gestures. We videotaped interactions from six caregiver-infant dyads playing with toys when infants were 8, 10, 12, 14, and 16 months old. We coded infants' gesture production and we determined whether caregivers interpreted those gestures as conveying a clear communicative function or not; we also coded whether infants used objects according to their conventions of use as a measure of shared knowledge about the referent. Results revealed an association between infants' increasing knowledge of object use and maternal interpretations of infants' gestures as conveying a clear communicative function. Our findings emphasize the importance of shared knowledge in shaping infants' emergent communicative skills.
Resumo:
Samples of volcanic rocks from Alboran Island, the Alboran Sea floor and from the Gourougou volcanic centre in northern Morocco have been analyzed for major and trace elements and Sr-Nd isotopes to test current theories on the tectonic geodynamic evolution of the Alboran Sea. The Alboran Island samples are low-K tholeiitic basaltic andesites whose depleted contents of HFS elements (similar to0.5xN-MORB), especially Nb (similar to0.2xN-MORB), show marked geochemical parallels with volcanics from immature intra-oceanic arcs and back-arc basins. Several of the submarine samples have similar compositions, one showing low-Ca boninite affinity. Nd-143/Nd-144 ratios fall in the same range as many island-arc and back-arc basin samples, whereas Sr-87/Sr-86 ratios (on leached samples) are somewhat more radiogenic. Our data point to active subduction taking place beneath the Alboran region in Miocene times, and imply the presence of an associated back-arc spreading centre. Our sea floor suite includes a few more evolved dacite and rhyolite samples with (Sr-87/Sr-86)(0) up to 0.717 that probably represent varying degrees of crustal melting. The shoshonite and high-K basaltic andesite lavas from Gourougou have comparable normalized incompatible-element enrichment diagrams and Ce/Y ratios to shoshonitic volcanics from oceanic island arcs, though they have less pronounced Nb deficits. They are much less LIL- and LREE-enriched than continental arc analogues and post-collisional shoshonites from Tibet. The magmas probably originated by melting in subcontinental lithospheric mantle that had experienced negligible subduction input. Sr-Nd isotope compositions point to significant crustal contamination which appears to account for the small Nb anomalies. The unmistakable supra-subduction zone (SSZ) signature shown by our Alboran basalts and basaltic andesite samples refutes geodynamic models that attribute all Neogene volcanism in the Alboran domain to decompression melting of upwelling asthenosphere arising from convective thinning of over-thickened lithosphere. Our data support recent models in which subsidence is caused by westward rollback of an eastward-dipping subduction zone beneath the westemmost Mediterranean. Moreover, severance of the lithosphere at the edges of the rolling-back slab provides opportunities for locally melting lithospheric mantle, providing a possible explanation for the shoshonitic volcanism seen in northern Morocco and more sporadically in SE Spain. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
The World Health Organization (WHO) criteria for the diagnosis of osteoporosis are mainly applicable for dual X-ray absorptiometry (DXA) measurements at the spine and hip levels. There is a growing demand for cheaper devices, free of ionizing radiation such as promising quantitative ultrasound (QUS). In common with many other countries, QUS measurements are increasingly used in Switzerland without adequate clinical guidelines. The T-score approach developed for DXA cannot be applied to QUS, although well-conducted prospective studies have shown that ultrasound could be a valuable predictor of fracture risk. As a consequence, an expert committee named the Swiss Quality Assurance Project (SQAP, for which the main mission is the establishment of quality assurance procedures for DXA and QUS in Switzerland) was mandated by the Swiss Association Against Osteoporosis (ASCO) in 2000 to propose operational clinical recommendations for the use of QUS in the management of osteoporosis for two QUS devices sold in Switzerland. Device-specific weighted "T-score" based on the risk of osteoporotic hip fractures as well as on the prediction of DXA osteoporosis at the hip, according to the WHO definition of osteoporosis, were calculated for the Achilles (Lunar, General Electric, Madison, Wis.) and Sahara (Hologic, Waltham, Mass.) ultrasound devices. Several studies (totaling a few thousand subjects) were used to calculate age-adjusted odd ratios (OR) and area under the receiver operating curve (AUC) for the prediction of osteoporotic fracture (taking into account a weighting score depending on the design of the study involved in the calculation). The ORs were 2.4 (1.9-3.2) and AUC 0.72 (0.66-0.77), respectively, for the Achilles, and 2.3 (1.7-3.1) and 0.75 (0.68-0.82), respectively, for the Sahara device. To translate risk estimates into thresholds for clinical application, 90% sensitivity was used to define low fracture and low osteoporosis risk, and a specificity of 80% was used to define subjects as being at high risk of fracture or having osteoporosis at the hip. From the combination of the fracture model with the hip DXA osteoporotic model, we found a T-score threshold of -1.2 and -2.5 for the stiffness (Achilles) determining, respectively, the low- and high-risk subjects. Similarly, we found a T-score at -1.0 and -2.2 for the QUI index (Sahara). Then a screening strategy combining QUS, DXA, and clinical factors for the identification of women needing treatment was proposed. The application of this approach will help to minimize the inappropriate use of QUS from which the whole field currently suffers.
Resumo:
Résumé: La thèse que nous présentons s'intéresse aux phénomènes d'attribution d'intentions hostiles. Dodge (1980) observe que les individus agressifs ont tendance, en situation ambiguë, à sur-attribuer des intentions hostiles à leurs pairs, ce qui induit des réponses agressives. Pour l'auteur, l'attribution d'intentions hostiles est un médiateur entre certaines caractéristiques personnelles (l'agressivité) des individus, et le type de réponses qu'ils apportent aux situations. Cependant, les informations concernant l'appartenance groupale des "pairs" ne sont jamais prises en compte dans leurs études. Si ce processus est perméable à l'influence des normes et croyances (Bègue et Muller, 2006), aucune étude ne met en évidence quel serait l'impact d'informations groupales sur l'élaboration des réponses aux situations, dans le cadre de ce modèle. L'objectif de cette thèse est de montrer que l'attribution d'intentions hostiles peut être envisagée comme un processus agissant également à un niveau intergroupes et donc prenant en compte des informations groupales sur les individus. En s'inspirant du modèle de Dodge, nous avons émis l'hypothèse que les logiques intergroupes intervenaient dans l'interprétation des intentions des acteurs impliqués dans les interactions, afin de produire une réponse adaptée aux logiques intergroupes. Afin de tester cette hypothèse, nous avons suivi trois axes de recherches: Dans le premier de ces axes, nous avons introduit, dans le paradigme de Dodge, des informations .sur l'appartenance groupale des protagonistes de l'interaction (endogroupe vs exogroupe). Nous avons montré que le type de situation (ambiguë vs hostile) est moins important que l'information groupale dans la production d'une réponse à la situation (Étude 1). En outre, nous avons mis en évidence des processus différents selon la position des individus dans leur groupe (Étude 2). Dans le second axe, nous avons montré que si les différences de statut entre groupes n'influençaient pas directement le modèle de Dodge, elles interagissaient avec l'appartenance groupale et la clarté de la situation au niveau de l'attribution d'intentions hostiles (étude 3) et des intentions comportementales (Ettide 4). Dans le troisième et deriúer axe, nous avons introduit l'attribution d'intentions hostiles dans un processus de dévalorisation d'une cible expliquant un échec par la discrimination (Kaiser et Miller, 2001; 2003). Nous avons alors montré que l'attribution d'intentions hostiles médiatisait le lien entre l'attribution mobilisée pour expliquer l'événement et l'évaluation de la cible (Étude 5), et que ce type d'attribution était spécifique, aux intentions comportementales agressives (Études 6). Nous avons alors conclu sur la dimension sociale de l'attribution d'intentions hostiles et sur le fait qu'il s'agissait d'un élément permettant la construction d'une représentation des interactions sociales. Abstract The present thesis focuses on the phenomena of hostile intents attribution. Dodge (1980) observes that in ambiguous situations, aggressive people tend to over attribute hostile intents to others. This attribution leads them to respond aggressively. According to the author, hostile intents attribution mediates the link between some personal characteristics (aggressiveness for example) of individuals and their responses to the situation. However information related to participants group membership is always neglected in these studies. Begue and Muller (2006) showed that some beliefs could moderate the interaction between aggressiveness and hostile intents attribution on behaviors, but no study exhibited evidence of a similar effect with social information. The aim of this thesis is to show that hostile intents attribution needs to be considered at an intergroup level by taking into account people's group ineinbership. Based on the Dodge model, we formulated the hypothesis that intergroup strategies had an impact on actors' intents interpretations which in return should lead to different but adapted reactions to the situation. To test this hypothesis, three lines of research were developed. In the first line, we introduced, in the Dodge's paradigm, some information about the participants group membership (ingroup vs outgroup). We showed that when elaborating a response to a specific situation its nature (ambiguous vs hostile) had less impact than group membership information (Study 1). In addition, we highlighted some different processes according to the position of individuals in their group (Study 2). In the second line, we showed that if the differences between groups status didn't influence the Dodge model, they interacted with group membership and situation nature to influence hostile intents attribution (Study 3) and behaviors intents (Study 4). In the last line of research, we introduced hostile intents attribution within the process of derogation of a target explaining its failure by discrimination (Kaiser and Miller, 2001; 2003). We showed that hostile intents attribution mediated the link between the attibution mobilized to explain the failure and the derogation of the target (Study 5), and that this attribution type was specifically linked to aggressive behavior intents (Study 6). We finally concluded that hostile intents attribution imply an important social dimension which needs to be taken into account because involved in the construction of a representation of social interactions.
Resumo:
Raised blood pressure (BP) is a major risk factor for cardiovascular disease. Previous studies have identified 47 distinct genetic variants robustly associated with BP, but collectively these explain only a few percent of the heritability for BP phenotypes. To find additional BP loci, we used a bespoke gene-centric array to genotype an independent discovery sample of 25,118 individuals that combined hypertensive case-control and general population samples. We followed up four SNPs associated with BP at our p < 8.56 × 10(-7) study-specific significance threshold and six suggestively associated SNPs in a further 59,349 individuals. We identified and replicated a SNP at LSP1/TNNT3, a SNP at MTHFR-NPPB independent (r(2) = 0.33) of previous reports, and replicated SNPs at AGT and ATP2B1 reported previously. An analysis of combined discovery and follow-up data identified SNPs significantly associated with BP at p < 8.56 × 10(-7) at four further loci (NPR3, HFE, NOS3, and SOX6). The high number of discoveries made with modest genotyping effort can be attributed to using a large-scale yet targeted genotyping array and to the development of a weighting scheme that maximized power when meta-analyzing results from samples ascertained with extreme phenotypes, in combination with results from nonascertained or population samples. Chromatin immunoprecipitation and transcript expression data highlight potential gene regulatory mechanisms at the MTHFR and NOS3 loci. These results provide candidates for further study to help dissect mechanisms affecting BP and highlight the utility of studying SNPs and samples that are independent of those studied previously even when the sample size is smaller than that in previous studies.
Resumo:
In coronary magnetic resonance angiography, a magnetization-preparation scheme for T2 -weighting (T2 Prep) is widely used to enhance contrast between the coronary blood-pool and the myocardium. This prepulse is commonly applied without spatial selection to minimize flow sensitivity, but the nonselective implementation results in a reduced magnetization of the in-flowing blood and a related penalty in signal-to-noise ratio. It is hypothesized that a spatially selective T2 Prep would leave the magnetization of blood outside the T2 Prep volume unaffected and thereby lower the signal-to-noise ratio penalty. To test this hypothesis, a spatially selective T2 Prep was implemented where the user could freely adjust angulation and position of the T2 Prep slab to avoid covering the ventricular blood-pool and saturating the in-flowing spins. A time gap of 150 ms was further added between the T2 Prep and other prepulses to allow for in-flow of a larger volume of unsaturated spins. Consistent with numerical simulation, the spatially selective T2 Prep increased in vivo human coronary artery signal-to-noise ratio (42.3 ± 2.9 vs. 31.4 ± 2.2, n = 22, P < 0.0001) and contrast-to-noise-ratio (18.6 ± 1.5 vs. 13.9 ± 1.2, P = 0.009) as compared to those of the nonselective T2 Prep. Additionally, a segmental analysis demonstrated that the spatially selective T2 Prep was most beneficial in proximal and mid segments where the in-flowing blood volume was largest compared to the distal segments. Magn Reson Med, 2013. © 2012 Wiley Periodicals, Inc.