956 resultados para Quebec framework of reference
Resumo:
There are many situations in which individuals have a choice of whether or notto observe eventual outcomes. In these instances, individuals often prefer to remainignorant. These contexts are outside the scope of analysis of the standard vonNeumann-Morgenstern (vNM) expected utility model, which does not distinguishbetween lotteries for which the agent sees the final outcome and those for which hedoes not. I develop a simple model that admits preferences for making an observationor for remaining in doubt. I then use this model to analyze the connectionbetween preferences of this nature and risk-attitude. This framework accommodatesa wide array of behavioral patterns that violate the vNM model, and thatmay not seem related, prima facie. For instance, it admits self-handicapping, inwhich an agent chooses to impair his own performance. It also accommodatesa status quo bias without having recourse to framing effects, or to an explicitdefinition of reference points. In a political economy context, voters have strictincentives to shield themselves from information. In settings with other-regardingpreferences, this model predicts observed behavior that seems inconsistent witheither altruism or self-interested behavior.
Resumo:
With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.
Resumo:
The electrical charges in soil particles are divided into structural or permanent charges and variable charges. Permanent charges develop on the soil particle surface by isomorphic substitution. Variable charges arise from dissociation and association of protons (H+), protonation or deprotonation, and specific adsorption of cations and anions. The aim of this study was to quantify the permanent charges and variable charges of Reference Soils of the State of Pernambuco, Brazil. To do so, 24 subsurface profiles from different regions (nine in the Zona da Mata, eight in the Agreste, and seven in the Sertão) were sampled, representing approximately 80 % of the total area of the state. Measurements were performed using cesium chloride solution. Determination was made of the permanent charges and the charges in regard to the hydroxyl functional groups through selective ion exchange of Cs+ by Li+ and Cs+ by NH4+, respectively. All the soils analyzed exhibited variable cation exchange capacity, with proportions from 0.16 to 0.60 and an average of 0.40 when related to total cation exchange capacity.
Resumo:
Outgoing radiation is introduced in the framework of the classical predictive electrodynamics using LorentzDiracs equation as a subsidiary condition. In a perturbative scheme in the charges the first radiative self-terms of the accelerations, momentum and angular momentum of a two charge system without external field are calculated.
Resumo:
In the context of recent attempts to redefine the 'skin notation' concept, a position paper summarizing an international workshop on the topic stated that the skin notation should be a hazard indicator related to the degree of toxicity and the potential for transdermal exposure of a chemical. Within the framework of developing a web-based tool integrating this concept, we constructed a database of 7101 agents for which a percutaneous permeation constant can be estimated (using molecular weight and octanol-water partition constant), and for which at least one of the following toxicity indices could be retrieved: Inhalation occupational exposure limit (n=644), Oral lethal dose 50 (LD50, n=6708), cutaneous LD50 (n=1801), Oral no observed adverse effect level (NOAEL, n=1600), and cutaneous NOAEL (n=187). Data sources included the Registry of toxic effects of chemical substances (RTECS, MDL information systems, Inc.), PHYSPROP (Syracuse Research Corp.) and safety cards from the International Programme on Chemical Safety (IPCS). A hazard index, which corresponds to the product of exposure duration and skin surface exposed that would yield an internal dose equal to a toxic reference dose was calculated. This presentation provides a descriptive summary of the database, correlations between toxicity indices, and an example of how the web tool will help industrial hygienist decide on the possibility of a dermal risk using the hazard index.
Resumo:
Determination of fat-free mass (FFM) and fat mass (FM) is of considerable interest in the evaluation of nutritional status. In recent years, bioelectrical impedance analysis (BIA) has emerged as a simple, reproducible method used for the evaluation of FFM and FM, but the lack of reference values reduces its utility to evaluate nutritional status. The aim of this study was to determine reference values for FFM, FM, and %FM by BIA in a white population of healthy subjects, to observe the changes in these values with age, and to develop percentile distributions for these parameters. Whole-body resistance of 1838 healthy white men and 1555 women, aged 15-64 y, was determined by using four skin electrodes on the right hand and foot. FFM and FM were calculated according to formulas validated for the subject groups and analyzed for age decades. This is the first study to present BIA-determined age- and sex-specific percentiles for FFM, FM, and %FM for healthy subjects, aged 15-64 y. Mean FM and %FM increased progressively in men and after age 45 y in women. The results suggest that any weight gain noted with age is due to a gain in FM. In conclusion, the data presented as percentiles can serve as reference to evaluate the normality of body composition of healthy and ill subject groups at a given age.
Resumo:
BACKGROUND: Urine catecholamines, vanillylmandelic, and homovanillic acid are recognized biomarkers for the diagnosis and follow-up of neuroblastoma. Plasma free (f) and total (t) normetanephrine (NMN), metanephrine (MN) and methoxytyramine (MT) could represent a convenient alternative to those urine markers. The primary objective of this study was to establish pediatric centile charts for plasma metanephrines. Secondarily, we explored their diagnostic performance in 10 patients with neuroblastoma. PROCEDURE: We recruited 191 children (69 females) free of neuroendocrine disease to establish reference intervals for plasma metanephrines, reported as centile curves for a given age and sex based on a parametric method using fractional polynomials models. Urine markers and plasma metanephrines were measured in 10 children with neuroblastoma at diagnosis. Plasma total metanephrines were measured by HPLC with coulometric detection and plasma free metanephrines by tandem LC-MS. RESULTS: We observed a significant age-dependence for tNMN, fNMN, and fMN, and a gender and age-dependence for tMN, fNMN, and fMN. Free MT was below the lower limit of quantification in 94% of the children. All patients with neuroblastoma at diagnosis were above the 97.5th percentile for tMT, tNMN, fNMN, and fMT, whereas their fMN and tMN were mostly within the normal range. As expected, urine assays were inconstantly predictive of the disease. CONCLUSIONS: A continuous model incorporating all data for a given analyte represents an appealing alternative to arbitrary partitioning of reference intervals across age categories. Plasma metanephrines are promising biomarkers for neuroblastoma, and their performances need to be confirmed in a prospective study on a large cohort of patients. Pediatr Blood Cancer 2015;62:587-593. © 2015 Wiley Periodicals, Inc.
Resumo:
Reference collections of multiple Drosophila lines with accumulating collections of "omics" data have proven especially valuable for the study of population genetics and complex trait genetics. Here we present a description of a resource collection of 84 strains of Drosophila melanogaster whose genome sequences were obtained after 12 generations of full-sib inbreeding. The initial rationale for this resource was to foster development of a systems biology platform for modeling metabolic regulation by the use of natural polymorphisms as perturbations. As reference lines, they are amenable to repeated phenotypic measurements, and already a large collection of metabolic traits have been assayed. Another key feature of these strains is their widespread geographic origin, coming from Beijing, Ithaca, Netherlands, Tasmania, and Zimbabwe. After obtaining 12.5× coverage of paired-end Illumina sequence reads, SNP and indel calls were made with the GATK platform. Thorough quality control was enabled by deep sequencing one line to >100×, and single-nucleotide polymorphisms and indels were validated using ddRAD-sequencing as an orthogonal platform. In addition, a series of preliminary population genetic tests were performed with these single-nucleotide polymorphism data for assessment of data quality. We found 83 segregating inversions among the lines, and as expected these were especially abundant in the African sample. We anticipate that this will make a useful addition to the set of reference D. melanogaster strains, thanks to its geographic structuring and unusually high level of genetic diversity.
Resumo:
We present computer simulations of a simple bead-spring model for polymer melts with intramolecular barriers. By systematically tuning the strength of the barriers, we investigate their role on the glass transition. Dynamic observables are analyzed within the framework of the mode coupling theory (MCT). Critical nonergodicity parameters, critical temperatures, and dynamic exponents are obtained from consistent fits of simulation data to MCT asymptotic laws. The so-obtained MCT λ-exponent increases from standard values for fully flexible chains to values close to the upper limit for stiff chains. In analogy with systems exhibiting higher-order MCT transitions, we suggest that the observed large λ-values arise form the interplay between two distinct mechanisms for dynamic arrest: general packing effects and polymer-specific intramolecular barriers. We compare simulation results with numerical solutions of the MCT equations for polymer systems, within the polymer reference interaction site model (PRISM) for static correlations. We verify that the approximations introduced by the PRISM are fulfilled by simulations, with the same quality for all the range of investigated barrier strength. The numerical solutions reproduce the qualitative trends of simulations for the dependence of the nonergodicity parameters and critical temperatures on the barrier strength. In particular, the increase in the barrier strength at fixed density increases the localization length and the critical temperature. However the qualitative agreement between theory and simulation breaks in the limit of stiff chains. We discuss the possible origin of this feature.
Resumo:
The formation of reference groups comprises an important procedure in chemical provenance studies of archaeological pottery. Material from ancient kilns is thought to be especially suitable for reference groups, as it comprises a definite unit of past production. Pottery from the Late Minoan IA kiln excavated at Kommos, Crete was analysed in order to produce a reference group in this important area of Minoan ceramic production. The samples were characterized by a combination of techniques providing information on the chemistry, mineralogy and microstructure of the ceramic body. Initially, the study was unable to establish, in a straightforward manner, a chemical reference group. Different ceramic pastes and a range of selective alterations and contaminations, affected by variable firing temperatures and burial environment, were shown to be responsible for the compositional variability. Procedures are described to compensate for such alterations and the perturbations in the data that they produce.
Resumo:
BACKGROUND: Frequent emergency department (ED) users meet several of the criteria of vulnerability, but this needs to be further examined taking into consideration all vulnerability's different dimensions. This study aimed to characterize frequent ED users and to define risk factors of frequent ED use within a universal health care coverage system, applying a conceptual framework of vulnerability. METHODS: A controlled, cross-sectional study comparing frequent ED users to a control group of non-frequent users was conducted at the Lausanne University Hospital, Switzerland. Frequent users were defined as patients with five or more visits to the ED in the previous 12 months. The two groups were compared using validated scales for each one of the five dimensions of an innovative conceptual framework: socio-demographic characteristics; somatic, mental, and risk-behavior indicators; and use of health care services. Independent t-tests, Wilcoxon rank-sum tests, Pearson's Chi-squared test and Fisher's exact test were used for the comparison. To examine the -related to vulnerability- risk factors for being a frequent ED user, univariate and multivariate logistic regression models were used. RESULTS: We compared 226 frequent users and 173 controls. Frequent users had more vulnerabilities in all five dimensions of the conceptual framework. They were younger, and more often immigrants from low/middle-income countries or unemployed, had more somatic and psychiatric comorbidities, were more often tobacco users, and had more primary care physician (PCP) visits. The most significant frequent ED use risk factors were a history of more than three hospital admissions in the previous 12 months (adj OR:23.2, 95%CI = 9.1-59.2), the absence of a PCP (adj OR:8.4, 95%CI = 2.1-32.7), living less than 5 km from an ED (adj OR:4.4, 95%CI = 2.1-9.0), and household income lower than USD 2,800/month (adj OR:4.3, 95%CI = 2.0-9.2). CONCLUSIONS: Frequent ED users within a universal health coverage system form a highly vulnerable population, when taking into account all five dimensions of a conceptual framework of vulnerability. The predictive factors identified could be useful in the early detection of future frequent users, in order to address their specific needs and decrease vulnerability, a key priority for health care policy makers. Application of the conceptual framework in future research is warranted.
Resumo:
In this thesis I argue that the psychological study of concepts and categorisation, and the philosophical study of reference are deeply intertwined. I propose that semantic intuitions are a variety of categorisation judgements, determined by concepts, and that because of this, concepts determine reference. I defend a dual theory of natural kind concepts, according to which natural kind concepts have distinct semantic cores and non-semantic identification procedures. Drawing on psychological essentialism, I suggest that the cores consist of externalistic placeholder essence beliefs. The identification procedures, in turn, consist of prototypes, sets of exemplars, or possibly also theory-structured beliefs. I argue that the dual theory is motivated both by experimental data and theoretical considerations. The thesis consists of three interrelated articles. Article I examines philosophical causal and description theories of natural kind term reference, and argues that they involve, or need to involve, certain psychological elements. I propose a unified theory of natural kind term reference, built on the psychology of concepts. Article II presents two semantic adaptations of psychological essentialism, one of which is a strict externalistic Kripkean-Putnamian theory, while the other is a hybrid account, according to which natural kind terms are ambiguous between internalistic and externalistic senses. We present two experiments, the results of which support the strict externalistic theory. Article III examines Fodor’s influential atomistic theory of concepts, according to which no psychological capacities associated with concepts constitute them, or are necessary for reference. I argue, contra Fodor, that the psychological mechanisms are necessary for reference.
Resumo:
Scientific studies regarding specifically references do not seem to exist. However, the utilization of references is an important practice for many companies involved in industrial marketing. The purpose of the study is to increase the understanding about the utilization of references in international industrial marketing in order to contribute to the development of a theory of reference behavior. Specifically, the modes of reference usage in industry, the factors affecting a supplier's reference behavior, and the question how references are actually utilized, are explored in the study. Due to the explorative nature of the study, a research design was followed where theory and empirical studies alternated. An Exploratory Framework was developed to guide a pilot case study that resulted in Framework 1. Results of the pilot study guided an expanded literature review that was used to develop first a Structural Framework and a Process Framework which were combined in Framework 2. Then, the second empirical phase of the case study was conducted in the same (pilot) case company. In this phase, Decision Systems Analysis (DSA) was used as the analysis method. The DSA procedure consists of three interviewing waves: initial interviews, reinterviews, and validating interviews. Four reference decision processes were identified, described and analyzed in the form of flowchart descriptions. The flowchart descriptions were used to explore new constructs and to develop new propositions to develop Framework 2 further. The quality of the study was ascertained by many actions in both empirical parts of the study. The construct validity of the study was ascertained by using multiple sources of evidence and by asking the key informant to review the pilot case report. The DSA method itself includes procedures assuring validity. Because of the choice to conduct a single case study, external validity was not even pursued. High reliability was pursued through detailed documentation and thorough reporting of evidence. It was concluded that the core of the concept of reference is a customer relationship regardless of the concrete forms a reference might take in its utilization. Depending on various contingencies, references might have various tasks inside the four roles of increasing 1) efficiency of sales and sales management, 2) efficiency of the business, 3) effectiveness of marketing activities, and 4) effectiveness in establishing, maintaining and enhancing customer relationships. Thus, references have not only external but internal tasks as well. A supplier's reference behavior might be affected by many hierarchical conditions. Additionally, the empirical study showed that the supplier can utilize its references as a continuous, all pervasive decision making process through various practices. The process includes both individual and unstructured decision making subprocesses. The proposed concept of reference can be used to guide a reference policy recommendable for companies for which the utilization of references is important. The significance of the study is threefold: proposing the concept of reference, developing a framework of a supplier's reference behavior and its short term process of utilizing references, and conceptual structuring of an unstructured and in industrial marketing important phenomenon to four roles.
Resumo:
This study focuses on the phenomenon of customer reference marketing in a business tobusiness (B to B) context. Although customer references are generally considered an important marketing and sales tool, the academic literature has paid surprisingly little attention to the phenomenon. The study suggests that customer references could be viewed as important marketing assets for industrial suppliers, and the ability to build, manage and leverage customer reference portfolios systematically constitutes a relevant marketing capability. The role of customer references is examined in the context of the industrial suppliers' shift towards a solution and project orientation and in the light of the on going changes in the project business. Suppliers in several industry sectors are undergoing a change from traditional equipment manufacturing towards project and solution oriented business. It is argued in this thesis that the high complexity, the project oriented nature and the intangible service elements that characterise many contemporary B to B offerings further increase the role of customer references. The study proposes three mechanisms of customer reference marketing: status transfer, validation through testimonials and the demonstration of experience and prior performance. The study was conducted in the context of Finnish B to B process technology and information technology companies. The empirical data comprises 38 interviews with managers of four case companies, 165 customer reference descriptions gathered from six case companies' Web sites, as well as company internal material. The findings from the case studies show that customer references have various external and internal functions that contribute to the growth and performance of B to B firms. Externally, customer references bring status transfer effects from reputable customers, concretise and demonstrate complex solutions, and provide indirect evidence of experience, previous performance, technological functionality and delivered customer value. They can also be leveraged internally to facilitate organisational learning and training, advance offering development, and motivate personnel. Major reference projects create new business opportunities and can be used as a vehicle for strategic change. The findings of the study shed light on the on going changing orientations in the project business environment, increase understanding of the variety of ways in which customer references can be deployed as marketing assets, and provide a framework of the relevant tasks and activities related to building, managing and leveraging a firm's customer reference portfolio. The findings contribute to the industrial marketing research, to the literature on marketing assets and capabilities and to the literature on projects and solutions. The proposed functions and mechanisms of customer reference marketing bring a more thorough and structured understanding about the essence and characteristics of the phenomenon and give a wide ranging view of the role of customer references as marketing assets for B to B firms. The study suggests several managerial implications for industrial suppliers in order to systematise customer reference marketing efforts.
Resumo:
EONIA is a market based overnight interest rate, whose role as the starting point of the yield curve makes it critical from the perspective of the implementation of European Central Bank´s common monetary policy in the euro area. The financial crisis that started in 2007 had a large impact on the determination mechanism of this interest rate, which is considered as the central bank´s operational target. This thesis examines the monetary policy implementation framework of the European Central Bank and changes made to it. Furthermore, we discuss the development of the recent turmoil in the money market. EONIA rate is modelled by means of a regression equation using variables related to liquidity conditions, refinancing need, auction results and calendar effects. Conditional volatility is captured by an EGARCH model, and autocorrelation is taken into account by employing an autoregressive structure. The results highlight how the tensions in the initial stage of the market turmoil were successfully countered by ECB´s liquidity policy. The subsequent response of EONIA to liquidity conditions under the full allotment liquidity provision procedure adopted after the demise of Lehman Brothers is also established. A clear distinction in the behavior of the interest rate between the sub-periods was evident. In the light of the results obtained, some of the challenges posed by the exit-strategy implementation will be addressed.