973 resultados para Forced-choice method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades, the chemical synthesis of short oligonucleotides has become an important aspect of study due to the discovery of new functions for nucleic acids such as antisense oligonucleotides (ASOs), aptamers, DNAzymes, microRNA (miRNA) and small interfering RNA (siRNA). The applications in modern therapies and fundamental medicine on the treatment of different cancer diseases, viral infections and genetic disorders has established the necessity to develop scalable methods for their cheaper and easier industrial manufacture. While small scale solid-phase oligonucleotide synthesis is the method of choice in the field, various challenges still remain associated with the production of short DNA and RNA-oligomers in very large quantities. On the other hand, solution phase synthesis of oligonucleotides offers a more predictable scaling-up of the synthesis and is amenable to standard industrial manufacture techniques. In the present thesis, various protocols for the synthesis of short DNA and RNA oligomers have been studied on a peracetylated and methylated β-cyclodextrin, and also on a pentaerythritol-derived support. On using the peracetylated and methylated β-cyclodextrin soluble supports, the coupling cycle was simplified by replacement of the typical 5′-O-(4,4′-dimethoxytrityl) protecting group with an acid-labile acetal-protected 5′-O-(1-methoxy-1-methylethyl) group, which upon acid-catalyzed methanolysis released easily removable volatile products. For this reason monomeric building blocks 5′-O-(1-methoxy-1-methylethyl) 3′-(2-cyano-ethyl-N,N-diisopropylphosphoramidite) were synthesized. Alternatively, on using the precipitative pentaerythritol support, novel 2´-O-(2-cyanoethyl)-5´-O-(1-methoxy-1-methylethyl) protected phosphoramidite building blocks for RNA synthesis have been prepared and their applicability by the synthesis of a pentamer was demonstrated. Similarly, a method for the preparation of short RNAs from commercially available 5´-O-(4,4´-dimethoxytrityl)-2´-O-(tert-butyldimethyl-silyl)ribonucleoside 3´-(2-cyanoethyl-N,N-diisopropylphosphoramidite) building blocks has been developed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cauliflower heads, which were precooled using four different methods including vacuum, forced-air, and high and low flow hydro precooling, were stored under controlled atmosphere and room conditions. Controlled atmosphere conditions (CA) were as follows: 1°C temperature, 90 ± 5% relative humidity, and 0:21 [(%CO2:%O2) – (0:21) control] atmosphere composition. Room conditions (RC) were: 22±1°C temperature and 55-60% humidity. Various quality parameters of the cauliflower heads were assessed during storage (days 0, 7, 14, 21, 28, and 35) under controlled atmosphere and room conditions (days 0, 5, and 10). During storage, weight loss, deterioration rate, overall sensory quality score, hardness, and colour (L, a, b, C and α) were evaluated. In the present study, the strength and quality parameters of cauliflower under CA and RC conditions were obtained. Vacuum precooling was found to be most suitable method before cauliflower was submitted to cold storage and sent to market. Furthermore, the storage of cauliflower without precooling resulted in a significant decrease in quality parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the implement of opening policy, the overall economy of China has maintained rapid and stable development, which has now makes China become the world's second largest economy. China, it is to become the largest overseas market for many large global enterprises from various industries, this naturally also includes the Tablet PC industry that raised in recent years. The purpose of this thesis is to analyze different internal and external factors that influence the entry mode choices of Finnish SMEs in tablet industry entering Chinese market. The goal is to find out the suitable entry modes for the Finnish tablet or other relevant SMEs entering Chinese market. Qualitative analysis is the main research method in empirical part of this study. The interviews were carried out with the case company and other two Finnish business organizations in China. The result of the study indicated that the internal resource and external business environment affect the entry modes choices much more than other factors for SMES. The exporting mode and sales subsidiary could be a better choice for SMEs entering Chinese market. Furthermore, firms should fully learn the Chinese market combine with its own background before making decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple-choice assessment is used within nearly all levels of education and is often heavily relied upon within both secondary and postsecondary institutions in determining a student’s present and future success. Understanding why it is effective or ineffective, how it is developed, and when it is or is not used by teachers can further inform teachers’ assessment practices, and subsequently, improve opportunities for student success. Twenty-eight teachers from 3 secondary schools in southern Ontario were interviewed about their perceptions and use of multiple-choice assessment and participated in a single-session introductory workshop on this topic. Perceptions and practices were revealed, discussed, and challenged through the use of a qualitative research method and examined alongside existing multiple-choice research. Discussion centered upon participants’ perspectives prior to and following their participation in the workshop. Implications related to future assessment practices and research in this field of assessment were presented. Findings indicated that many teachers utilized the multiple-choice form of assessment having had very little teacher education coursework or inservice professional development in the use of this format. The findings also revealed that teachers were receptive to training in this area but simply had not been exposed to or been given the opportunity to further develop their understanding. Participants generally agreed on its strengths (e.g., objectivity) and weaknesses (e.g., development difficulty). Participants were particularly interested in the potential for this assessment format to assess different levels of cognitive difficulty (i.e., levels beyond remembering of Bloom’s revised taxonomy), in addition to its potential to perhaps provide equitable means for assessing students of varying cultures, disabilities, and academic streams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RÉSUMÉ - Les images satellitales multispectrales, notamment celles à haute résolution spatiale (plus fine que 30 m au sol), représentent une source d’information inestimable pour la prise de décision dans divers domaines liés à la gestion des ressources naturelles, à la préservation de l’environnement ou à l’aménagement et la gestion des centres urbains. Les échelles d’étude peuvent aller du local (résolutions plus fines que 5 m) à des échelles régionales (résolutions plus grossières que 5 m). Ces images caractérisent la variation de la réflectance des objets dans le spectre qui est l’information clé pour un grand nombre d’applications de ces données. Or, les mesures des capteurs satellitaux sont aussi affectées par des facteurs « parasites » liés aux conditions d’éclairement et d’observation, à l’atmosphère, à la topographie et aux propriétés des capteurs. Deux questions nous ont préoccupé dans cette recherche. Quelle est la meilleure approche pour restituer les réflectances au sol à partir des valeurs numériques enregistrées par les capteurs tenant compte des ces facteurs parasites ? Cette restitution est-elle la condition sine qua non pour extraire une information fiable des images en fonction des problématiques propres aux différents domaines d’application des images (cartographie du territoire, monitoring de l’environnement, suivi des changements du paysage, inventaires des ressources, etc.) ? Les recherches effectuées les 30 dernières années ont abouti à une série de techniques de correction des données des effets des facteurs parasites dont certaines permettent de restituer les réflectances au sol. Plusieurs questions sont cependant encore en suspens et d’autres nécessitent des approfondissements afin, d’une part d’améliorer la précision des résultats et d’autre part, de rendre ces techniques plus versatiles en les adaptant à un plus large éventail de conditions d’acquisition des données. Nous pouvons en mentionner quelques unes : - Comment prendre en compte des caractéristiques atmosphériques (notamment des particules d’aérosol) adaptées à des conditions locales et régionales et ne pas se fier à des modèles par défaut qui indiquent des tendances spatiotemporelles à long terme mais s’ajustent mal à des observations instantanées et restreintes spatialement ? - Comment tenir compte des effets de « contamination » du signal provenant de l’objet visé par le capteur par les signaux provenant des objets environnant (effet d’adjacence) ? ce phénomène devient très important pour des images de résolution plus fine que 5 m; - Quels sont les effets des angles de visée des capteurs hors nadir qui sont de plus en plus présents puisqu’ils offrent une meilleure résolution temporelle et la possibilité d’obtenir des couples d’images stéréoscopiques ? - Comment augmenter l’efficacité des techniques de traitement et d’analyse automatique des images multispectrales à des terrains accidentés et montagneux tenant compte des effets multiples du relief topographique sur le signal capté à distance ? D’autre part, malgré les nombreuses démonstrations par des chercheurs que l’information extraite des images satellitales peut être altérée à cause des tous ces facteurs parasites, force est de constater aujourd’hui que les corrections radiométriques demeurent peu utilisées sur une base routinière tel qu’est le cas pour les corrections géométriques. Pour ces dernières, les logiciels commerciaux de télédétection possèdent des algorithmes versatiles, puissants et à la portée des utilisateurs. Les algorithmes des corrections radiométriques, lorsqu’ils sont proposés, demeurent des boîtes noires peu flexibles nécessitant la plupart de temps des utilisateurs experts en la matière. Les objectifs que nous nous sommes fixés dans cette recherche sont les suivants : 1) Développer un logiciel de restitution des réflectances au sol tenant compte des questions posées ci-haut. Ce logiciel devait être suffisamment modulaire pour pouvoir le bonifier, l’améliorer et l’adapter à diverses problématiques d’application d’images satellitales; et 2) Appliquer ce logiciel dans différents contextes (urbain, agricole, forestier) et analyser les résultats obtenus afin d’évaluer le gain en précision de l’information extraite par des images satellitales transformées en images des réflectances au sol et par conséquent la nécessité d’opérer ainsi peu importe la problématique de l’application. Ainsi, à travers cette recherche, nous avons réalisé un outil de restitution de la réflectance au sol (la nouvelle version du logiciel REFLECT). Ce logiciel est basé sur la formulation (et les routines) du code 6S (Seconde Simulation du Signal Satellitaire dans le Spectre Solaire) et sur la méthode des cibles obscures pour l’estimation de l’épaisseur optique des aérosols (aerosol optical depth, AOD), qui est le facteur le plus difficile à corriger. Des améliorations substantielles ont été apportées aux modèles existants. Ces améliorations concernent essentiellement les propriétés des aérosols (intégration d’un modèle plus récent, amélioration de la recherche des cibles obscures pour l’estimation de l’AOD), la prise en compte de l’effet d’adjacence à l’aide d’un modèle de réflexion spéculaire, la prise en compte de la majorité des capteurs multispectraux à haute résolution (Landsat TM et ETM+, tous les HR de SPOT 1 à 5, EO-1 ALI et ASTER) et à très haute résolution (QuickBird et Ikonos) utilisés actuellement et la correction des effets topographiques l’aide d’un modèle qui sépare les composantes directe et diffuse du rayonnement solaire et qui s’adapte également à la canopée forestière. Les travaux de validation ont montré que la restitution de la réflectance au sol par REFLECT se fait avec une précision de l’ordre de ±0.01 unités de réflectance (pour les bandes spectrales du visible, PIR et MIR), même dans le cas d’une surface à topographie variable. Ce logiciel a permis de montrer, à travers des simulations de réflectances apparentes à quel point les facteurs parasites influant les valeurs numériques des images pouvaient modifier le signal utile qui est la réflectance au sol (erreurs de 10 à plus de 50%). REFLECT a également été utilisé pour voir l’importance de l’utilisation des réflectances au sol plutôt que les valeurs numériques brutes pour diverses applications courantes de la télédétection dans les domaines des classifications, du suivi des changements, de l’agriculture et de la foresterie. Dans la majorité des applications (suivi des changements par images multi-dates, utilisation d’indices de végétation, estimation de paramètres biophysiques, …), la correction des images est une opération cruciale pour obtenir des résultats fiables. D’un point de vue informatique, le logiciel REFLECT se présente comme une série de menus simples d’utilisation correspondant aux différentes étapes de saisie des intrants de la scène, calcul des transmittances gazeuses, estimation de l’AOD par la méthode des cibles obscures et enfin, l’application des corrections radiométriques à l’image, notamment par l’option rapide qui permet de traiter une image de 5000 par 5000 pixels en 15 minutes environ. Cette recherche ouvre une série de pistes pour d’autres améliorations des modèles et méthodes liés au domaine des corrections radiométriques, notamment en ce qui concerne l’intégration de la FDRB (fonction de distribution de la réflectance bidirectionnelle) dans la formulation, la prise en compte des nuages translucides à l’aide de la modélisation de la diffusion non sélective et l’automatisation de la méthode des pentes équivalentes proposée pour les corrections topographiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, various approaches have been suggested for dose escalation studies based on observations of both undesirable events and evidence of therapeutic benefit. This article concerns a Bayesian approach to dose escalation that requires the user to make numerous design decisions relating to the number of doses to make available, the choice of the prior distribution, the imposition of safety constraints and stopping rules, and the criteria by which the design is to be optimized. Results are presented of a substantial simulation study conducted to investigate the influence of some of these factors on the safety and the accuracy of the procedure with a view toward providing general guidance for investigators conducting such studies. The Bayesian procedures evaluated use logistic regression to model the two responses, which are both assumed to be binary. The simulation study is based on features of a recently completed study of a compound with potential benefit to patients suffering from inflammatory diseases of the lung.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common method for testing preference for objects is to determine which of a pair of objects is approached first in a paired-choice paradigm. In comparison, many studies of preference for environmental enrichment (EE) devices have used paradigms in which total time spent with each of a pair of objects is used to determine preference. While each of these paradigms gives a specific measure of the preference for one object in comparison to another, neither method allows comparisons between multiple objects simultaneously. Since it is possible that several EE objects would be placed in a cage together to improve animal welfare, it is important to determine measures for rats' preferences in conditions that mimic this potential home cage environment. While it would be predicted that each type of measure would produce similar rankings of objects, this has never been tested empirically. In this study, we compared two paradigms: EE objects were either presented in pairs (paired-choice comparison) or four objects were presented simultaneously (simultaneous presentation comparison). We used frequency of first interaction and time spent with each object to rank the objects in the paired-choice experiment, and time spent with each object to rank the objects in the simultaneous presentation experiment. We also considered the behaviours elicited by the objects to determine if these might be contributing to object preference. We demonstrated that object ranking based on time spent with objects from the paired-choice experiment predicted object ranking in the simultaneous presentation experiment. Additionally, we confirmed that behaviours elicited were an important determinant of time spent with an object. This provides convergent evidence that both paired choice and simultaneous comparisons provide valid measures of preference for EE objects in rats. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two experiments investigated the influence of implicit memory on consumer choice for brands with varying levels of familiarity. Priming was measured using a consideration-choice task, developed by Coates, Butler and Berry (2004). Experiment 1 employed a coupon-rating task at encoding that required participants to meaningfully process individual brand names, to assess whether priming could affect participants' final (preferred) choices for familiar brands. Experiment 2 used this same method to assess the impact of implicit memory on consideration and choice for unknown and leader brands, presented in conjunction with familiar competitors. Significant priming was obtained in both experiments, and was shown to directly influence final choice in the case of familiar and highly familiar leader brands. Moreover, it was shown that a single prior exposure could lead participants to consider buying an unknown, and indeed fictitious, brand. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unless the benefits to society of measures to protect and improve the welfare of animals are made transparent by means of their valuation they are likely to go unrecognised and cannot easily be weighed against the costs of such measures as required, for example, by policy-makers. A simple single measure scoring system, based on the Welfare Quality® index, is used, together with a choice experiment economic valuation method, to estimate the value that people place on improvements to the welfare of different farm animal species measured on a continuous (0-100) scale. Results from using the method on a survey sample of some 300 people show that it is able to elicit apparently credible values. The survey found that 96% of respondents thought that we have a moral obligation to safeguard the welfare of animals and that over 72% were concerned about the way farm animals are treated. Estimated mean annual willingness to pay for meat from animals with improved welfare of just one point on the scale was £5.24 for beef cattle, £4.57 for pigs and £5.10 for meat chickens. Further development of the method is required to capture the total economic value of animal welfare benefits. Despite this, the method is considered a practical means for obtaining economic values that can be used in the cost-benefit appraisal of policy measures intended to improve the welfare of animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attribute non-attendance in choice experiments affects WTP estimates and therefore the validity of the method. A recent strand of literature uses attenuated estimates of marginal utilities of ignored attributes. Following this approach, we propose a generalisation of the mixed logit model whereby the distribution of marginal utility coefficients of a stated non-attender has a potentially lower mean and lower variance than those of a stated attender. Model comparison shows that our shrinkage approach fits the data better and produces more reliable WTP estimates. We further find that while reliability of stated attribute non-attendance increases in successive choice experiments, it does not increase when respondents report having ignored the same attribute twice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fingerprint method for detecting anthropogenic climate change is applied to new simulations with a coupled ocean-atmosphere general circulation model (CGCM) forced by increasing concentrations of greenhouse gases and aerosols covering the years 1880 to 2050. In addition to the anthropogenic climate change signal, the space-time structure of the natural climate variability for near-surface temperatures is estimated from instrumental data over the last 134 years and two 1000 year simulations with CGCMs. The estimates are compared with paleoclimate data over 570 years. The space-time information on both the signal and the noise is used to maximize the signal-to-noise ratio of a detection variable obtained by applying an optimal filter (fingerprint) to the observed data. The inclusion of aerosols slows the predicted future warming. The probability that the observed increase in near-surface temperatures in recent decades is of natural origin is estimated to be less than 5%. However, this number is dependent on the estimated natural variability level, which is still subject to some uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inverse methods are widely used in various fields of atmospheric science. However, such methods are not commonly used within the boundary-layer community, where robust observations of surface fluxes are a particular concern. We present a new technique for deriving surface sensible heat fluxes from boundary-layer turbulence observations using an inverse method. Doppler lidar observations of vertical velocity variance are combined with two well-known mixed-layer scaling forward models for a convective boundary layer (CBL). The inverse method is validated using large-eddy simulations of a CBL with increasing wind speed. The majority of the estimated heat fluxes agree within error with the proscribed heat flux, across all wind speeds tested. The method is then applied to Doppler lidar data from the Chilbolton Observatory, UK. Heat fluxes are compared with those from a mast-mounted sonic anemometer. Errors in estimated heat fluxes are on average 18 %, an improvement on previous techniques. However, a significant negative bias is observed (on average −63%) that is more pronounced in the morning. Results are improved for the fully-developed CBL later in the day, which suggests that the bias is largely related to the choice of forward model, which is kept deliberately simple for this study. Overall, the inverse method provided reasonable flux estimates for the simple case of a CBL. Results shown here demonstrate that this method has promise in utilizing ground-based remote sensing to derive surface fluxes. Extension of the method is relatively straight-forward, and could include more complex forward models, or other measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The joint and alternative uses of attribute non-attendance and importance ranking data within discrete choice experiments are investigated using data from Lebanon examining consumers’ preferences for safety certification in food. We find that both types of information; attribute non-attendance and importance rankings, improve estimates of respondent utility. We introduce a method of integrating both types of information simultaneously and find that this outperforms models where either importance ranking or non-attendance data are used alone. As in previous studies, stated non-attendance of attributes was not found to be consistent with respondents having zero marginal utility for those attributes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the many existing crosslinking procedures, glutaraldehyde (GA) is still the method of choice used in the manufacture of bioprosthesis. The major problems with GA are: (a) uncontrolled reactivity due to the chemical complexity or GA solutions; (b) toxicity due to the release of GA from polymeric crosslinks; and (c) tissue impermeabilization due to polymeric and heterogeneous crosslinks formation, partially responsible for the undesirable calcification of the bioprosthesis. A new method of crosslinking glutaraldehyde acetals has been developed with GA in acid ethanolic solution, and after the distribution inside de matrix, GA is released to crosslinking. Concentrations of hydrochloride acid in ethanolic solutions between 0.1 and 0.001 mol/L with GA concentration between 0.1 and 1.0% were measured in an ultraviolet spectrophotometer to verify the presence of free aldehyde groups and polymeric compounds of GA. After these measurements, the solutions were used to crosslink bovine pericardium. The spectrophotometric results showed that GA was better protected in acetal forms for acid ethanolic solution with HCl at 0.003 mol/L and GA 1.0%(v/v). The shrinkage temperature results of bovine pericardium crosslinked with acetal solutions showed values near 85 C after the exposure to triethylamine vapors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the use of the choice experiment method for modeling the demand for snowmobiling . The Choice Experiment includes five attributes, standard, composition, length, price day card and experience along trail. The paper estimates the snowmobile owners’ preferences and the most preferred attributes, including their will-ingness to pay for a daytrip on groomed snowmobile trail. The data consists of the an-swers from 479 registered snowmobile owners, who answered two hypothetical choice questions each. Estimating using the multinominal logit model, it is found that snow-mobilers on average are willing to pay 22.5 SEK for one day of snowmobiling on a trail with quality described as skidded every 14th day. Furthermore, it is found that the WTP increases with the quality of trail grooming. The result of this paper can be used as a yardstick for snowmobile clubs wanting to develop their trail net worth, organizations and companies developing snowmobiling as a recreational activities and marketers in-terested in marketing snowmobiling as recreational activities.