936 resultados para Target Field Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study of the angular distributions of leptons from decays of J/ψ"s produced in p-C and p-W collisions at s√=41.6~GeV has been performed in the J/ψ Feynman-x region −0.341 GeV/c a significant dependence on the reference frame is found: the polar anisotropy is more pronounced in the Collins-Soper frame and almost vanishes in the helicity frame, where, instead, a significant azimuthal anisotropy arises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perinteisten markkinointiviestintäkanavien menettäessä jatkuvasti tehoaan mediakentän ja kohderyhmien sirpaloituessa yhä pienempiin yksiköihin markkinointiorganisaatiot etsivät vaihtoehtoisia tapoja tavoittaakseen kohdeyleisönsä. Yksi vaihtoehtoinen markkinointiviestintäkeino on tuotesijoittelu (product placement), jossa (merkki)tuotteita sijoitetaan erilaisten viihdetuotantojen, kuten elokuvien, televisio-ohjelmien ja tietokonepelien, tarinan yhteyteen, jotta yhä medialukutaitoisempi kohdeyleisö ei pystyisi välttämään kaupallista viestiä esimerkiksi vaihtamalla televisiokanavaa tai kääntämällä lehden sivua. Koska tuote on sijoitettu kerrottavan tarinan sisään, markkinointiviestin — eli tuotteen havaitsemisen — välttäminen on huomattavasti vaikeampaa kuin perinteisten markkinointiviestintämenetelmien kohdalla. Lisäksi, sijoitellut tuotteet ovat tavallisesti kiinteässä yhteydessä tarinan juonen ja henkilöhahmojen kanssa siten, että tuote saa näistä yhteyksistä positiivista vahvistusta imagolleen. Pro Gradu-tutkielman tarkoituksena oli selvittää tuotesijoittelun käyttökelpoisuutta markkinointiviestinnässä sekä miten kulutushyödykemarkkinoijat voivat hyödyntää menetelmää markkinointiviestintästrategioissaan. Tuotesijoittelun poikkeava luonne markkinointiviestintävälineenä tuotti kysymyksen miten tuotesijoittelua voitaisiin hyödyntää yhteistyössä muiden markkinointiviestintäkeinojen kanssa. Tätä varten tutkimuksessa tuotesijoittelu yhdistettiin integroidun markkinointiviestinnän (IMC) viitekehykseen. IMC-konsepti syntyi markkinointiviestinnässä vastaamaan samaan tarpeeseen kuin tuotesijoittelukin: pirstaloitunut mediakenttä ja yksittäiset kohderyhmät vaativat kehittyneempää ja yhtenäisempää markkinointiviestinnän suunnittelua ja toteutusta. Tutkimuksen johtopäätöksenä tuotesijoittelu todettiin käyttökelpoiseksi markkinointiviestintäkeinoksi mikäli viestinnän tavoitteena on muu kuin tuotteen myyntiin suorasti vaikuttaminen. Tuotesijoittelu on sen sijaan erittäin tehokas tuotetietoisuuden lisäämisessä, erityisesti tunnistamisen kohdalla. Tuotesijoittelu voi myös tuottaa suoran ostotarpeen mutta tällöin viestin vastaanottajalla täytyy olla vallitseva tarve kyseisen tuoteryhmän osalta ennen altistumista ko. markkinointiviestille. Tuotesijoittelu voidaan sisällyttää IMC-suunnitteluprosessiin markkinointiviestintästrategian kiinteänä osana. Integraatio markkinointiviestinnässä siten, että tuotesijoittelua tuettaisiin muilla viestintäkeinoilla yhtenäisen kampanjan kehittämiseksi on kuitenkin paljon ennakoitua harvinaisempaa, johtuen ehkä eniten tuotesijoittelun poikkeuksellisesta luonteesta ja kyseisen viestintämuodon vaikeasta hallittavuudesta markkinoijan taholta. Tutkimus toteutettiin normatiivisena case-tutkimuksena pääasiassa sekundäärisiä tietolähteitä hyödyntäen. Case-tutkimuksia varten kerättiin primääristä tietoa kyselylomakkeella kahdesta tuotesijoittelua käyttävästä kansainvälisestä yhtiöstä, jonka lisäksi myös sekundäärisiä tietolähteitä hyödynnettiin case-osan tiedonkeruussa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background:Average energies of nuclear collective modes may be efficiently and accurately computed using a nonrelativistic constrained approach without reliance on a random phase approximation (RPA). Purpose: To extend the constrained approach to the relativistic domain and to establish its impact on the calibration of energy density functionals. Methods: Relativistic RPA calculations of the giant monopole resonance (GMR) are compared against the predictions of the corresponding constrained approach using two accurately calibrated energy density functionals. Results: We find excellent agreement at the 2% level or better between the predictions of the relativistic RPA and the corresponding constrained approach for magic (or semimagic) nuclei ranging from 16 O to 208 Pb. Conclusions: An efficient and accurate method is proposed for incorporating nuclear collective excitations into the calibration of future energy density functionals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli rakentaa case yritykselle malli lyhyen aikavälin kannattavuuden estimointia varten. Tutkimusmetodi on konstruktiivinen, ja malli kehitettiin laskentaihmisten avustuksella. Teoriaosassa käytiin kirjallisuuskatsauksen avulla läpi kannattavuutta, budjetointia sekä itse ennustamista. Teoriaosassa pyrittiin löytämään sellaisia menetelmiä, joita voitaisiin käyttää lyhyen aikavälin kannattavuuden estimoinnissa. Rakennettavalle mallille asetettujen vaatimusten mukaan menetelmäksi valittiin harkintaan perustuva menetelmä (judgmental). Tutkimuksen mukaan kannattavuuteen vaikuttaa myyntihinta ja –määrä, tuotanto, raaka-aineiden hinnat ja varaston muutos. Rakennettu malli toimii kohdeyrityksessä kohtalaisen hyvin ja huomattavaa on se, että eri tehtaiden ja eri koneiden väliset erot saattavat olla kohtuullisen suuret. Nämä erot johtuvat pääasiassa tehtaan koosta ja mallien erilaisuudesta. Mallin käytännön toimivuus tulee kuitenkin parhaiten selville silloin, kun se on laskentaihmisten käytössä. Ennustamiseen liittyy kuitenkin aina omat ongelmansa ja uudetkaan menetelmät eivät välttämättä poista näitä ongelmia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

«Quel est l'âge de cette trace digitale?» Cette question est relativement souvent soulevée au tribunal ou lors d'investigations, lorsque la personne suspectée admet avoir laissé ses empreintes digitales sur une scène de crime mais prétend l'avoir fait à un autre moment que celui du crime et pour une raison innocente. Toutefois, aucune réponse ne peut actuellement être donnée à cette question, puisqu'aucune méthodologie n'est pour l'heure validée et acceptée par l'ensemble de la communauté forensique. Néanmoins, l'inventaire de cas américains conduit dans cette recherche a montré que les experts fournissent tout de même des témoignages au tribunal concernant l'âge de traces digitales, même si ceux-­‐ci sont majoritairement basés sur des paramètres subjectifs et mal documentés. Il a été relativement aisé d'accéder à des cas américains détaillés, ce qui explique le choix de l'exemple. Toutefois, la problématique de la datation des traces digitales est rencontrée dans le monde entier, et le manque de consensus actuel dans les réponses données souligne la nécessité d'effectuer des études sur le sujet. Le but de la présente recherche est donc d'évaluer la possibilité de développer une méthode de datation objective des traces digitales. Comme les questions entourant la mise au point d'une telle procédure ne sont pas nouvelles, différentes tentatives ont déjà été décrites dans la littérature. Cette recherche les a étudiées de manière critique, et souligne que la plupart des méthodologies reportées souffrent de limitations prévenant leur utilisation pratique. Néanmoins, certaines approches basées sur l'évolution dans le temps de composés intrinsèques aux résidus papillaires se sont montrées prometteuses. Ainsi, un recensement détaillé de la littérature a été conduit afin d'identifier les composés présents dans les traces digitales et les techniques analytiques capables de les détecter. Le choix a été fait de se concentrer sur les composés sébacés détectés par chromatographie gazeuse couplée à la spectrométrie de masse (GC/MS) ou par spectroscopie infrarouge à transformée de Fourier. Des analyses GC/MS ont été menées afin de caractériser la variabilité initiale de lipides cibles au sein des traces digitales d'un même donneur (intra-­‐variabilité) et entre les traces digitales de donneurs différents (inter-­‐variabilité). Ainsi, plusieurs molécules ont été identifiées et quantifiées pour la première fois dans les résidus papillaires. De plus, il a été déterminé que l'intra-­‐variabilité des résidus était significativement plus basse que l'inter-­‐variabilité, mais que ces deux types de variabilité pouvaient être réduits en utilisant différents pré-­‐ traitements statistiques s'inspirant du domaine du profilage de produits stupéfiants. Il a également été possible de proposer un modèle objectif de classification des donneurs permettant de les regrouper dans deux classes principales en se basant sur la composition initiale de leurs traces digitales. Ces classes correspondent à ce qui est actuellement appelé de manière relativement subjective des « bons » ou « mauvais » donneurs. Le potentiel d'un tel modèle est élevé dans le domaine de la recherche en traces digitales, puisqu'il permet de sélectionner des donneurs représentatifs selon les composés d'intérêt. En utilisant la GC/MS et la FTIR, une étude détaillée a été conduite sur les effets de différents facteurs d'influence sur la composition initiale et le vieillissement de molécules lipidiques au sein des traces digitales. Il a ainsi été déterminé que des modèles univariés et multivariés pouvaient être construits pour décrire le vieillissement des composés cibles (transformés en paramètres de vieillissement par pré-­‐traitement), mais que certains facteurs d'influence affectaient ces modèles plus sérieusement que d'autres. En effet, le donneur, le substrat et l'application de techniques de révélation semblent empêcher la construction de modèles reproductibles. Les autres facteurs testés (moment de déposition, pression, température et illumination) influencent également les résidus et leur vieillissement, mais des modèles combinant différentes valeurs de ces facteurs ont tout de même prouvé leur robustesse dans des situations bien définies. De plus, des traces digitales-­‐tests ont été analysées par GC/MS afin d'être datées en utilisant certains des modèles construits. Il s'est avéré que des estimations correctes étaient obtenues pour plus de 60 % des traces-­‐tests datées, et jusqu'à 100% lorsque les conditions de stockage étaient connues. Ces résultats sont intéressants mais il est impératif de conduire des recherches supplémentaires afin d'évaluer les possibilités d'application de ces modèles dans des cas réels. Dans une perspective plus fondamentale, une étude pilote a également été effectuée sur l'utilisation de la spectroscopie infrarouge combinée à l'imagerie chimique (FTIR-­‐CI) afin d'obtenir des informations quant à la composition et au vieillissement des traces digitales. Plus précisément, la capacité de cette technique à mettre en évidence le vieillissement et l'effet de certains facteurs d'influence sur de larges zones de traces digitales a été investiguée. Cette information a ensuite été comparée avec celle obtenue par les spectres FTIR simples. Il en a ainsi résulté que la FTIR-­‐CI était un outil puissant, mais que son utilisation dans l'étude des résidus papillaires à des buts forensiques avait des limites. En effet, dans cette recherche, cette technique n'a pas permis d'obtenir des informations supplémentaires par rapport aux spectres FTIR traditionnels et a également montré des désavantages majeurs, à savoir de longs temps d'analyse et de traitement, particulièrement lorsque de larges zones de traces digitales doivent être couvertes. Finalement, les résultats obtenus dans ce travail ont permis la proposition et discussion d'une approche pragmatique afin d'aborder les questions de datation des traces digitales. Cette approche permet ainsi d'identifier quel type d'information le scientifique serait capable d'apporter aux enquêteurs et/ou au tribunal à l'heure actuelle. De plus, le canevas proposé décrit également les différentes étapes itératives de développement qui devraient être suivies par la recherche afin de parvenir à la validation d'une méthodologie de datation des traces digitales objective, dont les capacités et limites sont connues et documentées. -- "How old is this fingermark?" This question is relatively often raised in trials when suspects admit that they have left their fingermarks on a crime scene but allege that the contact occurred at a time different to that of the crime and for legitimate reasons. However, no answer can be given to this question so far, because no fingermark dating methodology has been validated and accepted by the whole forensic community. Nevertheless, the review of past American cases highlighted that experts actually gave/give testimonies in courts about the age of fingermarks, even if mostly based on subjective and badly documented parameters. It was relatively easy to access fully described American cases, thus explaining the origin of the given examples. However, fingermark dating issues are encountered worldwide, and the lack of consensus among the given answers highlights the necessity to conduct research on the subject. The present work thus aims at studying the possibility to develop an objective fingermark dating method. As the questions surrounding the development of dating procedures are not new, different attempts were already described in the literature. This research proposes a critical review of these attempts and highlights that most of the reported methodologies still suffer from limitations preventing their use in actual practice. Nevertheless, some approaches based on the evolution of intrinsic compounds detected in fingermark residue over time appear to be promising. Thus, an exhaustive review of the literature was conducted in order to identify the compounds available in the fingermark residue and the analytical techniques capable of analysing them. It was chosen to concentrate on sebaceous compounds analysed using gas chromatography coupled with mass spectrometry (GC/MS) or Fourier transform infrared spectroscopy (FTIR). GC/MS analyses were conducted in order to characterize the initial variability of target lipids among fresh fingermarks of the same donor (intra-­‐variability) and between fingermarks of different donors (inter-­‐variability). As a result, many molecules were identified and quantified for the first time in fingermark residue. Furthermore, it was determined that the intra-­‐variability of the fingermark residue was significantly lower than the inter-­‐variability, but that it was possible to reduce both kind of variability using different statistical pre-­‐ treatments inspired from the drug profiling area. It was also possible to propose an objective donor classification model allowing the grouping of donors in two main classes based on their initial lipid composition. These classes correspond to what is relatively subjectively called "good" or "bad" donors. The potential of such a model is high for the fingermark research field, as it allows the selection of representative donors based on compounds of interest. Using GC/MS and FTIR, an in-­‐depth study of the effects of different influence factors on the initial composition and aging of target lipid molecules found in fingermark residue was conducted. It was determined that univariate and multivariate models could be build to describe the aging of target compounds (transformed in aging parameters through pre-­‐ processing techniques), but that some influence factors were affecting these models more than others. In fact, the donor, the substrate and the application of enhancement techniques seemed to hinder the construction of reproducible models. The other tested factors (deposition moment, pressure, temperature and illumination) also affected the residue and their aging, but models combining different values of these factors still proved to be robust. Furthermore, test-­‐fingermarks were analysed with GC/MS in order to be dated using some of the generated models. It turned out that correct estimations were obtained for 60% of the dated test-­‐fingermarks and until 100% when the storage conditions were known. These results are interesting but further research should be conducted to evaluate if these models could be used in uncontrolled casework conditions. In a more fundamental perspective, a pilot study was also conducted on the use of infrared spectroscopy combined with chemical imaging in order to gain information about the fingermark composition and aging. More precisely, its ability to highlight influence factors and aging effects over large areas of fingermarks was investigated. This information was then compared with that given by individual FTIR spectra. It was concluded that while FTIR-­‐ CI is a powerful tool, its use to study natural fingermark residue for forensic purposes has to be carefully considered. In fact, in this study, this technique does not yield more information on residue distribution than traditional FTIR spectra and also suffers from major drawbacks, such as long analysis and processing time, particularly when large fingermark areas need to be covered. Finally, the results obtained in this research allowed the proposition and discussion of a formal and pragmatic framework to approach the fingermark dating questions. It allows identifying which type of information the scientist would be able to bring so far to investigators and/or Justice. Furthermore, this proposed framework also describes the different iterative development steps that the research should follow in order to achieve the validation of an objective fingermark dating methodology, whose capacities and limits are well known and properly documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book is the transcript of a witness seminar on the history of experimental economics, in which eleven high-profile experimental economists participated, including Nobel Laureates Vernon Smith, Reinhard Selten and Alvin Roth. The witness seminar was constructed along four different topics: skills, community, laboratory, and funding. The transcript is preceded by an introduction explaining the method of the witness seminar and its specific set-up and resuming its results. The participants' contribution and their lively discussion provide a wealth of insights into the emergence of experimental economics as a field of research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with a hardware accelerated Java virtual machine, named REALJava. The REALJava virtual machine is targeted for resource constrained embedded systems. The goal is to attain increased computational performance with reduced power consumption. While these objectives are often seen as trade-offs, in this context both of them can be attained simultaneously by using dedicated hardware. The target level of the computational performance of the REALJava virtual machine is initially set to be as fast as the currently available full custom ASIC Java processors. As a secondary goal all of the components of the virtual machine are designed so that the resulting system can be scaled to support multiple co-processor cores. The virtual machine is designed using the hardware/software co-design paradigm. The partitioning between the two domains is flexible, allowing customizations to the resulting system, for instance the floating point support can be omitted from the hardware in order to decrease the size of the co-processor core. The communication between the hardware and the software domains is encapsulated into modules. This allows the REALJava virtual machine to be easily integrated into any system, simply by redesigning the communication modules. Besides the virtual machine and the related co-processor architecture, several performance enhancing techniques are presented. These include techniques related to instruction folding, stack handling, method invocation, constant loading and control in time domain. The REALJava virtual machine is prototyped using three different FPGA platforms. The original pipeline structure is modified to suit the FPGA environment. The performance of the resulting Java virtual machine is evaluated against existing Java solutions in the embedded systems field. The results show that the goals are attained, both in terms of computational performance and power consumption. Especially the computational performance is evaluated thoroughly, and the results show that the REALJava is more than twice as fast as the fastest full custom ASIC Java processor. In addition to standard Java virtual machine benchmarks, several new Java applications are designed to both verify the results and broaden the spectrum of the tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Needle trap devices (NTDs) are a relatively new and promising tool for headspace (HS) analysis. In this study, a dynamic HS sampling procedure is evaluated for the determination of volatile organic compounds (VOCs) in whole blood samples. A full factorial design was used to evaluate the influence of the number of cycles and incubation time and it is demonstrated that the controlling factor in the process is the number of cycles. A mathematical model can be used to determine the most appropriate number of cycles required to adsorb a prefixed amount of VOCs present in the HS phase whenever quantitative adsorption is reached in each cycle. Matrix effect is of great importance when complex biological samples, such as blood, are analyzed. The evaluation of the salting out effect showed a significant improvement in the volatilization of VOCs to the HS in this type of matrices. Moreover, a 1:4 (blood:water) dilution is required to obtain quantitative recoveries of the target analytes when external calibration is used. The method developed gives detection limits in the 0.020–0.080 μg L−1 range (0.1–0.4 μg L−1 range for undiluted blood samples) with appropriate repeatability values (RSD < 15% at high level and <23% at LOQ level). Figure of merits of the method can be improved by using a smaller phase ratio (i.e., an increase in the blood volume and a decrease in the HS volume), which lead to lower detection limits, better repeatability values and greater sensibility. Twenty-eight blood samples have been evaluated with the proposed method and the results agree with those indicated in other studies. Benzene was the only target compound that gave significant differences between blood levels detected in volunteer non-smokers and smokers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theoretical model for the noise properties of n+nn+ diodes in the drift-diffusion framework is presented. In contrast with previous approaches, our model incorporates both the drift and diffusive parts of the current under inhomogeneous and hot-carrier conditions. Closed analytical expressions describing the transport and noise characteristics of submicrometer n+nn+ diodes, in which the diode base (n part) and the contacts (n+ parts) are coupled in a self-consistent way, are obtained

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Venäjällä rakentamiseen kohdistuu samoja riskitekijöitä kuin Suomessakin, mutta erilaisesta toimintaympäristöstä johtuen riskitekijöiden merkittävyys vaihtelee. Lisäksi Venäjällä rakentamiseen kohdistuu monia muitakin riskejä. Tutkielman päätarkoituksena on tunnistaa tutkimuksen kohdeyrityksen tytäryhtiönsä kautta Venäjällä harjoittamaan asuntotuotantoon kohdistuvia merkittävimpiä riskitekijöitä. Riskitekijöiden merkittävyyttä Venäjällä harjoitettavassa tuotannossa arvioidaan suhteessa Suomessa vallitsevaan toimintaympäristöön ja rakennusalan riskikenttään. Tutkielma on toteutettu kvalitatiivisella tutkimusotteella. Tutkielman teoriaosa luo pohjan empiirisen osan ymmärtämiselle ja tutkimukselle. Tutkielmassa määritetyt merkittävät riskitekijät Venäjällä harjoitettavassa asuntotuotannossa ovat suurelta osin lähtöisin toimintaympäristöön liittyvistä tekijöistä. Venäjän talous on tällä hetkellä globaalin talouskriisin kourissa. Myös asuntotuotantoon kohdistuu tutkielman tulosten perusteella merkittäviä riskitekijöitä. Toimintaympäristöön ja markkinoihin kohdistuvat riskitekijät ovat kuitenkin yleisesti ottaen merkittävämpiä Venäjällä rakennettaessa kuin itse asuntotuotannon toteuttamiseen liittyvät riskit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis was produced for the Technology Marketing unit at the Nokia Research Center. Technology marketing was a new function at Nokia Research Center, and needed an established framework with the capacity to take into account multiple aspects for measuring the team performance. Technology marketing functions had existed in other parts of Nokia, yet no single method had been agreed upon for measuring their performance. The purpose of this study was to develop a performance measurement system for Nokia Research Center Technology Marketing. The target was that Nokia Research Center Technology Marketing had a framework for separate metrics; including benchmarking for starting level and target values in the future planning (numeric values were kept confidential within the company). As a result of this research, the Balanced Scorecard model of Kaplan and Norton, was chosen for the performance measurement system for Nokia Research Center Technology Marketing. This research selected the indicators, which were utilized in the chosen performance measurement system. Furthermore, performance measurement system was defined to guide the Head of Marketing in managing Nokia Research Center Technology Marketing team. During the research process the team mission, vision, strategy and critical success factors were outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In animal psychology, the open-field (OF) test is a traditional method for studying different aspects of rodent behavior, with thigmotaxis (i.e., wallseeking behavior) being one of the best validated OF parameters employed to measure emotionality. The main purpose of the present study was to investigate the selection response in mice selectively bred for high and low levels of OF thigmotaxis (the HOFT and LOFT lines, respectively). The mice (N = 2048) were selected for 23 generations, resulting in bidirectional phenotypic divergence between the two lines; that is, the HOFT mice were more thigmotactic (i.e., more emotional) than the LOFT mice across the different generations. The origin of the line difference in thigmotaxis was further investigated by using the crossfostering paradigm, with the results suggesting that the divergence between the two lines was primarily innate in origin and not influenced by differing maternal behavior. The stability of the selection trait was examined by testing the animals at different ages as well as in varying conditions. The results indicated that the line difference in thigmotaxis was not affected by age at the time of testing, and it also persisted in the different OF testing situations as well as during pregnancy and lactation. The examination of a possible coselection of other characteristics revealed that the more thigmotactic HOFT mice lived longer than the less thigmotactic LOFT mice. In addition, the HOFT mice tended to rear and explore less than the LOFT mice, supporting the general assumption that emotionality and exploration are inversely related. The two lines did not generally differ in ambulation and defecation, that is, in the traditional OF indexes of emotionality, conforming to the suggestion that emotionality is a multidimensional construct. The effects of sex on different OF parameters were also assessed, with the results suggesting that among the HOFT and LOFT lines, the female mice were more emotional than the male mice. The examination of the temporal changes in the HOFT and LOFT lines’ OF behavior revealed some contradictory findings that also partially conflicted with general assumptions. Although this study did not show prominent differences in maternal responsiveness between the HOFT and LOFT mothers, the results suggested that the line divergence in emotionality was more pronounced in the presence of a pup after parturition than during pregnancy. The present study clearly demonstrates that OF thigmotaxis is a strong characteristic for producing two diverging lines of mice. The difference in thigmotaxis between the selectively bred HOFT and LOFT mice seemed to be a stable and robust feature of these animals, and it appeared to stem from a genetic background.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Neem tree, Azadirachta indica, provides many useful compounds that are used as pesticides. However, the efficiency in field of products like neem oil can be committed because they have not been observed reproductive content of secondary metabolic like azadirachtin. Based on reverse-phase high-performance liquid chromatography (HPLC) a new method was developed to permit the rapid quantitative analysis of azadirachtin from seeds, extracts and oil of Neem. In the present study it was evaluated the azadirachtin quantitative variation among various Neem's extracts and seeds showing the importance of quality control for reproduction of the insecticide efficiency, using S. frugiperda as target insect.