980 resultados para Instrumentation and Applied Physics (Formally ISU)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sexual reproduction is nearly universal in eukaryotes and genetic determination of sex prevails among animals. The astonishing diversity of sex-determining systems and sex chromosomes is yet bewildering. Some taxonomic groups possess conserved and dimorphic sex chromosomes, involving a functional copy (e.g. mammals' X, birds' Z) and a degenerated copy (mammals' Y, birds' W), implying that sex- chromosomes are expected to decay. In contrast, others like amphibians, reptiles and fishes yet maintained undifferentiated sex chromosomes. Why such different evolutionary trajectories? In this thesis, we empirically test and characterize the main hypotheses proposed to prevent the genetic decay of sex chromosomes, namely occasional X-Y recombination and frequent sex-chromosome transitions, using the Palearctic radiation of Hyla tree frogs as a model system. We take a phylogeographic and phylogenetic approach to relate sex-chromosome recombination, differentiation, and transitions in a spatial and temporal framework. By reconstructing the recent evolutionary history of the widespread European tree frog H. arborea, we showed that sex chromosomes can recombine in males, preventing their differentiation, a situation that potentially evolves rapidly. At the scale of the entire radiation, X-Y recombination combines with frequent transitions to prevent sex-chromosome degeneration in Hyla: we traced several turnovers of sex-determining system within the last 10My. These rapid changes seem less random than usually assumed: we gathered evidences that one chromosome pair is a sex expert, carrying genes with key role in animal sex determination, and which probably specialized through frequent reuse as a sex chromosome in Hyla and other amphibians. Finally, we took advantage of secondary contact zones between closely-related Hyla lineages to evaluate the consequences of sex chromosome homomorphy on the genetics of speciation. In comparison with other systems, the evolution of sex chromosomes in Hyla emphasized the existence of consistent evolutionary patterns within the chaotic diversity of flexibility of cold-blooded vertebrates' sex-determining systems, and provides insights into the evolution of recombination. Beyond sex-chromosome evolution, this work also significantly contributed to speciation, phylogeography and applied conservation research. -- La reproduction sexuée est quasi-universelle chez les eucaryotes et le sexe est le plus souvent déterminé génétiquement au sein du règne animal. L'incroyable diversité des systèmes de reproduction et des chromosomes sexuels est particulièrement étonnante. Certains groupes taxonomiques possèdent des chromosomes sexuels dimorphiques et très conservés, avec une copie entièrement fonctionnelle (ex : le X des mammifères, le Z des oiseaux) et une copie dégénérée (ex : le Y des mammifères, le W des oiseaux), suggérant que les chromosomes sexuels sont voués à se détériorer. Cependant les chromosomes sexuels d'autres groupes tels que les amphibiens, les reptiles et les poissons sont pour la plupart indifférenciés. Comment expliquer des trajectoires évolutives si différentes? Au cours de cette thèse, nous avons étudié empiriquement les processus évolutifs pouvant maintenir les chromosomes sexuels intacts, à savoir la recombinaison X-Y occasionnel ainsi que les substitutions fréquentes de chromosomes sexuels, en utilisant les rainettes Paléarctiques du genre Hyla comme modèle d'étude. Nous avons adopté une approche phylogéographique et phylogénétique pour appréhender les événements de recombinaison, de différenciation et de transitions de chromosomes sexuels dans un contexte spatio-temporel. En retraçant l'histoire évolutive récente de la rainette verte H. arborea, nous avons mis en évidence que les chromosomes sexuels pouvaient recombiner chez les mâles, empêchant ainsi leur différenciation, et que ce processus avait le potentiel d'évoluer très rapidement. A l'échelle plus globale de la radiation, il apparait que les phénomènes de recombinaison X-Y soient également accompagnés de substitutions de chromosomes sexuels, et participent de concert au maintien de chromosomes sexuels intacts dans les populations: le système de détermination du sexe des rainettes a changé plusieurs fois au cours des 10 derniers millions d'années. Ces transitions fréquentes ne semblent pas aléatoires: nous avons identifié une paire de chromosomes qui présente des caractéristiques présageant d'une spécialisation dans le déterminisme du sexe (notamment car elle possède des gènes importants pour cette fonction), et qui a été réutilisée plusieurs fois comme tel chez les rainettes ainsi que d'autres amphibiens. Enfin, nous avons étudié l'hybridation entre différentes espèces dans leurs zones de contact, afin d'évaluer si l'absence de différenciation entre X et Y jouaient un rôle dans les processus génétiques de spéciation. Outre son intérêt pour la compréhension de l'évolution des chromosomes sexuels, ce travail contribue de manière significative à d'autres domaines de recherche tels que la spéciation, la phylogéographie, ainsi que la biologie de la conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Various studies suggest that oxidative modifications of low density lipoprotein (LDL), and also other lipoproteins, have an important role in the development of atherosclerosis. In addition to the oxidation products formed endogenously, oxidised triacylglycerols (TAG) and oxysterols in the diet contribute to the oxidised lipoproteins found in circulation. However, studies on both the effect of oxidised dietary lipids on lipoprotein lipid oxidation and the reactions that modify oxidised fat after ingestion have been scarce. Studies on the effects of dietary antioxidants on the lipid oxidation in vivo and the risk of atherosclerosis have been inconclusive. More clinical trials are needed to test the importance of lipoprotein oxidation as a cardiovascular risk factor in humans. In the recent years, various methods have been optimised and applied to the analysis of lipid oxidation products in vivo, and information on the molecular structures of oxidised lipids in plasma, lipoproteins and atherosclerotic plaques has started to accumulate. However, specific structures of oxidised TAG molecules present in these tissues and lipoprotein fractions have not been investigated earlier. In the orginal research in this thesis, an approach based on highperformance liquid chromatographyelectrospray ionisationmass spectrometry (HPLCESIMS) and baseline diene conjugation (BDC) methods was used in order to investigate lipid oxidation level and oxidised TAG molecular structures in pig and human lipoproteins after dietary interventions. The approach was optimised with human LDL samples, which contained various oxidation products of TAG. LDL particles of hyperlipidaemic subjects contained an elevated amount of conjugated dienes. In the pig studies, several oxidised TAG structures with hydroxy, keto, epoxy or aldehydic groups were found in chylomicrons and VLDL after diets rich in sunflower seed oil. Also, the results showed that oxidised sunflower seed oil increased the oxidation of lipoprotein lipids and their TAG molecules. TAG hydroperoxides could be detected neither in the small intestinal mucosa of the pigs fed on the oxidised oil nor in their chylomicrons or VLDL.6 In the clinical studies, dietary flavonol aglycones extracted from sea buckthorn berries did not have an effect on lipoprotein lipid oxidation and other potential risk factors of atherosclerosis, but their absorption was demonstrated. Oil supplementation seemed to increase the bioavailability of the flavonols. Oxidised TAG molecules were detected in LDL particles of the subjects after both flavonol and control diets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In two previous papers [J. Differential Equations, 228 (2006), pp. 530 579; Discrete Contin. Dyn. Syst. Ser. B, 6 (2006), pp. 1261 1300] we have developed fast algorithms for the computations of invariant tori in quasi‐periodic systems and developed theorems that assess their accuracy. In this paper, we study the results of implementing these algorithms and study their performance in actual implementations. More importantly, we note that, due to the speed of the algorithms and the theoretical developments about their reliability, we can compute with confidence invariant objects close to the breakdown of their hyperbolicity properties. This allows us to identify a mechanism of loss of hyperbolicity and measure some of its quantitative regularities. We find that some systems lose hyperbolicity because the stable and unstable bundles approach each other but the Lyapunov multipliers remain away from 1. We find empirically that, close to the breakdown, the distances between the invariant bundles and the Lyapunov multipliers which are natural measures of hyperbolicity depend on the parameters, with power laws with universal exponents. We also observe that, even if the rigorous justifications in [J. Differential Equations, 228 (2006), pp. 530-579] are developed only for hyperbolic tori, the algorithms work also for elliptic tori in Hamiltonian systems. We can continue these tori and also compute some bifurcations at resonance which may lead to the existence of hyperbolic tori with nonorientable bundles. We compute manifolds tangent to nonorientable bundles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his version of the theory of multicomponent systems, Friedman used the analogy which exists between the virial expansion for the osmotic pressure obtained from the McMillan-Mayer (MM) theory of solutions in the grand canonical ensemble and the virial expansion for the pressure of a real gas. For the calculation of the thermodynamic properties of the solution, Friedman proposed a definition for the"excess free energy" that is a reminder of the ancient idea for the"osmotic work". However, the precise meaning to be attached to his free energy is, within other reasons, not well defined because in osmotic equilibrium the solution is not a closed system and for a given process the total amount of solvent in the solution varies. In this paper, an analysis based on thermodynamics is presented in order to obtain the exact and precise definition for Friedman"s excess free energy and its use in the comparison with the experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two graphs with adjacency matrices $\mathbf{A}$ and $\mathbf{B}$ are isomorphic if there exists a permutation matrix $\mathbf{P}$ for which the identity $\mathbf{P}^{\mathrm{T}} \mathbf{A} \mathbf{P} = \mathbf{B}$ holds. Multiplying through by $\mathbf{P}$ and relaxing the permutation matrix to a doubly stochastic matrix leads to the linear programming relaxation known as fractional isomorphism. We show that the levels of the Sherali--Adams (SA) hierarchy of linear programming relaxations applied to fractional isomorphism interleave in power with the levels of a well-known color-refinement heuristic for graph isomorphism called the Weisfeiler--Lehman algorithm, or, equivalently, with the levels of indistinguishability in a logic with counting quantifiers and a bounded number of variables. This tight connection has quite striking consequences. For example, it follows immediately from a deep result of Grohe in the context of logics with counting quantifiers that a fixed number of levels of SA suffice to determine isomorphism of planar and minor-free graphs. We also offer applications in both finite model theory and polyhedral combinatorics. First, we show that certain properties of graphs, such as that of having a flow circulation of a prescribed value, are definable in the infinitary logic with counting with a bounded number of variables. Second, we exploit a lower bound construction due to Cai, Fürer, and Immerman in the context of counting logics to give simple explicit instances that show that the SA relaxations of the vertex-cover and cut polytopes do not reach their integer hulls for up to $\Omega(n)$ levels, where $n$ is the number of vertices in the graph.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teollusuussovelluksissa vaaditaan nykyisin yhä useammin reaaliaikaista tiedon käsittelyä. Luotettavuus on yksi tärkeimmistä reaaliaikaiseen tiedonkäsittelyyn kykenevän järjestelmän ominaisuuksista. Sen saavuttamiseksi on sekä laitteisto, että ohjelmisto testattava. Tämän työn päätavoitteena on laitteiston testaaminen ja laitteiston testattavuus, koska luotettava laitteistoalusta on perusta tulevaisuuden reaaliaikajärjestelmille. Diplomityössä esitetään digitaaliseen signaalinkäsittelyyn soveltuvan prosessorikortin suunnittelu. Prosessorikortti on tarkoitettu sähkökoneiden ennakoivaa kunnonvalvontaa varten. Uusimmat DFT (Desing for Testability) menetelmät esitellään ja niitä sovelletaan prosessorikortin sunnittelussa yhdessä vanhempien menetelmien kanssa. Kokemukset ja huomiot menetelmien soveltuvuudesta raportoidaan työn lopussa. Työn tavoitteena on kehittää osakomponentti web -pohjaiseen valvontajärjestelmään, jota on kehitetty Sähkötekniikan osastolla Lappeenrannan teknillisellä korkeakoululla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli selvittää miten kehittää yrityksen nykyistä e-palvelujärjestelmää, Internet -teknologiaan perustuvaa sähköisiä kommunikaatio- ja tiedonjakojärjestelmää, yrityksen business-to-business asiakkuuksien johtamisessa. Tavoitteena oli myös luoda ehdotukset uusista e-palvelusopimusmalleista. Tutkimuksen teoriaosuudessa pyrittiin kehittämään aikaisempiin tutkimuksiin, tietokirjallisuuteen ja asiantuntijoihin perustuva viitekehysmalli. Empiirisessä osassa tutkimuksen tavoitteisiin pyrittiin haastattelemalla yrityksen asiakkaita ja henkilöstöä, sekä tarkastelemalla asiakaskontaktien nykyistä tilaa ja kehittymistä. Näiden tietojen perusteella selvitettiin e-palvelun käyttäjien tarpeita, profiilia ja valmiuksia palvelun käyttöön sekä palvelun nykyistä houkuttelevuutta. Tutkimuksen teoriaosan lähdeaineistona käytettiin kirjallisuutta, artikkeleita ja tilastoja asiakashallinnasta sekä e-palveluiden, erityisesti Internet ja verkkopalveluiden markkinoinnista, nykytilasta sekä palveluiden kehittämisestä. Lisäksi tutkittiin kirjallisuutta arvoverkostoanalyysistä, asiakkaan arvosta, informaatioteknologiasta, palvelun laadusta ja asiakastyytyväisyydestä. Tutkimuksen empiirinen osa perustuu yrityksen henkilöstöltä sekä asiakkailta haastatteluissa kerättyihin tietoihin, yrityksen ennalta keräämiin materiaaleihin sekä Taloustutkimuksen keräämiin tietoihin. Tutkimuksessa käytettiin case -menetelmää, joka oli yhdistelmä sekä kvalitatiivista että kvantitatiivista tutkimusta. Casen tarkoituksena oli testata mallin paikkansapitävyyttä ja käyttökelpoisuutta, sekä selvittää onko olemassa vielä muita tekijöitä, jotka vaikuttavat asiakkaan saamaan arvoon. Kvalitatiivinen aineisto perustuu teemahaastattelumenetelmää soveltaen haastateltuihin asiakkaisiin ja yrityksen työntekijöihin. Kvantitatiivinen tutkimus perustuu Taloustutkimuksen tutkimukseen ja yrityksen asiakaskontakteista kerättyyn tietoon. Haastatteluiden perusteella e-palvelut nähtiin hyödyllisinä ja tulevaisuudessa erittäin tärkeinä. E-palvelut nähdään yhtenä tärkeänä kanavana, perinteisten kanavien rinnalla, tehostaa business-to-business -asiakkuuksien johtamista. Tutkimuksen antamien tulosten mukaan asiakkaiden palveluun liittyvän tieto-, taito-, tarpeellisuus- ja kiinnostavuustasojen vaihtelevaisuus osoittaa selvän tarpeen eritasoisille e-palvelupaketti ratkaisuille. Tuloksista muodostettu ratkaisuehdotus käsittää neljän eri e-palvelupaketin rakentamisen asiakkaiden eri tarpeita mukaillen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Insects are the most diverse group of animals on the planet, comprising over 90% of all metazoan life forms, and have adapted to a wide diversity of ecosystems in nearly all environments. They have evolved highly sensitive chemical senses that are central to their interaction with their environment and to communication between individuals. Understanding the molecular bases of insect olfaction is therefore of great importance from both a basic and applied perspective. Odorant binding proteins (OBPs) are some of most abundant proteins found in insect olfactory organs, where they are the first component of the olfactory transduction cascade, carrying odorant molecules to the olfactory receptors. We carried out a search for OBPs in the genome of the parasitoid wasp Nasonia vitripennis and identified 90 sequences encoding putative OBPs. This is the largest OBP family so far reported in insects. We report unique features of the N. vitripennis OBPs, including the presence and evolutionary origin of a new subfamily of double-domain OBPs (consisting of two concatenated OBP domains), the loss of conserved cysteine residues and the expression of pseudogenes. This study also demonstrates the extremely dynamic evolution of the insect OBP family: (i) the number of different OBPs can vary greatly between species; (ii) the sequences are highly diverse, sometimes as a result of positive selection pressure with even the canonical cysteines being lost; (iii) new lineage specific domain arrangements can arise, such as the double domain OBP subfamily of wasps and mosquitoes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this master’s thesis was to develop a model for mobile subscription acquisition cost, SAC, and mobile subscription retention cost, SRC, by applying activity-based cost accounting principles. The thesis was conducted as a case study for a telecommunication company operating on the Finnish telecommunication market. In addition to activity-based cost accounting there were other theories studied and applied in order to establish a theory framework for this thesis. The concepts of acquisition and retention were explored in a broader context with the concepts of customer satisfaction, loyalty and profitability and eventually customer relationship management to understand the background and meaning of the theme of this thesis. The utilization of SAC and SRC information is discussed through the theories of decision making and activity-based management. Also, the present state and future needs of SAC and SRC information usage at the case company as well as the functions of the company were examined by interviewing some members of the company personnel. With the help of these theories and methods it was aimed at finding out both the theory-based and practical factors which affect the structure of the model. During the thesis study it was confirmed that the existing SAC and SRC model of the case company should be used as the basis in developing the activity-based model. As a result the indirect costs of the old model were transformed into activities and the direct costs were continued to be allocated directly to acquisition of new subscriptions and retention of old subscriptions. The refined model will enable managing the subscription acquisition, retention and the related costs better through the activity information. During the interviews it was found out that the SAC and SRC information is also used in performance measurement and operational and strategic planning. SAC and SRC are not fully absorbed costs and it was concluded that the model serves best as a source of indicative cost information. This thesis does not include calculating costs. Instead, the refined model together with both the theory-based and interview findings concerning the utilization of the information produced by the model will serve as a framework for the possible future development aiming at completing the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The central goal of food safety policy in the European Union (EU) is to protect consumer health by guaranteeing a high level of food safety throughout the food chain. This goal can in part be achieved by testing foodstuffs for the presence of various chemical and biological hazards. The aim of this study was to facilitate food safety testing by providing rapid and user-friendly methods for the detection of particular food-related hazards. Heterogeneous competitive time-resolved fluoroimmunoassays were developed for the detection of selected veterinary residues, that is coccidiostat residues, in eggs and chicken liver. After a simplified sample preparation procedure, the immunoassays were performed either in manual format with dissociation-enhanced measurement or in automated format with pre-dried assay reagents and surface measurement. Although the assays were primarily designed for screening purposes providing only qualitative results, they could also be used in a quantitative mode. All the developed assays had good performance characteristics enabling reliable screening of samples at concentration levels required by the authorities. A novel polymerase chain reaction (PCR)-based assay system was developed for the detection of Salmonella spp. in food. The sample preparation included a short non-selective pre-enrichment step, after which the target cells were collected with immunomagnetic beads and applied to PCR reaction vessels containing all the reagents required for the assay in dry form. The homogeneous PCR assay was performed with a novel instrument platform, GenomEra, and the qualitative assay results were automatically interpreted based on end-point time-resolved fluorescence measurements and cut-off values. The assay was validated using various food matrices spiked with sub-lethally injured Salmonella cells at levels of 1-10 colony forming units (CFU)/25 g of food. The main advantage of the system was the exceptionally short time to result; the entire process starting from the pre-enrichment and ending with the PCR result could be completed in eight hours. In conclusion, molecular methods using state-of-the-art assay techniques were developed for food safety testing. The combination of time-resolved fluorescence detection and ready-to-use reagents enabled sensitive assays easily amenable to automation. Consequently, together with the simplified sample preparation, these methods could prove to be applicable in routine testing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we obtain sharp asymptotic formulas with error estimates for the Mellin con- volution of functions de ned on (0;1), and use these formulas to characterize the asymptotic behavior of marginal distribution densities of stock price processes in mixed stochastic models. Special examples of mixed models are jump-di usion models and stochastic volatility models with jumps. We apply our general results to the Heston model with double exponential jumps, and make a detailed analysis of the asymptotic behavior of the stock price density, the call option pricing function, and the implied volatility in this model. We also obtain similar results for the Heston model with jumps distributed according to the NIG law.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efforts presented by the scientific community in recent years towards the development of numerous green chemical processes and wastewater treatment technologies are presented and discussed. In the light of these approaches, environmentally friendly technologies, as well as the key role played by the well-known advanced oxidation processes, are discussed, giving special attention to the ones comprising ozone applications. Fundamentals and applied aspects dealing with ozone technology and its application are also presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dual model with a nonlinear proton Regge trajectory in the missing mass (M_X^2) channel is constructed. A background based on a direct-channel exotic trajectory, developed and applied earlier for the inclusive electron-proton cross section description in the nucleon resonance region, is used. The parameters of the model are determined from the extrapolations to earlier experiments. Predictions for the low-mass (2 < M_X^2 < 8GeV^2) diffraction dissociation cross sections at the LHC energies are given.