971 resultados para three-shell model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim Species distribution models (SDMs) based on current species ranges underestimate the potential distribution when projected in time and/or space. A multi-temporal model calibration approach has been suggested as an alternative, and we evaluate this using 13,000 years of data. Location Europe. Methods We used fossil-based records of presence for Picea abies, Abies alba and Fagus sylvatica and six climatic variables for the period 13,000 to 1000yr bp. To measure the contribution of each 1000-year time step to the total niche of each species (the niche measured by pooling all the data), we employed a principal components analysis (PCA) calibrated with data over the entire range of possible climates. Then we projected both the total niche and the partial niches from single time frames into the PCA space, and tested if the partial niches were more similar to the total niche than random. Using an ensemble forecasting approach, we calibrated SDMs for each time frame and for the pooled database. We projected each model to current climate and evaluated the results against current pollen data. We also projected all models into the future. Results Niche similarity between the partial and the total-SDMs was almost always statistically significant and increased through time. SDMs calibrated from single time frames gave different results when projected to current climate, providing evidence of a change in the species realized niches through time. Moreover, they predicted limited climate suitability when compared with the total-SDMs. The same results were obtained when projected to future climates. Main conclusions The realized climatic niche of species differed for current and future climates when SDMs were calibrated considering different past climates. Building the niche as an ensemble through time represents a way forward to a better understanding of a species' range and its ecology in a changing climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cutaneous Leishmaniasis (CL) caused by Leishmania aethiopica is a public health and social problem with a sequel of severe and mutilating skin lesions. It is manifested in three forms: localized CL (LCL), mucosal CL (MCL) and diffuse CL (DCL). Unresponsiveness to sodium stibogluconate (Sb(V)) is common in Ethiopian CL patients. Using the amastigote-macrophage in vitro model the susceptibility of 24 clinical isolates of L. aethiopica derived from untreated patients was investigated. Eight strains of LCL, 9 of MCL, and 7 of DCL patients together with a reference strain (MHOM/ET/82/117/82) were tested against four antileishmanial drugs: amphotericin B, miltefosine, Sb(V) and paromomycin. In the same order of drugs, IC(50) (μg/ml±SD) values for the 24 strains tested were 0.16±0.18, 5.88±4.79, 10.23±8.12, and 13.63±18.74. The susceptibility threshold of isolates originating from the 3 categories of patients to all 4 drugs was not different (p>0.05). Maximal efficacy was superior for miltefosine across all the strains. Further susceptibility test could validate miltefosine as a potential alternative drug in cases of sodium stibogluconate treatment failure in CL patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four classes of variables are apparent in the problem of scour around bridge piers and abutments--geometry of piers and abutments, stream-flow characteristics, sediment characteristics, and geometry of site. The laboratory investigation, from its inception, has been divided into four phases based on these classes. In each phase the variables in three of the classes are held constant and those in the pertinent class are varied. To date, the first three phases have been studied. Typical scour bole patterns related to the geometry of the pier or abutment have been found. For equilibrium conditions of scour with uniform sand, the velocity of flow and the sand size do not appear to have any measurable effects on the depth of scour. This result is especially encouraging in the search for correlation between model and prototype since it would indicate that, primarily, only the depth of flow might be involved in the scale effect. The technique of model testing has been simplified, therefore, because rate of sediment transportation does not need to be scaled. Prior to the establishment of equilibrium conditions, however, depths of scour in excess of those for equilibrium conditions have been found. A concept of active scour as an imbalance between sediment transport capacity and rate of sediment supply has been used to explain the laboratory observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: A homozygous mutation in the H6 family homeobox 1 (HMX1) gene is responsible for a new oculoauricular defect leading to eye and auricular developmental abnormalities as well as early retinal degeneration (MIM 612109). However, the HMX1 pathway remains poorly understood, and in the first approach to better understand the pathway's function, we sought to identify the target genes. METHODS: We developed a predictive promoter model (PPM) approach using a comparative transcriptomic analysis in the retina at P15 of a mouse model lacking functional Hmx1 (dmbo mouse) and its respective wild-type. This PPM was based on the hypothesis that HMX1 binding site (HMX1-BS) clusters should be more represented in promoters of HMX1 target genes. The most differentially expressed genes in the microarray experiment that contained HMX1-BS clusters were used to generate the PPM, which was then statistically validated. Finally, we developed two genome-wide target prediction methods: one that focused on conserving PPM features in human and mouse and one that was based on the co-occurrence of HMX1-BS pairs fitting the PPM, in human or in mouse, independently. RESULTS: The PPM construction revealed that sarcoglycan, gamma (35kDa dystrophin-associated glycoprotein) (Sgcg), teashirt zinc finger homeobox 2 (Tshz2), and solute carrier family 6 (neurotransmitter transporter, glycine) (Slc6a9) genes represented Hmx1 targets in the mouse retina at P15. Moreover, the genome-wide target prediction revealed that mouse genes belonging to the retinal axon guidance pathway were targeted by Hmx1. Expression of these three genes was experimentally validated using a quantitative reverse transcription PCR approach. The inhibitory activity of Hmx1 on Sgcg, as well as protein tyrosine phosphatase, receptor type, O (Ptpro) and Sema3f, two targets identified by the PPM, were validated with luciferase assay. CONCLUSIONS: Gene expression analysis between wild-type and dmbo mice allowed us to develop a PPM that identified the first target genes of Hmx1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well established that at ambient and supercooled conditions water can be described as a percolating network of H bonds. This work is aimed at identifying, by neutron diffraction experiments combined with computer simulations, a percolation line in supercritical water, where the extension of the H-bond network is in question. It is found that in real supercritical water liquidlike states are observed at or above the percolation threshold, while below this threshold gaslike water forms small, sheetlike configurations. Inspection of the three-dimensional arrangement of water molecules suggests that crossing of this percolation line is accompa- nied by a change of symmetry in the first neighboring shell of molecules from trigonal below the line to tetrahedral above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With over 68 thousand miles of gravel roads in Iowa and the importance of these roads within the farm-to-market transportation system, proper water management becomes critical for maintaining the integrity of the roadway materials. However, the build-up of water within the aggregate subbase can lead to frost boils and ultimately potholes forming at the road surface. The aggregate subbase and subgrade soils under these gravel roads are produced with material opportunistically chosen from local sources near the site and, many times, the compositions of these sublayers are far from ideal in terms of proper water drainage with the full effects of this shortcut not being fully understood. The primary objective of this project was to provide a physically-based model for evaluating the drainability of potential subbase and subgrade materials for gravel roads in Iowa. The Richards equation provided the appropriate framework to study the transient unsaturated flow that usually occurs through the subbase and subgrade of a gravel road. From which, we identified that the saturated hydraulic conductivity, Ks, was a key parameter driving the time to drain of subgrade soils found in Iowa, thus being a good proxy variable for accessing roadway drainability. Using Ks, derived from soil texture, we were able to identify potential problem areas in terms of roadway drainage . It was found that there is a threshold for Ks of 15 cm/day that determines if the roadway will drain efficiently, based on the requirement that the time to drain, Td, the surface roadway layer does not exceed a 2-hr limit. Two of the three highest abundant textures (loam and silty clay loam), which cover nearly 60% of the state of Iowa, were found to have average Td values greater than the 2-hr limit. With such a large percentage of the state at risk for the formation of boils due to the soil with relatively low saturated hydraulic conductivity values, it seems pertinent that we propose alternative design and/or maintenance practices to limit the expensive repair work in Iowa. The addition of drain tiles or French mattresses my help address drainage problems. However, before pursuing this recommendation, a comprehensive cost-benefit analysis is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turtle Mountain in Alberta, Canada has become an important field laboratory for testing different techniques related to the characterization and monitoring of large slope mass movements as the stability of large portions of the eastern face of the mountain is still questionable. In order to better quantify the volumes potentially unstable and the most probable failure mechanisms and potential consequences, structural analysis and runout modeling were preformed. The structural features of the eastern face were investigated using a high resolution digital elevation model (HRDEM). According to displacement datasets and structural observations, potential failure mechanisms affecting different portions of the mountain have been assessed. The volumes of the different potentially unstable blocks have been calculated using the Sloping Local Base Level (SLBL) method. Based on the volume estimation, two and three dimensional dynamic runout analyses have been performed. Calibration of this analysis is based on the experience from the adjacent Frank Slide and other similar rock avalanches. The results will be used to improve the contingency plans within the hazard area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to assess genotype by environment interaction for seed yield per plant in rapeseed cultivars grown in Northern Serbia by the AMMI (additive main effects and multiplicative interaction) model. The study comprised 19 rapeseed genotypes, analyzed in seven years through field trials arranged in a randomized complete block design, with three replicates. Seed yield per plant of the tested cultivars varied from 1.82 to 19.47 g throughout the seven seasons, with an average of 7.41 g. In the variance analysis, 72.49% of the total yield variation was explained by environment, 7.71% by differences between genotypes, and 19.09% by genotype by environment interaction. On the biplot, cultivars with high yield genetic potential had positive correlation with the seasons with optimal growing conditions, while the cultivars with lower yield potential were correlated to the years with unfavorable conditions. Seed yield per plant is highly influenced by environmental factors, which indicates the adaptability of specific genotypes to specific seasons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Colonic endoscopic submucosal dissection (ESD) is challenging as a result of the limited ability of conventional endoscopic instruments to achieve traction and exposure. The aim of this study was to evaluate the feasibility of colonic ESD in a porcine model using a novel endoscopic surgical platform, the Anubiscope (Karl Storz, Tüttlingen, Germany), equipped with two working channels for surgical instruments with four degrees of freedom offering surgical triangulation. METHODS: Nine ESDs were performed by a surgeon without any ESD experience in three swine, at 25, 15, and 10 cm above the anal verge with the Anubiscope. Sixteen ESDs were performed by an experienced endoscopist in five swine using conventional endoscopic instruments. Major ESD steps included the following for both groups: scoring the area, submucosal injection of glycerol, precut, and submucosal dissection. Outcomes measured were as follows: dissection time and speed, specimen size, en bloc dissection, and complications. RESULTS: No perforations occurred in the Anubis group, while there were eight perforations (50 %) in the conventional group (p = 0.02). Complete and en bloc dissections were achieved in all cases in the Anubis group. Mean dissection time for completed cases was statistically significantly shorter in the Anubis group (32.3 ± 16.1 vs. 55.87 ± 7.66 min; p = 0.0019). Mean specimen size was higher in the conventional group (1321 ± 230 vs. 927.77 ± 229.96 mm(2); p = 0.003), but mean dissection speed was similar (35.95 ± 18.93 vs. 23.98 ± 5.02 mm(2)/min in the Anubis and conventional groups, respectively; p = 0.1). CONCLUSIONS: Colonic ESDs were feasible in pig models with the Anubiscope. This surgical endoscopic platform is promising for endoluminal surgical procedures such as ESD, as it is user-friendly, effective, and safe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this project is to develop an investment analysis model that integrates the capabilities of four types of analysis for use in evaluating interurban transportation system improvements. The project will also explore the use of new data warehousing and mining techniques to design the types of databases required for supporting such a comprehensive transportation model. The project consists of four phases. The first phase, which is documented in this report, involves development of the conceptual foundation for the model. Prior research is reviewed in Chapter 1, which is composed of three major sections providing demand modeling background information for passenger transportation, transportation of freight (manufactured products and supplies), and transportation of natural resources and agricultural commodities. Material from the literature on geographic information systems makes up Chapter 2. Database models for the national and regional economies and for the transportation and logistics network are conceptualized in Chapter 3. Demand forecasting of transportation service requirements is introduced in Chapter 4, with separate sections for passenger transportation, freight transportation, and transportation of natural resources and commodities. Characteristics and capacities of the different modes, modal choices, and route assignments are discussed in Chapter 5. Chapter 6 concludes with a general discussion of the economic impacts and feedback of multimodal transportation activities and facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report Monte Carlo results for a nonequilibrium Ising-like model in two and three dimensions. Nearest-neighbor interactions J change sign randomly with time due to competing kinetics. There follows a fast and random, i.e., spin-configuration-independent diffusion of Js, of the kind that takes place in dilute metallic alloys when magnetic ions diffuse. The system exhibits steady states of the ferromagnetic (antiferromagnetic) type when the probability p that J>0 is large (small) enough. No counterpart to the freezing phenomena found in quenched spin glasses occurs. We compare our results with existing mean-field and exact ones, and obtain information about critical behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vitamin K antagonists (VKAs) are prescribed worldwide and remain the oral anticoagulant of choice. These drugs are characterized by a narrow therapeutic index and a large inter- and intra-individual variability. P-glycoprotein could contribute to this variability. The aim of this study was to investigate the involvement of P-gp in the transport of acenocoumarol, phenprocoumon and warfarin using an in vitro Caco-2 cell monolayer model. These results were compared with those obtained with rivaroxaban, a new oral anticoagulant known to be a P-gp substrate. The transport of these four drugs was assessed at pH conditions 6.8/7.4 in the presence or absence of the P-gp inhibitor cyclosporine A (10 μM) and the more potent and specific P-gp inhibitor valspodar (5 μM). Analytical quantification was performed by LC/MS. With an efflux ratio of 1.7 and a significant decrease in the efflux (Papp B-A), in the presence of P-gp inhibitors at a concentration of 50 μM, acenocoumarol can be considered as a weak P-gp substrate. Concerning phenprocoumon, the results suggest that this molecule is a poor P-gp substrate. The P-gp inhibitors did not affect significantly the transport of warfarin. The efflux of rivaroxaban was strongly inhibited by the two P-gp inhibitors. In conclusion, none of the three VKAs tested are strong P-gp substrates. However, acenocoumarol can be considered as a weak P-gp substrate and phenprocoumon as a poor P-gp substrate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical deep brain stimulation (DBS) is an efficient method to treat movement disorders. Many models of DBS, based mostly on finite elements, have recently been proposed to better understand the interaction between the electrical stimulation and the brain tissues. In monopolar DBS, clinically widely used, the implanted pulse generator (IPG) is used as reference electrode (RE). In this paper, the influence of the RE model of monopolar DBS is investigated. For that purpose, a finite element model of the full electric loop including the head, the neck and the superior chest is used. Head, neck and superior chest are made of simple structures such as parallelepipeds and cylinders. The tissues surrounding the electrode are accurately modelled from data provided by the diffusion tensor magnetic resonance imaging (DT-MRI). Three different configurations of RE are compared with a commonly used model of reduced size. The electrical impedance seen by the DBS system and the potential distribution are computed for each model. Moreover, axons are modelled to compute the area of tissue activated by stimulation. Results show that these indicators are influenced by the surface and position of the RE. The use of a RE model corresponding to the implanted device rather than the usually simplified model leads to an increase of the system impedance (+48%) and a reduction of the area of activated tissue (-15%).