962 resultados para Computer Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yrityksen sisäisten rajapintojen tunteminen mahdollistaa tiedonvaihdon hallinnan läpi organisaation. Idean muokkaaminen kannattavaksi innovaatioksi edellyttää organisaation eri osien läpi kulkevaa saumatonta prosessiketjua sekä tietovirtaa. Tutkielman tavoitteena oli mallintaa organisaation kahden toiminnallisesti erilaisen osan välinen tiedon vaihto. Tiedon vaihto kuvattiin rajapintana, tietoliittymänä. Kolmiulotteinen organisaatiomalli muodosti tutkimuksen pääteorian. Se kytkettiin yrityksen tuotanto- ja myyntiosiin, kuten myös BestServ-projektin kehittämään uuteen palvelujen kehittämisen prosessiin. Uutta palvelujen kehittämisen prosessia laajennettiin ISO/IEC 15288 standardin kuvaamalla prosessimallilla. Yritysarkkitehtuurikehikoita käytettiin mallintamisen perustana. Tietoliittymä nimenä kuvastaa näkemystä siitä, että tieto [tietämys] on olemukseltaan yksilöiden tai ryhmien välistä. Mallinnusmenetelmät eivät kuitenkaan vielä mahdollista tietoon [tietämykseen] liittyvien kaikkien ominaisuuksien mallintamista. Tietoliittymän malli koostuu kolmesta osasta, joista kaksi esitetään graafisessa muodossa ja yksi taulukkona. Mallia voidaan käyttää itsenäisesti tai osana yritysarkkitehtuuria. Teollisessa palveluliiketoiminnassa sekä tietoliittymän mallinnusmenetelmä että sillä luotu malli voivat auttaa konepajateollisuuden yritystä ymmärtämään yrityksen kehittämistarpeet ja -kohteet, kun se haluaa palvelujen tuottamisella suuremman roolin asiakasyrityksen liiketoiminnassa. Tietoliittymän mallia voidaan käyttää apuna organisaation tietovarannon ja tietämyksen mallintamisessa sekä hallinnassa ja näin pyrkiä yhdistämään ne yrityksen strategiaa palvelevaksi kokonaisuudeksi. Tietoliittymän mallinnus tarjoaa tietojohtamisen kauppatieteelliselle tutkimukselle menetelmällisyyden tutkia innovaatioiden hallintaa sekä organisaation uudistumiskykyä. Kumpikin tutkimusalue tarvitsevat tarkempaa tietoa ja mahdollisuuksia hallita tietovirtoja, tiedon vaihtoa sekä organisaation tietovarannon käyttöä.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Koneet voidaan usein jakaa osajärjestelmiin, joita ovat ohjaus- ja säätöjärjestelmät, voimaa tuottavat toimilaitteet ja voiman välittävät mekanismit. Eri osajärjestelmiä on simuloitu tietokoneavusteisesti jo usean vuosikymmenen ajan. Osajärjestelmien yhdistäminen on kuitenkin uudempi ilmiö. Usein esimerkiksi mekanismien mallinnuksessa toimilaitteen tuottama voimaon kuvattu vakiona, tai ajan funktiona muuttuvana voimana. Vastaavasti toimilaitteiden analysoinnissa mekanismin toimilaitteeseen välittämä kuormitus on kuvattu vakiovoimana, tai ajan funktiona työkiertoa kuvaavana kuormituksena. Kun osajärjestelmät on erotettu toisistaan, on niiden välistenvuorovaikutuksien tarkastelu erittäin epätarkkaa. Samoin osajärjestelmän vaikutuksen huomioiminen koko järjestelmän käyttäytymissä on hankalaa. Mekanismien dynamiikan mallinnukseen on kehitetty erityisesti tietokoneille soveltuvia numeerisia mallinnusmenetelmiä. Useimmat menetelmistä perustuvat Lagrangen menetelmään, joka mahdollistaa vapaasti valittaviin koordinaattimuuttujiin perustuvan mallinnuksen. Numeerista ratkaisun mahdollistamiseksi menetelmän avulla muodostettua differentiaali-algebraaliyhtälöryhmää joudutaan muokkaamaan esim. derivoimalla rajoiteyhtälöitä kahteen kertaan. Menetelmän alkuperäisessä numeerisissa ratkaisuissa kaikki mekanismia kuvaavat yleistetyt koordinaatit integroidaan jokaisella aika-askeleella. Tästä perusmenetelmästä johdetuissa menetelmissä riippumattomat yleistetyt koordinaatit joko integroidaan ja riippuvat koordinaatit ratkaistaan rajoiteyhtälöiden perusteella tai yhtälöryhmän kokoa pienennetään esim. käyttämällä nopeus- ja kiihtyvyysanalyyseissä eri kiertymäkoordinaatteja kuin asema-analyysissä. Useimmat integrointimenetelmät on alun perin tarkoitettu differentiaaliyhtälöiden (ODE) ratkaisuunjolloin yhtälöryhmään liitetyt niveliä kuvaavat algebraaliset rajoiteyhtälöt saattavat aiheuttaa ongelmia. Nivelrajoitteiden virheiden korjaus, stabilointi, on erittäin tärkeää mekanismien dynamiikan simuloinnin onnistumisen ja tulosten oikeellisuuden kannalta. Mallinnusmenetelmien johtamisessa käytetyn virtuaalisen työn periaatteen oletuksena nimittäin on, etteivät rajoitevoimat tee työtä, eli rajoitteiden vastaista siirtymää ei tapahdu. Varsinkaan monimutkaisten järjestelmien pidemmissä analyyseissä nivelrajoitteet eivät toteudu tarkasti. Tällöin järjestelmän energiatasapainoei toteudu ja järjestelmään muodostuu virtuaalista energiaa, joka rikkoo virtuaalisen työn periaatetta, Tästä syystä tulokset eivät enää pidäpaikkaansa. Tässä raportissa tarkastellaan erityyppisiä mallinnus- ja ratkaisumenetelmiä, ja vertaillaan niiden toimivuutta yksinkertaisten mekanismien numeerisessa ratkaisussa. Menetelmien toimivuutta tarkastellaan ratkaisun tehokkuuden, nivelrajoitteiden toteutumisen ja energiatasapainon säilymisen kannalta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study was done with two different servo-systems. In the first system, a servo-hydraulic system was identified and then controlled by a fuzzy gainscheduling controller. The second servo-system, an electro-magnetic linear motor in suppressing the mechanical vibration and position tracking of a reference model are studied by using a neural network and an adaptive backstepping controller respectively. Followings are some descriptions of research methods. Electro Hydraulic Servo Systems (EHSS) are commonly used in industry. These kinds of systems are nonlinearin nature and their dynamic equations have several unknown parameters.System identification is a prerequisite to analysis of a dynamic system. One of the most promising novel evolutionary algorithms is the Differential Evolution (DE) for solving global optimization problems. In the study, the DE algorithm is proposed for handling nonlinear constraint functionswith boundary limits of variables to find the best parameters of a servo-hydraulic system with flexible load. The DE guarantees fast speed convergence and accurate solutions regardless the initial conditions of parameters. The control of hydraulic servo-systems has been the focus ofintense research over the past decades. These kinds of systems are nonlinear in nature and generally difficult to control. Since changing system parameters using the same gains will cause overshoot or even loss of system stability. The highly non-linear behaviour of these devices makes them ideal subjects for applying different types of sophisticated controllers. The study is concerned with a second order model reference to positioning control of a flexible load servo-hydraulic system using fuzzy gainscheduling. In the present research, to compensate the lack of dampingin a hydraulic system, an acceleration feedback was used. To compare the results, a pcontroller with feed-forward acceleration and different gains in extension and retraction is used. The design procedure for the controller and experimental results are discussed. The results suggest that using the fuzzy gain-scheduling controller decrease the error of position reference tracking. The second part of research was done on a PermanentMagnet Linear Synchronous Motor (PMLSM). In this study, a recurrent neural network compensator for suppressing mechanical vibration in PMLSM with a flexible load is studied. The linear motor is controlled by a conventional PI velocity controller, and the vibration of the flexible mechanism is suppressed by using a hybrid recurrent neural network. The differential evolution strategy and Kalman filter method are used to avoid the local minimum problem, and estimate the states of system respectively. The proposed control method is firstly designed by using non-linear simulation model built in Matlab Simulink and then implemented in practical test rig. The proposed method works satisfactorily and suppresses the vibration successfully. In the last part of research, a nonlinear load control method is developed and implemented for a PMLSM with a flexible load. The purpose of the controller is to track a flexible load to the desired position reference as fast as possible and without awkward oscillation. The control method is based on an adaptive backstepping algorithm whose stability is ensured by the Lyapunov stability theorem. The states of the system needed in the controller are estimated by using the Kalman filter. The proposed controller is implemented and tested in a linear motor test drive and responses are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing digital rights management (DRM) systems, initiatives like Creative Commons or research works as some digital rights ontologies provide limited support for content value chains modelling and management. This is becoming a critical issue as content markets start to profit from the possibilities of digital networks and the World Wide Web. The objective is to support the whole copyrighted content value chain across enterprise or business niches boundaries. Our proposal provides a framework that accommodates copyright law and a rich creation model in order to cope with all the creation life cycle stages. The dynamic aspects of value chains are modelled using a hybrid approach that combines ontology-based and rule-based mechanisms. The ontology implementation is based on Web Ontology Language and Description Logic (OWL-DL) reasoners, are directly used for license checking. On the other hand, for more complex aspects of the dynamics of content value chains, rule languages are the choice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been convincingly argued that computer simulation modeling differs from traditional science. If we understand simulation modeling as a new way of doing science, the manner in which scientists learn about the world through models must also be considered differently. This article examines how researchers learn about environmental processes through computer simulation modeling. Suggesting a conceptual framework anchored in a performative philosophical approach, we examine two modeling projects undertaken by research teams in England, both aiming to inform flood risk management. One of the modeling teams operated in the research wing of a consultancy firm, the other were university scientists taking part in an interdisciplinary project experimenting with public engagement. We found that in the first context the use of standardized software was critical to the process of improvisation, the obstacles emerging in the process concerned data and were resolved through exploiting affordances for generating, organizing, and combining scientific information in new ways. In the second context, an environmental competency group, obstacles were related to the computer program and affordances emerged in the combination of experience-based knowledge with the scientists' skill enabling a reconfiguration of the mathematical structure of the model, allowing the group to learn about local flooding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many classification systems rely on clustering techniques in which a collection of training examples is provided as an input, and a number of clusters c1,...cm modelling some concept C results as an output, such that every cluster ci is labelled as positive or negative. Given a new, unlabelled instance enew, the above classification is used to determine to which particular cluster ci this new instance belongs. In such a setting clusters can overlap, and a new unlabelled instance can be assigned to more than one cluster with conflicting labels. In the literature, such a case is usually solved non-deterministically by making a random choice. This paper presents a novel, hybrid approach to solve this situation by combining a neural network for classification along with a defeasible argumentation framework which models preference criteria for performing clustering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study was carried to develop functions that could explain the growth of Oxalis latifolia, in both early stages and throughout the season, contributing to the improvement of its cultural control. Bulbs of the Cornwall form of O. latifolia were buried at 1 and 8 cm in March 1999 and 2000. Samples were destructive at fixed times, and at each time the corresponding BBCH scale codes as well as the absolute number of growing and adult leaves were noted. Using the absolute number of adult leaves (transformed to percentages), a Gaussian curve of three parameters that explains the growth during the season (R2=0.9355) was developed. The BBCH scale permitted the fit of two regression lines that were accurately adjusted for each burial depth (R2=0.9969 and R2=0.9930 respectively for 1 and 8 cm). The best moment for an early defoliation in Northern Spain could be calculated with these regression lines, and was found to be the second week of May. In addition, it was observed that a burial depth of 8 cm does not affect the growing pattern of the weed, but it affects the number of leaves they produce, which decreases to less than a half of those produced at 1 cm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Para preservar la biodiversidad de los ecosistemas forestales de la Europa mediterránea en escenarios actuales y futuros de cambio global mediante una gestión forestal sostenible es necesario determinar cómo influye el medio ambiente y las propias características de los bosques sobre la biodiversidad que éstos albergan. Con este propósito, se analizó la influencia de diferentes factores ambientales y de estructura y composición del bosque sobre la riqueza de aves forestales a escala 1 × 1 km en Cataluña (NE de España). Se construyeron modelos univariantes y multivariantes de redes neuronales para respectivamente explorar la respuesta individual a las variables y obtener un modelo parsimonioso (ecológicamente interpretable) y preciso. La superficie de bosque (con una fracción de cabida cubierta superior a 5%), la fracción de cabida cubierta media, la temperatura anual y la precipitación estival medias fueron los mejores predictores de la riqueza de aves forestales. La red neuronal multivariante obtenida tuvo una buena capacidad de generalización salvo en las localidades con una mayor riqueza. Además, los bosques con diferentes grados de apertura del dosel arbóreo, más maduros y más diversos en cuanto a su composición de especies arbóreas se asociaron de forma positiva con una mayor riqueza de aves forestales. Finalmente, se proporcionan directrices de gestión para la planificación forestal que permitan promover la diversidad ornítica en esta región de la Europa mediterránea.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magneto-active polymers are a class of smart materials commonly manufactured by mixing micron-sized iron particles in a rubber-like matrix. When cured in the presence of an externally applied magnetic field, the iron particles arrange themselves into chain-like structures that lend an overall anisotropy to the material. It has been observed through electron micrographs and X-ray tomographs that these chains are not always perfect in structure, and may have dispersion due to the conditions present during manufacturing or some undesirable material properties. We model the response of these materials to coupled magneto-mechanical loading in this paper using a probability based structure tensor that accounts for this imperfect anisotropy. The response of the matrix material is decoupled from the chain phase, though still being connected through kinematic constraints. The latter is based on the definition of a 'chain deformation gradient' and a 'chain magnetic field'. We conclude with numerical examples that demonstrate the effect of chain dispersion on the response of the material to magnetoelastic loading.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To determine whether a mono-, bi- or tri-exponential model best fits the intravoxel incoherent motion (IVIM) diffusion-weighted imaging (DWI) signal of normal livers. MATERIALS AND METHODS: The pilot and validation studies were conducted in 38 and 36 patients with normal livers, respectively. The DWI sequence was performed using single-shot echoplanar imaging with 11 (pilot study) and 16 (validation study) b values. In each study, data from all patients were used to model the IVIM signal of normal liver. Diffusion coefficients (Di ± standard deviations) and their fractions (fi ± standard deviations) were determined from each model. The models were compared using the extra sum-of-squares test and information criteria. RESULTS: The tri-exponential model provided a better fit than both the bi- and mono-exponential models. The tri-exponential IVIM model determined three diffusion compartments: a slow (D1 = 1.35 ± 0.03 × 10(-3) mm(2)/s; f1 = 72.7 ± 0.9 %), a fast (D2 = 26.50 ± 2.49 × 10(-3) mm(2)/s; f2 = 13.7 ± 0.6 %) and a very fast (D3 = 404.00 ± 43.7 × 10(-3) mm(2)/s; f3 = 13.5 ± 0.8 %) diffusion compartment [results from the validation study]. The very fast compartment contributed to the IVIM signal only for b values ≤15 s/mm(2) CONCLUSION: The tri-exponential model provided the best fit for IVIM signal decay in the liver over the 0-800 s/mm(2) range. In IVIM analysis of normal liver, a third very fast (pseudo)diffusion component might be relevant. KEY POINTS: ? For normal liver, tri-exponential IVIM model might be superior to bi-exponential ? A very fast compartment (D = 404.00 ± 43.7 × 10 (-3)  mm (2) /s; f = 13.5 ± 0.8 %) is determined from the tri-exponential model ? The compartment contributes to the IVIM signal only for b ≤ 15 s/mm (2.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selostus: Maatalouspolitiikkauudistusten vaikutuksista pellonkäytön diversiteettiin

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo (MC) simulations have been used to study the structure of an intermediate thermal phase of poly(R-octadecyl ç,D-glutamate). This is a comblike poly(ç-peptide) able to adopt a biphasic structure that has been described as a layered arrangement of backbone helical rods immersed in a paraffinic pool of polymethylene side chains. Simulations were performed at two different temperatures (348 and 363 K), both of them above the melting point of the paraffinic phase, using the configurational bias MC algorithm. Results indicate that layers are constituted by a side-by-side packing of 17/5 helices. The organization of the interlayer paraffinic region is described in atomistic terms by examining the torsional angles and the end-to-end distances for the octadecyl side chains. Comparison with previously reported comblike poly(â-peptide)s revealed significant differences in the organization of the alkyl side chains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current challenge in a context of major environmental changes is to anticipate the responses of species to future landscape and climate scenarios. In the Mediterranean basin, climate change is one the most powerful driving forces of fire dynamics, with fire frequency and impact having markedly increased in recent years. Species distribution modelling plays a fundamental role in this challenge, but better integration of available ecological knowledge is needed to adequately guide conservation efforts. Here, we quantified changes in habitat suitability of an early-succession bird in Catalonia, the Dartford Warbler (Sylvia undata) ― globally evaluated as Near Threatened in the IUCN Red List. We assessed potential changes in species distributions between 2000 and 2050 under different fire management and climate change scenarios and described landscape dynamics using a spatially-explicit fire-succession model that simulates fire impacts in the landscape and post-fire regeneration (MEDFIRE model). Dartford Warbler occurrence data were acquired at two different spatial scales from: 1) the Atlas of European Breeding Birds (EBCC) and 2) Catalan Breeding Bird Atlas (CBBA). Habitat suitability was modelled using five widely-used modelling techniques in an ensemble forecasting framework. Our results indicated considerable habitat suitability losses (ranging between 47% and 57% in baseline scenarios), which were modulated to a large extent by fire regime changes derived from fire management policies and climate changes. Such result highlighted the need for taking the spatial interaction between climate changes, fire-mediated landscape dynamics and fire management policies into account for coherently anticipating habitat suitability changes of early succession bird species. We conclude that fire management programs need to be integrated into conservation plans to effectively preserve sparsely forested and early succession habitats and their associated species in the face of global environmental change.