907 resultados para Strut-and Tie Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A very accurate archaeological dating of a Roman site in NE Spain (El Vila-sec) was made based on the typology of pottery artifacts. Three different phases were identifi ed with activity ranging from the mid- 1st century BC to the early-3rd century AD. Analyses of bricks from kilns at El Vila-sec produced data on their stored archaeomagnetic vector. These data were compared with the secular variation curve for the Iberian Peninsula and the SCHA.DIF.3K regional archaeomagnetic model. Both, the reference curve and the model, produced probability distributions for the final period of use for two kilns from the second archaeological phase that were not used during the third phase. At a 95% con fidence level, both time distributions cover a wide chronological range including the presumed archaeological age. Both the Iberian secular variation curve and the SCHA.DIF.3K regional model proved to be suitable models for dating the site, although on their own they do not produce a single unambiguous solution. This archaeomagnetic approach could also be applied to neighbouring archaeological sites that have an imprecise archaeological age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality of life has been extensively discussed in acute and chronic illnesses. However a dynamic model grounded in the experience of patients in the course of transplantation has not been to our knowledge developed. In a qualitative longitudinal study, patients awaiting solid organ transplantation participated in semi-structured interviews: Exploring topics pre-selected on previous research literature review. Creative interview was privileged, open to themes patients would like to discuss at the different steps of the transplantation process. A qualitative thematic and reflexive analysis was performed, and a model of the dimensions constitutive of quality of life from the perspective of the patients was elaborated. Quality of life is not a stable construct in a long lasting illness-course, but evolves with illness constraints, treatments and outcomes. Dimensions constitutive of quality of life are defined, each of them containing different sub-categories depending on the organ related illness co-morbidities and the stage of illness-course.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The value of earmarks as an efficient means of personal identification is still subject to debate. It has been argued that the field is lacking a firm systematic and structured data basis to help practitioners to form their conclusions. Typically, there is a paucity of research guiding as to the selectivity of the features used in the comparison process between an earmark and reference earprints taken from an individual. This study proposes a system for the automatic comparison of earprints and earmarks, operating without any manual extraction of key-points or manual annotations. For each donor, a model is created using multiple reference prints, hence capturing the donor within source variability. For each comparison between a mark and a model, images are automatically aligned and a proximity score, based on a normalized 2D correlation coefficient, is calculated. Appropriate use of this score allows deriving a likelihood ratio that can be explored under known state of affairs (both in cases where it is known that the mark has been left by the donor that gave the model and conversely in cases when it is established that the mark originates from a different source). To assess the system performance, a first dataset containing 1229 donors elaborated during the FearID research project was used. Based on these data, for mark-to-print comparisons, the system performed with an equal error rate (EER) of 2.3% and about 88% of marks are found in the first 3 positions of a hitlist. When performing print-to-print transactions, results show an equal error rate of 0.5%. The system was then tested using real-case data obtained from police forces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate-driven range fluctuations during the Pleistocene have continuously reshaped species distribution leading to populations of contrasting genetic diversity. Contemporary climate change is similarly influencing species distribution and population structure, with important consequences for patterns of genetic diversity and species' evolutionary potential1. Yet few studies assess the impacts of global climatic changes on intraspecific genetic variation2, 3, 4, 5. Here, combining analyses of molecular data with time series of predicted species distributions and a model of diffusion through time over the past 21 kyr, we unravel caribou response to past and future climate changes across its entire Holarctic distribution. We found that genetic diversity is geographically structured with two main caribou lineages, one originating from and confined to Northeastern America, the other originating from Euro-Beringia but also currently distributed in western North America. Regions that remained climatically stable over the past 21 kyr maintained a high genetic diversity and are also predicted to experience higher climatic stability under future climate change scenarios. Our interdisciplinary approach, combining genetic data and spatial analyses of climatic stability (applicable to virtually any taxon), represents a significant advance in inferring how climate shapes genetic diversity and impacts genetic structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to parameterize, calibrate, and validate a new version of the soybean growth and yield model developed by Sinclair, under natural field conditions in northeastern Amazon. The meteorological data and the values of soybean growth and leaf area were obtained from an agrometeorological experiment carried out in Paragominas, PA, Brazil, from 2006 to 2009. The climatic conditions during the experiment were very distinct, with a slight reduction in rainfall in 2007, due to the El Niño phenomenon. There was a reduction in the leaf area index (LAI) and in biomass production during this year, which was reproduced by the model. The simulation of the LAI had root mean square error (RMSE) of 0.55 to 0.82 m² m-2, from 2006 to 2009. The simulation of soybean yield for independent data showed a RMSE of 198 kg ha-1, i.e., an overestimation of 3%. The model was calibrated and validated for Amazonian climatic conditions, and can contribute positively to the improvement of the simulations of the impacts of land use change in the Amazon region. The modified version of the Sinclair model is able to adequately simulate leaf area formation, total biomass, and soybean yield, under northeastern Amazon climatic conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé Cette étude porte sur le flanc inverse de la nappe de Siviez-Mischabel et sur les unités tectoniques sous jacentes (zone de Stalden supérieur et zone Houillère) dans la vallée menant à Zermatt. L'étude structurale du granite permien de Randa (orthogneiss oeillé) permet de mieux comprendre les effets de la déformation alpine sur les roches de socle. La cartographie détaillée de l'orthogneiss et de son encaissant, ainsi que l'étude lithostratigraphique des terrains sédimentaires associés permettent de proposer un schéma structural et cinématique du flanc inverse de la nappe de Siviez-Mischabel et de mieux comprendre ses relations avec les unités tectoniques sous-jacentes. L'analyse structurale de l'orthogneiss de Randa et de son encaissant révèle la superposition de plusieurs phases de déformation ductile. Cet orthogneiss formé sous des conditions métamorphiques du faciès schiste vert possède une forte schistosité alpine avec au moins deux linéations d'extension. La première, L1, orientée NW-SE est associée à la mise en place de la nappe. La seconde, L2, orientée SW-NE, se corrèle au cisaillement ductile du Simplon. La quantification de la déformation au moyen de la méthode de Fry sur les faciès porphyriques donne des ellipses à rapports axiaux compris entre 1.9 et 5.3, en accord avec les valeurs obtenues par d'autres marqueurs {tourmalines étirées, fibres). Les valeurs mesurées parallèlement à L1 ou L2 sont très semblables. La méthode de Fry a nécessité une étude théorique préalable afin de vérifier son applicabilité aux orthogneiss oeillés. La méthode requiert une distribution spatiale homogène et isotrope des marqueurs utilisés. Les tests statistiques effectués ont révélé que les phénocristaux de feldspath alcalin satisfont à cette condition et qu'ils peuvent être utilisés comme marqueur de la déformation au moyen de la méthode de Fry. Les valeurs obtenues révèlent l'importance du cisaillement ductile du Simplon sur la géométrie de la nappe dans la région d'étude. Le levé cartographique a permis d'améliorer la lithostratigraphie de la base de la nappe de Siviez-Mischabel. Trois formations en position renversée peuvent être observées sous les gneiss formant le coeur de la nappe. Ces trois formations forment le coeur du synclinal de St-Niklaus qui connecte la nappe de Siviez-Mischabel à la zone de Stalden supérieur. La datation par U-Pb de zircons détritiques et magmatiques par LA-ICP-MS permet de contraindre l'âge des formations observées (probablement Carbonifère à Trias précoce). Ces données ont des répercussions importantes sur la structure de la nappe dans la région, prouvant l'existence de plusieurs plis avec des séries normales et renversées bien préservées. La définition et la datation de ces formations, ainsi que leur identification dans la-Zone- Houillère avoisinante permettent de mieux comprendre la géométrie initiale et les relations tectoniques des nappes du Pennique moyen dans la vallée de Zermatt. Summary This study investigates the overturned limb of the Siviez-Mischabel nappe and underlying tectonic units (Upper Stalden zone and Houillère zone) in the Mattertal area. Detailed structural analysis in the Permian Randa granite (augen orthogneiss) allows a better understanding of the Alpine deformation effects on basement rocks. Detailed mapping of this orthogneiss and surrounding rocks, and the study of the lithostratigraphy in the related sedimentary horizons allow the proposition of a structural and kinematic model for the overturned limb of the Siviez-Mischabel and to better understand the relations with the underlying tectonic units. The structural analysis of the Randa orthogneiss and surrounding rocks revealed the superposition of several phases of ductile deformation. This orthogneiss formed under greenschist facies metamorphic conditions displays a strong Alpine foliation with at least two stretching lineations. The first lineation, L1, is oriented NW-SE and is related to the nappe emplacement northward. The second one, L2, is related to the Simplon ductile shear zone. Strain estimation using the Fry method has been performed on porphyritic facies of the Randa orthogneiss. The obtained ellipses have axial ratios varying between 1.9 and 5.3, in agreement with strain estimation obtained from other markers (stretched turmalines, fringes). The strain values are very similar if measured parallel to L1 or to L2. A theoretical approach was necessary to verify the relevant application of the Fry method to augen orthogneiss. This method requires that the distribution of the used markers has to be homogeneous and isotropic. Statistical tests have been done and revealed that K-feldspar phenocrysts satisfy these conditions and can be used as strain markers with the Fry method. The obtained strain measurements revealed the importance of the Simplon ductile shear zone on the geometry of the nappe in the studied area. Mapping has improved the lithostratigraphy at the base of the Siviez-Mischabel nappe. Three overturned formations can be observed below the gneisses forming the core of the nappe. These three formations form the St-Niklaus syncline, which connects the Siviez-Mischabel nappe to the underlying Upper Stalden zone. U-Pb dating of detrital and magmatic zircons by LA-ICPMS allowed the age of the observed formations to be constrained (presumably Carboniferous to Early Triassic). This data has critical implications for nappe structure in the region, composed of few recumbent folds with well preserved normal and overturned limbs. The definition and dating of these formations, as well as their identification in the adjacent "Houillère Zone" improve the understanding of the geometry and tectonic relations of the Middle Penninic nappes in the Mattertal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is increasing evidence that glial cells, in particular astrocytes, interact dynamically with neurons. The well-known anatomofunctional organization of neurons in the barrel cortex offers a suitable and promising model to study such neuroglial interaction. This review summarizes and discusses recent in vitro as well as in vivo works demonstrating that astrocytes receive, integrate, and respond to neuronal signals. In addition, they are active elements of brain metabolism and exhibit a certain degree of plasticity that affects neuronal activity. Altogether these findings indicate that the barrel cortex presents glial compartments overlapping and interacting with neuronal compartments and that these properties help define barrels as functional and independent units. Finally, this review outlines how the use of the barrel cortex as a model might in the future help to address important questions related to dynamic neuroglia interaction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yrityksen sisäisten rajapintojen tunteminen mahdollistaa tiedonvaihdon hallinnan läpi organisaation. Idean muokkaaminen kannattavaksi innovaatioksi edellyttää organisaation eri osien läpi kulkevaa saumatonta prosessiketjua sekä tietovirtaa. Tutkielman tavoitteena oli mallintaa organisaation kahden toiminnallisesti erilaisen osan välinen tiedon vaihto. Tiedon vaihto kuvattiin rajapintana, tietoliittymänä. Kolmiulotteinen organisaatiomalli muodosti tutkimuksen pääteorian. Se kytkettiin yrityksen tuotanto- ja myyntiosiin, kuten myös BestServ-projektin kehittämään uuteen palvelujen kehittämisen prosessiin. Uutta palvelujen kehittämisen prosessia laajennettiin ISO/IEC 15288 standardin kuvaamalla prosessimallilla. Yritysarkkitehtuurikehikoita käytettiin mallintamisen perustana. Tietoliittymä nimenä kuvastaa näkemystä siitä, että tieto [tietämys] on olemukseltaan yksilöiden tai ryhmien välistä. Mallinnusmenetelmät eivät kuitenkaan vielä mahdollista tietoon [tietämykseen] liittyvien kaikkien ominaisuuksien mallintamista. Tietoliittymän malli koostuu kolmesta osasta, joista kaksi esitetään graafisessa muodossa ja yksi taulukkona. Mallia voidaan käyttää itsenäisesti tai osana yritysarkkitehtuuria. Teollisessa palveluliiketoiminnassa sekä tietoliittymän mallinnusmenetelmä että sillä luotu malli voivat auttaa konepajateollisuuden yritystä ymmärtämään yrityksen kehittämistarpeet ja -kohteet, kun se haluaa palvelujen tuottamisella suuremman roolin asiakasyrityksen liiketoiminnassa. Tietoliittymän mallia voidaan käyttää apuna organisaation tietovarannon ja tietämyksen mallintamisessa sekä hallinnassa ja näin pyrkiä yhdistämään ne yrityksen strategiaa palvelevaksi kokonaisuudeksi. Tietoliittymän mallinnus tarjoaa tietojohtamisen kauppatieteelliselle tutkimukselle menetelmällisyyden tutkia innovaatioiden hallintaa sekä organisaation uudistumiskykyä. Kumpikin tutkimusalue tarvitsevat tarkempaa tietoa ja mahdollisuuksia hallita tietovirtoja, tiedon vaihtoa sekä organisaation tietovarannon käyttöä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the paper is to describe some of the challenges faced by schools, or by formal education in general, as a consequence of today"s mobilecentric society (henceforth MCS), the term we will use to denote the new, networked learning ecology that has arisen from the massive penetration of digital media in everyday life. After revisiting some of the ideas of McLuhan and Vygotsky in the light of this new technological scenario, we describe five traits of the MCS and the challenges illustrated through educational practices that we believe schools will face if they wish to preserve their function of individualization and socialization. We believe that despite the emergence of the MCS, the main function of the school is still to provide the"box of tools" (a set of psychological instruments, such as reading, writing, mathematical notation, digital literacy, etc.) that enables people to develop their learning skills and life projects and to become part of communities and groups. However, the complexity and mobility of the new learning environments means that the position held by schools needs to be reevaluated in the face of the informal learning paths and experiences both online and offline to which learners now have access. We also need to reevaluate the meaning of the school itself as an institution and the model of learner it should be training

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concerning process control of batch cooling crystallization the present work focused on the cooling profile and seeding technique. Secondly, the influence of additives on batch-wise precipitation process was investigated. Moreover, a Computational Fluid Dynamics (CFD) model for simulation of controlled batch cooling crystallization was developed. A novel cooling model to control supersaturation level during batch-wise cooling crystallization was introduced. The crystallization kinetics together with operating conditions, i.e. seed loading, cooling rate and batch time, were taken into account in the model. Especially, the supersaturation- and suspension density- dependent secondary nucleation was included in the model. The interaction between the operating conditions and their influence on the control target, i.e. the constant level of supersaturation, were studied with the aid of a numerical solution for the cooling model. Further, the batch cooling crystallization was simulated with the ideal mixing model and CFD model. The moment transformation of the population balance, together with the mass and heat balances, were solved numerically in the simulation. In order to clarify a relationship betweenthe operating conditions and product sizes, a system chart was developed for anideal mixing condition. The utilization of the system chart to determine the appropriate operating condition to meet a required product size was introduced. With CFD simulation, batch crystallization, operated following a specified coolingmode, was studied in the crystallizers having different geometries and scales. The introduced cooling model and simulation results were verified experimentallyfor potassium dihydrogen phosphate (KDP) and the novelties of the proposed control policies were demonstrated using potassium sulfate by comparing with the published results in the literature. The study on the batch-wise precipitation showed that immiscible additives could promote the agglomeration of a derivative of benzoic acid, which facilitated the filterability of the crystal product.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Salmonella is distributed worldwide and is a pathogen of economic and public health importance. As a multi-host pathogen with a long environmental persistence, it is a suitable model for the study of wildlife-livestock interactions. In this work, we aim to explore the spill-over of Salmonella between free-ranging wild boar and livestock in a protected natural area in NE Spain and the presence of antimicrobial resistance. Salmonella prevalence, serotypes and diversity were compared between wild boars, sympatric cattle and wild boars from cattle-free areas. The effect of age, sex, cattle presence and cattle herd size on Salmonella probability of infection in wild boars was explored by means of Generalized Linear Models and a model selection based on the Akaike’s Information Criterion. Prevalence was higher in wild boars co-habiting with cattle (35.67%, CI 95% 28.19–43.70) than in wild boar from cattle-free areas (17.54%, CI 95% 8.74–29.91). Probability of a wild boar being a Salmonella carrier increased with cattle herd size but decreased with the host age. Serotypes Meleagridis, Anatum and Othmarschen were isolated concurrently from cattle and sympatric wild boars. Apart from serotypes shared with cattle, wild boars appear to have their own serotypes, which are also found in wild boars from cattle-free areas (Enteritidis, Mikawasima, 4:b:- and 35:r:z35). Serotype richness (diversity) was higher in wild boars co-habiting with cattle, but evenness was not altered by the introduction of serotypes from cattle. The finding of a S. Mbandaka strain resistant to sulfamethoxazole, streptomycin and chloramphenicol and a S. Enteritidis strain resistant to ciprofloxacin and nalidixic acid in wild boars is cause for public health concern.