756 resultados para Industry relationship model
Resumo:
One third of all stroke survivors develop post-stroke depression (PSD). Depressive symptoms adversely affect rehabilitation and significantly increase risk of death in the post-stroke period. One of the theoretical views on the determinants of PSD focuses on psychosocial factors like disability and social support. Others emphasize biologic mechanisms such as disruption of biogenic amine neurotransmission and release of proinflammatory cytokines. The "lesion location" perspective attempts to establish a relationship between localization of stroke and occurrence of depression, but empirical results remain contradictory. These divergences are partly related to the fact that neuroimaging methods, unlike neuropathology, are not able to assess precisely the full extent of stroke-affected areas and do not specify the different types of vascular lesions. We provide here an overview of the known phenomenological profile and current pathogenic hypotheses of PSD and present neuropathological data challenging the classic "single-stroke"-based neuroanatomical model of PSD. We suggest that vascular burden due to the chronic accumulation of small macrovascular and microvascular lesions may be a crucial determinant of the development and evolution of PSD.
Resumo:
BACKGROUND: It is well established that high adherence to HIV-infected patients on highly active antiretroviral treatment (HAART) is a major determinant of virological and immunologic success. Furthermore, psychosocial research has identified a wide range of adherence factors including patients' subjective beliefs about the effectiveness of HAART. Current statistical approaches, mainly based on the separate identification either of factors associated with treatment effectiveness or of those associated with adherence, fail to properly explore the true relationship between adherence and treatment effectiveness. Adherence behavior may be influenced not only by perceived benefits-which are usually the focus of related studies-but also by objective treatment benefits reflected in biological outcomes. METHODS: Our objective was to assess the bidirectional relationship between adherence and response to treatment among patients enrolled in the ANRS CO8 APROCO-COPILOTE study. We compared a conventional statistical approach based on the separate estimations of an adherence and an effectiveness equation to an econometric approach using a 2-equation simultaneous system based on the same 2 equations. RESULTS: Our results highlight a reciprocal relationship between adherence and treatment effectiveness. After controlling for endogeneity, adherence was positively associated with treatment effectiveness. Furthermore, CD4 count gain after baseline was found to have a positive significant effect on adherence at each observation period. This immunologic parameter was not significant when the adherence equation was estimated separately. In the 2-equation model, the covariances between disturbances of both equations were found to be significant, thus confirming the statistical appropriacy of studying adherence and treatment effectiveness jointly. CONCLUSIONS: Our results, which suggest that positive biological results arising as a result of high adherence levels, in turn reinforce continued adherence and strengthen the argument that patients who do not experience rapid improvement in their immunologic and clinical statuses after HAART initiation should be prioritized when developing adherence support interventions. Furthermore, they invalidate the hypothesis that HAART leads to "false reassurance" among HIV-infected patients.
Resumo:
This paper presents a statistical model for the quantification of the weight of fingerprint evidence. Contrarily to previous models (generative and score-based models), our model proposes to estimate the probability distributions of spatial relationships, directions and types of minutiae observed on fingerprints for any given fingermark. Our model is relying on an AFIS algorithm provided by 3M Cogent and on a dataset of more than 4,000,000 fingerprints to represent a sample from a relevant population of potential sources. The performance of our model was tested using several hundreds of minutiae configurations observed on a set of 565 fingermarks. In particular, the effects of various sub-populations of fingers (i.e., finger number, finger general pattern) on the expected evidential value of our test configurations were investigated. The performance of our model indicates that the spatial relationship between minutiae carries more evidential weight than their type or direction. Our results also indicate that the AFIS component of our model directly enables us to assign weight to fingerprint evidence without the need for the additional layer of complex statistical modeling involved by the estimation of the probability distributions of fingerprint features. In fact, it seems that the AFIS component is more sensitive to the sub-population effects than the other components of the model. Overall, the data generated during this research project contributes to support the idea that fingerprint evidence is a valuable forensic tool for the identification of individuals.
Resumo:
The bio-economic model "Heures" is a first attempt to develop a simulation procedure to understand the Northwestern Mediterranean fisheries, to evaluate management strategies and to analyze the feasibility of implementing an adaptative management. The model is built on the interaction among three boxes simulating the dynamics of each of the basic actors of a fishery: the stock, the market and the fishermen. A fourth actor, the manager, imposes or modifies the rules, or, in terms of the model, modifies some particular parameters. Thus, the model allows us to simulate and evaluate the mid-term biologic and economic effects of particular management measures. The bio-economic nature of the model is given by the interaction among the three boxes, by the market simulation and, particularly, by the fishermen behaviour. This last element confers to the model its Mediterranean"selfregulated" character. The fishermen allocate their investments to maximize fishing mortality but, having a legal effort limit, they invest in maintenance and technology in order to increase the catchability, which, as a consequence. will be function of the invested capital.
Resumo:
The theoretical aspects and the associated software of a bioeconomic model for Mediterranean fisheries are presented. The first objective of the model is to reproduce the bioeconomic conditions in which the fisheries occur. The model is, perforce, multispecies and multigear. The main management procedure is effort limitation. The model also incorporates the usual fishermen strategy of increasing efficiency to obtain increased fishing mortality while maintaining the nominal effort. This is modelled by means of a function relating the efficiency (or technological progress) with the capital invested in the fishery and time. A second objective is to simulate alternative management strategies. The model allows the operation of technical and economic management measures in the presence of different kind of events. Both deterministic and stochastic simulations can be performed. An application of this tool to the hake fishery off Catalonia is presented, considering the other species caught and the different gears used. Several alternative management measures are tested and their consequences for the stock and economy of fishermen are analysed.
Resumo:
BACKGROUND: To understand cancer-related modifications to transcriptional programs requires detailed knowledge about the activation of signal-transduction pathways and gene expression programs. To investigate the mechanisms of target gene regulation by human estrogen receptor alpha (hERalpha), we combine extensive location and expression datasets with genomic sequence analysis. In particular, we study the influence of patterns of DNA occupancy by hERalpha on expression phenotypes. RESULTS: We find that strong ChIP-chip sites co-localize with strong hERalpha consensus sites and detect nucleotide bias near hERalpha sites. The localization of ChIP-chip sites relative to annotated genes shows that weak sites are enriched near transcription start sites, while stronger sites show no positional bias. Assessing the relationship between binding configurations and expression phenotypes, we find binding sites downstream of the transcription start site (TSS) to be equally good or better predictors of hERalpha-mediated expression as upstream sites. The study of FOX and SP1 cofactor sites near hERalpha ChIP sites shows that induced genes frequently have FOX or SP1 sites. Finally we integrate these multiple datasets to define a high confidence set of primary hERalpha target genes. CONCLUSION: Our results support the model of long-range interactions of hERalpha with the promoter-bound cofactor SP1 residing at the promoter of hERalpha target genes. FOX motifs co-occur with hERalpha motifs along responsive genes. Importantly we show that the spatial arrangement of sites near the start sites and within the full transcript is important in determining response to estrogen signaling.
Resumo:
The objective of this work was to investigate glyphosate adsorption by soils and its relationship with unoccupied binding sites for phosphate adsorption. Soil samples of three Chilean soils series - Valdivia (Andisol), Clarillo (Inceptisol) and Chicureo (Vertisol) - were incubated with different herbicide concentrations. Glyphosate remaining in solution was determined by adjusting a HPLC method with a UV detector. Experimental maximum adsorption capacity were 15,000, 14,300 and 4,700 mg g¹ for Valdivia, Clarillo, and Chicureo soils, respectively. Linear, Freundlich, and Langmuir models were used to describe glyphosate adsorption. Isotherms describing glyphosate adsorption differed among soils. Maximum adjusted adsorption capacity with the Langmuir model was 231,884, 17,874 and 5,670 mg g-1 for Valdivia, Clarillo, and Chicureo soils, respectively. Glyphosate adsorption on the Valdivia soil showed a linear behavior at the range of concentrations used and none of the adjusted models became asymptotic. The high glyphosate adsorption capacity of the Valdivia soil was probably a result of its high exchangeable Al, extractable Fe, and alophan and imogolite clay type. Adsorption was very much related to phosphate dynamics in the Valdivia soil, which showed the larger unoccupied phosphate binding sites. However relationship between unoccupied phosphate binding sites and glyphosate adsorption in the other two soils (Clarillo and Chicureo) was not clear.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
We investigate the selective pressures on a social trait when evolution occurs in a population of constant size. We show that any social trait that is spiteful simultaneously qualifies as altruistic. In other words, any trait that reduces the fitness of less related individuals necessarily increases that of related ones. Our analysis demonstrates that the distinction between "Hamiltonian spite" and "Wilsonian spite" is not justified on the basis of fitness effects. We illustrate this general result with an explicit model for the evolution of a social act that reduces the recipient's survival ("harming trait"). This model shows that the evolution of harming is favoured if local demes are of small size and migration is low (philopatry). Further, deme size and migration rate determine whether harming evolves as a selfish strategy by increasing the fitness of the actor, or as a spiteful/altruistic strategy through its positive effect on the fitness of close kin.
Resumo:
This study aimed to test subjective indicators designed to analyze the role food plays in children’s lives, explore children’s personal well-being, and evaluate the relationship between these two phenomena. It was conducted on 371 children aged 10 to 12 by means of a selfadministered questionnaire. Results showed a marked interest in food on the part of children, who consider taste and health the most important indicators when it comes to eating. They demonstrated a high level of personal well-being, measured using Cummins & Lau’s adapted version of the Personal Well- Being Index–School Children (PWI-SC) (2005), overall life satisfaction (OLS) and satisfaction with various life domains (friends, family, sports, food and body). Regression models were conducted to explain satisfaction with food, taking as independent variables the interest children have in food, the importance they give to different reasons for eating, scores from the PWI-SC, OLS and satisfaction with various life domains. In the final model, it was found that OLS, health indicators, satisfaction with health from the PWI-SC and satisfaction with your body contribute to explaining satisfaction with food. The results obtained suggest that satisfaction with food is a relevant indicator in the exploration of children’s subjective well-being, calling into question the widespread belief that these aspects are of exclusive interest to adults. They also seem to reinforce the importance of including food indicators in any study aimed at exploring the well-being of the 10 to 12 year-old population.
Resumo:
An Adobe (R) animation is presented for use in undergraduate Biochemistry courses, illustrating the mechanism of Na+ and K+ translocation coupled to ATP hydrolysis by the (Na, K)-ATPase, a P-2c-type ATPase, or ATP-powered ion pump that actively translocates cations across plasma membranes. The enzyme is also known as an E-1/E-2-ATPase as it undergoes conformational changes between the E-1 and E-2 forms during the pumping cycle, altering the affinity and accessibility of the transmembrane ion-binding sites. The animation is based on Horisberger's scheme that incorporates the most recent significant findings to have improved our understanding of the (Na, K)-ATPase structure function relationship. The movements of the various domains within the (Na, K)-ATPase alpha-subunit illustrate the conformational changes that occur during Na+ and K+ translocation across the membrane and emphasize involvement of the actuator, nucleotide, and phosphorylation domains, that is, the "core engine" of the pump, with respect to ATP binding, cation transport, and ADP and P-i release.
Resumo:
The relationship between non-institutional free press and local communication is quite particular since this type of press forms a very characteristic model of local communication, showing that advertising suffices to finance an information product addressed to a fairly well-defined readership as long as this product has a good advertising sales department and an effective distribution in its operating area. This paper discusses the present situation of the free press in Catalonia, where this phenomenon has been quite prominent. It points out the main features of this type of press and makes a review of its history, which runs from the euphoria of its early years and its expansion and consolidation, to the current crisis
Resumo:
The hyperpolarization-activated cyclic nucleotide-gated (HCN) channels are expressed in pacemaker cells very early during cardiogenesis. This work aimed at determining to what extent these channels are implicated in the electromechanical disturbances induced by a transient oxygen lack which may occur in utero. Spontaneously beating hearts or isolated ventricles and outflow tracts dissected from 4-day-old chick embryos were exposed to a selective inhibitor of HCN channels (ivabradine 0.1-10microM) to establish a dose-response relationship. The effects of ivabradine on electrocardiogram, excitation-contraction coupling and contractility of hearts submitted to anoxia (30min) and reoxygenation (60min) were also determined. The distribution of the predominant channel isoform, HCN4, was established in atria, ventricle and outflow tract by immunoblotting. Intrinsic beating rate of atria, ventricle and outflow tract was 164+/-22 (n=10), 78+/-24 (n=8) and 40+/-12bpm (n=23, mean+/-SD), respectively. In the whole heart, ivabradine (0.3microM) slowed the firing rate of atria by 16% and stabilized PR interval. These effects persisted throughout anoxia-reoxygenation, whereas the variations of QT duration, excitation-contraction coupling and contractility, as well as the types and duration of arrhythmias were not altered. Ivabradine (10microM) reduced the intrinsic rate of atria and isolated ventricle by 27% and 52%, respectively, whereas it abolished activity of the isolated outflow tract. Protein expression of HCN4 channels was higher in atria and ventricle than in the outflow tract. Thus, HCN channels are specifically distributed and control finely atrial, ventricular and outflow tract pacemakers as well as conduction in the embryonic heart under normoxia and throughout anoxia-reoxygenation.
Resumo:
EXECUTIVE SUMMARY : Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link. This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective. This dissertation is divided into three parts. Part One: Information Security Evaluation issues consists of four chapters. Chapter 1 is an introduction to the purpose of this research purpose and the Model that will be proposed. In this chapter we raise some questions with respect to "traditional evaluation methods" as well as identifying the principal elements to be addressed in this direction. Then we introduce the baseline attributes of our model and set out the expected result of evaluations according to our model. Chapter 2 is focused on the definition of Information Security to be used as a reference point for our evaluation model. The inherent concepts of the contents of a holistic and baseline Information Security Program are defined. Based on this, the most common roots-of-trust in Information Security are identified. Chapter 3 focuses on an analysis of the difference and the relationship between the concepts of Information Risk and Security Management. Comparing these two concepts allows us to identify the most relevant elements to be included within our evaluation model, while clearing situating these two notions within a defined framework is of the utmost importance for the results that will be obtained from the evaluation process. Chapter 4 sets out our evaluation model and the way it addresses issues relating to the evaluation of Information Security. Within this Chapter the underlying concepts of assurance and trust are discussed. Based on these two concepts, the structure of the model is developed in order to provide an assurance related platform as well as three evaluation attributes: "assurance structure", "quality issues", and "requirements achievement". Issues relating to each of these evaluation attributes are analysed with reference to sources such as methodologies, standards and published research papers. Then the operation of the model is discussed. Assurance levels, quality levels and maturity levels are defined in order to perform the evaluation according to the model. Part Two: Implementation of the Information Security Assurance Assessment Model (ISAAM) according to the Information Security Domains consists of four chapters. This is the section where our evaluation model is put into a welldefined context with respect to the four pre-defined Information Security dimensions: the Organizational dimension, Functional dimension, Human dimension, and Legal dimension. Each Information Security dimension is discussed in a separate chapter. For each dimension, the following two-phase evaluation path is followed. The first phase concerns the identification of the elements which will constitute the basis of the evaluation: ? Identification of the key elements within the dimension; ? Identification of the Focus Areas for each dimension, consisting of the security issues identified for each dimension; ? Identification of the Specific Factors for each dimension, consisting of the security measures or control addressing the security issues identified for each dimension. The second phase concerns the evaluation of each Information Security dimension by: ? The implementation of the evaluation model, based on the elements identified for each dimension within the first phase, by identifying the security tasks, processes, procedures, and actions that should have been performed by the organization to reach the desired level of protection; ? The maturity model for each dimension as a basis for reliance on security. For each dimension we propose a generic maturity model that could be used by every organization in order to define its own security requirements. Part three of this dissertation contains the Final Remarks, Supporting Resources and Annexes. With reference to the objectives of our thesis, the Final Remarks briefly analyse whether these objectives were achieved and suggest directions for future related research. Supporting resources comprise the bibliographic resources that were used to elaborate and justify our approach. Annexes include all the relevant topics identified within the literature to illustrate certain aspects of our approach. Our Information Security evaluation model is based on and integrates different Information Security best practices, standards, methodologies and research expertise which can be combined in order to define an reliable categorization of Information Security. After the definition of terms and requirements, an evaluation process should be performed in order to obtain evidence that the Information Security within the organization in question is adequately managed. We have specifically integrated into our model the most useful elements of these sources of information in order to provide a generic model able to be implemented in all kinds of organizations. The value added by our evaluation model is that it is easy to implement and operate and answers concrete needs in terms of reliance upon an efficient and dynamic evaluation tool through a coherent evaluation system. On that basis, our model could be implemented internally within organizations, allowing them to govern better their Information Security. RÉSUMÉ : Contexte général de la thèse L'évaluation de la sécurité en général, et plus particulièrement, celle de la sécurité de l'information, est devenue pour les organisations non seulement une mission cruciale à réaliser, mais aussi de plus en plus complexe. A l'heure actuelle, cette évaluation se base principalement sur des méthodologies, des bonnes pratiques, des normes ou des standards qui appréhendent séparément les différents aspects qui composent la sécurité de l'information. Nous pensons que cette manière d'évaluer la sécurité est inefficiente, car elle ne tient pas compte de l'interaction des différentes dimensions et composantes de la sécurité entre elles, bien qu'il soit admis depuis longtemps que le niveau de sécurité globale d'une organisation est toujours celui du maillon le plus faible de la chaîne sécuritaire. Nous avons identifié le besoin d'une approche globale, intégrée, systémique et multidimensionnelle de l'évaluation de la sécurité de l'information. En effet, et c'est le point de départ de notre thèse, nous démontrons que seule une prise en compte globale de la sécurité permettra de répondre aux exigences de sécurité optimale ainsi qu'aux besoins de protection spécifiques d'une organisation. Ainsi, notre thèse propose un nouveau paradigme d'évaluation de la sécurité afin de satisfaire aux besoins d'efficacité et d'efficience d'une organisation donnée. Nous proposons alors un modèle qui vise à évaluer d'une manière holistique toutes les dimensions de la sécurité, afin de minimiser la probabilité qu'une menace potentielle puisse exploiter des vulnérabilités et engendrer des dommages directs ou indirects. Ce modèle se base sur une structure formalisée qui prend en compte tous les éléments d'un système ou programme de sécurité. Ainsi, nous proposons un cadre méthodologique d'évaluation qui considère la sécurité de l'information à partir d'une perspective globale. Structure de la thèse et thèmes abordés Notre document est structuré en trois parties. La première intitulée : « La problématique de l'évaluation de la sécurité de l'information » est composée de quatre chapitres. Le chapitre 1 introduit l'objet de la recherche ainsi que les concepts de base du modèle d'évaluation proposé. La maniéré traditionnelle de l'évaluation de la sécurité fait l'objet d'une analyse critique pour identifier les éléments principaux et invariants à prendre en compte dans notre approche holistique. Les éléments de base de notre modèle d'évaluation ainsi que son fonctionnement attendu sont ensuite présentés pour pouvoir tracer les résultats attendus de ce modèle. Le chapitre 2 se focalise sur la définition de la notion de Sécurité de l'Information. Il ne s'agit pas d'une redéfinition de la notion de la sécurité, mais d'une mise en perspectives des dimensions, critères, indicateurs à utiliser comme base de référence, afin de déterminer l'objet de l'évaluation qui sera utilisé tout au long de notre travail. Les concepts inhérents de ce qui constitue le caractère holistique de la sécurité ainsi que les éléments constitutifs d'un niveau de référence de sécurité sont définis en conséquence. Ceci permet d'identifier ceux que nous avons dénommés « les racines de confiance ». Le chapitre 3 présente et analyse la différence et les relations qui existent entre les processus de la Gestion des Risques et de la Gestion de la Sécurité, afin d'identifier les éléments constitutifs du cadre de protection à inclure dans notre modèle d'évaluation. Le chapitre 4 est consacré à la présentation de notre modèle d'évaluation Information Security Assurance Assessment Model (ISAAM) et la manière dont il répond aux exigences de l'évaluation telle que nous les avons préalablement présentées. Dans ce chapitre les concepts sous-jacents relatifs aux notions d'assurance et de confiance sont analysés. En se basant sur ces deux concepts, la structure du modèle d'évaluation est développée pour obtenir une plateforme qui offre un certain niveau de garantie en s'appuyant sur trois attributs d'évaluation, à savoir : « la structure de confiance », « la qualité du processus », et « la réalisation des exigences et des objectifs ». Les problématiques liées à chacun de ces attributs d'évaluation sont analysées en se basant sur l'état de l'art de la recherche et de la littérature, sur les différentes méthodes existantes ainsi que sur les normes et les standards les plus courants dans le domaine de la sécurité. Sur cette base, trois différents niveaux d'évaluation sont construits, à savoir : le niveau d'assurance, le niveau de qualité et le niveau de maturité qui constituent la base de l'évaluation de l'état global de la sécurité d'une organisation. La deuxième partie: « L'application du Modèle d'évaluation de l'assurance de la sécurité de l'information par domaine de sécurité » est elle aussi composée de quatre chapitres. Le modèle d'évaluation déjà construit et analysé est, dans cette partie, mis dans un contexte spécifique selon les quatre dimensions prédéfinies de sécurité qui sont: la dimension Organisationnelle, la dimension Fonctionnelle, la dimension Humaine, et la dimension Légale. Chacune de ces dimensions et son évaluation spécifique fait l'objet d'un chapitre distinct. Pour chacune des dimensions, une évaluation en deux phases est construite comme suit. La première phase concerne l'identification des éléments qui constituent la base de l'évaluation: ? Identification des éléments clés de l'évaluation ; ? Identification des « Focus Area » pour chaque dimension qui représentent les problématiques se trouvant dans la dimension ; ? Identification des « Specific Factors » pour chaque Focus Area qui représentent les mesures de sécurité et de contrôle qui contribuent à résoudre ou à diminuer les impacts des risques. La deuxième phase concerne l'évaluation de chaque dimension précédemment présentées. Elle est constituée d'une part, de l'implémentation du modèle général d'évaluation à la dimension concernée en : ? Se basant sur les éléments spécifiés lors de la première phase ; ? Identifiant les taches sécuritaires spécifiques, les processus, les procédures qui auraient dû être effectués pour atteindre le niveau de protection souhaité. D'autre part, l'évaluation de chaque dimension est complétée par la proposition d'un modèle de maturité spécifique à chaque dimension, qui est à considérer comme une base de référence pour le niveau global de sécurité. Pour chaque dimension nous proposons un modèle de maturité générique qui peut être utilisé par chaque organisation, afin de spécifier ses propres exigences en matière de sécurité. Cela constitue une innovation dans le domaine de l'évaluation, que nous justifions pour chaque dimension et dont nous mettons systématiquement en avant la plus value apportée. La troisième partie de notre document est relative à la validation globale de notre proposition et contient en guise de conclusion, une mise en perspective critique de notre travail et des remarques finales. Cette dernière partie est complétée par une bibliographie et des annexes. Notre modèle d'évaluation de la sécurité intègre et se base sur de nombreuses sources d'expertise, telles que les bonnes pratiques, les normes, les standards, les méthodes et l'expertise de la recherche scientifique du domaine. Notre proposition constructive répond à un véritable problème non encore résolu, auquel doivent faire face toutes les organisations, indépendamment de la taille et du profil. Cela permettrait à ces dernières de spécifier leurs exigences particulières en matière du niveau de sécurité à satisfaire, d'instancier un processus d'évaluation spécifique à leurs besoins afin qu'elles puissent s'assurer que leur sécurité de l'information soit gérée d'une manière appropriée, offrant ainsi un certain niveau de confiance dans le degré de protection fourni. Nous avons intégré dans notre modèle le meilleur du savoir faire, de l'expérience et de l'expertise disponible actuellement au niveau international, dans le but de fournir un modèle d'évaluation simple, générique et applicable à un grand nombre d'organisations publiques ou privées. La valeur ajoutée de notre modèle d'évaluation réside précisément dans le fait qu'il est suffisamment générique et facile à implémenter tout en apportant des réponses sur les besoins concrets des organisations. Ainsi notre proposition constitue un outil d'évaluation fiable, efficient et dynamique découlant d'une approche d'évaluation cohérente. De ce fait, notre système d'évaluation peut être implémenté à l'interne par l'entreprise elle-même, sans recourir à des ressources supplémentaires et lui donne également ainsi la possibilité de mieux gouverner sa sécurité de l'information.