907 resultados para Integration of Programming Techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The environmental aspect of corporate social responsibility (CSR) expressed through the process of the EMS implementation in the oil and gas companies is identified as the main subject of this research. In the theoretical part, the basic attention is paid to justification of a link between CSR and environmental management. The achievement of sustainable competitive advantage as a result of environmental capital growth and inclusion of the socially responsible activities in the corporate strategy is another issue that is of special significance here. Besides, two basic forms of environmental management systems (environmental decision support systems and environmental information management systems) are explored and their role in effective stakeholder interaction is tackled. The most crucial benefits of EMS are also analyzed to underline its importance as a source of sustainable development. Further research is based on the survey of 51 sampled oil and gas companies (both publicly owned and state owned ones) originated from different countries all over the world and providing reports on sustainability issues in the open access. To analyze their approach to sustainable development, a specifically designed evaluation matrix with 37 indicators developed in accordance with the General Reporting Initiative (GRI) guidelines for non-financial reporting was prepared. Additionally, the quality of environmental information disclosure was measured on the basis of a quality – quantity matrix. According to results of research, oil and gas companies prefer implementing reactive measures to the costly and knowledge-intensive proactive techniques for elimination of the negative environmental impacts. Besides, it was identified that the environmental performance disclosure is mostly rather limited, so that the quality of non-financial reporting can be judged as quite insufficient. In spite of the fact that most of the oil and gas companies in the sample claim the EMS to be embedded currently in their structure, they often do not provide any details for the process of their implementation. As a potential for the further development of EMS, author mentions possible integration of their different forms in a single entity, extension of existing structure on the basis of consolidation of the structural and strategic precautions as well as development of a unified certification standard instead of several ones that exist today in order to enhance control on the EMS implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis focuses on integration in project business, i.e. how projectbased companies organize their product and process structures when they deliver industrial solutions to their customers. The customers that invest in these solutions run their businesses in different geographical, political and economical environments, which should be acknowledged by the supplier when providing solutions comprising of larger and more complex scopes than previously supplied to these customers. This means that the suppliers are increasing their supply range by taking over some of the activities in the value chain that have traditionally been handled by the customer. In order to be able to provide the functioning solutions, including more engineering hours, technical equipment and a wider project network, a change is needed in the mindset in order to be able to carry out and take the required responsibility that these new approaches bring. For the supplier it is important to be able to integrate technical products, systems and services, but the supplier also needs to have the capabilities to integrate the cross-functional organizations and departments in the project network, the knowledge and information between and within these organizations and departments, along with inputs from the customer into the product and process structures during the lifecycle of the project under development. Hence, the main objective of this thesis is to explore the challenges of integration that industrial projects meet, and based on that, to suggest a concept of how to manage integration in project business by making use of integration mechanisms. Integration is considered the essential process for accomplishing an industrial project, whereas the accomplishment of the industrial project is considered to be the result of the integration. The thesis consists of an extended summary and four papers, that are based on three studies in which integration mechanisms for value creation in industrial project networks and the management of integration in project business have been explored. The research is based on an inductive approach where in particular the design, commissioning and operations functions of industrial projects have been studied, addressing entire project life-cycles. The studies have been conducted in the shipbuilding and power generation industries where the scopes of supply consist of stand-alone equipment, equipment and engineering, and turnkey solutions. These industrial solutions include demanding efforts in engineering and organization. Addressing the calls for more studies on the evolving value chains of integrated solutions, mechanisms for inter- and intra-organizational integration and subsequent value creation in project networks have been explored. The research results in thirteen integration mechanisms and a typology for integration is proposed. Managing integration consists of integrating the project network (the supplier and the sub-suppliers) and the customer (the customer’s business purpose, operations environment and the end-user) into the project by making use of integration mechanisms. The findings bring new insight into research on industrial project business by proposing integration of technology and engineering related elements with elements related to customer oriented business performance in contemporary project environments. Thirteen mechanisms for combining products and the processes needed to deliver projects are described and categorized according to the impact that they have on the management of knowledge and information. These mechanisms directly relate to the performance of the supplier, and consequently to the functioning of the solution that the project provides. This thesis offers ways to promote integration of knowledge and information during the lifecycle of industrial projects, enhancing the development towards innovative solutions in project business.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standard techniques for radioautography used in biological and medical research can be classified into three categories, i.e., macroscopic radioautography, light microscopic radioautography and electron microscopic radioautography. The routine techniques used in these three procedures are described. With regard to macroscopic radioautography, whole body radioautography is a standard technique which employs freezing and cryosectioning and can demonstrate organ distributions of both soluble and insoluble compounds. In contrast, in light and electron microscopic radioautography, soluble and insoluble techniques are separated. In order to demonstrate insoluble labeled compounds, conventional chemical fixations such as formalin for light microscopy or buffered glutaraldehyde and osmium tetroxide for both light and electron microscopy followed by dehydration, embedding and wet-mounting applications of radioautographic emulsions can be used. For the demonstration of soluble labeled compounds, however, cryotechniques such as cryofixation, cryosectioning, freeze-drying, freeze-substitution followed by dry-sectioning and dry-mounting radioautography should be employed both for light and electron microscopy. The outlines of these techniques, which should be utilized in various fields of biological and medical research, are described in detail

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, stepwise titration with hydrochloric acid was used to obtain chemical reactivities and dissolution rates of ground limestones and dolostones of varying geological backgrounds (sedimentary, metamorphic or magmatic). Two different ways of conducting the calculations were used: 1) a first order mathematical model was used to calculate extrapolated initial reactivities (and dissolution rates) at pH 4, and 2) a second order mathematical model was used to acquire integrated mean specific chemical reaction constants (and dissolution rates) at pH 5. The calculations of the reactivities and dissolution rates were based on rate of change of pH and particle size distributions of the sample powders obtained by laser diffraction. The initial dissolution rates at pH 4 were repeatedly higher than previously reported literature values, whereas the dissolution rates at pH 5 were consistent with former observations. Reactivities and dissolution rates varied substantially for dolostones, whereas for limestones and calcareous rocks, the variation can be primarily explained by relatively large sample standard deviations. A list of the dolostone samples in a decreasing order of initial reactivity at pH 4 is: 1) metamorphic dolostones with calcite/dolomite ratio higher than about 6% 2) sedimentary dolostones without calcite 3) metamorphic dolostones with calcite/dolomite ratio lower than about 6% The reactivities and dissolution rates were accompanied by a wide range of experimental techniques to characterise the samples, to reveal how different rocks changed during the dissolution process, and to find out which factors had an influence on their chemical reactivities. An emphasis was put on chemical and morphological changes taking place at the surfaces of the particles via X-ray Photoelectron Spectroscopy (XPS) and Scanning Electron Microscopy (SEM). Supporting chemical information was obtained with X-Ray Fluorescence (XRF) measurements of the samples, and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS) and Inductively Coupled Plasma-Optical Emission Spectrometry (ICP-OES) measurements of the solutions used in the reactivity experiments. Information on mineral (modal) compositions and their occurrence was provided by X-Ray Diffraction (XRD), Energy Dispersive X-ray analysis (EDX) and studying thin sections with a petrographic microscope. BET (Brunauer, Emmet, Teller) surface areas were determined from nitrogen physisorption data. Factors increasing chemical reactivity of dolostones and calcareous rocks were found to be sedimentary origin, higher calcite concentration and smaller quartz concentration. Also, it is assumed that finer grain size and larger BET surface areas increase the reactivity although no certain correlation was found in this thesis. Atomic concentrations did not correlate with the reactivities. Sedimentary dolostones, unlike metamorphic ones, were found to have porous surface structures after dissolution. In addition, conventional (XPS) and synchrotron based (HRXPS) X-ray Photoelectron Spectroscopy were used to study bonding environments on calcite and dolomite surfaces. Both samples are insulators, which is why neutralisation measures such as electron flood gun and a conductive mask were used. Surface core level shifts of 0.7 ± 0.1 eV for Ca 2p spectrum of calcite and 0.75 ± 0.05 eV for Mg 2p and Ca 3s spectra of dolomite were obtained. Some satellite features of Ca 2p, C 1s and O 1s spectra have been suggested to be bulk plasmons. The origin of carbide bonds was suggested to be beam assisted interaction with hydrocarbons found on the surface. The results presented in this thesis are of particular importance for choosing raw materials for wet Flue Gas Desulphurisation (FGD) and construction industry. Wet FGD benefits from high reactivity, whereas construction industry can take advantage of slow reactivity of carbonate rocks often used in the facades of fine buildings. Information on chemical bonding environments may help to create more accurate models for water-rock interactions of carbonates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visceral afferents send information via cranial nerves to the nucleus tractus solitarius (NTS). The NTS is the initial step of information processing that culminates in homeostatic reflex responses. Recent evidence suggests that strong afferent synaptic responses in the NTS are most often modulated by depression and this forms a basic principle of central integration of these autonomic pathways. The visceral afferent synapse is uncommonly powerful at the NTS with large unitary response amplitudes and depression rather than facilitation at moderate to high frequencies of activation. Substantial signal depression occurs through multiple mechanisms at this very first brainstem synapse onto second order NTS neurons. This review highlights new approaches to the study of these basic processes featuring patch clamp recordings in NTS brain slices and optical techniques with fluorescent tracers. The vanilloid receptor agonist, capsaicin, distinguishes two classes of second order neurons (capsaicin sensitive or capsaicin resistant) that appear to reflect unmyelinated and myelinated afferent pathways. The differences in cellular properties of these two classes of NTS neurons indicate clear functional differentiation at both the pre- and postsynaptic portions of these first synapses. By virtue of their position at the earliest stage of these pathways, such mechanistic differences probably impart important differentiation in the performance over the entire reflex pathways.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This master’s thesis was made in order to gain answers to the question of how the integration of the marketing communications and the decision making related to it in a geographically dispersed service organization could be improved in a situation where an organization has gone through a merger. The effects of the organizational design dimensions towards the integration of the marketing communications and the decision making related to it was the main focus. A case study as a research strategy offered a perfect frames for an exploratory study and the data collection was conducted by semi-structured interviews and observing. The main finding proved that from the chosen design dimensions, decentralization, coordination and power, could be found specific factors that in a geographically dispersed organization are affecting the integration of the marketing communications negatively. The effects can be seen mostly in the decision making processes, roles and in the division of responsibility, which are affecting the other dimensions and by this, the integration. In a post-merger situation, the coordination dimension and especially the information asymmetry and the information flow seem to have a largest affect towards the integration of the marketing communications. An asymmetric information distribution with the lack of business and marketing education resulted in low self-assurance and at the end in fragmented management and to the inability to set targets and make independent decisions. As conclusions it can be stated, that with the organizational design dimensions can the effects of a merger towards the integration process of the marketing communications to be evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solar and wind power produce electricity irregularly. This irregular power production is problematic and therefore production can exceed the need. Thus sufficient energy storage solutions are needed. Currently there are some storages, such as flywheel, but they are quite short-term. Power-to-Gas (P2G) offers a solution to store energy as a synthetic natural gas. It also improves nation’s energy self-sufficiency. Power-to-Gas can be integrated to an industrial or a municipal facility to reduce production costs. In this master’s thesis the integration of Power-to-Gas technologies to wastewater treatment as a part of the VTT’s Neo-Carbon Energy project is studied. Power-to-Gas produces synthetic methane (SNG) from water and carbon dioxide with electricity. This SNG can be considered as stored energy. Basic wastewater treatment technologies and the production of biogas in the treatment plant are studied. The utilisation of biogas and SNG in heat and power production and in transportation is also studied. The integration of the P2G to wastewater treatment plant (WWTP) is examined mainly from economic view. First the mass flows of flowing materials are calculated and after that the economic impact based on the mass flows. The economic efficiency is evaluated with Net Present Value method. In this thesis it is also studied the overall profitability of the integration and the key economic factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the efficacy of providing four Grade 7 and 8 students with reading difficulties with explicit instruction in the use of reading comprehension strategies while using text-reader software. Specifically, the study explored participants' combined use of a text-reader and question-answering comprehension strategy during a 6-week instructional program. Using a qualitative case study methodology approach, participants' experiences using text-reader software, with the presence of explicit instruction in evidence-based reading comprehension strategies, were examined. The study involved three phases: (a) the first phase consisted of individual interviews with the participants and their parents; (b) the second phase consisted of a nine session course; and (c) the third phase consisted of individual exit interviews and a focus group discussion. After the data collection phases were completed, data were analyzed and coded for emerging themes, with-quantitativ,e measures of participants' reading performance used as descriptive data. The data suggested that assistive technology can serve as an instructional "hook", motivating students to engage actively in the reading processes, especially when accompanied by explicit strategy instruction. Participants' experiences also reflected development of strategy use and use of text-reader software and the importance of social interactions in developing reading comprehension skills. The findings of this study support the view that the integration of instruction using evidence-based practices are important and vital components in the inclusion oftext-reader software as part of students' educational programming. Also, the findings from this study can be extended to develop in-class programming for students using text-reader software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette recherche évalue si l’intégration du programme d’agrément MIRE (Mesures implantées pour le renouveau de l’évaluation) d’Agrément Canada, anciennement Conseil canadien d’agrément des services de santé, engendre du changement et de l’apprentissage organisationnel. Elle étudie le cas de deux organismes de santé, la Health Authority of Anguilla (HAA) et la Ca’ Foncella Opetale de Treviso (CFOT). La recherche comporte trois niveaux d’analyse pour lesquels des données qualitatives et quantitatives ont été recueillies : 1) les membres des équipes d’agrément; 2) les équipes d’agrément; 3) l’organisme dans son ensemble. Des questionnaires individuels administrés aux membres des équipes, des entretiens semi-structurés avec les chefs des équipes et les coordonnateurs de la qualité, une revue de documentation et plusieurs mesures périodiques du niveau de compliance aux normes MIRE ont été les techniques de collecte de données utilisées. Les résultats indiquent que les organismes ont opéré des transformations : 1) stratégiques; 2) de l’organisation; 3) des relations avec son environnement. Ils ont amélioré leurs systèmes et leurs pratiques de gestion de même que leurs communications internes et externes. Il y a eu aussi des apprentissages utiles par les individus, les équipes et les organismes. Les apprentissages individuels concernaient les programmes qualité, l’approche centrée sur la clientèle, la gestion des risques, l’éthique professionnelle, la gestion participative et l’évaluation des services. Les étapes « autoévaluation » et « apporter des améliorations et donner suite aux recommandations » du cycle d’agrément ont contribué le plus au changement et à l’apprentissage organisationnel. Les équipes interdisciplinaires d’agrément ont été le véhicule privilégié pour réaliser ces changements et ces apprentissages. La HAA et la CFOT ont amélioré progressivement leur niveau de compliance aux normes dans toutes les dimensions de la qualité, au niveau des équipes d’agrément et pour l’ensemble de l’organisation. Néanmoins, l’amélioration du niveau global de compliance était en deçà de la limite minimum des exigences du programme pour l’obtention d’un statut d’agrément sans restrictions importantes. L’envergure des changements et des apprentissages réalisés soulève la question de la capacité des organismes d’institutionnaliser ces nouvelles connaissances. La CFOT pourrait y arriver étant donné les ressources et les compétences à sa disposition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’intégration du génome du virus papilloma humain (VPH) a été reconnu jusqu’`a récemment comme étant un événnement fréquent mais pourtant tardif dans la progression de la maladie du col de l’utérus. La perspective temporelle vient, pourtant, d’être mise au défi par la détection de formes intégrées de VPH dans les tissus normaux et dans les lésions prénéoplasiques. Notre objectif était de déterminer la charge virale de VPH-16 et son état physique dans une série de 220 échantillons provenant de cols uterins normaux et avec des lésions de bas-grade. La technique quantitative de PCR en temps réel, méthode Taqman, nous a permis de quantifier le nombre de copies des gènes E6, E2, et de la B-globine, permettant ainsi l’évaluation de la charge virale et le ratio de E6/E2 pour chaque spécimen. Le ratio E6/E2 de 1.2 ou plus était suggestif d’intégration. Par la suite, le site d’intégration du VPH dans le génome humain a été déterminé par la téchnique de RS-PCR. La charge virale moyenne était de 57.5±324.6 copies d'ADN par cellule et le ratio E6/E2 a évalué neuf échantillons avec des formes d’HPV intégrées. Ces intégrants ont été amplifiés par RS-PCR, suivi de séquençage, et l’homologie des amplicons a été déterminée par le programme BLAST de NCBI afin d’identifier les jonctions virales-humaines. On a réussi `a identifier les jonctions humaines-virales pour le contrôle positif, c'est-à-dire les cellules SiHa, pourtant nous n’avons pas detecté d’intégration par la technique de RS-PCR dans les échantillons de cellules cervicales exfoliées provenant de tissus normaux et de lésions de bas-grade. Le VPH-16 est rarement intégré dans les spécimens de jeunes patientes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L’évaluation économique en santé consiste en l’analyse comparative d’alternatives de services en regard à la fois de leurs coûts et de leurs conséquences. Elle est un outil d’aide à la décision. La grande majorité des décisions concernant l’allocation des ressources sont prises en clinique; particulièrement au niveau des soins primaires. Puisque chaque décision est associée à un coût d’opportunité, la non-prise en compte des considérations économiques dans les pratiques des médecins de famille peut avoir un impact important sur l’efficience du système de santé. Il existe peu de connaissances quant à l’influence des évaluations économiques sur la pratique clinique. L’objet de la thèse est de comprendre le rôle de l’évaluation économique dans la pratique des médecins de famille. Ses contributions font l’objet de quatre articles originaux (philosophique, théorique, méthodologique et empirique). L’article philosophique suggère l’importance des questions de complexité et de réflexivité en évaluation économique. La complexité est la perspective philosophique, (approche générale épistémologique) qui sous-tend la thèse. Cette vision du monde met l’attention sur l’explication et la compréhension et sur les relations et les interactions (causalité interactive). Cet accent sur le contexte et le processus de production des données souligne l’importance de la réflexivité dans le processus de recherche. L’article théorique développe une conception nouvelle et différente du problème de recherche. L’originalité de la thèse réside également dans son approche qui s’appuie sur la perspective de la théorie sociologique de Pierre Bourdieu; une approche théorique cohérente avec la complexité. Opposé aux modèles individualistes de l’action rationnelle, Bourdieu préconise une approche sociologique qui s’inscrit dans la recherche d’une compréhension plus complète et plus complexe des phénomènes sociaux en mettant en lumière les influences souvent implicites qui viennent chaque jour exercer des pressions sur les individus et leurs pratiques. L’article méthodologique présente le protocole d’une étude qualitative de cas multiples avec niveaux d’analyse imbriqués : les médecins de famille (niveau micro-individuel) et le champ de la médecine familiale (niveau macro-structurel). Huit études de cas furent réalisées avec le médecin de famille comme unité principale d’analyse. Pour le niveau micro, la collecte des informations fut réalisée à l’aide d’entrevues de type histoire de vie, de documents et d’observation. Pour le niveau macro, la collecte des informations fut réalisée à l’aide de documents, et d’entrevues de type semi-structuré auprès de huit informateurs clés, de neuf organisations médicales. L’induction analytique fut utilisée. L’article empirique présente l’ensemble des résultats empiriques de la thèse. Les résultats montrent une intégration croissante de concepts en économie dans le discours officiel des organisations de médecine familiale. Cependant, au niveau de la pratique, l'économisation de ce discours ne semble pas être une représentation fidèle de la réalité puisque la très grande majorité des participants n'incarnent pas ce discours. Les contributions incluent une compréhension approfondie des processus sociaux qui influencent les schèmes de perception, de pensée, d’appréciation et d’action des médecins de famille quant au rôle de l’évaluation économique dans la pratique clinique et la volonté des médecins de famille à contribuer à une allocation efficiente, équitable et légitime des ressources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un objectif principal du génie logiciel est de pouvoir produire des logiciels complexes, de grande taille et fiables en un temps raisonnable. La technologie orientée objet (OO) a fourni de bons concepts et des techniques de modélisation et de programmation qui ont permis de développer des applications complexes tant dans le monde académique que dans le monde industriel. Cette expérience a cependant permis de découvrir les faiblesses du paradigme objet (par exemples, la dispersion de code et le problème de traçabilité). La programmation orientée aspect (OA) apporte une solution simple aux limitations de la programmation OO, telle que le problème des préoccupations transversales. Ces préoccupations transversales se traduisent par la dispersion du même code dans plusieurs modules du système ou l’emmêlement de plusieurs morceaux de code dans un même module. Cette nouvelle méthode de programmer permet d’implémenter chaque problématique indépendamment des autres, puis de les assembler selon des règles bien définies. La programmation OA promet donc une meilleure productivité, une meilleure réutilisation du code et une meilleure adaptation du code aux changements. Très vite, cette nouvelle façon de faire s’est vue s’étendre sur tout le processus de développement de logiciel en ayant pour but de préserver la modularité et la traçabilité, qui sont deux propriétés importantes des logiciels de bonne qualité. Cependant, la technologie OA présente de nombreux défis. Le raisonnement, la spécification, et la vérification des programmes OA présentent des difficultés d’autant plus que ces programmes évoluent dans le temps. Par conséquent, le raisonnement modulaire de ces programmes est requis sinon ils nécessiteraient d’être réexaminés au complet chaque fois qu’un composant est changé ou ajouté. Il est cependant bien connu dans la littérature que le raisonnement modulaire sur les programmes OA est difficile vu que les aspects appliqués changent souvent le comportement de leurs composantes de base [47]. Ces mêmes difficultés sont présentes au niveau des phases de spécification et de vérification du processus de développement des logiciels. Au meilleur de nos connaissances, la spécification modulaire et la vérification modulaire sont faiblement couvertes et constituent un champ de recherche très intéressant. De même, les interactions entre aspects est un sérieux problème dans la communauté des aspects. Pour faire face à ces problèmes, nous avons choisi d’utiliser la théorie des catégories et les techniques des spécifications algébriques. Pour apporter une solution aux problèmes ci-dessus cités, nous avons utilisé les travaux de Wiels [110] et d’autres contributions telles que celles décrites dans le livre [25]. Nous supposons que le système en développement est déjà décomposé en aspects et classes. La première contribution de notre thèse est l’extension des techniques des spécifications algébriques à la notion d’aspect. Deuxièmement, nous avons défini une logique, LA , qui est utilisée dans le corps des spécifications pour décrire le comportement de ces composantes. La troisième contribution consiste en la définition de l’opérateur de tissage qui correspond à la relation d’interconnexion entre les modules d’aspect et les modules de classe. La quatrième contribution concerne le développement d’un mécanisme de prévention qui permet de prévenir les interactions indésirables dans les systèmes orientés aspect.