406 resultados para reconfiguration
Resumo:
The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.
Resumo:
"Science as culture" is based on the assumption that science is a valuable component of human culture. We therefore have to build the bridge, in cultural terms, from the scientific community to the common citizen. Teaching science as culture requires the co-construction of knowledge and citizenship. Ways of articulating science/technology with society are invoked, pondering on the ethical ambivalence of such connections. The goals of this reflection are to think about: a) epistemological obstacles that, in favouring the logic of monoculture, oppose the implantation of the science as culture; b) epistemological strategies that point towards a diversity of cultural practices and "constellations" of knowledge leading to the reconfiguration of the being through knowledge; c) imperatives that force us to (re)think the epistemological bases suited to the paradigmatic changes and which translate the dynamics and complexity of the evolution of the frameworks that currently sustain science and school scientific education.
Resumo:
Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.
Improving oral healthcare in Scotland with special reference to sustainability and caries prevention
Resumo:
Brett Duane Improving oral healthcare in Scotland with special reference to sustainability and caries prevention University of Turku, Faculty of Medicine, Institute of Dentistry, Community Dentistry, Finnish Doctoral Program in Oral Sciences (FINDOS-Turku), Turku, Finland Annales Universitatis Turkuensis, Sarja- Ser. D, Medica-Odontologica. Painosalama Oy, Turku, Finland, 2015. Dentistry must provide sustainable, evidence-based, and prevention-focused care. In Scotland oral health prevention is delivered through the Childsmile programme, with an increasing use of high concentration fluoride toothpaste (HCFT). Compared with other countries there is little knowledge of xylitol prevention. The UK government has set strict carbon emission limits with which all national health services (NHS) must comply. The purpose of these studies was firstly to describe the Scottish national oral health prevention programme Childsmile (CS), to determine if the additional maternal use of xylitol (CS+X) was more effective at affecting the early colonisation of mutans streptococci (MS) than this programme alone; secondly to analyse trends in the prescribing and management of HCFT by dentists; and thirdly to analyse data from a dental service in order to improve its sustainability. In all, 182 mother/child pairs were selected on the basis of high maternal MS levels. Motherswere randomly allocated to a CS or CS+X group, with both groups receiving Childsmile. Theintervention group consumed xylitol three times a day, from when the child was 3 months until 24 months. Children were examined at age two to assess MS levels. In order to understand patterns of HCFT prescribing, a retrospective secondary data analysis of routine prescribing data for the years 2006-2012 was performed. To understand the sustainability of dental services, carbon accounting combined a top-down approach and a process analysis approach, followed by the use of Pollard’s decision model (used in other healthcare areas) to analyse and support sustainable service reconfiguration. Of the CS children, 17% were colonised with MS, compared with 5% of the CS+X group. This difference was not statistically significant (P=0.1744). The cost of HCFT prescribing increased fourteen-fold over five years, with 4% of dentists prescribing 70% of the total product. Travel (45%), procurement (36%) and building energy (18%) all contributed to the 1800 tonnes of carbon emissions produced by the service, around 4% of total NHS emissions. Using the analytical model, clinic utilisation rates improved by 56% and patient travel halved significantly reducing carbon emissions. It can be concluded that the Childsmile programme was effective in reducing the risk for MS transmission. HCFT is increasing in Scotland and needs to be managed. Dentistry has similar carbon emissions proportionally as the overall NHS, and the use of an analytic tool can be useful in helping identify these emissions. Key words: Sustainability, carbon emissions, xylitol, mutans streptococci, fluoride toothpaste, caries prevention.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
The international economic reconfiguration. The developed and developing countries have adjusted with varying degrees of success to the new international order. The world's evolution has not stopped: Europe and the emerging Asian economies, are struggling to create a multipolar World. In the periphery, some countries (Asia) are modernizing at a fast rate, while others (Latin America) are lagging behind and in need to revise their growth strategies. The decentralization of production and trade driven by transnational firms are shifting the geographic distribution of investment and employment. As a result, the industrialized countries have ceased to provide the bulk of the world's savings, changing somehow the foundations of the international financial system.
Resumo:
Decisive factors affecting the recent increase in formal employment in Brazil. This paper gives a general overview of the evolution of labour market indicators between 1995 and 2005 in Brazil. It shows an overall increase in formal employment rates from 2001 to 2005, as opposite to what had happened from 1995 to 1999. It is argued that such recent trends might indicate the reconfiguration of the labour market in better terms, with potential positive consequences to the finance performance of the Social Security sector. The paper also examines some of the major factors associated with this new trend and their chances to maintain such tendency in the near future. It's important to notice that all of them may be subject to some kind of political management by the State. In other words, we suggest that there are suficient instruments and operative skills in the Brazilian State to make these and others factors work in favour of a more persistent strategy of development with social inclusion through labour.
Resumo:
We examined three different algorithms used in diffusion Monte Carlo (DMC) to study their precisions and accuracies in predicting properties of isolated atoms, which are H atom ground state, Be atom ground state and H atom first excited state. All three algorithms — basic DMC, minimal stochastic reconfiguration DMC, and pure DMC, each with future-walking, are successfully impletmented in ground state energy and simple moments calculations with satisfactory results. Pure diffusion Monte Carlo with future-walking algorithm is proven to be the simplest approach with the least variance. Polarizabilities for Be atom ground state and H atom first excited state are not satisfactorily estimated in the infinitesimal differentiation approach. Likewise, an approach using the finite field approximation with an unperturbed wavefunction for the latter system also fails. However, accurate estimations for the a-polarizabilities are obtained by using wavefunctions that come from the time-independent perturbation theory. This suggests the flaw in our approach to polarizability estimation for these difficult cases rests with our having assumed the trial function is unaffected by infinitesimal perturbations in the Hamiltonian.
Resumo:
This thesis critically examines the online marketing tactics of 10 (English language) Canadian cosmetic surgery clinics’ websites that offer Female Genital Cosmetic Surgery (FGCS), specifically, labiaplasty (labial reduction) and vaginoplasty (vaginal tightening). Drawing on a qualitative Multimodal Critical Discourse Analysis (MCDA) and a feminist-informed social constructionist framework (Lazar, 2007), I examine how FGCS discourses reiterate and reinforce heteronormative sexual scripts for women, and impose restrictive models of femininity through the pathologization of genital diversity and the appropriation of postfeminist and neoliberal discourses of individual choice and empowerment. I explore feminist analyses of the links between FGCS and contemporary Western women’s postfeminist subjectivity, and the reconfiguration of women’s sexual agency, to better understand what these contemporary shifts may mean for women’s sexual anxiety and expression. My analysis highlights several discourses that organize the online marketing material of Canadian FGCS websites, including: the pathologization of genital diversity; restrictive models of femininity; heteronormative sexual scripts; neoliberal and post-feminist rhetorics of individual choice and empowerment; and psychological and sexual transformation. Overall, these discourses undermine acceptance of women’s genital diversity, legitimize the FGCS industry and frame FGCS as the only viable solution to alleviate women’s genital and sexual distress despite the lack of evidence regarding the long-term benefits and risks of these procedures, and the recommendations against FGCS by professional medical organizations.
Resumo:
Ancré dans une perspective historique, ce mémoire cherche à mettre en application une relecture de la théorie wébérienne de la « rationalisation éthique » comme facteur explicatif de la reconfiguration moderne du rapport entretenu entre les individus et la religion. Un retour sur les changements survenus dans la pensée religieuse de la Renaissance — pensée mise en contraste avec la situation religieuse des populations du Moyen-Âge — permet de mettre en évidence le passage d’une religion syncrétique, ritualiste et imprégnée de magie, à un christianisme épuré, intériorisé et rationnel. L’étude de la pensée religieuse de l’humaniste Érasme de Rotterdam, pris comme « figure historique » porteuse de cette transformation, pointe vers la diffusion à la Renaissance d’un christianisme compris comme système philosophique compréhensif dépouillé de son caractère mystique. Cette diffusion d’un « esprit » chrétien, et l’importance accordée à la mise en œuvre d’une conduite de vie méthodique spécifiquement orientée vers le salut, participe au premier chef d’un processus de « quotidianisation » du charisme religieux, prélude essentiel, dans une perspective wébérienne, à la « rationalisation éthique » et à l’autonomisation de la sphère religieuse dans la vie sociale.
Resumo:
Ce mémoire porte sur la polémique qui a eu lieu au Québec entre mars 2006 et décembre 2007 autour des pratiques d’« accommodements raisonnables » pour motif religieux. À partir d’une approche compréhensive et d’un cadre théorique propre à la sociologie des relations ethniques, il propose une analyse qualitative de lettres d’opinion publiées dans des quotidiens québécois. Une première analyse, thématique, a permis de constituer des registres argumentaires dans lesquels ont puisé les participants au débat public sur les « accommodements raisonnables » par le biais de lettres d’opinion. Une seconde analyse, comparative, a permis de construire des figures d’intervenants du débat public qui témoignent non seulement des forces idéologiques qui se sont affrontées dans le débat public, mais également de leur positionnement au croisement des axes saillants de la différenciation sociale dans cette polémique Les résultats de ces analyses suggèrent d’abord que la polémique résulte d’un conflit entre marqueurs identitaires devant servir au positionnement des frontières ethniques, et ensuite que la polémique des « accommodements raisonnables » a donné lieu à une reconfiguration des rapports ethniques au Québec, attribuable à la dissociation entre le conflit entre deux nations et celui sur les critères d’inclusion à la nation.
Resumo:
RÉSUMÉ Les répercussions du sida sur la communauté intellectuelle préfiguraient un changement certain dans l’esthétique littéraire contemporaine. Le témoignage de l’expérience individuelle de l’écrivain, à cet instant de désarroi collectif et de répression sociale à l’égard de la communauté homosexuelle, cherchait à provoquer une reconfiguration de l’espace de l’aveu par la projection du sujet privé dans la sphère publique. Cette posture de mise à nu avait déjà vu le jour dans les écrits féministes des années 70, mais elle a subi dans les années 80 et 90 une transformation importante puisque c’est le sujet masculin qui s’est exposé par la médiation du corps dans le récit de la maladie à l’heure du sida. Les discours de l’intime tentaient de rapprocher les espaces social et littéraire tout en affirmant des formes définies par des éthiques et des esthétiques hétérogènes. La période d’écriture de la maladie, qui clôt l’oeuvre de Guibert, est caractérisée par l’ancrage du contexte social de l’épidémie du sida. Par conséquent, les trois récits qui la fondent, soit À l’ami qui ne m’a pas sauvé la vie (1990), Le protocole compassionnel (1991) et Cytomégalovirus (1992), constituent le triptyque sur lequel s’appuiera ma réflexion, auquel s’ajoute le journal tenu par Guibert depuis son adolescence jusqu’à sa mort, Le mausolée des amants (2001), qui a été publié dix ans après la disparition de l’auteur. Cette oeuvre s’inscrit en partie dans cette mouvance du témoignage de la maladie, qui prend place entre 1987 et 1991, période pendant laquelle l’écrivain sent sa vulnérabilité sur le plan de sa santé. Il est proposé d’étudier à travers ces écrits l’écriture de l’aveu et de la dénonciation, telle qu’elle est pensée chez Guibert. Il s’agira de réfléchir sur les stratégies et les fonctions du témoignage littéraire d’une telle expérience à travers la mise en récit du sujet. Une problématique traverse toutefois cette posture de mise en danger individuelle où la nécessité de se révéler est l’objet d’un non-consensus. Or, cette recherche d’intensité par l’aveu, qui repose sur la maladie, la sexualité et la mort, veut dépasser sa dimension apocalyptique en tentant d’inscrire l’oeuvre dans une éthique sociale. De ce fait, le dévoilement, sur le mode de la dénonciation, s’oriente sur la dimension collective en prenant à partie la société et la communauté.
Resumo:
L'obligation de sécurité informationnelle - c'est-à-dire la tâche qui incombe aux entreprises d'assurer l'intégrité, la confidentialité et la disponibilité de l'information découle, tant en droit québécois que dans une majorité de juridictions occidentales, d'une série de dispositions législatives imposant non pas l'adoption de comportements ou l'utilisation de technologies ou de procédés identifiables, mais bien l'implantation de mesures de sécurité «raisonnables », «adéquates », ou « suffisantes ». Or, dans un domaine aussi embryonnaire et complexe que celui de la sécurité informationnelle, domaine dans lequel les solutions disponibles sont multiples et où la jurisprudence est éparse, comment une entreprise peut-elle jauger avec justesse l'étendue de son obligation? Bref, comment établir ce que ferait une entreprise raisonnablement prudente et diligente dans un domaine où il n'existe actuellement aucune balise législative, jurisprudentielle ou même coutumière permettant de fixer avec justesse le niveau de diligence imposé par le législateur? L'absence de sécurité juridique offerte par une telle situation est patente et nécessite une reconfiguration du cadre opératoire de l'obligation de sécurité informationnelle afin d'en identifier les composantes et les objectifs. Cet exercice passera par la redéfinition de l'obligation de sécurité informationnelle comme obligation de réduire les risques qui guettent l'information à un niveau socialement acceptable. En effet, la sécurité pouvant être définie comme étant la gestion du risque, c'est donc le risque qui réside au cœur de cette obligation. Or, en analysant les risques qui guettent un système, soit en analysant les menaces qui visent à exploiter ses vulnérabilités, il est possible d'établir quelles contre-mesures s'avèrent utiles et les coûts associés à leur mise en œuvre. Par la suite, il devient envisageable, en recourant à la définition économique de la négligence et en prenant compte des probabilités de brèches de sécurité et des dommages escomptés, d'établir les sommes optimales à investir dans l'achat, l'entretien et la mise à jour de ces contre-mesures. Une telle analyse permet ainsi de quantifier avec un certain degré de précision l'étendue de l'obligation de sécurité informationnelle en offrant aux entreprises un outil s'inspirant de données matérielles auxquelles elles ont librement accès et s'intégrant aisément dans le contexte juridique contemporain.
Resumo:
L’ethnographie de la détention frontalière en France se penche sur le phénomène des migrations transnationales et sur le gouvernement des frontières qui y répond, à travers des pratiques d’enfermement et d’expulsion des étrangers dans une « zone d’attente » aéroportuaire. La construction des camps d’étrangers, dont relève ce terrain, témoigne de nouvelles distributions du pouvoir qui passent par l’accès à la mobilité. L’étude empirique d’une forme particulière de ce contrôle est ainsi le point de départ d’une réflexion plus large sur ce régime de gouvernement, qui dessine une autre topographie politique de la globalisation. L’enquête nous invite à comprendre les pratiques par lesquelles les gouvernements démocratiques administrent des populations non-citoyennes, et la façon dont ces modalités de prise en charge et de surveillance opèrent une reconfiguration des frontières physiques, morales et politiques. L’ethnographie s’interroge sur l’expérience quotidienne de ceux qui sont les sujets de ces régimes juridiques et humanitaires croisés. Le confinement des étrangers entrecroise plusieurs dimensions, qui organisent la recherche : la construction d’un enfermement humanitaire, et les usages institutionnels et militants de différents régimes de droits qui y sont en jeu; les pratiques de prise en charge de populations identifiées comme vulnérables; les reconfigurations de la frontière à travers de nouvelles formes réticulaires et zonales; et enfin, l’expérience de circulation que dessinent les archipels de surveillance, et les pratiques de gestion différentielle des mobilités dont participent les zones d’attente.
Resumo:
La traduction portugaise de la version post-vulgate française de la Quête du Graal achevée vers la fin du XIIIe siècle et intitulée A Demanda do Santo Graal, offre un prisme intéressant pour saisir, en contexte et à travers les jeux de déplacements et de reconfiguration, l’imaginaire caractéristique du cycle Post-Vulgate, autrement difficilement accessible. La Demanda do Santo Graal permet de mieux comprendre l’imaginaire du roman français et d’accompagner son évolution en dehors de ses frontières linguistiques à une époque où le romancier n’est plus traducteur, mais devient lui-même une figure d’autorité. Ce travail consiste essentiellement à voir comment la Queste Post-Vulgate lue en parallèle avec la traduction/adaptation portugaise permet de comprendre l’évolution du roman vers la fin du Moyen Âge et la notion d’auteur en faisant la distinction entre le translateur, le créateur et le romancier dans le récit médiéval tout en faisant témoin de l’évolution de leur subjectivité littéraire du roman des origines à celui du Moyen Âge tardif.