84 resultados para generic finiteness
Resumo:
South Peak is a 7-Mm3 potentially unstable rock mass located adjacent to the 1903 Frank Slide on Turtle Mountain, Alberta. This paper presents three-dimensional numerical rock slope stability models and compares them with a previous conceptual slope instability model based on discontinuity surfaces identified using an airborne LiDAR digital elevation model (DEM). Rock mass conditions at South Peak are described using the Geological Strength Index and point load tests, whilst the mean discontinuity set orientations and characteristics are based on approximately 500 field measurements. A kinematic analysis was first conducted to evaluate probable simple discontinuity-controlled failure modes. The potential for wedge failure was further assessed by considering the orientation of wedge intersections over the airborne LiDAR DEM and through a limit equilibrium combination analysis. Block theory was used to evaluate the finiteness and removability of blocks in the rock mass. Finally, the complex interaction between discontinuity sets and the topography within South Peak was investigated through three-dimensional distinct element models using the code 3DEC. The influence of individual discontinuity sets, scale effects, friction angle and the persistence along the discontinuity surfaces on the slope stability conditions were all investigated using this code.
Resumo:
Introduction: The general strategy to perform anti-doping analysis starts with a screening followed by a confirmatory step when a sample is suspected to be positive. The screening step should be fast, generic and able to highlight any sample that may contain a prohibited substance by avoiding false negative and reducing false positive results. The confirmatory step is a dedicated procedure comprising a selective sample preparation and detection mode. Aim: The purpose of the study is to develop rapid screening and selective confirmatory strategies to detect and identify 103 doping agents in urine. Methods: For the screening, urine samples were simply diluted by a factor 2 with ultra-pure water and directly injected ("dilute and shoot") in the ultrahigh- pressure liquid chromatography (UHPLC). The UHPLC separation was performed in two gradients (ESI positive and negative) from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. The gradient analysis time is 9 min including 3 min reequilibration. Analytes detection was performed in full scan mode on a quadrupole time-of-flight (QTOF) mass spectrometer by acquiring the exact mass of the protonated (ESI positive) or deprotonated (ESI negative) molecular ion. For the confirmatory analysis, urine samples were extracted on SPE 96-well plate with mixed-mode cation (MCX) for basic and neutral compounds or anion exchange (MAX) sorbents for acidic molecules. The analytes were eluted in 3 min (including 1.5 min reequilibration) with a S1-25 Ann Toxicol Anal. 2009; 21(S1) Abstracts gradient from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. Analytes confirmation was performed in MS and MS/MS mode on a QTOF mass spectrometer. Results: In the screening and confirmatory analysis, basic and neutral analytes were analysed in the positive ESI mode, whereas acidic compounds were analysed in the negative mode. The analyte identification was based on retention time (tR) and exact mass measurement. "Dilute and shoot" was used as a generic sample treatment in the screening procedure, but matrix effect (e.g., ion suppression) cannot be avoided. However, the sensitivity was sufficient for all analytes to reach the minimal required performance limit (MRPL) required by the World Anti Doping Agency (WADA). To avoid time-consuming confirmatory analysis of false positive samples, a pre-confirmatory step was added. It consists of the sample re-injection, the acquisition of MS/MS spectra and the comparison to reference material. For the confirmatory analysis, urine samples were extracted by SPE allowing a pre-concentration of the analyte. A fast chromatographic separation was developed as a single analyte has to be confirmed. A dedicated QTOF-MS and MS/MS acquisition was performed to acquire within the same run a parallel scanning of two functions. Low collision energy was applied in the first channel to obtain the protonated molecular ion (QTOF-MS), while dedicated collision energy was set in the second channel to obtain fragmented ions (QTOF-MS/MS). Enough identification points were obtained to compare the spectra with reference material and negative urine sample. Finally, the entire process was validated and matrix effects quantified. Conclusion: Thanks to the coupling of UHPLC with the QTOF mass spectrometer, high tR repeatability, sensitivity, mass accuracy and mass resolution over a broad mass range were obtained. The method was sensitive, robust and reliable enough to detect and identify doping agents in urine. Keywords: screening, confirmatory analysis, UHPLC, QTOF, doping agents
Resumo:
RésuméCette thèse traite de l'utilisation des concepts de Symbiose Industrielle dans les pays en développement et étudie le potentiel de cette stratégie pour stimuler un développement régional durable dans les zones rurales d'Afrique de l'Ouest. En particulier, lorsqu'une Symbiose Industrielle est instaurée entre une usine et sa population alentour, des outils d'évaluation sont nécessaires pour garantir que le projet permette d'atteindre un réel développement durable. Les outils existants, développés dans les pays industrialisés, ne sont cependant pas complètement adaptés pour l'évaluation de projets dans les pays en développement. En effet, les outils sont porteurs d'hypothèses implicites propres au contexte socio-économique dans lequel ils ont été conçus.L'objectif de cette thèse est de développer un cadre méthodologique pour l'évaluation de la durabilité de projets de Symbiose Industrielle dans les pays en développement.Pour ce faire, je m'appuie sur une étude de cas de la mise en place d'une Symbiose Industrielle au nord du Nigéria, à laquelle j'ai participé en tant qu'observatrice dès 2007. AshakaCem, une usine productrice de ciment du groupe Lafarge, doit faire face à de nombreuses tensions avec la population rurale alentour. L'entreprise a donc décidé d'adopter une nouvelle méthode inspirée des concepts de Symbiose Industrielle. Le projet consiste à remplacer jusqu'à 10% du carburant fossile utilisé pour la cuisson de la matière crue (calcaire et additifs) par de la biomasse produite par les paysans locaux. Pour ne pas compromettre la fragile sécurité alimentaire régionale, des techniques de lutte contre l'érosion et de fertilisation naturelle des sols sont enseignées aux paysans, qui peuvent ainsi utiliser la culture de biomasse pour améliorer leurs cultures vivrières. A travers cette Symbiose Industrielle, l'entreprise poursuit des objectifs sociaux (poser les bases nécessaires à un développement régional), mais également environnementaux (réduire ses émissions de CO2 globales) et économiques (réduire ses coûts énergétiques). Elle s'ancre ainsi dans une perspective de développement durable qui est conditionnelle à la réalisation du projet.A travers l'observation de cette Symbiose et par la connaissance des outils existants je constate qu'une évaluation de la durabilité de projets dans les pays en développement nécessite l'utilisation de critères d'évaluation propres à chaque projet. En effet, dans ce contexte, l'emploi de critères génériques apporte une évaluation trop éloignée des besoins et de la réalité locale. C'est pourquoi, en m'inspirant des outils internationalement reconnus comme l'Analyse du Cycle de Vie ou la Global Reporting Initiative, je définis dans cette thèse un cadre méthodologique qui peut, lui, être identique pour tous les projets. Cette stratégie suit six étapes, qui se réalisent de manière itérative pour permettre une auto¬amélioration de la méthodologie d'évaluation et du projet lui-même. Au cours de ces étapes, les besoins et objectifs en termes sociaux, économiques et environnementaux des différents acteurs sont déterminés, puis regroupés, hiérarchisés et formulés sous forme de critères à évaluer. Des indicateurs quantitatifs ou qualitatifs sont ensuite définis pour chacun de ces critères. Une des spécificités de cette stratégie est de définir une échelle d'évaluation en cinq graduations, identique pour chaque indicateur, témoignant d'un objectif totalement atteint (++) ou pas du tout atteint (--).L'application de ce cadre méthodologique à la Symbiose nigériane a permis de déterminer quatre critères économiques, quatre critères socio-économiques et six critères environnementaux à évaluer. Pour les caractériser, 22 indicateurs ont été définis. L'évaluation de ces indicateurs a permis de montrer que le projet élaboré atteint les objectifs de durabilité fixés pour la majorité des critères. Quatre indicateurs ont un résultat neutre (0), et un cinquième montre qu'un critère n'est pas atteint (--). Ces résultats s'expliquent par le fait que le projet n'en est encore qu'à sa phase pilote et n'a donc pas encore atteint la taille et la diffusion optimales. Un suivi sur plusieurs années permettra de garantir que ces manques seront comblés.Le cadre méthodologique que j'ai développé dans cette thèse est un outil d'évaluation participatif qui pourra être utilisé dans un contexte plus large que celui des pays en développement. Son caractère générique en fait un très bon outil pour la définition de critères et indicateurs de suivi de projet en terme de développement durable.SummaryThis thesis examines the use of industrial symbiosis in developing countries and studies its potential to stimulate sustainable regional development in rural areas across Western Africa. In particular, when industrial symbiosis is instituted between a factory and the surrounding population, evaluation tools are required to ensure the project achieves truly sustainable development. Existing tools developed in industrialized countries are not entirely suited to assessing projects in developing countries. Indeed, the implicit hypotheses behind such tools reflect the socioeconomic context in which they were designed. The goal of this thesis is to develop a methodological framework for evaluating the sustainability of industrial symbiosis projects in developing countries.To accomplish this, I followed a case study about the implementation of industrial symbiosis in northern Nigeria by participating as an observer since 2007. AshakaCem, a cement works of Lafarge group, must confront many issues associated with violence committed by the local rural population. Thus, the company decided to adopt a new approach inspired by the concepts of industrial symbiosis.The project involves replacing up to 10% of the fossil fuel used to heat limestone with biomass produced by local farmers. To avoid jeopardizing the fragile security of regional food supplies, farmers are taught ways to combat erosion and naturally fertilize the soil. They can then use biomass cultivation to improve their subsistence crops. Through this industrial symbiosis, AshakaCem follows social objectives (to lay the necessary foundations for regional development), but also environmental ones (to reduce its overall CO2 emissions) and economical ones (to reduce its energy costs). The company is firmly rooted in a view of sustainable development that is conditional upon the project's execution.By observing this symbiosis and by being familiar with existing tools, I note that assessing the sustainability of projects in developing countries requires using evaluation criteria that are specific to each project. Indeed, using generic criteria results in an assessment that is too far removed from what is needed and from the local reality. Thus, by drawing inspiration from such internationally known tools as Life Cycle Analysis and the Global Reporting Initiative, I define a generic methodological framework for the participative establishment of an evaluation methodology specific to each project.The strategy follows six phases that are fulfilled iteratively so as to improve the evaluation methodology and the project itself as it moves forward. During these phases, the social, economic, and environmental needs and objectives of the stakeholders are identified, grouped, ranked, and expressed as criteria for evaluation. Quantitative or qualitative indicators are then defined for each of these criteria. One of the characteristics of this strategy is to define a five-point evaluation scale, the same for each indicator, to reflect a goal that was completely reached (++) or not reached at all (--).Applying the methodological framework to the Nigerian symbiosis yielded four economic criteria, four socioeconomic criteria, and six environmental criteria to assess. A total of 22 indicators were defined to characterize the criteria. Evaluating these indicators made it possible to show that the project meets the sustainability goals set for the majority of criteria. Four indicators had a neutral result (0); a fifth showed that one criterion had not been met (--). These results can be explained by the fact that the project is still only in its pilot phase and, therefore, still has not reached its optimum size and scope. Following up over several years will make it possible to ensure these gaps will be filled.The methodological framework presented in this thesis is a highly effective tool that can be used in a broader context than developing countries. Its generic nature makes it a very good tool for defining criteria and follow-up indicators for sustainable development.
Resumo:
La question centrale de ce travail est celle de la relation entre finitude environnementale et liberté individuelle. Par finitude environnementale il faut entendre l'ensemble des contraintes écologiques variées qui posent des limites à l'action humaine. Celles-ci sont de deux types généraux : les limites de disponibilité des ressources naturelles et: les limites de charge des écosystèmes et des grands cycles biogéochimiques globaux (chapitre 1). La thèse défendue ici est que les conceptions libertariennes et libérales de la liberté sont en conflit avec la nécessité de prendre en compte de telles limites et qu'une approche néo-républicaine est mieux à même de répondre à ces enjeux écologiques. Les théories libertariennes, de droite comme de gauche, sont inadaptées à la prise en compte de la finitude des ressources naturelles car elles maintiennent un droit à l'appropriation illimitée de ces dernières par les individus. Ce point est en contradiction avec le caractère systémique de la rareté et avec l'absence de substitut pour certaines ressources indispensables à la poursuite d'une vie décente (chapitres 2 et 3). La théorie libérale de la neutralité, appuyée par le principe du tort (harm principle), est quant à elle inadaptée à la prise en compte des problèmes environnementaux globaux comme le changement climatique. Les mécanismes causaux menant à la création de dommages environnementaux sont en effet indirects et diffus, ce qui empêche l'assignation de responsabilités au niveau individuel. La justification de politiques environnementales contraignantes s'en trouve donc mise en péril (chapitre 4). Ces difficultés proviennent avant tout de deux traits caractéristiques de ces doctrines : leur ontologie sociale atomiste et leur conception de la liberté comme liberté de choix. Le néo-républicanisme de Philip Pettit permet de répondre à ces deux problèmes grâce à son ontologie holiste et à sa conception de la liberté comme non- domination. Cette théorie permet donc à la fois de proposer une conception de la liberté compatible avec la finitude environnementale et de justifier des politiques environnementales exigeantes, sans que le sacrifice en termes de liberté n'apparaisse trop important (chapitre 5). - The centrai issue of this work is that of the relationship between environmental finiteness and individual liberty. By environmental finiteness one should understand the set of diverse ecological constraints that limit human action. These limits are of two general kinds: on the one hand the availability of natural resources, and on the other hand the carrying capacity of ecosystems and biogeochemical cycles (chapter 1}. The thesis defended here is that libertarian and liberal conceptions of liberty conflict with the necessity to take such limits into account, and that a neo-republican approach is best suited to address environmental issues. Libertarian theories, right-wing as well as left-wing, are in particular not able to take resource scarcity into account because they argue for an unlimited right of individuals to appropriate those resources. This point is in contradiction with the systemic nature of scarcity and with the absence of substitutes for some essential resources (chapters 2 and 3). The liberal doctrine of neutrality, as associated with the harm principle, is unsuitable when addressing global environmental issues like climate change. Causal mechanisms leading to environmental harm are indirect and diffuse, which prevents the assignation of individual responsibilities. This makes the justification of coercive environmental policies difficult (chapter 4). These difficulties stem above all from two characteristic features of libertarian and liberal doctrines: their atomistic social ontology and their conception of freedom as liberty of choice. Philip Pettit's neo- republicanism on the other hand is able to address these problems thanks to its holist social ontology and its conception of liberty as non-domination. This doctrine offers a conception of liberty compatible with environmental limits and theoretical resources able to justify demanding environmental policies without sacrificing too much in terms of liberty (chapter 5).
Resumo:
EXPERIMENTATION GENERIQUE ET DIALOGISME INTERTEXTUEL: PERRAULT, LA FONTAINE, APULEIUS, STRAPAROLA AND BASILE. Les contes en vers et en prose de Perrault relèvent, selon l'hypothèse présentée dans cette étude, d'un dialogisme intertextuel très complexe avec les formes génériques du conte déjà existantes dans les littératures européennes. L'académicien s'adonne à une véritable «expérimentation générique » au cours de laquelle il crée de nouvelles formes génériques, de nouvelles intrigues et de nouvelles figures à partir de la fabella de Psyché enchâssée dans les Métamorphoses d'Apulée, et de sa récriture galante par La Fontaine dans Les Amours de Psiché et de Cupidon en 1669. Ce double dialogue intertextuel est encore sous-tendu par le recours aux favole de Straparola (Le Piacevole notti) et aux cunti (Lo cunto de li cunti) de Basile, célèbres narrateurs italiens, qui avaient déjà reconfiguré de façon originale certains épisodes et personnages de la fabella de Psyché. Ce processus dialogique complexe est ici mis en évidence par l'analyse (inter)textuelle successive des trois premiers contes en prose, La Belle au bois dormant, Le Petit Chaperon rouge et La Barbe bleue. L'analyse comparative montre que Perrault les invente à partir des trois moments successifs de l'épreuve la plus difficile que Vénus inflige à Psyché, celle de la descente aux Enfers. En introduisant des différences significatives par rapport aux textes latins, italiens et français, l'académicien parvient à créer une nouvelle variation générique : le conte pseudo-naïf doté d'un sens crypté qui se découvre « plus ou moins selon le degré de pénétration de ceux qui les lisent ».GENERIC EXPERIMENTATION AND INTERTEXTUAL DIALOGISM: PERRAULT, LA FONTAINE, APULEIUS, STRAPAROLA AND BASILEAccording to Ute Heidmann, Perrault's tales have very complex inter- textual relations with other generic forms which already existed in other European literatures. Heidmann demonstrates here how Perrault experi- ments "generically" with the fairy tale and how he creates new generic forms from other tales by Apuleus, Straparola, Basile, or La Fontaine. This dialogic process is here underlined by the analysis of three particular fairy tales, Sleeping Beauty in the Woods, Little Red Riding Hood and Blue Beard. Heidmann shows how, by introducing key differences with the Latin, Italian and French models, Perrault succeeds in creating a new generic variation of the fairy tale: "the pseudo-naïve fairy tale".
Resumo:
Rhinoviruses and enteroviruses are leading causes of respiratory infections. To evaluate genotypic diversity and identify forces shaping picornavirus evolution, we screened persons with respiratory illnesses by using rhinovirus-specific or generic real-time PCR assays. We then sequenced the 5 untranslated region, capsid protein VP1, and protease precursor 3CD regions of virus-positive samples. Subsequent phylogenetic analysis identified the large genotypic diversity of rhinoviruses circulating in humans. We identified and completed the genome sequence of a new enterovirus genotype associated with respiratory symptoms and acute otitis media, confirming the close relationship between rhinoviruses and enteroviruses and the need to detect both viruses in respiratory specimens. Finally, we identified recombinants among circulating rhinoviruses and mapped their recombination sites, thereby demonstrating that rhinoviruses can recombine in their natural host. This study clarifies the diversity and explains the reasons for evolution of these viruses.
Resumo:
This article is a taxonomic study of the radiolarian species of the superfamilies Eptingiacea and Saturnaliacea occurring in the middle Carnian fauna from the Koseyahya section, near the town of Elbistan, southeastern Turkey. This fauna is characteristic of the Tetraporobrachia haeckeli Radiolarian Zone as defined in Austria and later found also in Turkey and Oman. It comes from an 8 m thick succession of clayey/cherty limestones from the lower part of the section. In addition, a few species from the late Ladinian and Carnian from Oman and the early Norian from Alaska have also been included in this study, in order to improve some generic diagnoses and to show the diversity and evolutionary trends of some genera. 32 radiolarian species of which 22 are new are described and illustrated, and assigned to 16 genera of which three are new (Capnuchospyris, Veleptingium, and Triassolaguncula). The diagnoses of some species, genera, subfamilies and families have been revised, and the family Eptingiidae has been raised to the rank of superfamily.
Resumo:
OBJECTIVE: In general population survey instruments that measure volume of drinking, additional questions and shorter reference periods yield higher volumes. Comparison studies have focused on volume but not on associations between volume and consequences. METHOD: From a cohort study on substance use risk factors (Cohort Study on Substance Use Risk Factors [C-SURF]), baseline data were analyzed for 5,074 young (approximately 20-year-old) men who were drinkers in the past 12 months. Volume of drinking was measured by a generic quantity-frequency (QF) instrument, an extended QF (separately for weekends and weekdays) instrument with 12-months recall, and a retrospective past-week diary. Associations of consequences with and without attribution of alcohol as a cause, Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), criteria for dependence, and DSM-5 alcohol use disorder in the past 12 months were analyzed. RESULTS: The generic QF measure resulted in lower volume compared with either the extended QF measure (more questions) or the retrospective diary (the most questions and the shortest recall period). For outcomes, however, the extended QF assessment performed the best and the diary the worst. CONCLUSIONS: Higher volume yields are not always better regarding associations with outcomes. The extended QF instrument better captures the variability of drinking. The retrospective diary performs poorly for associations because of the mismatch with the recall period for past-12-months consequences and the potential for misclassification of past-week abstainers and heavy drinkers because of an uncommon past week. Diaries are not recommended for research investigating individual associations between exposure and outcomes in young populations if consequences are measured with a sufficiently long interval to capture rare consequences. (J. Stud. Alcohol Drugs, 75, 880-888, 2014).
Resumo:
This contribution explores the role of international standards in the rules governing the internationalisation of the service economy. It analyses on a cross-institutional basis patterns of authority in the institutional setting of service standards in the European and Amercian context. The entry into force of the World Trade Organisation (WTO) in 1995 gave international standards a major role in harmonising the technical specifications of goods and services traded on the global market Despite the careful wording of the WTO, a whole range of international bodies still have the capacity to define generic as well as detailed technical specifications affecting how swelling offshore services are expected to be traded on worldwide basis. The analysis relies on global political economy approaches to identify constitutive patterns of authority mediating between the political and the economic spheres on a transnational space. It extends to the area of service standards the assumption that the process of globalisation is not opposing states and markets, but a joint expression of both of them including new patterns and agents of structural change through formal and informal power and regulatory practices. The paper argues that service standards reflect the significant development of a form of transnational hybrid authority, that blurs the distinction between private and public actors, whose scope spread all along from physical measures to societal values, and which reinforces the deterritorialisation of regulatory practices in contemporary capitalism. It provides evidence of this argument by analysing the current European strategy regarding service standardization in response to several programming mandate of the European Commission and the American views on the future development of service standards.
Resumo:
Background: Disease management, a system of coordinated health care interventions for populations with chronic diseases in which patient self-care is a key aspect, has been shown to be effective for several conditions. Little is known on the supply of disease management programs in Switzerland. Objectives: To systematically search, record and evaluate data on existing disease management programs in Switzerland. Methods: Programs met our operational definition of disease management if their interventions targeted a chronic disease, included a multidisciplinary team and lasted at least 6 months. To find existing programs, we searched Swiss official websites, Swiss web-pages using Google, medical electronic database (Medline), and checked references from selected documents. We also contacted personally known individuals, those identified as possibly working in the field, individuals working in major Swiss health insurance companies and people recommended by previously contacted persons (snow ball strategy). We developed an extraction grid and collected information pertaining to the following 8 domains: patient population, intervention recipient, intervention content, delivery personnel, method of communication, intensity and complexity, environment and clinical outcomes (measures?). Results: We identified 8 programs fulfilling our operational definition of disease management. Programs targeted patients with diabetes, hypertension, heart failure, obesity, alcohol dependence, psychiatric disorders or breast cancer, and were mainly directed towards patients. The interventions were multifaceted and included education in almost all cases. Half of the programs included regularly scheduled follow-up, by phone in 3 instances. Healthcare professionals involved were physicians, nurses, case managers, social workers, psychologists and dietitians. None fulfilled the 6 criteria established by the Disease Management Association of America. Conclusions: Our study shows that disease management programs, in a country with universal health insurance coverage and little incentive to develop new healthcare strategies, are scarce, although we may have missed existing programs. Nonetheless, those already implemented are very interesting and rather comprehensive. Appropriate evaluation of these programs should be performed in order to build upon them and try to design a generic disease management framework suited to the Swiss healthcare system.
Resumo:
MOTIVATION: Regulatory gene networks contain generic modules such as feedback loops that are essential for the regulation of many biological functions. The study of the stochastic mechanisms of gene regulation is instrumental for the understanding of how cells maintain their expression at levels commensurate with their biological role, as well as to engineer gene expression switches of appropriate behavior. The lack of precise knowledge on the steady-state distribution of gene expression requires the use of Gillespie algorithms and Monte-Carlo approximations. METHODOLOGY: In this study, we provide new exact formulas and efficient numerical algorithms for computing/modeling the steady-state of a class of self-regulated genes, and we use it to model/compute the stochastic expression of a gene of interest in an engineered network introduced in mammalian cells. The behavior of the genetic network is then analyzed experimentally in living cells. RESULTS: Stochastic models often reveal counter-intuitive experimental behaviors, and we find that this genetic architecture displays a unimodal behavior in mammalian cells, which was unexpected given its known bimodal response in unicellular organisms. We provide a molecular rationale for this behavior, and we implement it in the mathematical picture to explain the experimental results obtained from this network.
Resumo:
This article examines the extent and limits of non-state forms of authority in international relations. It analyses how the information and communication technology (ICT) infrastructure for the tradability of services in a global knowledge-based economy relies on informal regulatory practices for adjustment of ICT-related skills. Companies and associations provide training and certification programmes as part of a growing market for educational services setting their own standards. The existing literature on non-conventional forms of authority in the global political economy has emphasised that the consent of actors subject to informal rules and explicit or implicit state recognition remains crucial for the effectiveness of those new forms of power. However, analyses based on a limited sample of actors tend toward a narrow understanding of the issues and fail to fully explore the differentiated space in which non-state authority is emerging. This paper examines the form of authority underpinning the global knowledge-based economy within the broader perspective of the issues likely to be standardised by technical ICT specification, the wide range of actors involved, and the highly differentiated space where standards become authoritative. The empirical findings highlight the role of different private actors in establishing international educational norms in this field. They also pinpoint the limits of profit-oriented standard-settings, notably with regard to generic norms.
Resumo:
General introductionThe Human Immunodeficiency/Acquired Immunodeficiency Syndrome (HIV/AIDS) epidemic, despite recent encouraging announcements by the World Health Organization (WHO) is still today one of the world's major health care challenges.The present work lies in the field of health care management, in particular, we aim to evaluate the behavioural and non-behavioural interventions against HIV/AIDS in developing countries through a deterministic simulation model, both in human and economic terms. We will focus on assessing the effectiveness of the antiretroviral therapies (ART) in heterosexual populations living in lesser developed countries where the epidemic has generalized (formerly defined by the WHO as type II countries). The model is calibrated using Botswana as a case study, however our model can be adapted to other countries with similar transmission dynamics.The first part of this thesis consists of reviewing the main mathematical concepts describing the transmission of infectious agents in general but with a focus on human immunodeficiency virus (HIV) transmission. We also review deterministic models assessing HIV interventions with a focus on models aimed at African countries. This review helps us to recognize the need for a generic model and allows us to define a typical structure of such a generic deterministic model.The second part describes the main feed-back loops underlying the dynamics of HIV transmission. These loops represent the foundation of our model. This part also provides a detailed description of the model, including the various infected and non-infected population groups, the type of sexual relationships, the infection matrices, important factors impacting HIV transmission such as condom use, other sexually transmitted diseases (STD) and male circumcision. We also included in the model a dynamic life expectancy calculator which, to our knowledge, is a unique feature allowing more realistic cost-efficiency calculations. Various intervention scenarios are evaluated using the model, each of them including ART in combination with other interventions, namely: circumcision, campaigns aimed at behavioral change (Abstain, Be faithful or use Condoms also named ABC campaigns), and treatment of other STD. A cost efficiency analysis (CEA) is performed for each scenario. The CEA consists of measuring the cost per disability-adjusted life year (DALY) averted. This part also describes the model calibration and validation, including a sensitivity analysis.The third part reports the results and discusses the model limitations. In particular, we argue that the combination of ART and ABC campaigns and ART and treatment of other STDs are the most cost-efficient interventions through 2020. The main model limitations include modeling the complexity of sexual relationships, omission of international migration and ignoring variability in infectiousness according to the AIDS stage.The fourth part reviews the major contributions of the thesis and discusses model generalizability and flexibility. Finally, we conclude that by selecting the adequate interventions mix, policy makers can significantly reduce the adult prevalence in Botswana in the coming twenty years providing the country and its donors can bear the cost involved.Part I: Context and literature reviewIn this section, after a brief introduction to the general literature we focus in section two on the key mathematical concepts describing the transmission of infectious agents in general with a focus on HIV transmission. Section three provides a description of HIV policy models, with a focus on deterministic models. This leads us in section four to envision the need for a generic deterministic HIV policy model and briefly describe the structure of such a generic model applicable to countries with generalized HIV/AIDS epidemic, also defined as pattern II countries by the WHO.
Resumo:
BACKGROUND: Little is known about the health status of prisoners in Switzerland. The aim of this study was to provide a detailed description of the health problems presented by detainees in Switzerland's largest remand prison. METHODS: In this retrospective cross-sectional study we reviewed the health records of all detainees leaving Switzerland's largest remand prison in 2007. The health problems were coded using the International Classification for Primary Care (ICPC-2). Analyses were descriptive, stratified by gender. RESULTS: A total of 2195 health records were reviewed. Mean age was 29.5 years (SD 9.5); 95% were male; 87.8% were migrants. Mean length of stay was 80 days (SD 160). Illicit drug use (40.2%) and mental health problems (32.6%) were frequent, but most of these detainees (57.6%) had more generic primary care problems, such as skin (27.0%), infectious diseases (23.5%), musculoskeletal (19.2%), injury related (18.3%), digestive (15.0%) or respiratory problems (14.0%). Furthermore, 7.9% reported exposure to violence during arrest by the police. CONCLUSION: Morbidity is high in this young, predominantly male population of detainees, in particular in relation to substance abuse. Other health problems more commonly seen in general practice are also frequent. These findings support the further development of coordinated primary care and mental health services within detention centers.
Resumo:
In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.