891 resultados para Resource Constrained


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Les efforts globaux pour contrôler la tuberculose sont présentement restreints par la prévalence croissante du VIH/SIDA. Quoique les éclosions de la tuberculose multi résistante (TB-MDR) soient fréquemment rapportées parmi les populations atteintes du SIDA, le lien entre VIH/SIDA et le développement de résistance n’est pas clair. Objectifs: Cette recherche visait à : (1) développer une base de connaissances concernant les facteurs associés à des éclosions de la TB-MDR parmi les patients atteints du VIH/SIDA; (2) utiliser ce cadre de connaissances pour accroître des mesures préliminaires pour mieux contrôler la tuberculose pulmonaire chez les patients atteints du VIH/SIDA; et (3) afin d’améliorer l’application des ces mesures, affiner les techniques bactériologiques existantes pour Mycobacterium tuberculosis. Méthodologie: Quatre études ont été réalisées : (1) Une étude longitudinale pour identifier les facteurs associés avec une éclosion de la TB-MDR parmi les patients atteints du SIDA qui ont reçu le traitement directement supervisé de courte durée (DOTS) pour la tuberculose pulmonaire au Lima et au Pérou entre 1999 et 2005; (2) Une étude transversale pour décrire différentes étapes de l’histoire naturelle de la tuberculose, la prévalence et les facteurs associés avec la mycobactérie qu’on retrouve dans les selles des patients atteints du SIDA; (3) Un projet pilote pour développer des stratégies de dépistage pour la tuberculose pulmonaire parmi les patients hospitalisés atteints du SIDA, en utilisant l’essaie Microscopic Observation Drug Susceptibility (MODS); et (4) Une étude laboratoire pour identifier les meilleures concentrations critiques pour détecter les souches MDR de M. tuberculosis en utilisant l’essaie MODS. Résultats : Étude 1 démontre qu’une épidémie de TB-MDR parmi les patients atteints du SIDA qui ont reçu DOTS pour la tuberculose pulmonaire ait été causée par la superinfection du clone de M. tuberculosis plutôt que le développement de la résistance secondaire. Bien que ce clone ait été plus commun parmi la cohorte de patients atteints du SIDA, il n’avait aucune différence de risque pour superinfection entre les patients avec ou sans SIDA. Ces résultats suggèrent qu’un autre facteur, possiblement associé à la diarrhée, peu contribuer à la prévalence élevée de ce clone chez les patients atteints du SIDA. Étude 2 suggère que chez la plupart des patients atteints du SIDA il a été retrouvé une mycobactérie dans leurs selles alors qu’ils étaient en phase terminale au niveau de la tuberculose pulmonaire. Or, les patients atteints du SIDA ayant été hospitalisés pendant les deux dernières années pour une autre condition médicale sont moins à risque de se retrouver avec une mycobactérie dans leurs selles. Étude 3 confirme que la tuberculose pulmonaire a été commune à tous les patients hospitalisés atteints du SIDA, mais diagnostiquée incorrectement en utilisant les critères cliniques présentement recommandés pour la tuberculose. Or, l’essaie MODS a détecté pour la plupart de ces cas. De plus, MODS a été également efficace quand la méthode a été dirigée aux patients soupçonnés d’avoir la tuberculose, à cause de leurs symptômes. Étude 4 démontre les difficultés de détecter les souches de M. tuberculosis avec une faible résistance contre ethambutol et streptomycine en utilisant l’essai MODS avec les concentrations de drogue présentement recommandées pour un milieu de culture. Cependant, l’utilité diagnostique de MODS peut être améliorée ; modifier les concentrations critiques et utiliser deux plaques et non une, pour des tests réguliers. Conclusion: Nos études soulèvent la nécessité d’améliorer le diagnostic et le traitement de la tuberculose parmi les patients atteints du SIDA, en particulier ceux qui vivent dans des régions avec moins de ressources. Par ailleurs, nos résultats font ressortir les effets indirects que les soins de santé ont sur les patients infectés par le VIH et qu’ils peuvent avoir sur le développement de la tuberculose.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Contexte : Pour les patients atteints d’une polyarthrite rhumatoïde débutante (PARD), l’utilisation de médicaments antirhumatismaux à longue durée d’action améliore les résultats pour les patients de manière significative. Les patients traités par un rhumatologue ont une plus grande probabilité de recevoir des traitements et donc d’avoir de meilleurs résultats de santé. Toutefois, les délais observés entre le début des symptômes et une première visite avec un rhumatologue sont souvent supérieurs à la recommandation de trois mois énoncée par les guides de pratiques. Au Québec, le temps d’attente pour voir un rhumatologue à la suite d’une demande de consultation est généralement long et contribue aux délais totaux. Objectifs : Nous avons évalué la capacité d’un programme d’accès rapide avec un triage effectué par une infirmière à correctement identifier les patients avec PARD et à réduire leur temps d’attente, dans le but d’améliorer le processus de soin. Méthodes : Une infirmière a évalué tous les nouveaux patients référés en 2009 et 2010 dans une clinique de rhumatologie située en banlieue de Montréal. Un niveau de priorité leur a été attribué sur la base du contenu de la demande de consultation, de l’information obtenue à la suite d’une entrevue téléphonique avec le patient et, si requis, d’un examen partiel des articulations. Les patients avec PARD, avec une arthrite inflammatoire non différentiée, ou atteints d’une autre pathologie rhumatologique aiguë étaient priorisés et obtenaient un rendez-vous le plus rapidement possible. Les principales mesures de résultat étudiées étaient la validité (sensibilité et spécificité) du triage pour les patients atteints de PARD ainsi que les délais entre la demande de consultation et la première visite avec un rhumatologue. Résultats : Parmi les 701 patients nouvellement référés, 65 ont eu un diagnostic final de PARD. Le triage a correctement identifié 85,9% de ces patients et a correctement identifié 87,2% des patients avec l’une des pathologies prioritaires. Le délai médian entre la demande de consultation et la première visite était de 22 jours pour les patients atteints de PARD et de 115 pour tous les autres. Discussion et conclusion : Ce programme d’accès rapide avec triage effectué par une infirmière a correctement identifié la plupart des patients atteints de PARD, lesquels ont pu être vus rapidement en consultation par le rhumatologue. Considérant qu’il s’agit d’un programme qui requiert beaucoup d’investissement de temps et de personnel, des enjeux de faisabilités doivent être résolus avant de pouvoir implanter un tel type de programme dans un système de soins de santé ayant des ressources très limitées.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data caching can remarkably improve the efficiency of information access in a wireless ad hoc network by reducing the access latency and bandwidth usage. The cache placement problem minimizes total data access cost in ad hoc networks with multiple data items. The ad hoc networks are multi hop networks without a central base station and are resource constrained in terms of channel bandwidth and battery power. By data caching the communication cost can be reduced in terms of bandwidth as well as battery energy. As the network node has limited memory the problem of cache placement is a vital issue. This paper attempts to study the existing cooperative caching techniques and their suitability in mobile ad hoc networks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the vision of Mark Weiser on ubiquitous computing, computers are disappearing from the focus of the users and are seamlessly interacting with other computers and users in order to provide information and services. This shift of computers away from direct computer interaction requires another way of applications to interact without bothering the user. Context is the information which can be used to characterize the situation of persons, locations, or other objects relevant for the applications. Context-aware applications are capable of monitoring and exploiting knowledge about external operating conditions. These applications can adapt their behaviour based on the retrieved information and thus to replace (at least a certain amount) the missing user interactions. Context awareness can be assumed to be an important ingredient for applications in ubiquitous computing environments. However, context management in ubiquitous computing environments must reflect the specific characteristics of these environments, for example distribution, mobility, resource-constrained devices, and heterogeneity of context sources. Modern mobile devices are equipped with fast processors, sufficient memory, and with several sensors, like Global Positioning System (GPS) sensor, light sensor, or accelerometer. Since many applications in ubiquitous computing environments can exploit context information for enhancing their service to the user, these devices are highly useful for context-aware applications in ubiquitous computing environments. Additionally, context reasoners and external context providers can be incorporated. It is possible that several context sensors, reasoners and context providers offer the same type of information. However, the information providers can differ in quality levels (e.g. accuracy), representations (e.g. position represented in coordinates and as an address) of the offered information, and costs (like battery consumption) for providing the information. In order to simplify the development of context-aware applications, the developers should be able to transparently access context information without bothering with underlying context accessing techniques and distribution aspects. They should rather be able to express which kind of information they require, which quality criteria this information should fulfil, and how much the provision of this information should cost (not only monetary cost but also energy or performance usage). For this purpose, application developers as well as developers of context providers need a common language and vocabulary to specify which information they require respectively they provide. These descriptions respectively criteria have to be matched. For a matching of these descriptions, it is likely that a transformation of the provided information is needed to fulfil the criteria of the context-aware application. As it is possible that more than one provider fulfils the criteria, a selection process is required. In this process the system has to trade off the provided quality of context and required costs of the context provider against the quality of context requested by the context consumer. This selection allows to turn on context sources only if required. Explicitly selecting context services and thereby dynamically activating and deactivating the local context provider has the advantage that also the resource consumption is reduced as especially unused context sensors are deactivated. One promising solution is a middleware providing appropriate support in consideration of the principles of service-oriented computing like loose coupling, abstraction, reusability, or discoverability of context providers. This allows us to abstract context sensors, context reasoners and also external context providers as context services. In this thesis we present our solution consisting of a context model and ontology, a context offer and query language, a comprehensive matching and mediation process and a selection service. Especially the matching and mediation process and the selection service differ from the existing works. The matching and mediation process allows an autonomous establishment of mediation processes in order to transfer information from an offered representation into a requested representation. In difference to other approaches, the selection service selects not only a service for a service request, it rather selects a set of services in order to fulfil all requests which also facilitates the sharing of services. The approach is extensively reviewed regarding the different requirements and a set of demonstrators shows its usability in real-world scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reviews the concept of “organic”, its meaning and emphasizes a comparison with conventional goods. It develops the background of organic goods in the past 20 years, quotations different definitions of organic and developing a main definition. Also it states certain criteriab and variables in order to develop a deeper business analysis. And it has the objective to define the advantages, disadvantages, key points and strategies for companies that want to venture an organic production, and if it’s recommendable to pursue. After a cross case and SWOT analysis it is possible to determine that depending of the core strategy and type of company if an enterprise can decide to venture the organic market.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Accompanying the call for increased evidence-based policy the developed world is implementing more longitudinal panel studies which periodically gather information about the same people over a number of years. Panel studies distinguish between transitory and persistent states (e.g. poverty, unemployment) and facilitate causal explanations of relationships between variables. However, they are complex and costly. A growing number of developing countries are now implementing or considering starting panel studies. The objectives of this paper are to identify challenges that arise in panel studies, and to give examples of how these have been addressed in resource-constrained environments. The main issues considered are: the development of a conceptual framework which links macro and micro contexts; sampling the cohort in a cost-effective way; tracking individuals; ethics and data management and analysis. Panel studies require long term funding, a stable institution and an acceptance that there will be limited value for money in terms of results from early stages, with greater benefits accumulating in the study's mature years. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The vulnerability of smallholder farmers to climate change and variability is increasingly rising. As agriculture is the only source of income for most of them, agricultural adaptation with respect to climate change is vital for their sustenance and to ensure food security. In order to develop appropriate strategies and institutional responses, it is necessary to have a clear understanding of the farmers’ perception of climate change, actual adaptations at farm-level and what factors drive and constrain their decision to adapt. Thus, this study investigates the farm-level adaptation to climate change based on the case of a farming community in Sri Lanka. The findings revealed that farmers’ perceived the ongoing climate change based on their experiences. Majority of them adopted measures to address climate change and variability. These adaptation measures can be categorised into five groups, such as crop management, land management, irrigation management, income diversification, and rituals. The results showed that management of non-climatic factors was an important strategy to enhance farmers’ adaptation, particularly in a resource-constrained smallholder farming context. The results of regression analysis indicated that human cognition was an important determinant of climate change adaptation. Social networks were also found to significantly influence adaptation. The study also revealed that social barriers, such as cognitive and normative factors, are equally important as other economic barriers to adaptation. While formulating and implementing the adaptation strategies, this study underscored the importance of understanding socio-economic, cognitive and normative aspects of the local communities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Supreme audit institutions (SAIs) have an important role in assessing value for money in the delivery of public services. Assessing value for money necessarily involves assessing counterfactuals: good value for money has been achieved if a policy could not reasonably have been delivered more efficiently, effectively, or economically. Operations research modelling has the potential to help in the assessment of these counterfactuals. However, is such modelling too arcane, complex, and technically burdensome for organisations that, like SAIs, operate in a time- and resource-constrained and politically charged environment? We report on three applications of modelling at the UK's SAI, the National Audit Office, in the context of studies on demand management in tax collection, end-of-life care, and health-care associated infections. In all cases, the models have featured in the audit reports and helped study teams come to a value-for-money judgment. We conclude that OR modelling is indeed a valuable addition to the value-for-money auditor's methodological tool box.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The term Ambient Intelligence (AmI) refers to a vision on the future of the information society where smart, electronic environment are sensitive and responsive to the presence of people and their activities (Context awareness). In an ambient intelligence world, devices work in concert to support people in carrying out their everyday life activities, tasks and rituals in an easy, natural way using information and intelligence that is hidden in the network connecting these devices. This promotes the creation of pervasive environments improving the quality of life of the occupants and enhancing the human experience. AmI stems from the convergence of three key technologies: ubiquitous computing, ubiquitous communication and natural interfaces. Ambient intelligent systems are heterogeneous and require an excellent cooperation between several hardware/software technologies and disciplines, including signal processing, networking and protocols, embedded systems, information management, and distributed algorithms. Since a large amount of fixed and mobile sensors embedded is deployed into the environment, the Wireless Sensor Networks is one of the most relevant enabling technologies for AmI. WSN are complex systems made up of a number of sensor nodes which can be deployed in a target area to sense physical phenomena and communicate with other nodes and base stations. These simple devices typically embed a low power computational unit (microcontrollers, FPGAs etc.), a wireless communication unit, one or more sensors and a some form of energy supply (either batteries or energy scavenger modules). WNS promises of revolutionizing the interactions between the real physical worlds and human beings. Low-cost, low-computational power, low energy consumption and small size are characteristics that must be taken into consideration when designing and dealing with WSNs. To fully exploit the potential of distributed sensing approaches, a set of challengesmust be addressed. Sensor nodes are inherently resource-constrained systems with very low power consumption and small size requirements which enables than to reduce the interference on the physical phenomena sensed and to allow easy and low-cost deployment. They have limited processing speed,storage capacity and communication bandwidth that must be efficiently used to increase the degree of local ”understanding” of the observed phenomena. A particular case of sensor nodes are video sensors. This topic holds strong interest for a wide range of contexts such as military, security, robotics and most recently consumer applications. Vision sensors are extremely effective for medium to long-range sensing because vision provides rich information to human operators. However, image sensors generate a huge amount of data, whichmust be heavily processed before it is transmitted due to the scarce bandwidth capability of radio interfaces. In particular, in video-surveillance, it has been shown that source-side compression is mandatory due to limited bandwidth and delay constraints. Moreover, there is an ample opportunity for performing higher-level processing functions, such as object recognition that has the potential to drastically reduce the required bandwidth (e.g. by transmitting compressed images only when something ‘interesting‘ is detected). The energy cost of image processing must however be carefully minimized. Imaging could play and plays an important role in sensing devices for ambient intelligence. Computer vision can for instance be used for recognising persons and objects and recognising behaviour such as illness and rioting. Having a wireless camera as a camera mote opens the way for distributed scene analysis. More eyes see more than one and a camera system that can observe a scene from multiple directions would be able to overcome occlusion problems and could describe objects in their true 3D appearance. In real-time, these approaches are a recently opened field of research. In this thesis we pay attention to the realities of hardware/software technologies and the design needed to realize systems for distributed monitoring, attempting to propose solutions on open issues and filling the gap between AmI scenarios and hardware reality. The physical implementation of an individual wireless node is constrained by three important metrics which are outlined below. Despite that the design of the sensor network and its sensor nodes is strictly application dependent, a number of constraints should almost always be considered. Among them: • Small form factor to reduce nodes intrusiveness. • Low power consumption to reduce battery size and to extend nodes lifetime. • Low cost for a widespread diffusion. These limitations typically result in the adoption of low power, low cost devices such as low powermicrocontrollers with few kilobytes of RAMand tenth of kilobytes of program memory with whomonly simple data processing algorithms can be implemented. However the overall computational power of the WNS can be very large since the network presents a high degree of parallelism that can be exploited through the adoption of ad-hoc techniques. Furthermore through the fusion of information from the dense mesh of sensors even complex phenomena can be monitored. In this dissertation we present our results in building several AmI applications suitable for a WSN implementation. The work can be divided into two main areas:Low Power Video Sensor Node and Video Processing Alghoritm and Multimodal Surveillance . Low Power Video Sensor Nodes and Video Processing Alghoritms In comparison to scalar sensors, such as temperature, pressure, humidity, velocity, and acceleration sensors, vision sensors generate much higher bandwidth data due to the two-dimensional nature of their pixel array. We have tackled all the constraints listed above and have proposed solutions to overcome the current WSNlimits for Video sensor node. We have designed and developed wireless video sensor nodes focusing on the small size and the flexibility of reuse in different applications. The video nodes target a different design point: the portability (on-board power supply, wireless communication), a scanty power budget (500mW),while still providing a prominent level of intelligence, namely sophisticated classification algorithmand high level of reconfigurability. We developed two different video sensor node: The device architecture of the first one is based on a low-cost low-power FPGA+microcontroller system-on-chip. The second one is based on ARM9 processor. Both systems designed within the above mentioned power envelope could operate in a continuous fashion with Li-Polymer battery pack and solar panel. Novel low power low cost video sensor nodes which, in contrast to sensors that just watch the world, are capable of comprehending the perceived information in order to interpret it locally, are presented. Featuring such intelligence, these nodes would be able to cope with such tasks as recognition of unattended bags in airports, persons carrying potentially dangerous objects, etc.,which normally require a human operator. Vision algorithms for object detection, acquisition like human detection with Support Vector Machine (SVM) classification and abandoned/removed object detection are implemented, described and illustrated on real world data. Multimodal surveillance: In several setup the use of wired video cameras may not be possible. For this reason building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. Energy efficiency for wireless smart camera networks is one of the major efforts in distributed monitoring and surveillance community. For this reason, building an energy efficient wireless vision network for monitoring and surveillance is one of the major efforts in the sensor network community. The Pyroelectric Infra-Red (PIR) sensors have been used to extend the lifetime of a solar-powered video sensor node by providing an energy level dependent trigger to the video camera and the wireless module. Such approach has shown to be able to extend node lifetime and possibly result in continuous operation of the node.Being low-cost, passive (thus low-power) and presenting a limited form factor, PIR sensors are well suited for WSN applications. Moreover techniques to have aggressive power management policies are essential for achieving long-termoperating on standalone distributed cameras needed to improve the power consumption. We have used an adaptive controller like Model Predictive Control (MPC) to help the system to improve the performances outperforming naive power management policies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this thesis we focus on optimization and simulation techniques applied to solve strategic, tactical and operational problems rising in the healthcare sector. At first we present three applications to Emilia-Romagna Public Health System (SSR) developed in collaboration with Agenzia Sanitaria e Sociale dell'Emilia-Romagna (ASSR), a regional center for innovation and improvement in health. Agenzia launched a strategic campaign aimed at introducing Operations Research techniques as decision making tools to support technological and organizational innovations. The three applications focus on forecast and fund allocation of medical specialty positions, breast screening program extension and operating theater planning. The case studies exploit the potential of combinatorial optimization, discrete event simulation and system dynamics techniques to solve resource constrained problem arising within Emilia-Romagna territory. We then present an application in collaboration with Dipartimento di Epidemiologia del Lazio that focuses on population demand of service allocation to regional emergency departments. Finally, a simulation-optimization approach, developed in collaboration with INESC TECH center of Porto, to evaluate matching policies for the kidney exchange problem is discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Opportunistic diseases caused by Human Immunodeficiency Virus (HIV) and Hepatitis B Virus (HBV) is an omnipresent global challenge. In order to manage these epidemics, we need to have low cost and easily deployable platforms at the point-of-care in high congestions regions like airports and public transit systems. In this dissertation we present our findings in using Localized Surface Plasmon Resonance (LSPR)-based detection of pathogens and other clinically relevant applications using microfluidic platforms at the point-of-care setting in resource constrained environment. The work presented here adopts the novel technique of LSPR to multiplex a lab-on-a-chip device capable of quantitatively detecting various types of intact viruses and its various subtypes, based on the principle of a change in wavelength occurring when metal nano-particle surface is modified with a specific surface chemistry allowing the binding of a desired pathogen to a specific antibody. We demonstrate the ability to detect and quantify subtype A, B, C, D, E, G and panel HIV with a specificity of down to 100 copies/mL using both whole blood sample and HIV-patient blood sample discarded from clinics. These results were compared against the gold standard Reverse Transcriptase Polymerase Chain Reaction (RT-qPCR). This microfluidic device has a total evaluation time for the assays of about 70 minutes, where 60 minutes is needed for the capture and 10 minutes for data acquisition and processing. This LOC platform eliminates the need for any sample preparation before processing. This platform is highly multiplexable as the same surface chemistry can be adapted to capture and detect several other pathogens like dengue virus, E. coli, M. Tuberculosis, etc.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES In resource-constrained settings, tuberculosis (TB) is a common opportunistic infection and cause of death in HIV-infected persons. TB may be present at the start of antiretroviral therapy (ART), but it is often under-diagnosed. We describe approaches to TB diagnosis and screening of TB in ART programs in low- and middle-income countries. METHODS AND FINDINGS We surveyed ART programs treating HIV-infected adults in sub-Saharan Africa, Asia and Latin America in 2012 using online questionnaires to collect program-level and patient-level data. Forty-seven sites from 26 countries participated. Patient-level data were collected on 987 adult TB patients from 40 sites (median age 34.7 years; 54% female). Sputum smear microscopy and chest radiograph were available in 47 (100%) sites, TB culture in 44 (94%), and Xpert MTB/RIF in 23 (49%). Xpert MTB/RIF was rarely available in Central Africa and South America. In sites with access to these diagnostics, microscopy was used in 745 (76%) patients diagnosed with TB, culture in 220 (24%), and chest X-ray in 688 (70%) patients. When free of charge culture was done in 27% of patients, compared to 21% when there was a fee (p = 0.033). Corresponding percentages for Xpert MTB/RIF were 26% and 15% of patients (p = 0.001). Screening practices for active disease before starting ART included symptom screening (46 sites, 98%), chest X-ray (38, 81%), sputum microscopy (37, 79%), culture (16, 34%), and Xpert MTB/RIF (5, 11%). CONCLUSIONS Mycobacterial culture was infrequently used despite its availability at most sites, while Xpert MTB/RIF was not generally available. Use of available diagnostics was higher when offered free of charge.