949 resultados para Optimal vaccine distribution
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.
Resumo:
Particulates with specific sizes and characteristics can induce potent immune responses by promoting antigen uptake of appropriate immuno-stimulatory cell types. Magnetite (Fe3O4) nanoparticles have shown many potential bioapplications due to their biocompatibility and special characteristics. Here, superparamagnetic Fe3O4 nanoparticles (SPIONs) with high magnetization value (70emug-1) were stabilized with trisodium citrate and successfully conjugated with a model antigen (ovalbumin, OVA) via N,N'-carbonyldiimidazole (CDI) mediated reaction, to achieve a maximum conjugation capacity at approximately 13μgμm-2. It was shown that different mechanisms governed the interactions between the OVA molecules and magnetite nanoparticles at different pH conditions. We evaluated as-synthesized SPION against commercially available magnetite nanoparticles. The cytotoxicity of these nanoparticles was investigated using mammalian cells. The reported CDI-mediated reaction can be considered as a potential approach in conjugating biomolecules onto magnetite or other biodegradable nanoparticles for vaccine delivery.
Resumo:
Monash University in Australia has developed a new approach towards DNA vaccine development that has the potential to cut the time it takes to produce a vaccine from up to nine months to four weeks or less. The university has designed and filed a patent on a commercially viable, single-stage technology for manufacturing DNA molecules. The technology was used to produce malaria and measles DNA vaccines, which were tested to be homogeneous supercoiled DNA, free from RNA and protein contaminations and meeting FDA regulatory standards for DNA vaccines. The technique is based on customized, smart, polymeric, monolithic adsorbents that can purify DNA very rapidly. The design criteria of solid-phase adsorbent include rapid adsorption and desorption kinetics, physical composition, and adequate selectivity , capacity and recovery. The new show technology significantly improved binding capacities, higher recovery, drastically reduced use of buffers and processing time, less clogging, and higher yields of DNA.
Resumo:
Malaria is a global health problem; an effective vaccine is urgently needed. Due to the relative poverty and lack of infrastructure in malaria endemic areas, DNA-based vaccines that are stable at ambient temperatures and easy to formulate have great potential. While attention has been focused mainly on antigen selection, vector design and efficacy assessment, the development of a rapid and commercially viable process to manufacture DNA is generally overlooked. We report here a continuous purification technique employing an optimized stationary adsorbent to allow high-vaccine recovery, low-processing time, and, hence, high-productivity. A 40.0 mL monolithic stationary phase was synthesized and functionalized with amino groups from 2-Chloro-N,N- diethylethylamine hydrochloride for anion-exchange isolation of a plasmid DNA (pDNA) that encodes a malaria vaccine candidate, VR1020-PyMSP4/5. Physical characterization of the monolithic polymer showed a macroporous material with a modal pore diameter of 750 nm. The final vaccine product isolated after 3 min elution was homogeneous supercoiled plasmid with gDNA, RNA and protein levels in keeping with clinical regulatory standards. Toxicological studies of the pVR1020-PyMSP4/5 showed a minimum endotoxin level of 0.28 EU/m.g pDNA. This cost-effective technique is cGMP compatible and highly scalable for the production of DNA-based vaccines in commercial quantities, when such vaccines prove to be effective against malaria. © 2008 American Institute of Chemical Engineers.
Resumo:
Infectious diseases such as SARS, influenza and bird flu have the potential to cause global pandemics; a key intervention will be vaccination. Hence, it is imperative to have in place the capacity to create vaccines against new diseases in the shortest time possible. In 2004, The Institute of Medicine asserted that the world is tottering on the verge of a colossal influenza outbreak. The institute stated that, inadequate production system for influenza vaccines is a major obstruction in the preparation towards influenza outbreaks. Because of production issues, the vaccine industry is facing financial and technological bottlenecks: In October 2004, the FDA was caught off guard by the shortage of flu vaccine, caused by a contamination in a US-based plant (Chiron Corporation), one of the only two suppliers of US flu vaccine. Due to difficulties in production and long processing times, the bulk of the world's vaccine production comes from very small number of companies compared to the number of companies producing drugs. Conventional vaccines are made of attenuated or modified forms of viruses. Relatively high and continuous doses are administered when a non-viable vaccine is used and the overall protective immunity obtained is ephemeral. The safety concerns of viral vaccines have propelled interest in creating a viable replacement that would be more effective and safer to use.
Resumo:
Plasmid DMA offers the promise of a new generation of pharmaceuticals that will address the often overlooked issue of vaccine production by offering a simple and reproducible method for producing a vaccine. Through reverse engineering, production could be reduced from up to 9 months to as little as 1 month. Simplified development and faster turn-around times means that DMA offers a solution to the vaccine crisis and will help to contain future viral outbreaks by enabling the production of a vaccine against new viral strains in the shortest possible time. Work currently being completed in the area of plasmid DMA production, purification and encapsulation will be presented.
Resumo:
A major drawback to the immunological potency of conventional vaccines, resulting in reduced level of immune responses, tissue injury, shock and high cytotoxicity, thus making their applications contraindicated in immunodeficiency diseases, is the presence of high contaminant concentrations in vaccine titers. Vaccine contamination arises from the simultaneous occurrence of competitive pathways resulting in the formation of other bio-products during cellular metabolism aside the pathways necessary for the production of vaccine molecules. One of such vaccine contaminating molecules is endotoxins which are mainly lipopolysaccharides (LPS) complexes found in the membrane of bacterial cell wall. The structural dynamics of these molecules make their removal from vaccine titers problematic, thus making vaccine endotoxin removal a major research endeavour. This presentation will discuss a novel technique for reducing the endotoxin level of vaccines. The technique commences with the disentanglement of endotoxin-vaccine molecular bonding and then capturing the vaccine molecules on an affinity monolith to separate the vaccine molecules from the endotoxins.
Resumo:
Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.
Resumo:
An expanding education market targeted through ‘bridging material’ enabling cineliteracies has the potential to offer Australian producers with increased distribution opportunities, educators with targeted teaching aids and students with enhanced learning outcomes. For Australian documentary producers, the key to unlocking the potential of the education sector is engaging with its curriculum-based requirements at the earliest stages of pre-production. Two key mechanisms can lead to effective educational engagement; the established area of study guides produced in association with the Australian Teachers of Media (ATOM) and the emerging area of philanthropic funding coordinated by the Documentary Australia Foundation (DAF). DAF has acted as a key financial and cultural philanthropic bridge between individuals, foundations, corporations and the Australian documentary sector for over 14 years. DAF does not make or commission films but through management and receipt of grants and donations provides ‘expertise, information, guidance and resources to help each sector work together to achieve their goals’. The DAF application process also requires film-makers to detail their ‘Education and Outreach Strategy’ for each film with 582 films registered and 39 completed as of June 2014. These education strategies that can range from detailed to cursory efforts offer valuable insights into the Australian documentary sector's historical and current expectations of education as a receptive and dynamic audience for quality factual content. A recurring film-maker education strategy found in the DAF data is an engagement with ATOM to create a study guide for their film. This study guide then acts as a ‘bridging material’ between content and education audience. The frequency of this effort suggests these study guides enable greater educator engagement with content and increased interest and distribution of the film to educators. The paper Education paths for documentary distribution: DAF, ATOM and the study guides that bind them will address issues arising out of the changing needs of the education sector and the impact targeting ‘cineliteracy’ outcomes may have for Australian documentary distribution.
Resumo:
Overvoltage and overloading due to high utilization of PVs are the main power quality concerns for future distribution power systems. This paper proposes a distributed control coordination strategy to manage multiple PVs within a network to overcome these issues. PVs reactive power is used to deal with over-voltages and PVs active power curtailment are regulated to avoid overloading. The proposed control structure is used to share the required contribution fairly among PVs, in proportion to their ratings. This approach is examined on a practical distribution network with multiple PVs.
Resumo:
Money is often a limiting factor in conservation, and attempting to conserve endangered species can be costly. Consequently, a framework for optimizing fiscally constrained conservation decisions for a single species is needed. In this paper we find the optimal budget allocation among isolated subpopulations of a threatened species to minimize local extinction probability. We solve the problem using stochastic dynamic programming, derive a useful and simple alternative guideline for allocating funds, and test its performance using forward simulation. The model considers subpopulations that persist in habitat patches of differing quality, which in our model is reflected in different relationships between money invested and extinction risk. We discover that, in most cases, subpopulations that are less efficient to manage should receive more money than those that are more efficient to manage, due to higher investment needed to reduce extinction risk. Our simple investment guideline performs almost as well as the exact optimal strategy. We illustrate our approach with a case study of the management of the Sumatran tiger, Panthera tigris sumatrae, in Kerinci Seblat National Park (KSNP), Indonesia. We find that different budgets should be allocated to the separate tiger subpopulations in KSNP. The subpopulation that is not at risk of extinction does not require any management investment. Based on the combination of risks of extinction and habitat quality, the optimal allocation for these particular tiger subpopulations is an unusual case: subpopulations that occur in higher-quality habitat (more efficient to manage) should receive more funds than the remaining subpopulation that is in lower-quality habitat. Because the yearly budget allocated to the KSNP for tiger conservation is small, to guarantee the persistence of all the subpopulations that are currently under threat we need to prioritize those that are easier to save. When allocating resources among subpopulations of a threatened species, the combined effects of differences in habitat quality, cost of action, and current subpopulation probability of extinction need to be integrated. We provide a useful guideline for allocating resources among isolated subpopulations of any threatened species. © 2010 by the Ecological Society of America.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.