879 resultados para Scheduling algorithms and analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution and maturation of Cloud Computing created an opportunity for the emergence of new Cloud applications. High-performance Computing, a complex problem solving class, arises as a new business consumer by taking advantage of the Cloud premises and leaving the expensive datacenter management and difficult grid development. Standing on an advanced maturing phase, today’s Cloud discarded many of its drawbacks, becoming more and more efficient and widespread. Performance enhancements, prices drops due to massification and customizable services on demand triggered an emphasized attention from other markets. HPC, regardless of being a very well established field, traditionally has a narrow frontier concerning its deployment and runs on dedicated datacenters or large grid computing. The problem with common placement is mainly the initial cost and the inability to fully use resources which not all research labs can afford. The main objective of this work was to investigate new technical solutions to allow the deployment of HPC applications on the Cloud, with particular emphasis on the private on-premise resources – the lower end of the chain which reduces costs. The work includes many experiments and analysis to identify obstacles and technology limitations. The feasibility of the objective was tested with new modeling, architecture and several applications migration. The final application integrates a simplified incorporation of both public and private Cloud resources, as well as HPC applications scheduling, deployment and management. It uses a well-defined user role strategy, based on federated authentication and a seamless procedure to daily usage with balanced low cost and performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbonic anhydrases are enzymes that are ubiquitously found in all organisms that are engaged in catalyzing the hydration of carbon dioxide to form bicarbonate and proton and vice versa. They are crucial in the process of respiration, bone resorption, pH regulation, ion transport, and photosynthesis in plants. Out of the five classes of carbonic anhydrase α, β, γ, δ, ζ this study focused in the α carbonic anhydrases. This class of CAs constitute of 16 subfamilies in mammals that include 3 non-active enzymes known as Carbonic Anhydrase Related Proteins. The inactiveness of these enzymes is due to the loss of one or more Histidine residues in the active site. This thesis was conducted based on the aim of studying evolutionary analysis of carbonic anhydrase sequences from organisms spanning from the Cambrian age. It was carried out in two phases. The first phase was the sequence collection, which involved many biological sequence databases as a source. The scope of this segment included sequence alignments and analysis of the sequence manually and in an automated form incorporating few analysis tools. The second Phase was phylogenetic analysis and exploring the subcellular location of the proteins, which was key for the evolutionary analysis. Through the medium of the methods conducted with respect to the phases mentioned above, it was possible to accomplish the desired result. Certain thought-provoking sequences were come across and analyzed thoroughly. Whereas, Phylogenetics showed interesting results to bolster previous findings and new findings as well which lay bedrock for future intensified studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding how biodiversity spatially distribute over both the short term and long term, and what factors are affecting the distribution, are critical for modeling the spatial pattern of biodiversity as well as for promoting effective conservation planning and practices. This dissertation aims to examine factors that influence short-term and long-term avian distribution from the geographical sciences perspective. The research develops landscape level habitat metrics to characterize forest height heterogeneity and examines their efficacies in modelling avian richness at the continental scale. Two types of novel vegetation-height-structured habitat metrics are created based on second order texture algorithms and the concepts of patch-based habitat metrics. I correlate the height-structured metrics with the richness of different forest guilds, and also examine their efficacies in multivariate richness models. The results suggest that height heterogeneity, beyond canopy height alone, supplements habitat characterization and richness models of two forest bird guilds. The metrics and models derived in this study demonstrate practical examples of utilizing three-dimensional vegetation data for improved characterization of spatial patterns in species richness. The second and the third projects focus on analyzing centroids of avian distributions, and testing hypotheses regarding the direction and speed of these shifts. I first showcase the usefulness of centroids analysis for characterizing the distribution changes of a few case study species. Applying the centroid method on 57 permanent resident bird species, I show that multi-directional distribution shifts occurred in large number of studied species. I also demonstrate, plain birds are not shifting their distribution faster than mountain birds, contrary to the prediction based on climate change velocity hypothesis. By modelling the abundance change rate at regional level, I show that extreme climate events and precipitation measures associate closely with some of the long-term distribution shifts. This dissertation improves our understanding on bird habitat characterization for species richness modelling, and expands our knowledge on how avian populations shifted their ranges in North America responding to changing environments in the past four decades. The results provide an important scientific foundation for more accurate predictive species distribution modeling in future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Betanodavirus infections have a significant impact through direct losses and trade restrictions for aquaculture sectors in Australia. The giant grouper, Epinephelus lanceolatus, is a high-value, fast-growing species with significant aquaculture potential. With subacute to chronic mortalities reported from a commercial aquaculture facility in northern Queensland, the viral nervous necrosis in the affected fish was confirmed using a RT-qPCR followed by virus isolation using the SSN-1 cell line. The RNA1 and RNA2 segments were sequenced and nucleotide sequences were compared with betanodavirus sequences from GenBank. Phylogenetic analysis revealed that both these sequences clustered with sequences representing red spotted grouper nervous necrosis virus genotype and showed high sequence identity to virus sequences affecting other grouper species. This is the first report confirming infection by betanodavirus in E. lanceolatus from Australia with successful isolation of the virus in a cell culture system, and analysis of nearly full length RNA1 and RNA2 sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of end customer quality complaints with direct relationship with automotive components has presented negative trend at European level for the entire automotive industry. Thus, this research proposal is to concentrate efforts on the most important items of Pareto chart and understand the failure type and the mechanism involved, link and impact of the project and parameters on the process, ending it with the development of one of the company’s most desired tool, that hosted this project – European methodology of terminals defects classification, and listing real opportunities for improvement based on measurement and analysis of actual data. Through the development of terminals defects classification methodology, which is considered a valuable asset to the company, all the other companies of the YAZAKI’s group will be able to characterize terminals as brittle or ductile, in order to put in motion, more efficiently, all the other different existing internal procedures for the safeguarding of the components, improving manufacturing efficiency. Based on a brief observation, nothing can be said in absolute sense, concerning the failure causes. Base materials, project, handling during manufacture and storage, as well as the cold work performed by plastic deformation, all play an important role. However, it was expected that this failure has been due to a combination of factors, in detriment of the existence of a single cause. In order to acquire greater knowledge about this problem, unexplored by the company up to the date of commencement of this study, was conducted a thorough review of existing literature on the subject, real production sites were visited and, of course, the actual parts were tested in lab environment. To answer to many of the major issues raised throughout the investigation, were used extensively some theoretical concepts focused on the literature review, with a view to realizing the relationship existing between the different parameters concerned. Should here be stated that finding technical studies on copper and its alloys is really hard, not being given all the desirable information. This investigation has been performed as a YAZAKI Europe Limited Company project and as a Master Thesis for Instituto Superior de Engenharia do Porto, conducted during 9 months between 2012/2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation research points out major challenging problems with current Knowledge Organization (KO) systems, such as subject gateways or web directories: (1) the current systems use traditional knowledge organization systems based on controlled vocabulary which is not very well suited to web resources, and (2) information is organized by professionals not by users, which means it does not reflect intuitively and instantaneously expressed users’ current needs. In order to explore users’ needs, I examined social tags which are user-generated uncontrolled vocabulary. As investment in professionally-developed subject gateways and web directories diminishes (support for both BUBL and Intute, examined in this study, is being discontinued), understanding characteristics of social tagging becomes even more critical. Several researchers have discussed social tagging behavior and its usefulness for classification or retrieval; however, further research is needed to qualitatively and quantitatively investigate social tagging in order to verify its quality and benefit. This research particularly examined the indexing consistency of social tagging in comparison to professional indexing to examine the quality and efficacy of tagging. The data analysis was divided into three phases: analysis of indexing consistency, analysis of tagging effectiveness, and analysis of tag attributes. Most indexing consistency studies have been conducted with a small number of professional indexers, and they tended to exclude users. Furthermore, the studies mainly have focused on physical library collections. This dissertation research bridged these gaps by (1) extending the scope of resources to various web documents indexed by users and (2) employing the Information Retrieval (IR) Vector Space Model (VSM) - based indexing consistency method since it is suitable for dealing with a large number of indexers. As a second phase, an analysis of tagging effectiveness with tagging exhaustivity and tag specificity was conducted to ameliorate the drawbacks of consistency analysis based on only the quantitative measures of vocabulary matching. Finally, to investigate tagging pattern and behaviors, a content analysis on tag attributes was conducted based on the FRBR model. The findings revealed that there was greater consistency over all subjects among taggers compared to that for two groups of professionals. The analysis of tagging exhaustivity and tag specificity in relation to tagging effectiveness was conducted to ameliorate difficulties associated with limitations in the analysis of indexing consistency based on only the quantitative measures of vocabulary matching. Examination of exhaustivity and specificity of social tags provided insights into particular characteristics of tagging behavior and its variation across subjects. To further investigate the quality of tags, a Latent Semantic Analysis (LSA) was conducted to determine to what extent tags are conceptually related to professionals’ keywords and it was found that tags of higher specificity tended to have a higher semantic relatedness to professionals’ keywords. This leads to the conclusion that the term’s power as a differentiator is related to its semantic relatedness to documents. The findings on tag attributes identified the important bibliographic attributes of tags beyond describing subjects or topics of a document. The findings also showed that tags have essential attributes matching those defined in FRBR. Furthermore, in terms of specific subject areas, the findings originally identified that taggers exhibited different tagging behaviors representing distinctive features and tendencies on web documents characterizing digital heterogeneous media resources. These results have led to the conclusion that there should be an increased awareness of diverse user needs by subject in order to improve metadata in practical applications. This dissertation research is the first necessary step to utilize social tagging in digital information organization by verifying the quality and efficacy of social tagging. This dissertation research combined both quantitative (statistics) and qualitative (content analysis using FRBR) approaches to vocabulary analysis of tags which provided a more complete examination of the quality of tags. Through the detailed analysis of tag properties undertaken in this dissertation, we have a clearer understanding of the extent to which social tagging can be used to replace (and in some cases to improve upon) professional indexing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ancient starch analysis is a microbotanical method in which starch granules are extracted from archaeological residues and the botanical source is identified. The method is an important addition to established palaeoethnobotanical research, as it can reveal ancient microremains of starchy staples such as cereal grains and seeds. In addition, starch analysis can detect starch originating from underground storage organs, which are rarely discovered using other methods. Because starch is tolerant of acidic soils, unlike most organic matter, starch analysis can be successful in northern boreal regions. Starch analysis has potential in the study of cultivation, plant domestication, wild plant usage and tool function, as well as in locating activity areas at sites and discovering human impact on the environment. The aim of this study was to experiment with the starch analysis method in Finnish and Estonian archaeology by building a starch reference collection from cultivated and native plant species, by developing sampling, measuring and analysis protocols, by extracting starch residues from archaeological artefacts and soils, and by identifying their origin. The purpose of this experiment was to evaluate the suitability of the method for the study of subsistence strategies in prehistoric Finland and Estonia. A total of 64 archaeological samples were analysed from four Late Neolithic sites in Finland and Estonia, with radiocarbon dates ranging between 2904 calBC and 1770 calBC. The samples yielded starch granules, which were compared with the starch reference collection and descriptions in the literature. Cereal-type starch was identified from the Finnish Kiukainen culture site and from the Estonian Corded Ware site. The samples from the Finnish Corded Ware site yielded underground storage organ starch, which may be the first evidence of the use of rhizomes as food in Finland. No cereal-type starch was observed. Although the sample sets were limited, the experiment confirmed that starch granules have been preserved well in the archaeological material of Finland and Estonia, and that differences between subsistence patterns, as well as evidence of cultivation and wild plant gathering, can be discovered using starch analysis. By collecting large sample sets and addressing the three most important issues – preventing contamination, collecting adequate references and understanding taphonomic processes – starch analysis can substantially contribute to research on ancient subsistence in Finland and Estonia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With global markets and global competition, pressures are placed on manufacturing organizations to compress order fulfillment times, meet delivery commitments consistently and also maintain efficiency in operations to address cost issues. This chapter argues for a process perspective on planning, scheduling and control that integrates organizational planning structures, information systems as well as human decision makers. The chapter begins with a reconsideration of the gap between theory and practice, in particular for classical scheduling theory and hierarchical production planning and control. A number of the key studies of industrial practice are then described and their implications noted. A recent model of scheduling practice derived from a detailed study of real businesses is described. Socio-technical concepts are then introduced and their implications for the design and management of planning, scheduling and control systems are discussed. The implications of adopting a process perspective are noted along with insights from knowledge management. An overview is presented of a methodology for the (re-)design of planning, scheduling and control systems that integrates organizational, system and human perspectives. The most important messages from the chapter are then summarized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phylogenetic inference consist in the search of an evolutionary tree to explain the best way possible genealogical relationships of a set of species. Phylogenetic analysis has a large number of applications in areas such as biology, ecology, paleontology, etc. There are several criterias which has been defined in order to infer phylogenies, among which are the maximum parsimony and maximum likelihood. The first one tries to find the phylogenetic tree that minimizes the number of evolutionary steps needed to describe the evolutionary history among species, while the second tries to find the tree that has the highest probability of produce the observed data according to an evolutionary model. The search of a phylogenetic tree can be formulated as a multi-objective optimization problem, which aims to find trees which satisfy simultaneously (and as much as possible) both criteria of parsimony and likelihood. Due to the fact that these criteria are different there won't be a single optimal solution (a single tree), but a set of compromise solutions. The solutions of this set are called "Pareto Optimal". To find this solutions, evolutionary algorithms are being used with success nowadays.This algorithms are a family of techniques, which aren’t exact, inspired by the process of natural selection. They usually find great quality solutions in order to resolve convoluted optimization problems. The way this algorithms works is based on the handling of a set of trial solutions (trees in the phylogeny case) using operators, some of them exchanges information between solutions, simulating DNA crossing, and others apply aleatory modifications, simulating a mutation. The result of this algorithms is an approximation to the set of the “Pareto Optimal” which can be shown in a graph with in order that the expert in the problem (the biologist when we talk about inference) can choose the solution of the commitment which produces the higher interest. In the case of optimization multi-objective applied to phylogenetic inference, there is open source software tool, called MO-Phylogenetics, which is designed for the purpose of resolving inference problems with classic evolutionary algorithms and last generation algorithms. REFERENCES [1] C.A. Coello Coello, G.B. Lamont, D.A. van Veldhuizen. Evolutionary algorithms for solving multi-objective problems. Spring. Agosto 2007 [2] C. Zambrano-Vega, A.J. Nebro, J.F Aldana-Montes. MO-Phylogenetics: a phylogenetic inference software tool with multi-objective evolutionary metaheuristics. Methods in Ecology and Evolution. En prensa. Febrero 2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quest for robust heuristics that are able to solve more than one problem is ongoing. In this paper, we present, discuss and analyse a technique called Evolutionary Squeaky Wheel Optimisation and apply it to two different personnel scheduling problems. Evolutionary Squeaky Wheel Optimisation improves the original Squeaky Wheel Optimisation’s effectiveness and execution speed by incorporating two additional steps (Selection and Mutation) for added evolution. In the Evolutionary Squeaky Wheel Optimisation, a cycle of Analysis-Selection-Mutation-Prioritization-Construction continues until stopping conditions are reached. The aim of the Analysis step is to identify below average solution components by calculating a fitness value for all components. The Selection step then chooses amongst these underperformers and discards some probabilistically based on fitness. The Mutation step further discards a few components at random. Solutions can become incomplete and thus repairs may be required. The repair is carried out by using the Prioritization step to first produce priorities that determine an order by which the following Construction step then schedules the remaining components. Therefore, improvements in the Evolutionary Squeaky Wheel Optimisation is achieved by selective solution disruption mixed with iterative improvement and constructive repair. Strong experimental results are reported on two different domains of personnel scheduling: bus and rail driver scheduling and hospital nurse scheduling.