914 resultados para Load analyser
Resumo:
A parallel method for dynamic partitioning of unstructured meshes is described. The method employs a new iterative optimisation technique which both balances the workload and attempts to minimise the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that the algorithm provides partitions of an equivalent or higher quality to static partitioners (which do not reuse the existing partition) and much more quickly. Perhaps more importantly, the algorithm results in only a small fraction of the amount of data migration compared to the static partitioners.
Resumo:
In parallel adaptive finite element simulations the work load on the individual processors may change frequently. To (re)distribute the load evenly over the processors a load balancing heuristic is needed. Common strategies try to minimise subdomain dependencies by optimising the cutsize of the partitioning. However for certain solvers cutsize only plays a minor role, and their convergence is highly dependent on the subdomain shapes. Degenerated subdomain shapes cause them to need significantly more iterations to converge. In this work a new parallel load balancing strategy is introduced which directly addresses the problem of generating and conserving reasonably good subdomain shapes in a dynamically changing Finite Element Simulation. Geometric data is used to formulate several cost functions to rate elements in terms of their suitability to be migrated. The well known diffusive method which calculates the necessary load flow is enhanced by weighting the subdomain edges with the help of these cost functions. The proposed methods have been tested and results are presented.
Resumo:
We present a dynamic distributed load balancing algorithm for parallel, adaptive finite element simulations using preconditioned conjugate gradient solvers based on domain-decomposition. The load balancer is designed to maintain good partition aspect ratios. It calculates a balancing flow using different versions of diffusion and a variant of breadth first search. Elements to be migrated are chosen according to a cost function aiming at the optimization of subdomain shapes. We show how to use information from the second step to guide the first. Experimental results using Bramble's preconditioner and comparisons to existing state-of-the-art balancers show the benefits of the construction.
Resumo:
As the complexity of parallel applications increase, the performance limitations resulting from computational load imbalance become dominant. Mapping the problem space to the processors in a parallel machine in a manner that balances the workload of each processors will typically reduce the run-time. In many cases the computation time required for a given calculation cannot be predetermined even at run-time and so static partition of the problem returns poor performance. For problems in which the computational load across the discretisation is dynamic and inhomogeneous, for example multi-physics problems involving fluid and solid mechanics with phase changes, the workload for a static subdomain will change over the course of a computation and cannot be estimated beforehand. For such applications the mapping of loads to process is required to change dynamically, at run-time in order to maintain reasonable efficiency. The issue of dynamic load balancing are examined in the context of PHYSICA, a three dimensional unstructured mesh multi-physics continuum mechanics computational modelling code.
Resumo:
A method is outlined for optimising graph partitions which arise in mapping unstructured mesh calculations to parallel computers. The method employs a relative gain iterative technique to both evenly balance the workload and minimise the number and volume of interprocessor communications. A parallel graph reduction technique is also briefly described and can be used to give a global perspective to the optimisation. The algorithms work efficiently in parallel as well as sequentially and when combined with a fast direct partitioning technique (such as the Greedy algorithm) to give an initial partition, the resulting two-stage process proves itself to be both a powerful and flexible solution to the static graph-partitioning problem. Experiments indicate that the resulting parallel code can provide high quality partitions, independent of the initial partition, within a few seconds. The algorithms can also be used for dynamic load-balancing, reusing existing partitions and in this case the procedures are much faster than static techniques, provide partitions of similar or higher quality and, in comparison, involve the migration of a fraction of the data.
Resumo:
This paper presents a new dynamic load balancing technique for structured mesh computational mechanics codes in which the processor partition range limits of just one of the partitioned dimensions uses non-coincidental limits, as opposed to using coincidental limits in all of the partitioned dimensions. The partition range limits are 'staggered', allowing greater flexibility in obtaining a balanced load distribution in comparison to when the limits are changed 'globally'. as the load increase/decrease on one processor no longer restricts the load decrease/increase on a neighbouring processor. The automatic implementation of this 'staggered' load balancing strategy within an existing parallel code is presented in this paper, along with some preliminary results.
Resumo:
Elasticity is one of the most known capabilities related to cloud computing, being largely deployed reactively using thresholds. In this way, maximum and minimum limits are used to drive resource allocation and deallocation actions, leading to the following problem statements: How can cloud users set the threshold values to enable elasticity in their cloud applications? And what is the impact of the applications load pattern in the elasticity? This article tries to answer these questions for iterative high performance computing applications, showing the impact of both thresholds and load patterns on application performance and resource consumption. To accomplish this, we developed a reactive and PaaS-based elasticity model called AutoElastic and employed it over a private cloud to execute a numerical integration application. Here, we are presenting an analysis of best practices and possible optimizations regarding the elasticity and HPC pair. Considering the results, we observed that the maximum threshold influences the application time more than the minimum one. We concluded that threshold values close to 100% of CPU load are directly related to a weaker reactivity, postponing resource reconfiguration when its activation in advance could be pertinent for reducing the application runtime.
Resumo:
Excess nutrient loads carried by streams and rivers are a great concern for environmental resource managers. In agricultural regions, excess loads are transported downstream to receiving water bodies, potentially causing algal blooms, which could lead to numerous ecological problems. To better understand nutrient load transport, and to develop appropriate water management plans, it is important to have accurate estimates of annual nutrient loads. This study used a Monte Carlo sub-sampling method and error-corrected statistical models to estimate annual nitrate-N loads from two watersheds in central Illinois. The performance of three load estimation methods (the seven-parameter log-linear model, the ratio estimator, and the flow-weighted averaging estimator) applied at one-, two-, four-, six-, and eight-week sampling frequencies were compared. Five error correction techniques; the existing composite method, and four new error correction techniques developed in this study; were applied to each combination of sampling frequency and load estimation method. On average, the most accurate error reduction technique, (proportional rectangular) resulted in 15% and 30% more accurate load estimates when compared to the most accurate uncorrected load estimation method (ratio estimator) for the two watersheds. Using error correction methods, it is possible to design more cost-effective monitoring plans by achieving the same load estimation accuracy with fewer observations. Finally, the optimum combinations of monitoring threshold and sampling frequency that minimizes the number of samples required to achieve specified levels of accuracy in load estimation were determined. For one- to three-weeks sampling frequencies, combined threshold/fixed-interval monitoring approaches produced the best outcomes, while fixed-interval-only approaches produced the most accurate results for four- to eight-weeks sampling frequencies.
Resumo:
In this study, magnesium is alloyed with varying amounts of the ferromagnetic alloying element cobalt in order to obtain lightweight load-sensitive materials with sensory properties which allow an online-monitoring of mechanical forces applied to components made from Mg-Co alloys. An optimized casting process with the use of extruded Mg-Co powder rods is utilized which enables the production of magnetic magnesium alloys with a reproducible Co concentration. The efficiency of the casting process is confirmed by SEM analyses. Microstructures and Co-rich precipitations of various Mg-Co alloys are investigated by means of EDS and XRD analyses. The Mg-Co alloys' mechanical strengths are determined by tensile tests. Magnetic properties of the Mg-Co sensor alloys depending on the cobalt content and the acting mechanical load are measured utilizing the harmonic analysis of eddy-current signals. Within the scope of this work, the influence of the element cobalt on magnesium is investigated in detail and an optimal cobalt concentration is defined based on the performed examinations.
Resumo:
La conchyliculture, et principalement l’élevage de l’huître creuse, Crassostrea gigas, constitue la principale activité aquacole française. Cette activité repose, en grande partie, sur le recrutement naturel de l’espèce qui assure 60 à 70% des besoins en jeunes huîtres (naissain) : cette activité de collecte s’appelle le captage. Les deux principaux centres de captage en France sont les bassins d’Arcachon et de Marennes-Oléron. Or, depuis une dizaine d'années, sur le bassin d'Arcachon, le captage devient très variable: à des années de captage nul (par exemple 2002, 2005, 2007) ou faible (2009, 2010, 2011) succèdent des années excellentes voire pléthoriques (2003, 2006, 2008, 2012, 2014). A Marennes-Oléron, cette variabilité existe, mais s’avère beaucoup moins marquée. En outre, à la faveur du lent réchauffement des eaux, le captage peut désormais se pratiquer de plus en plus vers le nord. Ainsi, la baie de Bourgneuf, mais aussi la rade de Brest sont devenues, depuis quelques années, des secteurs où un nombre croissant d’ostréiculteurs pratiquent le captage avec succès, mais avec, là aussi, des irrégularités dans le recrutement qu’il convient de comprendre. Enfin, depuis la crise des mortalités de 2008, il se développe aussi sur la lagune de Thau une volonté de pratiquer le captage. Afin de mieux comprendre les facteurs de variations du captage, l’Ifremer a mis en place, à la demande du Comité National de la Conchyliculture, un réseau national de suivi de la reproduction : le Réseau Velyger. Créé en 2008 sur financements européens et financé désormais par la Direction des Pêches Maritimes et de l’Aquaculture, ce réseau apporte, chaque année, sur les écosystèmes cités précédemment, une série d’indicateurs biologiques (maturation, fécondité, date de ponte, abondance et survie larvaire, intensité du recrutement, survie du naissain) dont l’analyse croisée avec des indicateurs hydrologiques et climatiques permet progressivement de mieux appréhender les causes de variabilité du recrutement de l’huître creuse en France, modèle biologique et espèce clé de la conchyliculture française. Ce rapport présente donc les résultats 2015 de ce réseau d’observation et fait appel, pour la partie hydro-climatique, à des données acquises par d’autres réseaux régionaux et nationaux. Il détaille et analyse par site toutes les caractéristiques du cycle de reproduction de l’huître creuse : maturation et fécondité des adultes, période de ponte, abondance et survie des larves, intensité du captage et mortalités précoces. Il fournit ensuite une interprétation et une synthèse des résultats 2015 à la lueur des résultats des années antérieures. Ainsi, pour l’année 2015, on retient les faits majeurs suivants : • Sur le plan hydro-climatique, cette année se caractérise par un hiver doux et un printemps dans les normales, suivis d’un été là aussi très proches des normales à quelques exceptions près : l’étang de Thau affiche tout au long de l’été des températures largement excédentaires. Compte tenu d’une pluviométrie là aussi proche des normales, les concentrations en phytoplancton sont restées à un niveau moyen de la rade de Brest aux pertuis charentais et plutôt déficitaires dans le bassin d’Arcachon et la lagune de Thau. • En termes de biologie, ces conditions hydro-climatiques se sont traduites, chez les populations d’huîtres adultes, par des indices de condition, proxy de la fécondité, généralement proches des moyennes, avec toujours l’existence d’un gradient nord-sud observé chaque année, corrélativement à la concentration en phytoplancton. En outre, l’absence d’excédent thermique au printemps et en début d’été n’a pas permis de ponte précoce (à l’exception de la lagune de Thau), elle a même été plutôt tardive surtout dans le bassin d’Arcachon. • Sur la façade atlantique, les températures de l’eau lors du développement larvaire des principales cohortes ont été plutôt basses (inférieures à 20°C en rade de Brest et inférieures à 21°C ailleurs) et donc la vitesse de croissance larvaire a été ralentie et la survie amoindrie. Les rendements larvaires ont été effectivement très bas (e.g. 0,002 % à Arcachon). In fine, il y a eu peu de larves grosses dans l’eau, ce qui s’est traduit par un captage faible à modéré. Une exception tout de même : dans la lagune de Thau, les températures caniculaires tout au long de l’été ont permis une concentration moyenne de larves ‘grosses’ modérée (80 larves/1,5m3). Cependant, les méthodes et les techniques de captage sont encore en cours d’optimisation sur ce secteur et cette année, malgré cette présence de larves grosses, le captage est resté faible (< 10 naissains par coupelle à l’automne). • En conséquence, l’année 2015, se caractérise par un captage globalement « faible à modéré » dans tous les secteurs s’échelonnant autour de 10 naissains/coupelle dans la lagune de Thau et en baie de Bourgneuf à plus de 200 naissains/coupelle dans les pertuis charentais. Enfin, à partir de l’ensemble des résultats acquis depuis 2008, ce rapport fournit en conclusion une série de recommandations à prendre en compte pour préserver le captage dans les années à venir.
Resumo:
This thesis describes the development and correlation of a thermal model that forms the foundation of a thermal capacitance spacecraft propellant load estimator. Specific details of creating the thermal model for the diaphragm propellant tank used on NASA’s Magnetospheric Multiscale spacecraft using ANSYS and the correlation process implemented are presented. The thermal model was correlated to within +/- 3 Celsius of the thermal vacuum test data, and was determined sufficient to make future propellant predictions on MMS. The model was also found to be relatively sensitive to uncertainties in applied heat flux and mass knowledge of the tank. More work is needed to improve temperature predictions in the upper hemisphere of the propellant tank where predictions were found to be 2-2.5 Celsius lower than the test data. A road map for applying the model to predict propellant loads on the actual MMS spacecraft in 2017-2018 is also presented.
Resumo:
Background: Depression is a major health problem worldwide and the majority of patients presenting with depressive symptoms are managed in primary care. Current approaches for assessing depressive symptoms in primary care are not accurate in predicting future clinical outcomes, which may potentially lead to over or under treatment. The Allostatic Load (AL) theory suggests that by measuring multi-system biomarker levels as a proxy of measuring multi-system physiological dysregulation, it is possible to identify individuals at risk of having adverse health outcomes at a prodromal stage. Allostatic Index (AI) score, calculated by applying statistical formulations to different multi-system biomarkers, have been associated with depressive symptoms. Aims and Objectives: To test the hypothesis, that a combination of allostatic load (AL) biomarkers will form a predictive algorithm in defining clinically meaningful outcomes in a population of patients presenting with depressive symptoms. The key objectives were: 1. To explore the relationship between various allostatic load biomarkers and prevalence of depressive symptoms in patients, especially in patients diagnosed with three common cardiometabolic diseases (Coronary Heart Disease (CHD), Diabetes and Stroke). 2 To explore whether allostatic load biomarkers predict clinical outcomes in patients with depressive symptoms, especially in patients with three common cardiometabolic diseases (CHD, Diabetes and Stroke). 3 To develop a predictive tool to identify individuals with depressive symptoms at highest risk of adverse clinical outcomes. Methods: Datasets used: ‘DepChron’ was a dataset of 35,537 patients with existing cardiometabolic disease collected as a part of routine clinical practice. ‘Psobid’ was a research data source containing health related information from 666 participants recruited from the general population. The clinical outcomes for 3 both datasets were studied using electronic data linkage to hospital and mortality health records, undertaken by Information Services Division, Scotland. Cross-sectional associations between allostatic load biomarkers calculated at baseline, with clinical severity of depression assessed by a symptom score, were assessed using logistic and linear regression models in both datasets. Cox’s proportional hazards survival analysis models were used to assess the relationship of allostatic load biomarkers at baseline and the risk of adverse physical health outcomes at follow-up, in patients with depressive symptoms. The possibility of interaction between depressive symptoms and allostatic load biomarkers in risk prediction of adverse clinical outcomes was studied using the analysis of variance (ANOVA) test. Finally, the value of constructing a risk scoring scale using patient demographics and allostatic load biomarkers for predicting adverse outcomes in depressed patients was investigated using clinical risk prediction modelling and Area Under Curve (AUC) statistics. Key Results: Literature Review Findings. The literature review showed that twelve blood based peripheral biomarkers were statistically significant in predicting six different clinical outcomes in participants with depressive symptoms. Outcomes related to both mental health (depressive symptoms) and physical health were statistically associated with pre-treatment levels of peripheral biomarkers; however only two studies investigated outcomes related to physical health. Cross-sectional Analysis Findings: In DepChron, dysregulation of individual allostatic biomarkers (mainly cardiometabolic) were found to have a non-linear association with increased probability of co-morbid depressive symptoms (as assessed by Hospital Anxiety and Depression Score HADS-D≥8). A composite AI score constructed using five biomarkers did not lead to any improvement in the observed strength of the association. In Psobid, BMI was found to have a significant cross-sectional association with the probability of depressive symptoms (assessed by General Health Questionnaire GHQ-28≥5). BMI, triglycerides, highly sensitive C - reactive 4 protein (CRP) and High Density Lipoprotein-HDL cholesterol were found to have a significant cross-sectional relationship with the continuous measure of GHQ-28. A composite AI score constructed using 12 biomarkers did not show a significant association with depressive symptoms among Psobid participants. Longitudinal Analysis Findings: In DepChron, three clinical outcomes were studied over four years: all-cause death, all-cause hospital admissions and composite major adverse cardiovascular outcome-MACE (cardiovascular death or admission due to MI/stroke/HF). Presence of depressive symptoms and composite AI score calculated using mainly peripheral cardiometabolic biomarkers was found to have a significant association with all three clinical outcomes over the following four years in DepChron patients. There was no evidence of an interaction between AI score and presence of depressive symptoms in risk prediction of any of the three clinical outcomes. There was a statistically significant interaction noted between SBP and depressive symptoms in risk prediction of major adverse cardiovascular outcome, and also between HbA1c and depressive symptoms in risk prediction of all-cause mortality for patients with diabetes. In Psobid, depressive symptoms (assessed by GHQ-28≥5) did not have a statistically significant association with any of the four outcomes under study at seven years: all cause death, all cause hospital admission, MACE and incidence of new cancer. A composite AI score at baseline had a significant association with the risk of MACE at seven years, after adjusting for confounders. A continuous measure of IL-6 observed at baseline had a significant association with the risk of three clinical outcomes- all-cause mortality, all-cause hospital admissions and major adverse cardiovascular event. Raised total cholesterol at baseline was associated with lower risk of all-cause death at seven years while raised waist hip ratio- WHR at baseline was associated with higher risk of MACE at seven years among Psobid participants. There was no significant interaction between depressive symptoms and peripheral biomarkers (individual or combined) in risk prediction of any of the four clinical outcomes under consideration. Risk Scoring System Development: In the DepChron cohort, a scoring system was constructed based on eight baseline demographic and clinical variables to predict the risk of MACE over four years. The AUC value for the risk scoring system was modest at 56.7% (95% CI 55.6 to 57.5%). In Psobid, it was not possible to perform this analysis due to the low event rate observed for the clinical outcomes. Conclusion: Individual peripheral biomarkers were found to have a cross-sectional association with depressive symptoms both in patients with cardiometabolic disease and middle-aged participants recruited from the general population. AI score calculated with different statistical formulations was of no greater benefit in predicting concurrent depressive symptoms or clinical outcomes at follow-up, over and above its individual constituent biomarkers, in either patient cohort. SBP had a significant interaction with depressive symptoms in predicting cardiovascular events in patients with cardiometabolic disease; HbA1c had a significant interaction with depressive symptoms in predicting all-cause mortality in patients with diabetes. Peripheral biomarkers may have a role in predicting clinical outcomes in patients with depressive symptoms, especially for those with existing cardiometabolic disease, and this merits further investigation.
Resumo:
Background: This article examines the concepts of low glycemic indices (GIs) and glycemic load (GL) foods as key drivers in the dietary management of type 2 diabetes as well as their shortcomings. The controversies arising from the analysis of glycemic index (GI) and GL of foods such as their reproducibility as well as their relevance to the dietary management of type 2 diabetes are also discussed. Methods: Search was conducted in relevant electronic databases such as: Pubmed, Google Scholar, HINARI, the Cochrane library, Popline, LILACS, CINAHL, EMBASE, etc to identify the current status of knowledge regarding the controversies surrounding management of diabetes with low GI and GL foods. Conclusion: This article suggests that in view of discrepancies that surround the results of GI versus GL of foods, any assay on the GI and GL of a food with the aim of recommending the food for the dietary management of type 2 diabetes, could be balanced with glycated hemoglobin assays before they are adopted as useful antidiabetic foods.