583 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes amongst the agents on solution quality are examined for two multiple-choice optimisation problems. It is shown that partnering strategies that exploit problem-specific knowledge are superior and can counter inappropriate (sub-) fitness measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper addresses current changes in the highly diverse European landscape, and the way these transitions are being treated in policy and landscape management in the fragmented, heterogeneous and dynamic context of today’s Europe. It appears that intersecting driving forces are increasing the complexity of European landscapes and causing polarising developments in agricultural land use, biodiversity conservation and cultural landscape management. On the one hand, multifunctional rural landscapes, especially in peri-urban regions, provide services and functions that serve the citizens in their demand for identity, support their sense of belonging and offer opportunities for recreation and involvement in practical landscape management. On the other hand, industrial agricultural production on increasingly large farms produces food, feed, fibre and energy to serve expanding international markets with rural live ability and accessibility as a minor issue. The intermediate areas of traditionally dominant small and family farms in Europe seem to be gradually declining in profitability. The paper discusses the potential of a governance approach that can cope with the requirement of optimising land-sharing conditions and community-based landscape development, while adapting to global market conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on an attempt to apply Genetic Algorithms to the problem of optimising a complex system, through discrete event simulation (Simulation Optimisation), with a view to reducing the noise associated with such a procedure. We are applying this proposed solution approach to our application test bed, a Crossdocking distribution centre, because it provides a good representative of the random and unpredictable behaviour of complex systems i.e. automated machine random failure and the variability of manual order picker skill. It is known that there is noise in the output of discrete event simulation modelling. However, our interest focuses on the effect of noise on the evaluation of the fitness of candidate solutions within the search space, and the development of techniques to handle this noise. The unique quality of our proposed solution approach is we intend to embed a noise reduction technique in our Genetic Algorithm based optimisation procedure, in order for it to be robust enough to handle noise, efficiently estimate suitable fitness function, and produce good quality solutions with minimal computational effort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compiler varies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les langages de programmation typés dynamiquement tels que JavaScript et Python repoussent la vérification de typage jusqu’au moment de l’exécution. Afin d’optimiser la performance de ces langages, les implémentations de machines virtuelles pour langages dynamiques doivent tenter d’éliminer les tests de typage dynamiques redondants. Cela se fait habituellement en utilisant une analyse d’inférence de types. Cependant, les analyses de ce genre sont souvent coûteuses et impliquent des compromis entre le temps de compilation et la précision des résultats obtenus. Ceci a conduit à la conception d’architectures de VM de plus en plus complexes. Nous proposons le versionnement paresseux de blocs de base, une technique de compilation à la volée simple qui élimine efficacement les tests de typage dynamiques redondants sur les chemins d’exécution critiques. Cette nouvelle approche génère paresseusement des versions spécialisées des blocs de base tout en propageant de l’information de typage contextualisée. Notre technique ne nécessite pas l’utilisation d’analyses de programme coûteuses, n’est pas contrainte par les limitations de précision des analyses d’inférence de types traditionnelles et évite la complexité des techniques d’optimisation spéculatives. Trois extensions sont apportées au versionnement de blocs de base afin de lui donner des capacités d’optimisation interprocédurale. Une première extension lui donne la possibilité de joindre des informations de typage aux propriétés des objets et aux variables globales. Puis, la spécialisation de points d’entrée lui permet de passer de l’information de typage des fonctions appellantes aux fonctions appellées. Finalement, la spécialisation des continuations d’appels permet de transmettre le type des valeurs de retour des fonctions appellées aux appellants sans coût dynamique. Nous démontrons empiriquement que ces extensions permettent au versionnement de blocs de base d’éliminer plus de tests de typage dynamiques que toute analyse d’inférence de typage statique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes for (sub-)fitness evaluation purposes are examined for two multiple-choice optimisation problems. It is shown that random partnering strategies perform best by providing better sampling and more diversity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes amongst the agents on solution quality are examined for two multiple-choice optimisation problems. It is shown that partnering strategies that exploit problem-specific knowledge are superior and can counter inappropriate (sub-) fitness measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper combines the idea of a hierarchical distributed genetic algorithm with different inter-agent partnering strategies. Cascading clusters of sub-populations are built from bottom up, with higher-level sub-populations optimising larger parts of the problem. Hence, higher-level sub-populations search a larger search space with a lower resolution whilst lower-level sub-populations search a smaller search space with a higher resolution. The effects of different partner selection schemes amongst the agents on solution quality are examined for two multiple-choice optimisation problems. It is shown that partnering strategies that exploit problem-specific knowledge are superior and can counter inappropriate (sub-) fitness measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Medical image processing in general and brain image processing in particular are computationally intensive tasks. Luckily, their use can be liberalized by means of techniques such as GPU programming. In this article we study NiftyReg, a brain image processing library with a GPU implementation using CUDA, and analyse different possible ways of further optimising the existing codes. We will focus on fully using the memory hierarchy and on exploiting the computational power of the CPU. The ideas that lead us towards the different attempts to change and optimize the code will be shown as hypotheses, which we will then test empirically using the results obtained from running the application. Finally, for each set of related optimizations we will study the validity of the obtained results in terms of both performance and the accuracy of the resulting images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hintergrund: Die koronare Herzkrankheit (KHK) ist eine häufige und potenziell tödliche Erkrankung mit einer Lebenszeitprävalenz von über 20%. Allein in Deutschland wird die Zahl der durch die ischämische Herzerkrankung und des akuten Myokardinfarkts jährlich verursachten Todesfälle auf etwa 140.000 geschätzt. Ein Zusammenhang eng mit dem Lebensstil verbundener Risikofaktoren mit Auftreten und Prognose der KHK ist nachgewiesen. Durch Maßnahmen der nichtmedikamentösen Sekundärprävention wird versucht, diese Risikofaktoren positiv zu verändern sowie die KHK im Gegensatz zu palliativen interventionellen Therapiestrategien kausal zu behandeln. Zur Wirksamkeit der nichtmedikamentösen sekundärpräventiven Maßnahmen liegt eine ganze Reihe von Einzelstudien und -untersuchungen vor, eine systematische Analyse, die die Evidenz aller hauptsächlich angewandten Sekundärpräventionsstrategien zusammenfasst, fehlt unseres Wissens nach bislang jedoch. Auch eine Auswertung vorhandener Studien zur Kosten-Effektivität der Maßnahmen ist hierbei zu integieren. Fragestellung: Ziel dieses HTA-Berichts (HTA=Health Technology Assessment) ist die Erstellung einer umfassenden Übersicht der aktuellen Literatur zu nichtmedikamentösen Sekundärpräventionsmaßnahmen in der Behandlung der KHK, um diese Maßnahmen und deren Komponenten bezüglich ihrer medizinischen Wirksamkeit sowie Wirtschaftlichkeit zu beurteilen. Weiterhin sollen die ethischen, sozialen und rechtlichen Aspekte der nichtmedikamentösen Sekundärprävention und die Übertragbarkeit der Ergebnisse auf den deutschen Versorgungsalltag untersucht werden. Methodik: Relevante Publikationen werden über eine strukturierte und hochsensitive Datenbankrecherche sowie mittels Handrecherche identifiziert. Die Literaturrecherche wird in vier Einzelsuchen zu medizinischen, gesundheitsökonomischen, ethischen und juristischen Themen am 18.09.2008 durchgeführt und erstreckt sich über die vergangenen fünf Jahre. Die methodische Qualität der Publikationen wird von jeweils zwei unabhängigen Gutachtern unter Beachtung von Kriterien der evidenzbasierten Medizin (EbM) systematisch geprüft. Ergebnisse: Von insgesamt 9.074 Treffern erfüllen 43 medizinische Publikationen die Selektionskriterien, mit einem Nachbeobachtungszeitraum zwischen zwölf und 120 Monaten. Insgesamt ist die Studienqualität zufriedenstellend, allerdings berichtet nur ca. die Hälfte der Studien differenziert die Gesamtmortalität, während die übrigen Studien andere Outcomemaße verwenden. Die Wirksamkeit einzelner Sekundärpräventionsmaßnahmen stellt sich als sehr heterogen dar. Insgesamt kann langfristig eine Reduktion der kardialen sowie der Gesamtmortalität und der Häufigkeit kardialer Ereignisse sowie eine Erhöhung der Lebensqualität beobachtet werden. Vor allem für trainingsbasierte und multimodale Interventionen ist eine effektive Reduktion der Mortalität zu beobachten, während psychosoziale Interventionen besonders in Bezug auf eine Erhöhung der Lebensqualität effektiv zu sein scheinen. Für die ökonomischen Auswertungen werden 26 Publikationen identifiziert, die von ihrer Themenstellung und Studienart dem hier betrachteten Kontext zugeordnet werden können. Insgesamt kann festgestellt werden, dass sich die Studienlage zur multimodalen Rehabilitation sowohl bezüglich ihrer Menge als auch Qualität der Analysen besser darstellt, als dies für Evaluationen von Einzelmaßnahmen beobachtet werden kann. Die internationale Literatur bestätigt den multimodalen Ansätzen dabei zwar ein gutes Verhältnis von Kosten und Effektivität, untersucht jedoch nahezu ausschließlich ambulante oder häuslichbasierte Maßnahmen. Die Auswertung der Studien, die einzelne sich mit präventiven Maßnahmen in Hinblick auf ihre Kosten-Effektivität beschäftigen, ergibt lediglich positive Tendenzen für Interventionen der Raucherentwöhnung und des körperlichen Trainings. Im Hinblick auf psychosoziale Maßnahmen sowie auch die Ernährungsumstellung können aufgrund der unzureichenden Studienlage jedoch keine Aussagen über die Kosten-Effektivität getroffen werden. Insgesamt werden im Rahmen der Betrachtung sozialer Aspekte der nichtmedikamentösen Sekundärprävention elf Publikationen einbezogen. Die relativ neuen Studien bestätigen, dass Patienten mit niedrigem sozioökonomischen Status insgesamt schlechtere Ausgangsbedingungen und demnach einen spezifischen Bedarf an rehabilitativer Unterstützung haben. Gleichzeitig sind sich die Forscher jedoch uneinig, ob gerade diese Patientengruppe relativ häufiger oder seltener an den Rehabilitationsmaßnahmen teilnimmt. Bezüglich der Barrieren, die Patienten von der Teilnahme an den präventiven Maßnahmen abhalten, werden psychologische Faktoren, physische Einschränkungen aber auch gesellschaftliche und systemisch-orientierte Einflüsse genannt. Diskussion: Nichtmedikamentöse Sekundärpräventionsmaßnahmen sind sicher und in der Lage eine Reduktion der Mortalität sowie der Häufigkeit kardialer Ereignisse zu erzielen sowie die Lebensqualität zu erhöhen. Da nur wenige der methodisch verlässlichen Studien Teilnehmer über einen längeren Zeitraum von mindestens 60 Monaten nachverfolgen, müssen Aussagen über die Nachhaltigkeit als limitiert angesehen werden. Verlässliche Angaben in Bezug auf relevante Patientensubgruppen lassen sich nur sehr eingeschränkt machen ebenso wie im Hinblick auf die vergleichende Beurteilung verschiedener Maßnahmen der Sekundärprävention, da diese von eingeschlossenen Studien nur unzureichend erforscht wurden. Zukünftige methodisch verlässliche Studien sind notwendig, um diese Fragestellungen zu untersuchen und zu beantworten. Bezogen auf die Kosten-Effektivität nichtmedikamentöser sekundärpräventiver Maßnahmen kann aus den internationalen Studien eine insgesamt positive Aussage zusammengefasst werden. Einschränkungen dieser resultieren jedoch zum einen aus den Besonderheiten des deutschen Systems der stationären Rehabilitationsangebote, zum anderen aus den qualitativ mangelhaften Evaluationen der Einzelmaßnahmen. Studien mit dem Ziel der Bewertung der Kosten-Effektivität stationärer Rehabilitationsangebote sind ebenso erforderlich wie auch qualitativ hochwertige Untersuchungen einzeln erbrachter Präventionsmaßnahmen. Aus sozialer Perspektive sollte insbesondere untersucht werden, welche Patientengruppe aus welchen Gründen von einer Teilnahme an Rehabilitations- bzw. präventiven Maßnahmen absieht und wie diesen Argumenten begegnet werden könnte. Schlussfolgerung: Nichtmedikamentöse sekundärpräventive Maßnahmen sind in der Lage eine Reduktion der Mortalität und der Häufigkeit kardialer Ereignisse zu erzielen sowie die Lebensqualität zu erhöhen. Eine Stärkung des Stellenwerts nichtmedikamentöser Maßnahmen der Sekundärprävention erscheint vor diesem Hintergrund notwendig. Auch kann für einige Interventionen ein angemessenes Verhältnis von Effektivität und Kosten angenommen werden. Es besteht allerdings nach wie vor erheblicher Forschungsbedarf bezüglich der Wirksamkeitsbeurteilung nichtmedikamentöser Maßnahmen der Sekundärprävention in wichtigen Patientensubgruppen und der Effizienz zahlreicher angebotener Programme. Darüber hinaus ist weitere Forschung notwendig, um die Nachhaltigkeit der Maßnahmen und Gründe für die Nichtinanspruchnahme detailliert zu untersuchen. Vor allem gilt es jedoch den Versorgungsalltag in Deutschland, wie er sich für Ärzte, Patienten und weitere Akteure des Gesundheitswesens darstellt, zu untersuchen und den heutigen Stellenwert nichtmedikamentöser Maßnahmen aufzuzeigen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic kidney disease (CKD) is associated with increased cardiovascular risk in comparison with the general population. This can be observed even in the early stages of CKD, and rises in proportion to the degree of renal impairment. Not only is cardiovascular disease (CVD) more prevalent in CKD, but its nature differs too, with an excess of morbidity and mortality associated with congestive cardiac failure, arrhythmia and sudden death, as well as the accelerated atherosclerosis which is also observed. Conventional cardiovascular risk factors such as hypertension, dyslipidaemia, obesity, glycaemia and smoking, are highly prevalent amongst patients with CKD, although in many of these examples the interaction between risk factor and disease differs from that which exists in normal renal function. Nevertheless, the extent of CVD cannot be fully explained by these conventional risk factors, and non-conventional factors specific to CKD are now recognised to contribute to the burden of CVD. Oxidative stress is a state characterised by excessive production of reactive oxygen species (ROS) and other radical species, a reduction in the capacity of antioxidant systems, and disturbance in normal redox homeostasis with depletion of protective vascular signalling molecules such as nitric oxide (NO). This results in oxidative damage to macromolecules such as lipids, proteins and DNA which can alter their functionality. Moreover, many enzymes are sensitive to redox regulation such that oxidative modification to cysteine thiol groups results in activation of signalling cascades which result in adverse cardiovascular effects such as vascular and endothelial dysfunction. Endothelial dysfunction and oxidative stress are present in association with many conventional cardiovascular risk factors, and can be observed even prior to the development of overt, clinical, vascular pathology, suggesting that these phenomena represent the earliest stages of CVD. In the presence of CKD, there is increased ROS production due to upregulated NADPH oxidase (NOX), increase in a circulating asymmetric dimethylarginine (ADMA), uncoupling of endothelial nitric oxide synthase (eNOS) as well as other mechanisms. There is also depletion in exogenous antioxidants such as ascorbic acid and tocopherol, and a reduction in activity of endogenous antioxidant systems regulated by the master gene regulator Nrf-2. In previous studies, circulating markers of oxidative stress have been shown to be increased in CKD, together with a reduction in endothelial function in a stepwise fashion relating to the severity of renal impairment. Not only is CVD linked to oxidative stress, but the progression of CKD itself is also in part dependent on redox sensitive mechanisms. For example, administration of the ROS scavenger tempol attenuates renal injury and reduces renal fibrosis seen on biopsy in a mouse model of CKD, whilst conversely, supplementation with the NOS inhibitor L-NAME causes proteinuria and renal impairment. Previous human studies examining the effect of antioxidant administration on vascular and renal function have been conflicting however. The work contained in this thesis therefore examines the effect of antioxidant administration on vascular and endothelial function in CKD. Firstly, 30 patients with CKD stages 3 – 5, and 20 matched hypertensive controls were recruited. Participants with CKD had lower ascorbic acid, higher TAP and ADMA, together with higher augmentation index and pulse wave velocity. There was no difference in baseline flow mediated dilatation (FMD) between groups. Intravenous ascorbic acid increased TAP and O2-, and reduced central BP and augmentation index in both groups, and lowered ADMA in the CKD group only. No effect on FMD was observed. The effects of ascorbic acid on kidney function was then investigated, however this was hindered by the inherent drawbacks of existing methods of non-invasively measuring kidney function. Arterial spin labelling MRI is an emerging imaging technique which allows measurement of renal perfusion without administration of an exogenous contrast agent. The technique relies upon application of an inversion pulse to blood within the vasculature proximal to the kidneys, which magnetically labels protons allowing measurement upon transit to the kidney. At the outset of this project local experience using ASL MRI was limited and there ensued a prolonged pre-clinical phase of testing with the aim of optimising imaging strategy. A study was then designed to investigate the repeatability of ASL MRI in a group of 12 healthy volunteers with normal renal function. The measured T1 longitudinal relaxation times and ASL MRI perfusion values were in keeping with those found in the literature; T1 time was 1376 ms in the cortex and 1491 ms in the whole kidney ROI, whilst perfusion was 321 mL/min/100g in the cortex, and 228 mL/min/100g in the whole kidney ROI. There was good reproducibility demonstrated on Bland Altman analysis, with a CVws was 9.2% for cortical perfusion and 7.1% for whole kidney perfusion. Subsequently, in a study of 17 patients with CKD and 24 healthy volunteers, the effects of ascorbic acid on renal perfusion was investigated. Although no change in renal perfusion was found following ascorbic acid, it was found that ASL MRI demonstrated significant differences between those with normal renal function and participants with CKD stages 3 – 5, with increased cortical and whole kidney T1, and reduced cortical and whole kidney perfusion. Interestingly, absolute perfusion showed a weak but significant correlation with progression of kidney disease over the preceding year. Ascorbic acid was therefore shown to have a significant effect on vascular biology both in CKD and in those with normal renal function, and to reduce ADMA only in patients with CKD. ASL MRI has shown promise as a non-invasive investigation of renal function and as a biomarker to identify individuals at high risk of progressive renal impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A smart solar photovoltaic grid system is an advent of innovation coherence of information and communications technology (ICT) with power systems control engineering via the internet [1]. This thesis designs and demonstrates a smart solar photovoltaic grid system that is selfhealing, environmental and consumer friendly, but also with the ability to accommodate other renewable sources of energy generation seamlessly, creating a healthy competitive energy industry and optimising energy assets efficiency. This thesis also presents the modelling of an efficient dynamic smart solar photovoltaic power grid system by exploring the maximum power point tracking efficiency, optimisation of the smart solar photovoltaic array through modelling and simulation to improve the quality of design for the solar photovoltaic module. In contrast, over the past decade quite promising results have been published in literature, most of which have not addressed the basis of the research questions in this thesis. The Levenberg-Marquardt and sparse based algorithms have proven to be very effective tools in helping to improve the quality of design for solar photovoltaic modules, minimising the possible relative errors in this thesis. Guided by theoretical and analytical reviews in literature, this research has carefully chosen the MatLab/Simulink software toolbox for modelling and simulation experiments performed on the static smart solar grid system. The auto-correlation coefficient results obtained from the modelling experiments give an accuracy of 99% with negligible mean square error (MSE), root mean square error (RMSE) and standard deviation. This thesis further explores the design and implementation of a robust real-time online solar photovoltaic monitoring system, establishing a comparative study of two solar photovoltaic tracking systems which provide remote access to the harvested energy data. This research made a landmark innovation in designing and implementing a unique approach for online remote access solar photovoltaic monitoring systems providing updated information of the energy produced by the solar photovoltaic module at the site location. In addressing the challenge of online solar photovoltaic monitoring systems, Darfon online data logger device has been systematically integrated into the design for a comparative study of the two solar photovoltaic tracking systems examined in this thesis. The site location for the comparative study of the solar photovoltaic tracking systems is at the National Kaohsiung University of Applied Sciences, Taiwan, R.O.C. The overall comparative energy output efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic monitoring system as observed at the research location site is about 72% based on the total energy produced, estimated money saved and the amount of CO2 reduction achieved. Similarly, in comparing the total amount of energy produced by the two solar photovoltaic tracking systems, the overall daily generated energy for the month of July shows the effectiveness of the azimuthal-altitude tracking systems over the 450 stationary solar photovoltaic system. It was found that the azimuthal-altitude dual-axis tracking systems were about 68.43% efficient compared to the 450 stationary solar photovoltaic systems. Lastly, the overall comparative hourly energy efficiency of the azimuthal-altitude dual-axis over the 450 stationary solar photovoltaic energy system was found to be 74.2% efficient. Results from this research are quite promising and significant in satisfying the purpose of the research objectives and questions posed in the thesis. The new algorithms introduced in this research and the statistical measures applied to the modelling and simulation of a smart static solar photovoltaic grid system performance outperformed other previous works in reviewed literature. Based on this new implementation design of the online data logging systems for solar photovoltaic monitoring, it is possible for the first time to have online on-site information of the energy produced remotely, fault identification and rectification, maintenance and recovery time deployed as fast as possible. The results presented in this research as Internet of things (IoT) on smart solar grid systems are likely to offer real-life experiences especially both to the existing body of knowledge and the future solar photovoltaic energy industry irrespective of the study site location for the comparative solar photovoltaic tracking systems. While the thesis has contributed to the smart solar photovoltaic grid system, it has also highlighted areas of further research and the need to investigate more on improving the choice and quality design for solar photovoltaic modules. Finally, it has also made recommendations for further research in the minimization of the absolute or relative errors in the quality and design of the smart static solar photovoltaic module.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Many acute stroke trials have given neutral results. Sub-optimal statistical analyses may be failing to detect efficacy. Methods which take account of the ordinal nature of functional outcome data are more efficient. We compare sample size calculations for dichotomous and ordinal outcomes for use in stroke trials. Methods Data from stroke trials studying the effects of interventions known to positively or negatively alter functional outcome – Rankin Scale and Barthel Index – were assessed. Sample size was calculated using comparisons of proportions, means, medians (according to Payne), and ordinal data (according to Whitehead). The sample sizes gained from each method were compared using Friedman 2 way ANOVA. Results Fifty-five comparisons (54 173 patients) of active vs. control treatment were assessed. Estimated sample sizes differed significantly depending on the method of calculation (Po00001). The ordering of the methods showed that the ordinal method of Whitehead and comparison of means produced significantly lower sample sizes than the other methods. The ordinal data method on average reduced sample size by 28% (inter-quartile range 14–53%) compared with the comparison of proportions; however, a 22% increase in sample size was seen with the ordinal method for trials assessing thrombolysis. The comparison of medians method of Payne gave the largest sample sizes. Conclusions Choosing an ordinal rather than binary method of analysis allows most trials to be, on average, smaller by approximately 28% for a given statistical power. Smaller trial sample sizes may help by reducing time to completion, complexity, and financial expense. However, ordinal methods may not be optimal for interventions which both improve functional outcome