905 resultados para optimising compiler


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Objective: Clozapine has been available since the early 1990s. Studies continue to demonstrate its superior efficacy in treatment-resistant schizophrenia. Despite this, numerous studies show under-utilisation, delayed access and reluctance by psychiatrists to prescribe clozapine. This retrospective cross-sectional study compared the prescribing of clozapine in two adult cohorts under the care of large public mental health services in Auckland (New Zealand) and Birmingham (United Kingdom) on 31 March 2007. Method: Time from first presentation to clozapine initiation, prior antipsychotics trialled and antipsychotic co-prescribing were compared. Data included demographics, psychiatric diagnosis, co-morbid conditions, year of first presentation, admissions and pharmacological treatment (clozapine dose, start date, prior antipsychotics, co-prescribed antipsychotic). Results: Overall, 664 people were prescribed clozapine (402 Auckland; 262 Birmingham); mean daily dose of 384 mg (Auckland) and 429 mg (Birmingham). 53 % presented after 1990 and the average duration of time before starting clozapine was significantly longer in the Birmingham cohort (6.5 vs. 5.3 years) but this reduced in both cohorts to a 1-year mean in those presenting within the last 3 years. The average number of antipsychotics trialled pre-clozapine for those presenting since 1990 was significantly higher in the Birmingham cohort (4.3 vs. 3.1) but in both cohorts this similarly reduced in those presenting within the last 3 years. Antipsychotic co-prescribing was significantly higher in the Birmingham cohort (22.9 vs. 10.7 %). Conclusions: There is evidence that access to clozapine has improved over time in both cohorts, with a reduction in the duration between presentation and initiation of clozapine and number of different antipsychotics trialled pre-clozapine. These are very positive findings in terms of optimising outcomes with clozapine and are possibly due to the impact of guideline recommendations, increasing clinician, consumer and carer knowledge, and experience with clozapine and funding changes. © 2014 Springer International Publishing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Link quality-based rate adaptation has been widely used for IEEE 802.11 networks. However, network performance is affected by both link quality and random channel access. Selection of transmit modes for optimal link throughput can cause medium access control (MAC) throughput loss. In this paper, we investigate this issue and propose a generalised cross-layer rate adaptation algorithm. It considers jointly link quality and channel access to optimise network throughput. The objective is to examine the potential benefits by cross-layer design. An efficient analytic model is proposed to evaluate rate adaptation algorithms under dynamic channel and multi-user access environments. The proposed algorithm is compared to link throughput optimisation-based algorithm. It is found rate adaptation by optimising link layer throughput can result in large performance loss, which cannot be compensated by the means of optimising MAC access mechanism alone. Results show cross-layer design can achieve consistent and considerable performance gains of up to 20%. It deserves to be exploited in practical design for IEEE 802.11 networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is proposed to use one common model of computer for teaching different parts of the informatics course, connected with both hardware and software subjects. Reasoning of such slant is presented; the most suitable themes of the course, where it is practical, are enumerated. The own author's development (including software support) – the educational model of virtual computer "E97" and compiler from Pascal language for it – are described. It is accented, that the discussed ideas are helpful for any other similar model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is twofold: first, to provide a critical assessment of the literature on business incubation effectiveness and second, to submit a situated theoretical perspective on how business incubation management can provide an environment that supports the development of incubatee entrepreneurs and their businesses. Design/methodology/approach – The paper provides a narrative critical assessment of the literature on business incubation effectiveness. Definitional issues, performance aspects and approaches to establishing critical success factors in business incubation are discussed. Business incubation management is identified as an overarching factor for theorising on business incubation effectiveness. Findings – The literature on business incubation effectiveness suffers from several deficiencies, including definitional incongruence, descriptive accounts, fragmentation and lack of strong conceptual grounding. Notwithstanding the growth of research on this domain, understanding of how entrepreneurs and their businesses develop within the business incubator environment remains limited. Given the importance of relational, intangible factors in business incubation and the critical role of business incubation management in orchestrating and optimising such factors, it is suggested that theorising efforts would benefit from a situated perspective. Originality/value – The identification of specific shortcomings in the literature on business incubation highlights the need for more systematic efforts towards theory building. It is suggested that focusing on the role of business incubation management from a situated learning theory perspective can lend itself to a more profound understanding of the development process of incubatee entrepreneurs and their firms. Theoretical propositions are offered to this effect, as well as avenues for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Science and art are considered as two distinct areas in the spectrum of human activities. Many scientists are inspired by art and many artists embed science in their work. This paper presents a one-year experiment, which started with benchmark tests of a compiler, passed through dynamic systems based on complex numbers and ended as a scientific art exhibition. The paper demonstrates that it is possible to blend science and art in a mutually beneficial way. It also shows how science can inspire the creation of artistic works, as well as how these works can inspire further scientific research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis begins with a review of the literature on wisdom models, theories of wise leadership, and existing wisdom measures. It continues with a review of how the concept of wisdom may add value to existing leadership models, highlighting the need to empirically identify the characteristics of wise leaders and develop a wise leadership measure. A nomological framework for wise leadership is then presented. Based on a review of the wisdom and leadership paradigms, a mixed-methods research design is described for three studies to define the characteristics of wise leadership in organisations; identify specific leadership challenges that might require wise responses; and to develop the wise leadership measure comprising of vignettes. The first study involves critical incident interviews with 26 nominated wise leaders and 23 of their nominators, which led to the identification of nine wise leadership dimensions which include Strong Ethical Code, Strong Judgement, Optimising Positive Outcomes, Managing Uncertainty, Strong Legacy, Leading with Purpose, Humanity, Humility, and Self-Awareness. The second study includes critical incident interviews with 20 leaders about organisational challenges associated with the nine dimensions, to elucidate the wise leadership measure. The third study includes the design of 45 vignettes based on organisational challenges that measure the nine wise leadership dimensions. The measure is then administered to 250 organisational leaders to establish its construct validity, leading to the selection of 18 vignettes forming the final wise leadership measure. Theoretical, methodological and practical implications of this research are then discussed with recommendations for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lowering glucose levels, while avoiding hypoglycaemia, can be challenging in insulin-treated patients with diabetes. We evaluated the role of ambulatory glucose profile in optimising glycaemic control in this population. Insulin-treated patients with type 1 and type 2 diabetes were recruited into a prospective, multicentre, 100-day study and randomised to control (n = 28) or intervention (n = 59) groups. The intervention group used ambulatory glucose profile, generated by continuous glucose monitoring, to assess daily glucose levels, whereas the controls relied on capillary glucose testing. Patients were reviewed at days 30 and 45 by the health care professional to adjust insulin therapy. Comparing first and last 2 weeks of the study, ambulatory glucose profile-monitored type 2 diabetes patients (n = 28) showed increased time in euglycaemia (mean ± standard deviation) by 1.4 ± 3.5 h/day (p = 0.0427) associated with reduction in HbA1c from 77 ± 15 to 67 ± 13 mmol/mol (p = 0.0002) without increased hypoglycaemia. Type 1 diabetes patients (n = 25) showed reduction in hypoglycaemia from 1.4 ± 1.7 to 0.8 ± 0.8 h/day (p = 0.0472) associated with a marginal HbA1c decrease from 75 ± 10 to 72 ± 8 mmol/mol (p = 0.0508). Largely similar findings were observed comparing intervention and control groups at end of study. In conclusion, ambulatory glucose profile helps glycaemic management in insulin-treated diabetes patients by increasing time spent in euglycaemia and decreasing HbA1c in type 2 diabetes patients, while reducing hypoglycaemia in type 1 diabetes patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pénzügy kutatócsoport a TÁMOP-4.2.1.B-09/1/KMR-2010-0005 azonosítójú projektjében igen szerteágazó elemzési munkát végzett. Rámutattunk, hogy a különböző szintű gazdasági szereplők megnövekedett tőkeáttétele egyértelműen a rendszerkockázat növekedéséhez vezet, hiszen nő az egyes szereplők csődjének valószínűsége. Ha a tőkeáttételt eltérő mértékben és ütemben korlátozzák az egyes szektorokban, országokban akkor a korlátozást később bevezető szereplők egyértelműen versenyelőnyhöz jutnak. Az egyes pénzügyi intézmények tőkeallokációját vizsgálva kimutattuk, hogy a különféle divíziók közt mindig lehetséges a működés fedezetésül szolgáló tőkét (kockázatot) úgy felosztani, hogy a megállapodás felmondás egyik érintettnek se álljon érdekében. Ezt azonban nem lehet minden szempontból igazságosan megtenni, így egyes üzletágak versenyhátrányba kerülhetnek, ha a konkurens piaci szereplők az adott tevékenységet kevésbé igazságtalanul terhelték meg. Kimutattunk, hogy az egyes nyugdíjpénztárak befektetési tevékenységének eredményességére nagy hatással van a magánnyugdíjpénztárak tevékenységének szabályozása. Ezek a jogszabályok a társadalom hosszú távú versenyképességére vannak hatással. Rámutattunk arra is, hogy a gazdasági válság előtt a hazai bankok sem voltak képesek ügyfeleik kockázatviselő képességét helyesen megítélni, ráadásul jutalékrendszerük nem is tette ebben érdekelté azokat. Számos vizsgálatunk foglalkozott a magyar vállalatok versenyképességének alakulásával is. Megvizsgáltuk a különféle adónemek, árfolyamkockázatok és finanszírozási politikák versenyképességet befolyásoló hatását. Külön kutatás vizsgálta a kamatlábak ingadozásának és az hitelekhez kapcsolódó eszközfedezet meglétének vállalati értékre gyakorolt hatásait. Rámutattunk a nemfizetés növekvő kockázatára, és áttekintettük a lehetséges és a ténylegesen alkalmazott kezelési stratégiákat is. Megvizsgáltuk azt is, hogy a tőzsdei cégek tulajdonosai miként használják ki az osztalékfizetéshez kapcsolódó adóoptimalizálási lehetőségeket. Gyakorlati piaci tapasztalataik alapján az adóelkerülő kereskedést a befektetők a részvények egy jelentős részénél végrehajtják. Külön kutatás foglakozott a szellemi tőke hazai vállalatoknál játszott szerepéről. Ez alapján a cégek a problémát 2009-ben lényegesen magasabb szakértelemmel kezelték, mint öt esztendővel korábban. Rámutattunk arra is, hogy a tulajdonosi háttér lényeges hatást gyakorolhat arra, ahogyan a cégek célrendszerüket felépítik, illetve ahogy az intellektuális javakra tekintenek. _____ The Finance research team has covered a wide range of research fields while taking part at project TÁMOP-4.2.1.B-09/1/KMR-2010-0005. It has been shown that the increasing financial gearing at the different economic actors clearly leads to growth in systematic risk as the probability of bankruptcy climbs upwards. Once the leverage is limited at different levels and at different points in time for the different sectors, countries introducing the limitations later gain clearly a competitive advantage. When investigating the leverage at financial institutions we found that the capital requirement of the operation can always be divided among divisions so that none of them would be better of with cancelling the cooperation. But this cannot be always done fairly from all point of view meaning some of the divisions may face a competitive disadvantage if competitors charge their similar division less unfairly. Research has also shown that the regulation of private pension funds has vital effect on the profitability of the investment activity of the funds. These laws and regulations do not only affect the funds themselves but also the competitiveness of the whole society. We have also fund that Hungarian banks were unable to estimate correctly the risk taking ability of their clients before the economic crisis. On the top of that the bank were not even interested in that due to their commission based income model. We also carried out several research on the competitiveness of the Hungarian firms. The effect of taxes, currency rate risks, and financing policies on competitiveness has been analysed in detail. A separate research project was dedicated to the effect of the interest rate volatility and asset collaterals linked to debts on the value of the firm. The increasing risk of non-payment has been underlined and we also reviewed the adequate management strategies potentially available and used in real life. We also investigated how the shareholders of listed companies use the tax optimising possibilities linked to dividend payments. Based on our findings on the Hungarian markets the owners perform the tax evading trades in case of the most shares. A separate research has been carried out on the role played by intellectual capital. After that the Hungarian companies dealt with the problem in 2009 with far higher proficiency than five years earlier. We also pointed out that the ownership structure has a considerable influence on how firms structure their aims and view their intangible assets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The intent of this work was to develop a mobile robotic platform that was controlled by a Palm Pilot PDA. Advances in consumer electronics are producing powerful yet small handheld devices. Some of these devices present quasi-PC capabilities for a fraction of the cost; furthermore, they are compact enough that they fit in all but the smallest of platforms. The platform prototype built for testing purposes has a differential-drive configuration to provide simple but agile movement control. The sensor package consisted of two infrared ranging sensors mounted on servomotors that provide a wide area of detection. Building such a platform involved selection of hardware, circuit integration and software development. The software suite selected to develop code for the Palm Pilot was CodeWarrior, a C compiler that can generate code in Palm-native PRC files.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Water-alternating-gas (WAG) is an enhanced oil recovery method combining the improved macroscopic sweep of water flooding with the improved microscopic displacement of gas injection. The optimal design of the WAG parameters is usually based on numerical reservoir simulation via trial and error, limited by the reservoir engineer’s availability. Employing optimisation techniques can guide the simulation runs and reduce the number of function evaluations. In this study, robust evolutionary algorithms are utilized to optimise hydrocarbon WAG performance in the E-segment of the Norne field. The first objective function is selected to be the net present value (NPV) and two global semi-random search strategies, a genetic algorithm (GA) and particle swarm optimisation (PSO) are tested on different case studies with different numbers of controlling variables which are sampled from the set of water and gas injection rates, bottom-hole pressures of the oil production wells, cycle ratio, cycle time, the composition of the injected hydrocarbon gas (miscible/immiscible WAG) and the total WAG period. In progressive experiments, the number of decision-making variables is increased, increasing the problem complexity while potentially improving the efficacy of the WAG process. The second objective function is selected to be the incremental recovery factor (IRF) within a fixed total WAG simulation time and it is optimised using the same optimisation algorithms. The results from the two optimisation techniques are analyzed and their performance, convergence speed and the quality of the optimal solutions found by the algorithms in multiple trials are compared for each experiment. The distinctions between the optimal WAG parameters resulting from NPV and oil recovery optimisation are also examined. This is the first known work optimising over this complete set of WAG variables. The first use of PSO to optimise a WAG project at the field scale is also illustrated. Compared to the reference cases, the best overall values of the objective functions found by GA and PSO were 13.8% and 14.2% higher, respectively, if NPV is optimised over all the above variables, and 14.2% and 16.2% higher, respectively, if IRF is optimised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Humans are profoundly affected by the surroundings which they inhabit. Environmental psychologists have produced numerous credible theories describing optimal human environments, based on the concept of congruence or “fit” (1, 2). Lack of person/environment fit can lead to stress-related illness and lack of psychosocial well-being (3). Conversely, appropriately designed environments can promote wellness (4) or “salutogenesis” (5). Increasingly, research in the area of Evidence-Based Design, largely concentrated in the area of healthcare architecture, has tended to bear out these theories (6). Patients and long-term care residents, because of injury, illness or physical/ cognitive impairment, are less likely to be able to intervene to modify their immediate environment, unless this is designed specifically to facilitate their particular needs. In the context of care settings, detailed design of personal space therefore takes on enormous significance. MyRoom conceptualises a personalisable room, utilising sensoring and networked computing to enable the environment to respond directly and continuously to the occupant. Bio-signals collected and relayed to the system will actuate application(s) intended to positively influence user well-being. Drawing on the evidence base in relation to therapeutic design interventions (7), real-time changes in ambient lighting, colour, image, etc. respond continuously to the user’s physiological state, optimising congruence. Based on research evidence, consideration is also given to development of an application which uses natural images (8). It is envisaged that actuation will require machine-learning based on interpretation of data gathered by sensors; sensoring arrangements may vary depending on context and end-user. Such interventions aim to reduce inappropriate stress/ provide stimulation, supporting both instrumental and cognitive tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an ISA extension that decouples the data access and register write operations in a load instruction. We describe system and hardware support for decoupled loads. Furthermore, we show how compilers can generate better static instruction schedules by hoisting a decoupled load’s data access above may-alias stores and branches. We find that decoupled loads improve performance with geometric mean speedups of 8.4%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les langages de programmation typés dynamiquement tels que JavaScript et Python repoussent la vérification de typage jusqu’au moment de l’exécution. Afin d’optimiser la performance de ces langages, les implémentations de machines virtuelles pour langages dynamiques doivent tenter d’éliminer les tests de typage dynamiques redondants. Cela se fait habituellement en utilisant une analyse d’inférence de types. Cependant, les analyses de ce genre sont souvent coûteuses et impliquent des compromis entre le temps de compilation et la précision des résultats obtenus. Ceci a conduit à la conception d’architectures de VM de plus en plus complexes. Nous proposons le versionnement paresseux de blocs de base, une technique de compilation à la volée simple qui élimine efficacement les tests de typage dynamiques redondants sur les chemins d’exécution critiques. Cette nouvelle approche génère paresseusement des versions spécialisées des blocs de base tout en propageant de l’information de typage contextualisée. Notre technique ne nécessite pas l’utilisation d’analyses de programme coûteuses, n’est pas contrainte par les limitations de précision des analyses d’inférence de types traditionnelles et évite la complexité des techniques d’optimisation spéculatives. Trois extensions sont apportées au versionnement de blocs de base afin de lui donner des capacités d’optimisation interprocédurale. Une première extension lui donne la possibilité de joindre des informations de typage aux propriétés des objets et aux variables globales. Puis, la spécialisation de points d’entrée lui permet de passer de l’information de typage des fonctions appellantes aux fonctions appellées. Finalement, la spécialisation des continuations d’appels permet de transmettre le type des valeurs de retour des fonctions appellées aux appellants sans coût dynamique. Nous démontrons empiriquement que ces extensions permettent au versionnement de blocs de base d’éliminer plus de tests de typage dynamiques que toute analyse d’inférence de typage statique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regulating intracellular pH (pHi) is critical for optimising the metabolic activity of corals, yet mechanisms involved in pH regulation and the buffering capacity within coral cells are not well understood. Our study investigated how the presence of symbiotic dinoflagellates affects the response of pHi to pCO2-driven seawater acidification in cells isolated from Pocillopora damicornis. Using the fluorescent dye BCECF-AM, in conjunction with confocal microscopy, we simultaneously characterised the response of pHi in host coral cells and their dinoflagellate symbionts, in symbiotic and non-symbiotic states under saturating light, with and without the photosynthetic inhibitor DCMU. Each treatment was run under control (pH 7.8) and CO2 acidified seawater conditions (decreasing pH from 7.8 - 6.8). After two hours of CO2 addition, by which time the external pH (pHe) had declined to 6.8, the dinoflagellate symbionts had increased their pHi by 0.5 pH units above control levels. In contrast, in both symbiotic and non-symbiotic host coral cells, 15 min of CO2 addition (0.2 pH unit drop in pHe) led to cytoplasmic acidosis equivalent to 0.4 pH units. Despite further seawater acidification over the duration of the experiment, the pHi of non-symbiotic coral cells did not change, though in host cells containing a symbiont cell the pHi recovered to control levels. This recovery was negated when cells were incubated with DCMU. Our results reveal that photosynthetic activity of the endosymbiont is tightly coupled with the ability of the host cell to recover from cellular acidosis after exposure to high CO2 / low pH.