583 resultados para optimising compiler
Resumo:
Este trabalho foi efectuado com o apoio da Universidade de Lisboa, Instituto Superior de Agronomia com o Centro de Engenharia dos Biossistemas (CEER
Resumo:
OutSystems Platform is used to develop, deploy, and maintain enterprise web an mobile web applications. Applications are developed through a visual domain specific language, in an integrated development environment, and compiled to a standard stack of web technologies. In the platform’s core, there is a compiler and a deployment service that transform the visual model into a running web application. As applications grow, compilation and deployment times increase as well, impacting the developer’s productivity. In the previous model, a full application was the only compilation and deployment unit. When the developer published an application, even if he only changed a very small aspect of it, the application would be fully compiled and deployed. Our goal is to reduce compilation and deployment times for the most common use case, in which the developer performs small changes to an application before compiling and deploying it. We modified the OutSystems Platform to support a new incremental compilation and deployment model that reuses previous computations as much as possible in order to improve performance. In our approach, the full application is broken down into smaller compilation and deployment units, increasing what can be cached and reused. We also observed that this finer model would benefit from a parallel execution model. Hereby, we created a task driven Scheduler that executes compilation and deployment tasks in parallel. Our benchmarks show a substantial improvement of the compilation and deployment process times for the aforementioned development scenario.
Resumo:
Digital Businesses have become a major driver for economic growth and have seen an explosion of new startups. At the same time, it also includes mature enterprises that have become global giants in a relatively short period of time. Digital Businesses have unique characteristics that make the running and management of a Digital Business much different from traditional offline businesses. Digital businesses respond to online users who are highly interconnected and networked. This enables a rapid flow of word of mouth, at a pace far greater than ever envisioned when dealing with traditional products and services. The relatively low cost of incremental user addition has led to a variety of innovation in pricing of digital products, including various forms of free and freemium pricing models. This thesis explores the unique characteristics and complexities of Digital Businesses and its implications on the design of Digital Business Models and Revenue Models. The thesis proposes an Agent Based Modeling Framework that can be used to develop Simulation Models that simulate the complex dynamics of Digital Businesses and the user interactions between users of a digital product. Such Simulation models can be used for a variety of purposes such as simple forecasting, analysing the impact of market disturbances, analysing the impact of changes in pricing models and optimising the pricing for maximum revenue generation or a balance between growth in usage and revenue generation. These models can be developed for a mature enterprise with a large historical record of user growth rate as well as for early stage enterprises without much historical data. Through three case studies, the thesis demonstrates the applicability of the Framework and its potential applications.
Resumo:
Este proyecto se enmarca en la utlización de métodos formales (más precisamente, en la utilización de teoría de tipos) para garantizar la ausencia de errores en programas. Por un lado se plantea el diseño de nuevos algoritmos de chequeo de tipos. Para ello, se proponen nuevos algoritmos basados en la idea de normalización por evaluación que sean extensibles a otros sistemas de tipos. En el futuro próximo extenderemos resultados que hemos conseguido recientemente [16,17] para obtener: una simplificación de los trabajos realizados para sistemas sin regla eta (acá se estudiarán dos sistemas: a la Martin Löf y a la PTS), la formulación de estos chequeadores para sistemas con variables, generalizar la noción de categoría con familia utilizada para dar semántica a teoría de tipos, obtener una formulación categórica de la noción de normalización por evaluación y finalmente, aplicar estos algoritmos a sistemas con reescrituras. Para los primeros resultados esperados mencionados, nos proponemos como método adaptar las pruebas de [16,17] a los nuevos sistemas. La importancia radica en que permitirán tornar más automatizables (y por ello, más fácilmente utilizables) los asistentes de demostración basados en teoría de tipos. Por otro lado, se utilizará la teoría de tipos para certificar compiladores, intentando llevar adelante la propuesta nunca explorada de [22] de utilizar un enfoque abstracto basado en categorías funtoriales. El método consistirá en certificar el lenguaje "Peal" [29] y luego agregar sucesivamente funcionalidad hasta obtener Forsythe [23]. En este período esperamos poder agregar varias extensiones. La importancia de este proyecto radica en que sólo un compilador certificado garantiza que un programa fuente correcto se compile a un programa objeto correcto. Es por ello, crucial para todo proceso de verificación que se base en verificar código fuente. Finalmente, se abordará la formalización de sistemas con session types. Los mismos han demostrado tener fallas en sus formulaciones [30], por lo que parece conveniente su formalización. Durante la marcha de este proyecto, esperamos tener alguna formalización que dé lugar a un algoritmo de chequeo de tipos y a demostrar las propiedades usuales de los sistemas. La contribución es arrojar un poco de luz sobre estas formulaciones cuyos errores revelan que el tema no ha adquirido aún suficiente madurez o comprensión por parte de la comunidad. This project is about using type theory to garantee program correctness. It follows three different directions: 1) Finding new type-checking algorithms based on normalization by evaluation. First, we would show that recent results like [16,17] extend to other type systems like: Martin-Löf´s type theory without eta rule, PTSs, type systems with variables (in addition to systems in [16,17] which are a la de Bruijn), systems with rewrite rules. This will be done by adjusting the proofs in [16,17] so that they apply to such systems as well. We will also try to obtain a more general definition of categories with families and normalization by evaluation, formulated in categorical terms. We expect this may turn proof-assistants more automatic and useful. 2) Exploring the proposal in [22] to compiler construction for Algol-like languages using functorial categories. According to [22] such approach is suitable for verifying compiler correctness, claim which was never explored. First, the language Peal [29] will be certified in type theory and we will gradually add funtionality to it until a correct compiler for the language Forsythe [23] is obtained. 3) Formilizing systems for session types. Several proposals have shown to be faulty [30]. This means that a formalization of it may contribute to the general understanding of session types.
Resumo:
A three-year research proposal of international impact and quality was developed between the author and CREVER researchers, taking into account the infrastructure and research lines already being carried out at CREVER. The project is related to the study of new mixtures in thermal activated systems, by the addition of a third component to help the heat transfer processes. The project proposes the use of water and lithium nitrate as absorbents in ternary ammonia mixtures, varying the concentration with the objective of optimising the mixture for solar air conditioning purposes. Also, the research proposal will promote an intensive collaboration in the following years between CREVER and The Centro de Investigación en Energía-UNAM, Mexico.
Resumo:
Labour market regulations aimed at enhancing job-security are dominant in several OECD countries. These regulations seek to reduce dismissals of workers and fluctuations in employment. The main theoretical contribution is to gauge the effects of such regulations on labour demand across establishment sizes. In order to achieve this, we investigate an optimising model of labour demand under uncertainty through the application of real option theory. We also consider other forms of employment which increase the flexibility of the labour market. In particular, we are modelling the contribution of temporary employment agencies (Zeitarbeit) allowing for quick personnel adjustments in client firms. The calibration results indicate that labour market rigidities may be crucial for understanding sluggishness in firms´ labour demand and the emergence and growth of temporary work.
Resumo:
Modern macroeconomic theory utilises optimal control techniques to model the maximisation of individual well-being using a lifetime utility function. Agents face choices over current and future consumption (with resultant implied savings decisions) seeking to maximise the present value of current plus future well-being. However, such inter-temporal welfare-maximising assumptions remain empirically untested. In the work presented here we test whether welfare was in (historical) fact maximised in the US between 1870-2000 and find empirical support for the optimising basis of growth theory, but only once a comprehensive view of what constitutes a country’s wealth or capital is taken into account.
Resumo:
El objetivo fundamental de este proyecto consiste en crear un generador de compilador, basado en analizadores ascendentes. Como base para hacer este analizador se usará el lenguaje Cosel y el módulo Com, que es un generador de compiladores basado en analizadores descendentes y que actualmente se está utilizando en las prácticas de la asignatura de Compiladores I. El nuevo generador, que tiene como entrada una gramática, ha de comprobar si es una gramática ascendente LALR (1) y analizar una cadena de entrada de símbolos usando dicha gramática.
Resumo:
Nowadays, many of the health care systems are large and complex environments and quite dynamic, specifically Emergency Departments, EDs. It is opened and working 24 hours per day throughout the year with limited resources, whereas it is overcrowded. Thus, is mandatory to simulate EDs to improve qualitatively and quantitatively their performance. This improvement can be achieved modelling and simulating EDs using Agent-Based Model, ABM and optimising many different staff scenarios. This work optimises the staff configuration of an ED. In order to do optimisation, objective functions to minimise or maximise have to be set. One of those objective functions is to find the best or optimum staff configuration that minimise patient waiting time. The staff configuration comprises: doctors, triage nurses, and admissions, the amount and sort of them. Staff configuration is a combinatorial problem, that can take a lot of time to be solved. HPC is used to run the experiments, and encouraging results were obtained. However, even with the basic ED used in this work the search space is very large, thus, when the problem size increases, it is going to need more resources of processing in order to obtain results in an acceptable time.
Resumo:
The outcome for patients after an out-of-hospital cardiac arrest (OHCA) has been poor over many decades and single interventions have mostly resulted in disappointing results. More recently, some regions have observed better outcomes after redesigning their cardiac arrest pathways. Optimised resuscitation and prehospital care is absolutely key, but in-hospital care appears to be at least as important. OHCA treatment requires a multidisciplinary approach, comparable to trauma care; the development of cardiac arrest pathways and cardiac arrest centres may dramatically improve patient care and outcomes. Besides emergency medicine physicians, intensivists and neurologists, cardiologists are playing an increasingly crucial role in the post-resuscitation management, especially by optimising cardiac output and undertaking urgent coronary angiography/intervention.
Resumo:
Résumé grand public :Le cerveau se compose de cellules nerveuses appelées neurones et de cellules gliales dont font partie les astrocytes. Les neurones communiquent entre eux par signaux électriques et en libérant des molécules de signalisation comme le glutamate. Les astrocytes ont eux pour charge de capter le glucose depuis le sang circulant dans les vaisseaux sanguins, de le transformer et de le transmettre aux neurones pour qu'ils puissent l'utiliser comme source d'énergie. L'astrocyte peut ensuite utiliser ce glucose de deux façons différentes pour produire de l'énergie : la première s'opère dans des structures appelées mitochondries qui sont capables de produire plus de trente molécules riches en énergie (ATP) à partir d'une seule molécule de glucose ; la seconde possibilité appelée glycolyse peut produire deux molécules d'ATP et un dérivé du glucose appelé lactate. Une théorie couramment débattue propose que lorsque les astrocytes capturent le glutamate libéré par les neurones, ils libèrent en réponse du lactate qui servirait de base énergétique aux neurones. Cependant, ce mécanisme n'envisage pas une augmentation de l'activité des mitochondries des astrocytes, ce qui serait pourtant bien plus efficace pour produire de l'énergie.En utilisant la microscopie par fluorescence, nous avons pu mesurer les changements de concentrations ioniques dans les mitochondries d'astrocytes soumis à une stimulation glutamatergique. Nous avons démontré que les mitochondries des astrocytes manifestent des augmentations spontanées et transitoires de leur concentrations ioniques, dont la fréquence était diminuée au cours d'une stimulation avec du glutamate. Nous avons ensuite montré que la capture de glutamate augmentait la concentration en sodium et acidifiait les mitochondries des astrocytes. En approfondissant ces mécanismes, plusieurs éléments ont suggéré que l'acidification induite diminuerait le potentiel de synthèse d'énergie d'origine mitochondriale et la consommation d'oxygène dans les astrocytes. En résumé, l'ensemble de ces travaux suggère que la signalisation neuronale impliquant le glutamate dicte aux astrocytes de sacrifier temporairement l'efficacité de leur métabolisme énergétique, en diminuant l'activité de leurs mitochondries, afin d'augmenter la disponibilité des ressources énergétiques utiles aux neurones.Résumé :La remarquable efficacité du cerveau à compiler et propager des informations coûte au corps humain 20% de son budget énergétique total. Par conséquent, les mécanismes cellulaires responsables du métabolisme énergétique cérébral se sont adéquatement développés pour répondre aux besoins énergétiques du cerveau. Les dernières découvertes en neuroénergétique tendent à démontrer que le site principal de consommation d'énergie dans le cerveau est situé dans les processus astrocytaires qui entourent les synapses excitatrices. Un nombre croissant de preuves scientifiques a maintenant montré que le transport astrocytaire de glutamate est responsable d'un coût métabolique important qui est majoritairement pris en charge par une augmentation de l'activité glycolytique. Cependant, les astrocytes possèdent également un important métabolisme énergétique de type mitochondrial. Par conséquent, la localisation spatiale des mitochondries à proximité des transporteurs de glutamate suggère l'existence d'un mécanisme régulant le métabolisme énergétique astrocytaire, en particulier le métabolisme mitochondrial.Afin de fournir une explication à ce paradoxe énergétique, nous avons utilisé des techniques d'imagerie par fluorescence pour mesurer les modifications de concentrations ioniques spontanées et évoquées par une stimulation glutamatergique dans des astrocytes corticaux de souris. Nous avons montré que les mitochondries d'astrocytes au repos manifestaient des changements individuels, spontanés et sélectifs de leur potentiel électrique, de leur pH et de leur concentration en sodium. Nous avons trouvé que le glutamate diminuait la fréquence des augmentations spontanées de sodium en diminuant le niveau cellulaire d'ATP. Nous avons ensuite étudié la possibilité d'une régulation du métabolisme mitochondrial astrocytaire par le glutamate. Nous avons montré que le glutamate initie dans la population mitochondriale une augmentation rapide de la concentration en sodium due à l'augmentation cytosolique de sodium. Nous avons également montré que le relâchement neuronal de glutamate induit une acidification mitochondriale dans les astrocytes. Nos résultats ont indiqué que l'acidification induite par le glutamate induit une diminution de la production de radicaux libres et de la consommation d'oxygène par les astrocytes. Ces études ont montré que les mitochondries des astrocytes sont régulées individuellement et adaptent leur activité selon l'environnement intracellulaire. L'adaptation dynamique du métabolisme énergétique mitochondrial opéré par le glutamate permet d'augmenter la quantité d'oxygène disponible et amène au relâchement de lactate, tous deux bénéfiques pour les neurones.Abstract :The remarkable efficiency of the brain to compute and communicate information costs the body 20% of its total energy budget. Therefore, the cellular mechanisms responsible for brain energy metabolism developed adequately to face the energy needs. Recent advances in neuroenergetics tend to indicate that the main site of energy consumption in the brain is the astroglial process ensheating activated excitatory synapses. A large body of evidence has now shown that glutamate uptake by astrocytes surrounding synapses is responsible for a significant metabolic cost, whose metabolic response is apparently mainly glycolytic. However, astrocytes have also a significant mitochondrial oxidative metabolism. Therefore, the location of mitochondria close to glutamate transporters raises the question of the existence of mechanisms for tuning their energy metabolism, in particular their mitochondrial metabolism.To tackle these issues, we used real time imaging techniques to study mitochondrial ionic alterations occurring at resting state and during glutamatergic stimulation of mouse cortical astrocytes. We showed that mitochondria of intact resting astrocytes exhibited individual spontaneous and selective alterations of their electrical potential, pH and Na+ concentration. We found that glutamate decreased the frequency of mitochondrial Na+ transient activity by decreasing the cellular level of ATP. We then investigated a possible link between glutamatergic transmission and mitochondrial metabolism in astrocytes. We showed that glutamate triggered a rapid Na+ concentration increase in the mitochondrial population as a result of plasma-membrane Na+-dependent uptake. We then demonstrated that neuronally released glutamate also induced a mitochondrial acidification in astrocytes. Glutamate induced a pH-mediated and cytoprotective decrease of mitochondrial metabolism that diminished oxygen consumption. Taken together, these studies showed that astrocytes contain mitochondria that are individually regulated and sense the intracellular environment to modulate their own activity. The dynamic regulation of astrocyte mitochondrial energy output operated by glutamate allows increasing oxygen availability and lactate production both being beneficial for neurons.
Resumo:
Recently published guidelines��in the UK��relating to sight tests among people with dementia go some of the way to addressing the specific needs of this group. However, there is still a long way to go in terms of improving the provision of eye care services and optimising the visual health of this group.A��study, published by the Thomas Pocklington Trust,��which examines this subject - The development of professional guidelines for the eye examination of people with dementia - was presented at the first ever national “Dementia and Sight Loss conference” in London (1st December) - a forum where 100 dementia and sight loss professionals met to discuss ways to tackle the challenge of concurrent dementia and sight loss. The study, by researchers at the University of Bradford Schools of Optometry and Health Studies, reviewed procedures for sight tests and eye examinations among those with dementia. It found that policy and practice were hampered by a serious lack of basic research into concurrent dementia and sight loss and prompted recommendations which could lead to improved procedures, tools and techniques.Recommendations prompted by the study outline seven steps towards improving policy and practice:Conduct a systematic study of the availability and uptake of sight tests among people with dementia. Set up a website for people with dementia and their carers with information on how dementia affects eye health, and the importance of eye examinations. Develop education and training for optometrists and care home staff. Compile a list of optometrists experienced in providing eye care for people with dementia. Develop a template for recording the results of eye examinations in people with dementia – something which can be endorsed by professional bodies and made available to care homes. Measure the effectiveness of eye care, such as sight tests and cataract removals, on the quality of life of people with dementia. Research clinical testing methods so that guidelines can be strengthened. Measuring contrast sensitivity, for example, in someone with dementia could be vital as an inability to judge contrasts can make daily tasks impossible.To access the discussion paper please follow this link: The development of professional guidelines for the eye examination of people with dementia ��
Resumo:
Introduction: The ICT progress and development are making the set up of virtual libraries easier, with great advantages for users in terms of immediate access to vast amounts of library information, online resources, and services. Most of the Autonomous Spanish Regions are presently working on the establishment of virtual libraries with the aim of optimising economic resources destined to subscribe bibliographic information sources and offering qualified documentary services to Public Health System Professionals. The Ministry of Health and Social Policy, an institution unifying health matters, issued this Project to study the National Health System Virtual Library creation viability, in order to guarantee relevant scientific information access equity. Objectives: - To study the National Health System (NHS) Virtual Library set up viability. - To elaborate the Spanish territory health information resource map. - To identify the appropriate technological model. - To identify the documentary services to be offered. - To identify the more optimum functional, structural model. - To identify the economic model. Method: - To create a chart organization for the project´s management. - To organize Work groups. - To elaborate a standardised survey model for the libraries of the autonomous regions. Results: - Identification of the different Autonomous Region libraries of health models. - Recommendations for the NHS Virtual Library: - Functional, structural model - Technological model - Financial-economic model Conclusions: The fact that the Virtual Library would be an invaluable space for access to quality scientific information, as well as that the minimum services could be offered to every NHS user regardless of geographical situation, is confirmed. Accordingly, we thought about three different models to develop the Virtual Library, depending on this initial analysis, which will allow us to establish the most suitable model for the Spanish National Health System, considering its economic, functional and technological recommendations. It would be necessary to do a study for the NHS Virtual Library Set Up following the recommendations arising from this Project.
Resumo:
This paper presents the design and implementation of a mission control system (MCS) for an autonomous underwater vehicle (AUV) based on Petri nets. In the proposed approach the Petri nets are used to specify as well as to execute the desired autonomous vehicle mission. The mission is easily described using an imperative programming language called mission control language (MCL) that formally describes the mission execution thread. A mission control language compiler (MCL-C) able to automatically translate the MCL into a Petri net is described and a real-time Petri net player that allows to execute the resulting Petri net onboard an AUV are also presented
Resumo:
The focus of this review is to highlight the need for improved communication between medical and dental professionals in order to deliver more effective care to patients. The need for communication is increasingly required to capitalise on recent advances in the biological sciences and in medicine for the management of patients with chronic diseases. Improvements in longevity have resulted in populations with increasing special oral-care needs, including those who have cancer of the head and neck, those who are immunocompromised due to HIV/AIDS, advanced age, residence in long-term care facilities or the presence of life-long conditions, and those who are receiving long-term prescription medications for chronic conditions (e.g., anti-hypertensives, anticoagulants, immunosuppressants, antidepressants). These medications can cause adverse reactions in the oral cavity, such as xerostomia and ulceration. Patients with xerostomia are at increased risk of tooth decay, periodontal disease and infection. The ideal management of such individuals should involve the collaborative efforts of physicians, nurses, dentists and dental hygienists, thus optimising treatment and minimising secondary complications deriving from the oral cavity.