15 resultados para performance management framework
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
En aquest article es pretén explicar breument la viabilitat de la futura gestió i utilització de la biomassa forestal de Bellver de Cerdanya mitjançant un district heating al futur barri del Pla de Tomet. Les particularitats per les quals aquest poble és ideal per a aquest projecte són que l'ajuntament és propietari de gairebé un 90% dels boscos situats en aquest municipi; i que alhora ja ha realitzat diverses instal·lacions que utilitzen la biomassa forestal per a calefacció i ACS. La situació econòmica de la comarca és bastant complicada, ja que s'ha basat en el sector turístic i la construcció, però ambdós no passen pel millor moment. El projecte serviria per donar un valor a la biomassa forestal que fins ara no s'ha donat, i alhora s'intenta buscar nous inputs econòmics per a la Cerdanya. En aquest treball també s'analitza quins haurien de ser els futurs tractaments que s'haurien d'aplicar a la forest, tenint en compte les activitats que es realitzen actualment, i evitant en tot moment possibles efectes negatius, com podria ser la sobreexplotació. També es dedica una part del projecte a explicar els sistemes per obtenir i gestionar de forma correcta la biomassa. A continuació es tracta la part més tècnica, realitzant una estimació del possible futur consum energètic del barri del Pla de Tomet, encara no construït; i decidint quins sistema de calderes seria el més adequat, el tipus d’emmagatzematge més apropiat i els passos a seguir per millorar el rendiment del procés de la gestió i extracció de la biomassa. Seguint tots aquests passos s'arriba a la conclusió que aprofitar la biomassa forestal és millor solució que utilitzar combustibles fòssils. A part dels obvis beneficis medi ambientals, també és millor a nivell econòmic, tant pels futurs veïns com per l'ajuntament.
Resumo:
This paper proposes a two-dimensional Strategic Performance Measure (SPM) to evaluate the achievement of sustained superior performance. This proposal builds primarily on the fact that, under the strategic management perspective, a firm's prevalent objective is the pursuit of sustained superior performance. Three basic conceptual dimensions stem from this objective: relativity, sign dependence, and dynamism. These are the foundations of the SPM, which carries out a separate evaluation of the attained superior performance and of its sustainability over time. In contrast to existing measures of performance, the SPM provides: (i) a dynamic approach by considering the progress or regress in performance over time, and (ii) a cardinal measurement of performance differences and its changes over time. The paper also proposes an axiomatic framework that a measure of strategic performance should comply with to be theoretically and managerially sound. Finally, an empirical illustration of the Spanish banking sector during 1987-1999 is herein provided by discussing some relevant cases.
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.
Resumo:
This project deals with the generation of profitability and the distribution of its benefits. Inspired by Davis (1947, 1955), we define profitability as the ratio of revenue to cost. Profitability is not as popular a measure of business financial performance as profit, the difference between revenue and cost. Regardless of its popularity, however, profitability is surely a useful financial performance measure. Our primary objective in this project is to identify the factors that generate change in profitability. One set of factors, which we refer to as sources, consists of changes in quantities and prices of outputs and inputs. Individual quantity changes aggregate to the overall impact of quantity change on profitability change, which we call productivity change. Individual price changes aggregate to the overall impact of price change on profitability change, which we call price recovery change. In this framework profitability change consists exclusively of productivity change and price recovery change. A second set of factors, which we refer to as drivers, consists of phenomena such as technical change, change in the efficiency of resource allocation, and the impact of economies of scale. The ability of management to harness these factors drives productivity change, which is one component of profitability change. Thus the term sources refers to quantities and prices of individual outputs and inputs, whose changes influence productivity change or price recovery change, either of which influences profitability change. The term drivers refers to phenomena related to technology and management that influence productivity change (but not price recovery change), and hence profitability change.
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
In the accounting literature, interaction or moderating effects are usually assessed by means of OLS regression and summated rating scales are constructed to reduce measurement error bias. Structural equation models and two-stage least squares regression could be used to completely eliminate this bias, but large samples are needed. Partial Least Squares are appropriate for small samples but do not correct measurement error bias. In this article, disattenuated regression is discussed as a small sample alternative and is illustrated on data of Bisbe and Otley (in press) that examine the interaction effect of innovation and style of use of budgets on performance. Sizeable differences emerge between OLS and disattenuated regression
Resumo:
Technological limitations and power constraints are resulting in high-performance parallel computing architectures that are based on large numbers of high-core-count processors. Commercially available processors are now at 8 and 16 cores and experimental platforms, such as the many-core Intel Single-chip Cloud Computer (SCC) platform, provide much higher core counts. These trends are presenting new sets of challenges to HPC applications including programming complexity and the need for extreme energy efficiency.In this work, we first investigate the power behavior of scientific PGAS application kernels on the SCC platform, and explore opportunities and challenges for power management within the PGAS framework. Results obtained via empirical evaluation of Unified Parallel C (UPC) applications on the SCC platform under different constraints, show that, for specific operations, the potential for energy savings in PGAS is large; and power/performance trade-offs can be effectively managed using a cross-layerapproach. We investigate cross-layer power management using PGAS language extensions and runtime mechanisms that manipulate power/performance tradeoffs. Specifically, we present the design, implementation and evaluation of such a middleware for application-aware cross-layer power management of UPC applications on the SCC platform. Finally, based on our observations, we provide a set of recommendations and insights that can be used to support similar power management for PGAS applications on other many-core platforms.
Resumo:
This paper proposes a two-dimensional Strategic Performance Measure (SPM) to evaluate the achievement of sustained superior performance. This proposal builds primarily on the fact that, under the strategic management perspective, a firm's prevalent objective is the pursuit of sustained superior performance. Three basicconceptual dimensions stem from this objective: relativity, sign dependence, and dynamism. These are the foundations of the SPM, which carries out a separate evaluation of the attained superior performance and of its sustainability over time. In contrast to existing measures of performance, the SPM provides: (i) a dynamic approach by considering the progress or regress in performance over time, and (ii) a cardinal measurement of performance differences and its changes over time. The paper also proposes an axiomatic framework that ameasure of strategic performance should comply with to be theoretically and managerially sound. Finally, anempirical illustration of the Spanish banking sector during 1987-1999 is herein provided by discussing some relevant case
Resumo:
Objective: The importance of hemodynamics in the etiopathogenesis of intracranial aneurysms (IAs) is widely accepted.Computational fluid dynamics (CFD) is being used increasingly for hemodynamic predictions. However, alogn with thecontinuing development and validation of these tools, it is imperative to collect the opinion of the clinicians. Methods: A workshopon CFD was conducted during the European Society of Minimally Invasive Neurological Therapy (ESMINT) Teaching Course,Lisbon, Portugal. 36 delegates, mostly clinicians, performed supervised CFD analysis for an IA, using the @neuFuse softwaredeveloped within the European project @neurIST. Feedback on the workshop was collected and analyzed. The performancewas assessed on a scale of 1 to 4 and, compared with experts’ performance. Results: Current dilemmas in the management ofunruptured IAs remained the most important motivating factor to attend the workshop and majority of participants showedinterest in participating in a multicentric trial. The participants achieved an average score of 2.52 (range 0–4) which was 63% (range 0–100%) of an expert user. Conclusions: Although participants showed a manifest interest in CFD, there was a clear lack ofawareness concerning the role of hemodynamics in the etiopathogenesis of IAs and the use of CFD in this context. More effortstherefore are required to enhance understanding of the clinicians in the subject.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
In order to improve the management of copyright in the Internet, known as Digital Rights Management, there is the need for a shared language for copyright representation. Current approaches are based on purely syntactic solutions, i.e. a grammar that defines a rights expression language. These languages are difficult to put into practise due to the lack of explicit semantics that facilitate its implementation. Moreover, they are simple from the legal point of view because they are intended just to model the usage licenses granted by content providers to end-users. Thus, they ignore the copyright framework that lies behind and the whole value chain from creators to end-users. Our proposal is to use a semantic approach based on semantic web ontologies. We detail the development of a copyright ontology in order to put this approach into practice. It models the copyright core concepts for creation, rights and the basic kinds of actions that operate on content. Altogether, it allows building a copyright framework for the complete value chain. The set of actions operating on content are our smaller building blocks in order to cope with the complexity of copyright value chains and statements and, at the same time, guarantee a high level of interoperability and evolvability. The resulting copyright modelling framework is flexible and complete enough to model many copyright scenarios, not just those related to the economic exploitation of content. The ontology also includes moral rights, so it is possible to model this kind of situations as it is shown in the included example model for a withdrawal scenario. Finally, the ontology design and the selection of tools result in a straightforward implementation. Description Logic reasoners are used for license checking and retrieval. Rights are modelled as classes of actions, action patterns are modelled also as classes and the same is done for concrete actions. Then, to check if some right or license grants an action is reduced to check for class subsumption, which is a direct functionality of these reasoners.
Resumo:
This study compares the impact of quality management tools on the performance of organisations utilising the ISO 9001:2000 standard as a basis for a quality-management system band those utilising the EFQM model for this purpose. A survey is conducted among 107 experienced and independent quality-management assessors. The study finds that organisations with qualitymanagement systems based on the ISO 9001:2000 standard tend to use general-purpose qualitative tools, and that these do have a relatively positive impact on their general performance. In contrast, organisations adopting the EFQM model tend to use more specialised quantitative tools, which produce significant improvements in specific aspects of their performance. The findings of the study will enable organisations to choose the most effective quality-improvement tools for their particular quality strategy
Resumo:
El propósito de este trabajo es optimizar el sistema de gestión de workflows COMPSs caracterizando el comportamiento de diferentes dispositivos de memoria a nivel de consumo energético y tiempo de ejecución. Para llevar a cabo este propósito, se ha implementado un servicio de caché para COMPSs para conseguir que sea consciente de la jerarquía de memoria y se han realizado múltiples experimentos para caracterizar los dispositivos de memoria y las mejoras en el rendimiento.