939 resultados para resource consumption accounting
Resumo:
As the built environment accounts for much of the world's emissions, resource consumption and waste, concerns remain as to how sustainable the sector is. Understanding how such concerns can be better managed is complex, with a range of competing agendas and institutional forces at play. This is especially the case in Nigeria where there are often differing priorities, weak regulations and institutions to deal with this challenge. Construction firms are in competition with each other in a market that is growing in size and sophistication yearly. The business case for sustainability has been argued severally in literature. However, the capability of construction firms with respect to sustainability in Nigeria has not been studied. This paper presents the preliminary findings of an exploratory multi-case study carried out to understand the firm's views on sustainability as a source of competitive advantage. A international firm and a lower medium-sized indigenous firm were selected for this purpose. Qualitative interviews were conducted with top-level management of both firms, with key themes from the sustainable construction and dynamic capabilities literature informing the case study protocol. The interviews were transcribed and analysed with the use of NVivo software. The findings suggest that the multinational firm is better grounded in sustainability knowledge. Although the level of awareness and demand for sustainable construction is generally very poor, few international clients are beginning to stimulate interest in sustainable buildings. This has triggered both firms to build their capabilities in that regard, albeit in an unhurried manner. Both firms agree on the potentials of market-driven sustainability in the long term. Nonetheless, more drastic actions are required to accelerate the sustainable construction agenda in Nigeria.
Resumo:
Wavelength-routed networks (WRN) are very promising candidates for next-generation Internet and telecommunication backbones. In such a network, optical-layer protection is of paramount importance due to the risk of losing large amounts of data under a failure. To protect the network against this risk, service providers usually provide a pair of risk-independent working and protection paths for each optical connection. However, the investment made for the optical-layer protection increases network cost. To reduce the capital expenditure, service providers need to efficiently utilize their network resources. Among all the existing approaches, shared-path protection has proven to be practical and cost-efficient [1]. In shared-path protection, several protection paths can share a wavelength on a fiber link if their working paths are risk-independent. In real-world networks, provisioning is usually implemented without the knowledge of future network resource utilization status. As the network changes with the addition and deletion of connections, the network utilization will become sub-optimal. Reconfiguration, which is referred to as the method of re-provisioning the existing connections, is an attractive solution to fill in the gap between the current network utilization and its optimal value [2]. In this paper, we propose a new shared-protection-path reconfiguration approach. Unlike some of previous reconfiguration approaches that alter the working paths, our approach only changes protection paths, and hence does not interfere with the ongoing services on the working paths, and is therefore risk-free. Previous studies have verified the benefits arising from the reconfiguration of existing connections [2] [3] [4]. Most of them are aimed at minimizing the total used wavelength-links or ports. However, this objective does not directly relate to cost saving because minimizing the total network resource consumption does not necessarily maximize the capability of accommodating future connections. As a result, service providers may still need to pay for early network upgrades. Alternatively, our proposed shared-protection-path reconfiguration approach is based on a load-balancing objective, which minimizes the network load distribution vector (LDV, see Section 2). This new objective is designed to postpone network upgrades, thus bringing extra cost savings to service providers. In other words, by using the new objective, service providers can establish as many connections as possible before network upgrades, resulting in increased revenue. We develop a heuristic load-balancing (LB) reconfiguration approach based on this new objective and compare its performance with an approach previously introduced in [2] and [4], whose objective is minimizing the total network resource consumption.
Resumo:
This quantitative study aimed to identify the costs of the most frequent nursing activities in highly dependent hospitalized patients at a medical clinic. The non-probabilistic convenience sample corresponded to 607 observations regarding oral feeding activities (OF), blood pressure verification (BP)/heart rate (HR), body temperature checking (BTC), performance of intimate hygiene and management of feeding probe. The costs identified corresponded to R$2.40 (SD+/-2.64) for OF feeding; R$1.26 (SD+/-0.48) to verify the BP/HR; R$1.17 (SD+/-0.46) for BTC; R$15.59 (SD+/-8.62) to perform intimate hygiene and R$5.95 (SD+/-2.13) for management of feeding probe. This study will facilitate cost management, with a view to avoiding waste related to unnecessary resource consumption and establish a correlation between costs and care delivery results. Supported by Pro-Reitoria de Pesquisa, Universidade de Sao Paulo, Brazil.
Resumo:
OBJECTIVES Patients with inflammatory bowel disease (IBD) have a high resource consumption, with considerable costs for the healthcare system. In a system with sparse resources, treatment is influenced not only by clinical judgement but also by resource consumption. We aimed to determine the resource consumption of IBD patients and to identify its significant predictors. MATERIALS AND METHODS Data from the prospective Swiss Inflammatory Bowel Disease Cohort Study were analysed for the resource consumption endpoints hospitalization and outpatient consultations at enrolment [1187 patients; 41.1% ulcerative colitis (UC), 58.9% Crohn's disease (CD)] and at 1-year follow-up (794 patients). Predictors of interest were chosen through an expert panel and a review of the relevant literature. Logistic regressions were used for binary endpoints, and negative binomial regressions and zero-inflated Poisson regressions were used for count data. RESULTS For CD, fistula, use of biologics and disease activity were significant predictors for hospitalization days (all P-values <0.001); age, sex, steroid therapy and biologics were significant predictors for the number of outpatient visits (P=0.0368, 0.023, 0.0002, 0.0003, respectively). For UC, biologics, C-reactive protein, smoke quitters, age and sex were significantly predictive for hospitalization days (P=0.0167, 0.0003, 0.0003, 0.0076 and 0.0175 respectively); disease activity and immunosuppressive therapy predicted the number of outpatient visits (P=0.0009 and 0.0017, respectively). The results of multivariate regressions are shown in detail. CONCLUSION Several highly significant clinical predictors for resource consumption in IBD were identified that might be considered in medical decision-making. In terms of resource consumption and its predictors, CD and UC show a different behaviour.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, nonfailure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
We propose an analysis for detecting procedures and goals that are deterministic (i.e., that produce at most one solution at most once),or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic. The analysis takes advantage of the pruning operator in order to improve the detection of mutual exclusion and determinacy. It also supports arithmetic equations and disequations, as well as equations and disequations on terms,for which we give a complete satisfiability testing algorithm, w.r.t. available type information. Information about determinacy can be used for program debugging and optimization, resource consumption and granularity control, abstraction carrying code, etc. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efficient.
Resumo:
En estos tiempos de crisis se hace imperativo lograr un consumo de recursos públicos lo más racional posible. El transporte público urbano es un sector al que se dedican grandes inversiones y cuya prestación de servicios está fuertemente subvencionada. El incremento de la eficiencia técnica del sector, entendida como la relación entre producción de servicios y consumo de recursos, puede ayudar a conseguir una mejor gestión de los fondos públicos. Un primer paso para que se produzca una mejora es el desarrollo de una metodología de evaluación de la eficiencia técnica de las compañías de transporte público. Existen diferentes métodos para la evaluación técnica de un conjunto de compañías pertenecientes a un sector. Uno de los más utilizados es el método frontera, en el que se encuentra el análisis envolvente de datos (Data Envelopment Analysis, DEA, por sus siglas en inglés). Este método permite establecer una frontera de eficiencia técnica relativa a un determinado grupo de compañías, en función de un número limitado de variables. Las variables deben cuantificar, por un lado, la prestación de servicios de las distintas compañías (outputs), y por el otro, los recursos consumidos en la producción de dichos servicios (inputs). El objetivo de esta tesis es analizar, mediante el uso del método DEA, la eficiencia técnica de los servicios de autobuses urbanos en España. Para ello, se estudia el número de variables más adecuado para conformar los modelos con los que se obtienen las fronteras de eficiencia. En el desarrollo de la metodología se utilizan indicadores de los servicios de autobús urbano de las principales ciudades de las áreas metropolitanas españolas, para el periodo 2004-2009. In times of crisis it is imperative achieve a consumption of public resources as rational as possible. Urban public transport is a sector devoted to large investments and whose services are heavily subsidized. Increase the technical efficiency of the sector, defined as the ratio of service output and resource consumption, can help achieve a better management of public funds. One step to produce an improvement is the development of a methodology for evaluating the technical efficiency of the public transport companies. There are different methods for the technical evaluation of a set of companies within an industry. One of the most widely used methods is the frontier method, in particular the Data Envelopment Analysis (DEA). This method allows the calculation of a technical efficiency frontier on a specific group of companies, based on a limited number of variables. Variables must quantify, on the one hand, the provision of services of different companies (outputs), and on the other hand, the resources consumed in the production of such services (inputs). The objective of this thesis is to analyze, using the DEA method, the technical efficiency of urban bus services in Spain. For this purpose, it is studied the more suitable variables that can be used in the models to obtain the efficiency frontiers. In developing the methodology are used indicators of urban bus services in major cities of the Spanish metropolitan areas for the period 2004-2009.
Resumo:
We propose an analysis for detecting procedures and goals that are deterministic (i.e. that produce at most one solution), or predicates whose clause tests are mutually exclusive (which implies that at most one of their clauses will succeed) even if they are not deterministic (because they cali other predicates that can produce more than one solution). Applications of such determinacy information include detecting programming errors, performing certain high-level program transformations for improving search efñciency, optimizing low level code generation and parallel execution, and estimating tighter upper bounds on the computational costs of goals and data sizes, which can be used for program debugging, resource consumption and granularity control, etc. We have implemented the analysis and integrated it in the CiaoPP system, which also infers automatically the mode and type information that our analysis takes as input. Experiments performed on this implementation show that the analysis is fairly accurate and efncient.
Resumo:
Concurrency in Logic Programming has received much attention in the past. One problem with many proposals, when applied to Prolog, is that they involve large modifications to the standard implementations, and/or the communication and synchronization facilities provided do not fit as naturally within the language model as we feel is possible. In this paper we propose a new mechanism for implementing synchronization and communication for concurrency, based on atomic accesses to designated facts in the (shared) datábase. We argüe that this model is comparatively easy to implement and harmonizes better than previous proposals within the Prolog control model and standard set of built-ins. We show how in the proposed model it is easy to express classical concurrency algorithms and to subsume other mechanisms such as Linda, variable-based communication, or classical parallelism-oriented primitives. We also report on an implementation of the model and provide performance and resource consumption data.
Resumo:
Ciao is a logic-based, multi-paradigm programming system. One of its most distinguishing features is that it supports a large number of semantic and syntactic language features which can be selectively activated or deactivated for each program module. As a result, a module can be written in, for example, ISO-Prolog plus constraints and higher order, while another can be a puré logic module with a different control rule such as iterative deepening and/or tabling, and perhaps using constructive negation. A powerful and modular extensión mechanism allows user-level design and implementation of such features and sub-languages. Another distinguishing feature of Ciao is its powerful assertion language, which allows expressing many kinds of program properties (ranging from, e.g., moded types to resource consumption), as well as tests and documentation. The compiler is capable of statically ñnding violations of these properties or verifying that programs comply with them, and issuing certiñcates of this compliance. The compiler also performs many types of optimizations, including automatic parallelization. It offers very competitive performance, while retaining the flexibility and interactive development of a dynamic language. We will present a hands-on overview of the system, through small examples which emphasize the novel aspects and the motivations which lie behind Ciao's design and implementation.
Resumo:
CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource usage control, and program verification. More recently, novel and promising applications of such semantic approximations are being applied in the more general context of program development such as program verification. In this work, we describe our extensión of the system to incorpórate Abstraction-Carrying Code (ACC), a novel approach to mobile code safety. ACC follows the standard strategy of associating safety certificates to programs, originally proposed in Proof Carrying- Code. A distinguishing feature of ACC is that we use an abstraction (or abstract model) of the program computed by standard static analyzers as a certifícate. The validity of the abstraction on the consumer side is checked in a single-pass by a very efficient and specialized abstractinterpreter. We have implemented and benchmarked ACC within CiaoPP. The experimental results show that the checking phase is indeed faster than the proof generation phase, and that the sizes of certificates are reasonable. Moreover, the preprocessor is based on compile-time (and run-time) tools for the certification of CLP programs with resource consumption assurances.
Resumo:
The technique of Abstract Interpretation has allowed the development of very sophisticated global program analyses which are at the same time provably correct and practical. We present in a tutorial fashion a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system librarles), to genérate and simplify run-time tests, and to perform high-level program transformations such as múltiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, non-failure, and bounds on resource consumption (time or space cost). CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements the described functionality, will be used to illustrate the fundamental ideas.
Resumo:
We present in a tutorial fashion CiaoPP, the preprocessor of the Ciao multi-paradigm programming system, which implements a novel program development framework which uses abstract interpretation as a fundamental tool. The framework uses modular, incremental abstract interpretation to obtain information about the program. This information is used to validate programs, to detect bugs with respect to partial specifications written using assertions (in the program itself and/or in system libraries), to generate and simplify run-time tests, and to perform high-level program transformations such as multiple abstract specialization, parallelization, and resource usage control, all in a provably correct way. In the case of validation and debugging, the assertions can refer to a variety of program points such as procedure entry, procedure exit, points within procedures, or global computations. The system can reason with much richer information than, for example, traditional types. This includes data structure shape (including pointer sharing), bounds on data structure sizes, and other operational variable instantiation properties, as well as procedure-level properties such as determinacy, termination, non-failure, and bounds on resource consumption (time or space cost).
Resumo:
Abstract Due to recent scientific and technological advances in information sys¬tems, it is now possible to perform almost every application on a mobile device. The need to make sense of such devices more intelligent opens an opportunity to design data mining algorithm that are able to autonomous execute in local devices to provide the device with knowledge. The problem behind autonomous mining deals with the proper configuration of the algorithm to produce the most appropriate results. Contextual information together with resource information of the device have a strong impact on both the feasibility of a particu¬lar execution and on the production of the proper patterns. On the other hand, performance of the algorithm expressed in terms of efficacy and efficiency highly depends on the features of the dataset to be analyzed together with values of the parameters of a particular implementation of an algorithm. However, few existing approaches deal with autonomous configuration of data mining algorithms and in any case they do not deal with contextual or resources information. Both issues are of particular significance, in particular for social net¬works application. In fact, the widespread use of social networks and consequently the amount of information shared have made the need of modeling context in social application a priority. Also the resource consumption has a crucial role in such platforms as the users are using social networks mainly on their mobile devices. This PhD thesis addresses the aforementioned open issues, focusing on i) Analyzing the behavior of algorithms, ii) mapping contextual and resources information to find the most appropriate configuration iii) applying the model for the case of a social recommender. Four main contributions are presented: - The EE-Model: is able to predict the behavior of a data mining algorithm in terms of resource consumed and accuracy of the mining model it will obtain. - The SC-Mapper: maps a situation defined by the context and resource state to a data mining configuration. - SOMAR: is a social activity (event and informal ongoings) recommender for mobile devices. - D-SOMAR: is an evolution of SOMAR which incorporates the configurator in order to provide updated recommendations. Finally, the experimental validation of the proposed contributions using synthetic and real datasets allows us to achieve the objectives and answer the research questions proposed for this dissertation.
Resumo:
Las pruebas de software (Testing) son en la actualidad la técnica más utilizada para la validación y la evaluación de la calidad de un programa. El testing está integrado en todas las metodologías prácticas de desarrollo de software y juega un papel crucial en el éxito de cualquier proyecto de software. Desde las unidades de código más pequeñas a los componentes más complejos, su integración en un sistema de software y su despliegue a producción, todas las piezas de un producto de software deben ser probadas a fondo antes de que el producto de software pueda ser liberado a un entorno de producción. La mayor limitación del testing de software es que continúa siendo un conjunto de tareas manuales, representando una buena parte del coste total de desarrollo. En este escenario, la automatización resulta fundamental para aliviar estos altos costes. La generación automática de casos de pruebas (TCG, del inglés test case generation) es el proceso de generar automáticamente casos de prueba que logren un alto recubrimiento del programa. Entre la gran variedad de enfoques hacia la TCG, esta tesis se centra en un enfoque estructural de caja blanca, y más concretamente en una de las técnicas más utilizadas actualmente, la ejecución simbólica. En ejecución simbólica, el programa bajo pruebas es ejecutado con expresiones simbólicas como argumentos de entrada en lugar de valores concretos. Esta tesis se basa en un marco general para la generación automática de casos de prueba dirigido a programas imperativos orientados a objetos (Java, por ejemplo) y basado en programación lógica con restricciones (CLP, del inglés constraint logic programming). En este marco general, el programa imperativo bajo pruebas es primeramente traducido a un programa CLP equivalente, y luego dicho programa CLP es ejecutado simbólicamente utilizando los mecanismos de evaluación estándar de CLP, extendidos con operaciones especiales para el tratamiento de estructuras de datos dinámicas. Mejorar la escalabilidad y la eficiencia de la ejecución simbólica constituye un reto muy importante. Es bien sabido que la ejecución simbólica resulta impracticable debido al gran número de caminos de ejecución que deben ser explorados y a tamaño de las restricciones que se deben manipular. Además, la generación de casos de prueba mediante ejecución simbólica tiende a producir un número innecesariamente grande de casos de prueba cuando es aplicada a programas de tamaño medio o grande. Las contribuciones de esta tesis pueden ser resumidas como sigue. (1) Se desarrolla un enfoque composicional basado en CLP para la generación de casos de prueba, el cual busca aliviar el problema de la explosión de caminos interprocedimiento analizando de forma separada cada componente (p.ej. método) del programa bajo pruebas, almacenando los resultados y reutilizándolos incrementalmente hasta obtener resultados para el programa completo. También se ha desarrollado un enfoque composicional basado en especialización de programas (evaluación parcial) para la herramienta de ejecución simbólica Symbolic PathFinder (SPF). (2) Se propone una metodología para usar información del consumo de recursos del programa bajo pruebas para guiar la ejecución simbólica hacia aquellas partes del programa que satisfacen una determinada política de recursos, evitando la exploración de aquellas partes del programa que violan dicha política. (3) Se propone una metodología genérica para guiar la ejecución simbólica hacia las partes más interesantes del programa, la cual utiliza abstracciones como generadores de trazas para guiar la ejecución de acuerdo a criterios de selección estructurales. (4) Se propone un nuevo resolutor de restricciones, el cual maneja eficientemente restricciones sobre el uso de la memoria dinámica global (heap) durante ejecución simbólica, el cual mejora considerablemente el rendimiento de la técnica estándar utilizada para este propósito, la \lazy initialization". (5) Todas las técnicas propuestas han sido implementadas en el sistema PET (el enfoque composicional ha sido también implementado en la herramienta SPF). Mediante evaluación experimental se ha confirmado que todas ellas mejoran considerablemente la escalabilidad y eficiencia de la ejecución simbólica y la generación de casos de prueba. ABSTRACT Testing is nowadays the most used technique to validate software and assess its quality. It is integrated into all practical software development methodologies and plays a crucial role towards the success of any software project. From the smallest units of code to the most complex components and their integration into a software system and later deployment; all pieces of a software product must be tested thoroughly before a software product can be released. The main limitation of software testing is that it remains a mostly manual task, representing a large fraction of the total development cost. In this scenario, test automation is paramount to alleviate such high costs. Test case generation (TCG) is the process of automatically generating test inputs that achieve high coverage of the system under test. Among a wide variety of approaches to TCG, this thesis focuses on structural (white-box) TCG, where one of the most successful enabling techniques is symbolic execution. In symbolic execution, the program under test is executed with its input arguments being symbolic expressions rather than concrete values. This thesis relies on a previously developed constraint-based TCG framework for imperative object-oriented programs (e.g., Java), in which the imperative program under test is first translated into an equivalent constraint logic program, and then such translated program is symbolically executed by relying on standard evaluation mechanisms of Constraint Logic Programming (CLP), extended with special treatment for dynamically allocated data structures. Improving the scalability and efficiency of symbolic execution constitutes a major challenge. It is well known that symbolic execution quickly becomes impractical due to the large number of paths that must be explored and the size of the constraints that must be handled. Moreover, symbolic execution-based TCG tends to produce an unnecessarily large number of test cases when applied to medium or large programs. The contributions of this dissertation can be summarized as follows. (1) A compositional approach to CLP-based TCG is developed which overcomes the inter-procedural path explosion by separately analyzing each component (method) in a program under test, stowing the results as method summaries and incrementally reusing them to obtain whole-program results. A similar compositional strategy that relies on program specialization is also developed for the state-of-the-art symbolic execution tool Symbolic PathFinder (SPF). (2) Resource-driven TCG is proposed as a methodology to use resource consumption information to drive symbolic execution towards those parts of the program under test that comply with a user-provided resource policy, avoiding the exploration of those parts of the program that violate such policy. (3) A generic methodology to guide symbolic execution towards the most interesting parts of a program is proposed, which uses abstractions as oracles to steer symbolic execution through those parts of the program under test that interest the programmer/tester most. (4) A new heap-constraint solver is proposed, which efficiently handles heap-related constraints and aliasing of references during symbolic execution and greatly outperforms the state-of-the-art standard technique known as lazy initialization. (5) All techniques above have been implemented in the PET system (and some of them in the SPF tool). Experimental evaluation has confirmed that they considerably help towards a more scalable and efficient symbolic execution and TCG.