923 resultados para Building Certification Systems
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
Part 4: Transition Towards Product-Service Systems
Resumo:
The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation.^ In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data.^ For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.^
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.
Resumo:
The performance assessment as to water consumption in WC cisterns has contributed to the development of flushing system technologies, which allow smaller flushing volumes. The purpose of this work is to assess the performance of the the low water consumption requirement of WC cisterns with dual flushing system (6/3L), when compared to 6L flushing volume WC cisterns in multifamily buildings. The research methodology consisted of a case study in a multifamily residential building with submetering system, by monitoring the total water consumption and the two flushing systems using water meters installed in WC cisterns. By means of a mathematical model, a comparison of the design flowrate in the main branch was carried out considering the two types of WC cisterns. The results indicated that the water consumption in the 6L WC cistern was 20% in relation to the total domestic consumption, whereas the water consumption observed in the dual-flush WC cistern (6/3L) was 16%. The dual flushing system (6/3L) presented about 18% consumption reduction impact as compared to the 6 L system. The design flowrate values in the main branch, obtained by the mathematical model, were 0.35 L/s for systems with 6 L WC cistern and 0.34 L/s with dual-flush WC cistern (6/3 L), that is, a reduction of similar to 3%. Practical application: The knowledge of the performance in field of dual-flush WC cistern contributes to industry to improve this system and to users to aid their choice of technologies aimed at water conservation, and so assisting to the development of sustainable buildings.
Resumo:
Many factors affect the airflow patterns, thermal comfort, contaminant removal efficiency and indoor air quality at individual workstations in office buildings. In this study, four ventilation systems were used in a test chamber designed to represent an area of a typical office building floor and reproduce the real characteristics of a modern office space. Measurements of particle concentration and thermal parameters (temperature and velocity) were carried out for each of the following types of ventilation systems: (a) conventional air distribution system with ceiling supply and return; (b) conventional air distribution system with ceiling supply and return near the floor; (c) underfloor air distribution system; and (d) split system. The measurements aimed to analyse the particle removal efficiency in the breathing zone and the impact of particle concentration on an individual at the workstation. The efficiency of the ventilation system was analysed by measuring particle size and concentration, ventilation effectiveness and the indoor/outdoor ratio. Each ventilation system showed different airflow patterns and the efficiency of each ventilation system in the removal of the particles in the breathing zone showed no correlation with particle size and the various methods of analyses used. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The most-used refrigeration system is the vapor-compression system. In this cycle, the compressor is the most complex and expensive component, especially the reciprocating semihermetic type, which is often used in food product conservation. This component is very sensitive to variations in its operating conditions. If these conditions reach unacceptable levels, failures are practically inevitable. Therefore, maintenance actions should be taken in order to maintain good performance of such compressors and to avoid undesirable stops of the system. To achieve such a goal, one has to evaluate the reliability of the system and/or the components. In this case, reliability means the probability that some equipment cannot perform their requested functions for an established time period, under defined operating conditions. One of the tools used to improve component reliability is the failure mode and effect analysis (FMEA). This paper proposes that the methodology of FMEA be used as a tool to evaluate the main failures found in semihermetic reciprocating compressors used in refrigeration systems. Based on the results, some suggestions for maintenance are addressed.
Resumo:
We address here aspects of the implementation of a memory evolutive system (MES), based on the model proposed by A. Ehresmann and J. Vanbremeersch (2007), by means of a simulated network of spiking neurons with time dependent plasticity. We point out the advantages and challenges of applying category theory for the representation of cognition, by using the MES architecture. Then we discuss the issues concerning the minimum requirements that an artificial neural network (ANN) should fulfill in order that it would be capable of expressing the categories and mappings between them, underlying the MES. We conclude that a pulsed ANN based on Izhikevich`s formal neuron with STDP (spike time-dependent plasticity) has sufficient dynamical properties to achieve these requirements, provided it can cope with the topological requirements. Finally, we present some perspectives of future research concerning the proposed ANN topology.
Resumo:
Forest Stewardship Council (FSC) certification promises international consumers that `green-label` timber has been logged sustainably. However, recent research indicates that this is not true for ipe (Tabebuia spp.), currently flooding the US residential decking market, much of it logged in Brazil. Uneven or non-application of minimum technical standards for certification could undermine added value and eventually the certification process itself. We examine public summary reports by third-party certifiers describing the evaluation process for certified companies in the Brazilian Amazon to determine the extent to which standards are uniformly applied and the degree to which third-party certifier requirements for compliance are consistent among properties. Current best-practice harvest systems, combined with Brazilian legal norms for harvest levels, guarantee that no certified company or community complies with FSC criteria and indicators specifying species-level management. No guidelines indicate which criteria and indicators must be enforced, or to what degree, for certification to be conferred by third-party assessors; nor do objective guidelines exist for evaluating compliance for criteria and indicators for which adequate scientific information is not yet available to identify acceptable levels. Meanwhile, certified companies are expected to monitor the long-term impacts of logging on biodiversity in addition to conducting best-practice forest management. This burden should reside elsewhere. We recommend a clarification of `sustained timber yield` that reflects current state of knowledge and practice in Amazonia. Quantifiable verifiers for best-practice forest management must be developed and consistently employed. These will need to be flexible to reflect the diversity in forest structure and dynamics that prevails across this vast region. We offer suggestions for how to achieve these goals.
Resumo:
This paper investigates whether initiatives for sustainability certification of Brazilian ethanol can be expected to stimulate a change among producers toward more sustainable production - and, if so, what those changes would likely be. Connected to this, several questions are raised including whether producers might prefer to target other markets with less stringent demands, and if certification might lead to structural changes in the sector because producers who lack the capacity to meet the new requirements cannot remain competitive. The analysis of interviews with a diverse group of stakeholders under the guidance of the Technological Innovation Systems framework allowed us identify different actions taken by the Brazilian sugarcane ethanol sector in response to requirements of sustainability. The interviewees agreed that sustainability certification is an important element for the expansion of biofuel production in Brazil. Brazilian stakeholders have created a platform for more competitive sustainable production and have initiated relevant processes in response to the development connected to sustainability certification. Yet, the certification activities have had a limited impact in terms of the number of involved stakeholders. But interview responses indicate that the sector may adapt to new certification requirements rather than leave markets where such requirements become established. Structural changes can be expected if certification requirements as they exist in many initiatives are introduced in unflexible ways. The social importance of the ethanol industry is large in Brazil and some adjustments for certification may be required. The paper concludes by suggesting some actions for the industry. (C) 2010 Society of Chemical Industry and John Wiley & Sons, Ltd
Resumo:
The development of cropping systems simulation capabilities world-wide combined with easy access to powerful computing has resulted in a plethora of agricultural models and consequently, model applications. Nonetheless, the scientific credibility of such applications and their relevance to farming practice is still being questioned. Our objective in this paper is to highlight some of the model applications from which benefits for farmers were or could be obtained via changed agricultural practice or policy. Changed on-farm practice due to the direct contribution of modelling, while keenly sought after, may in some cases be less achievable than a contribution via agricultural policies. This paper is intended to give some guidance for future model applications. It is not a comprehensive review of model applications, nor is it intended to discuss modelling in the context of social science or extension policy. Rather, we take snapshots around the globe to 'take stock' and to demonstrate that well-defined financial and environmental benefits can be obtained on-farm from the use of models. We highlight the importance of 'relevance' and hence the importance of true partnerships between all stakeholders (farmer, scientists, advisers) for the successful development and adoption of simulation approaches. Specifically, we address some key points that are essential for successful model applications such as: (1) issues to be addressed must be neither trivial nor obvious; (2) a modelling approach must reduce complexity rather than proliferate choices in order to aid the decision-making process (3) the cropping systems must be sufficiently flexible to allow management interventions based on insights gained from models. The pro and cons of normative approaches (e.g. decision support software that can reach a wide audience quickly but are often poorly contextualized for any individual client) versus model applications within the context of an individual client's situation will also be discussed. We suggest that a tandem approach is necessary whereby the latter is used in the early stages of model application for confidence building amongst client groups. This paper focuses on five specific regions that differ fundamentally in terms of environment and socio-economic structure and hence in their requirements for successful model applications. Specifically, we will give examples from Australia and South America (high climatic variability, large areas, low input, technologically advanced); Africa (high climatic variability, small areas, low input, subsistence agriculture); India (high climatic variability, small areas, medium level inputs, technologically progressing; and Europe (relatively low climatic variability, small areas, high input, technologically advanced). The contrast between Australia and Europe will further demonstrate how successful model applications are strongly influenced by the policy framework within which producers operate. We suggest that this might eventually lead to better adoption of fully integrated systems approaches and result in the development of resilient farming systems that are in tune with current climatic conditions and are adaptable to biophysical and socioeconomic variability and change. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
Poultry can be managed under different feeding systems, depending on the husbandry skills and the feed available. These systems include the following: (1) a complete dry feed offered as a mash ad libitum; (2) the same feed offered as pellets or crumbles ad libitum; (3) a complete feed with added whole grain; (4) a complete wet feed given once or twice a day; (5) a complete feed offered on a restricted basis; (6) choice feeding. Of all these, an interesting alternative to offering complete diets is choice feeding which can be applied on both a small or large commercial scale. Under choice feeding or free-choice feeding birds are usually offered a choice between three types of feedstuffs: (a) an energy source (e.g. maize, rice bran, sorghum or wheat); (b) a protein source (e.g. soyabean meal, meat meal, fish meal or coconut meal) plus vitamins and minerals and (c), in the case of laying hens, calcium in granular form (i.e. oyster-shell grit). This system differs from the modern commercial practice of offering a complete diet comprising energy and protein sources, ground and mixed together. Under the complete diet system, birds are mainly only able to exercise their appetite for energy. When the environmental temperature varies, the birds either over- or under-consume protein and calcium. The basic principle behind practising choice feeding with laying hens is that individual hens are able to select from the various feed ingredients on offer and compose their own diet, according to their actual needs and production capacity. A choice-feeding system is of particular importance to small poultry producers in developing countries, such as Indonesia, because it can substantially reduce the cost of feed. The system is flexible and can be constructed in such a way that the various needs of a flock of different breeds, including village chickens, under different climates can be met. The system also offers a more effective way to use home-produced grain, such as maize, and by-products, such as rice bran, in developing countries. Because oyster-shell grit is readily available in developing countries at lower cost than limestone, the use of cheaper oyster-shell grit can further benefit small-holders in these countries. These benefits apart, simpler equipment suffices when designing and building a feed mixer on the farm, and transport costs are lower. If whole (unground) grain is used, the intake of which is accompanied by increased efficiency of feed utilisation, the costs of grinding, mixing and many of the handling procedures associated with mash and pellet preparation are eliminated. The choice feedstuffs can all be offered in the current feed distribution systems, either by mixing the ingredients first or by using a bulk bin divided into three compartments.
Resumo:
According to Wright [1] certification of products and processes began during the 1960’s in the manufacturing industry, as a tool to control and assure the quality/conformity of products and services provided by suppliers to customers/consumers. Thus, the series of ISO 9000 was published first time, in 1987 and it was been created with a flexible character, to be reviewed periodically. Later, were published others normative references, which highlight the ISO 14001 in 1996 and OHSAS 18001 in 1999. This was also, the natural sequence of the certification processes in the organizations, i.e., began with the certification of quality management systems (QMS) followed by the environmental management systems (EMS) and after for the Occupational Health and Safety Management System (OHSMS). Hence, a high percentage of organizations with an EMS, in accordance with the ISO 14001, had also implemented, a certified QMS, in accordance with ISO 9001. At first the implementation of a QMS was particularly relevant in high demanding activity sectors, like the automotive and aeronautical industries, but it has rapidly extended to every activity sector, becoming a common requisite of any company worldwide and a factor of competitiveness and survival. Due to the increasingly demanding environmental legislation in developed countries, companies nowadays are required to seriously take into consideration not only environmental aspects associated to the production chain itself, but also to the life cycle of their products.