870 resultados para repository, process model, version, storage
Resumo:
The study reported here is part of a large project for evaluation of the Thermo-Chemical Accumulator (TCA), a technology under development by the Swedish company ClimateWell AB. The studies concentrate on the use of the technology for comfort cooling. This report concentrates on measurements in the laboratory, modelling and system simulation. The TCA is a three-phase absorption heat pump that stores energy in the form of crystallised salt, in this case Lithium Chloride (LiCl) with water being the other substance. The process requires vacuum conditions as with standard absorption chillers using LiBr/water. Measurements were carried out in the laboratories at the Solar Energy Research Center SERC, at Högskolan Dalarna as well as at ClimateWell AB. The measurements at SERC were performed on a prototype version 7:1 and showed that this prototype had several problems resulting in poor and unreliable performance. The main results were that: there was significant corrosion leading to non-condensable gases that in turn caused very poor performance; unwanted crystallisation caused blockages as well as inconsistent behaviour; poor wetting of the heat exchangers resulted in relatively high temperature drops there. A measured thermal COP for cooling of 0.46 was found, which is significantly lower than the theoretical value. These findings resulted in a thorough redesign for the new prototype, called ClimateWell 10 (CW10), which was tested briefly by the authors at ClimateWell. The data collected here was not large, but enough to show that the machine worked consistently with no noticeable vacuum problems. It was also sufficient for identifying the main parameters in a simulation model developed for the TRNSYS simulation environment, but not enough to verify the model properly. This model was shown to be able to simulate the dynamic as well as static performance of the CW10, and was then used in a series of system simulations. A single system model was developed as the basis of the system simulations, consisting of a CW10 machine, 30 m2 flat plate solar collectors with backup boiler and an office with a design cooling load in Stockholm of 50 W/m2, resulting in a 7.5 kW design load for the 150 m2 floor area. Two base cases were defined based on this: one for Stockholm using a dry cooler with design cooling rate of 30 kW; one for Madrid with a cooling tower with design cooling rate of 34 kW. A number of parametric studies were performed based on these two base cases. These showed that the temperature lift is a limiting factor for cooling for higher ambient temperatures and for charging with fixed temperature source such as district heating. The simulated evacuated tube collector performs only marginally better than a good flat plate collector if considering the gross area, the margin being greater for larger solar fractions. For 30 m2 collector a solar faction of 49% and 67% were achieved for the Stockholm and Madrid base cases respectively. The average annual efficiency of the collector in Stockholm (12%) was much lower than that in Madrid (19%). The thermal COP was simulated to be approximately 0.70, but has not been possible to verify with measured data. The annual electrical COP was shown to be very dependent on the cooling load as a large proportion of electrical use is for components that are permanently on. For the cooling loads studied, the annual electrical COP ranged from 2.2 for a 2000 kWh cooling load to 18.0 for a 21000 kWh cooling load. There is however a potential to reduce the electricity consumption in the machine, which would improve these figures significantly. It was shown that a cooling tower is necessary for the Madrid climate, whereas a dry cooler is sufficient for Stockholm although a cooling tower does improve performance. The simulation study was very shallow and has shown a number of areas that are important to study in more depth. One such area is advanced control strategy, which is necessary to mitigate the weakness of the technology (low temperature lift for cooling) and to optimally use its strength (storage).
Resumo:
Background There is emerging evidence that the physical environment is important for health, quality of life and care, but there is a lack of valid instruments to assess health care environments. The Sheffield Care Environment Assessment Matrix (SCEAM), developed in the United Kingdom, provides a comprehensive assessment of the physical environment of residential care facilities for older people. This paper reports on the translation and adaptation of SCEAM for use in Swedish residential care facilities for older people, including information on its validity and reliability. Methods SCEAM was translated into Swedish and back-translated into English, and assessed for its relevance by experts using content validity index (CVI) together with qualitative data. After modification, the validity assessments were repeated and followed by test-retest and inter-rater reliability tests in six units within a Swedish residential care facility that varied in terms of their environmental characteristics. Results Translation and back translation identified linguistic and semantic related issues. The results of the first content validity analysis showed that more than one third of the items had item-CVI (I-CVI) values less than the critical value of 0.78. After modifying the instrument, the second content validation analysis resulted in I-CVI scores above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ= 0.903 and 0.869) and inter-rater reliability (κ= 0.851 and 0.832). Conclusions Adapting an instrument to a domestic context is a complex and time-consuming process, requiring an understanding of the culture where the instrument was developed and where it is to be used. A team, including the instrument’s developers, translators, and researchers is necessary to ensure a valid translation and adaption. This study showed preliminary validity and reliability evidence for the Swedish version (S-SCEAM) when used in a Swedish context. Further, we believe that the S-SCEAM has improved compared to the original instrument and suggest that it can be used as a foundation for future developments of the SCEAM model.
Resumo:
Bin planning (arrangements) is a key factor in the timber industry. Improper planning of the storage bins may lead to inefficient transportation of resources, which threaten the overall efficiency and thereby limit the profit margins of sawmills. To address this challenge, a simulation model has been developed. However, as numerous alternatives are available for arranging bins, simulating all possibilities will take an enormous amount of time and it is computationally infeasible. A discrete-event simulation model incorporating meta-heuristic algorithms has therefore been investigated in this study. Preliminary investigations indicate that the results achieved by GA based simulation model are promising and better than the other meta-heuristic algorithm. Further, a sensitivity analysis has been done on the GA based optimal arrangement which contributes to gaining insights and knowledge about the real system that ultimately leads to improved and enhanced efficiency in sawmill yards. It is expected that the results achieved in the work will support timber industries in making optimal decisions with respect to arrangement of storage bins in a sawmill yard.
Resumo:
The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.
Resumo:
I study the welfare cost of inflation and the effect on prices after a permanent increase in the interest rate. In the steady state, the real money demand is homogeneous of degree one in income and its interest-rate elasticity is approximately equal to −1/2. Consumers are indifferent between an economy with 10% p.a. inflation and one with zero inflation if their income is 1% higher in the first economy. A permanent increase in the interest rate makes the price level to drop initially and inflation to adjust slowly to its steady state level.
Resumo:
In this work we focus on tests for the parameter of an endogenous variable in a weakly identi ed instrumental variable regressionmodel. We propose a new unbiasedness restriction for weighted average power (WAP) tests introduced by Moreira and Moreira (2013). This new boundary condition is motivated by the score e ciency under strong identi cation. It allows reducing computational costs of WAP tests by replacing the strongly unbiased condition. This latter restriction imposes, under the null hypothesis, the test to be uncorrelated to a given statistic with dimension given by the number of instruments. The new proposed boundary condition only imposes the test to be uncorrelated to a linear combination of the statistic. WAP tests under both restrictions to perform similarly numerically. We apply the di erent tests discussed to an empirical example. Using data from Yogo (2004), we assess the e ect of weak instruments on the estimation of the elasticity of inter-temporal substitution of a CCAPM model.
Resumo:
Organizations are Complex systems. A conceptual model of the enterprise is needed that is: coherent the distinguished aspect models constitute a logical and truly integral comprehensive all relevant issues are covered consistent the aspect models are free from contradictions or irregularities concise no superfluous matters are contained in it essential it shows only the essence of the enterprise, i.e., the model abstracts from all realization and implementation issues. The world is in great need for transparency about the operation of all the systems we daily work with, ranging from the domestic appliances to the big societal institutions. In this context the field of enterprise ontology has emerged with the aim to create models that help to understand the essence of the construction and operation of complete systems; more specifically, of enterprises. Enterprise ontology arises in the way to look through the distracting and confusing appearance of an enterprise right into its deep kernel. This, from the perspective of the system designer gives him the tools needed to design a successful system in a way that’s reflects the desires and needs of the workers of the enterprise. This project’s context is the use of DEMO (Design and Engineering Methodology for Organizations) for (re)designing or (re)engineering of an enterprise, namely a process of the construction department of a city hall, the lack of a well-founded theory about the construction and operation of this processes that was the motivation behind this work. The purpose of studying applying the DEMO theory and method was to optimize the process, automating it as much as possible, while reducing paper and time spent between tasks and provide a better service to the citizens.
Resumo:
A study was taken in a 1566 ha watershed situated in the Capivara River basin, municipality of Botucatu, São Paulo State, Brazil. This environment is fragile and can be subjected to different forms of negative impacts, among them soil erosion by water. The main objective of the research was to develop a methodology for the assessment of soil erosion fragility at the various different watershed positions, using the geographic information system ILWIS version 3.3 for Windows. An impact model was created to generate the soil's erosion fragility plan, based on four indicators of fragility to water erosion: land use and cover, slope, percentage of soil fine sand and accumulated water flow. Thematic plans were generated in a geographic information system (GIS) environment. First, all the variables, except land use and cover, were described by continuous numerical plans in a raster structure. The land use and cover plan was also represented by numerical values associated with the weights attributed to each class, starting from a pairwise comparison matrix and using the analytical hierarchy process. A final field check was done to record evidence of erosive processes in the areas indicated as presenting the highest levels of fragility, i.e., sites with steep slopes, high percentage of soil fine sand, tendency to accumulate surface water flow, and sites of pastureland. The methodology used in the environmental problems diagnosis of the study area can be employed at places with similar relief, soil and climatic conditions.
Resumo:
A mathematical model was developed in order to study the behavior of thermal stratification of liquid in a typical storage tank with porous medium. The model employs a transient stream function-vorticity formulation to predict the development of stream function and temperature fields in a charging process. Parameters analyzed include Biot, Darcy, Reynolds and Richardson numbers, position, and the thickness of the porous medium. The results show the influence of these physical parameters that should be considered for a good design of storage tanks with thermal stratification.
Resumo:
Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Several local factors that influence the healing process of replanted teeth have been investigated. However, it remains unclear how systemic alterations, such as diabetes mellitus, affect the prognosis of these cases. The purpose of this study was to evaluate the healing process of incisors of non-controlled diabetic rats replanted after storage in bovine long shelf-life (UHT) whole milk. Thirty-two rats were randomly assigned to receive an endovenous injection of either citrate buffer solution (group I - control; n = 16) or streptozotocin dissolved in citrate buffer solution to induce diabetes (group II; n = 16). After confirmation of the diabetic status by analysis of the glycemic levels, the maxillary right incisor of each animal was extracted and immersed in milk for 60 min. The root canals of teeth were then instrumented, and were filled with a calcium hydroxide-based dressing and replanted into their sockets. All animals received systemic antibiotic and were killed by anesthetic overdose 10 and 60 days after replantation. The specimens containing the replanted teeth were removed, fixed, decalcified, and embedded in paraffin. Semi-serial 6-mu m-thick sections were obtained and stained with hematoxylin and eosin for histologic and histometric analyses. The results showed that the connective tissue adjacent to the root surface was less organized in the diabetic animals than in the control animals in both periods; the root dentin was less severely affected by root resorption in the diabetic rats; there were no significant differences between the control and diabetic groups regarding the occurrence of replacement resorption and inflammatory resorption.
Resumo:
Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
An economic model including the labor resource and the process stage configuration is proposed to design g charts allowing for all the design parameters to be varied in an adaptive way. A random shift size is considered during the economic design selection. The results obtained for a benchmark of 64 process stage scenarios show that the activities configuration and some process operating parameters influence the selection of the best control chart strategy: to model the random shift size, its exact distribution can be approximately fitted by a discrete distribution obtained from a relatively small sample of historical data. However, an accurate estimation of the inspection costs associated to the SPC activities is far from being achieved. An illustrative example shows the implementation of the proposed economic model in a real industrial case. (C) 2011 Elsevier B.V. All rights reserved.