818 resultados para Efficient lighting
Resumo:
New ways of combining observations with numerical models are discussed in which the size of the state space can be very large, and the model can be highly nonlinear. Also the observations of the system can be related to the model variables in highly nonlinear ways, making this data-assimilation (or inverse) problem highly nonlinear. First we discuss the connection between data assimilation and inverse problems, including regularization. We explore the choice of proposal density in a Particle Filter and show how the ’curse of dimensionality’ might be beaten. In the standard Particle Filter ensembles of model runs are propagated forward in time until observations are encountered, rendering it a pure Monte-Carlo method. In large-dimensional systems this is very inefficient and very large numbers of model runs are needed to solve the data-assimilation problem realistically. In our approach we steer all model runs towards the observations resulting in a much more efficient method. By further ’ensuring almost equal weight’ we avoid performing model runs that are useless in the end. Results are shown for the 40 and 1000 dimensional Lorenz 1995 model.
Resumo:
Aims: Therapeutic limbal epithelial stem cells could be managed more efficiently if clinically validated batches were transported for ‘on-demand’ use. Materials & methods: In this study, corneal epithelial cell viability in calcium alginate hydrogels was examined under cell culture, ambient and chilled conditions for up to 7 days. Results: Cell viability improved as gel internal pore size increased, and was further enhanced with modification of the gel from a mass to a thin disc. Ambient storage conditions were optimal for supporting cell viability in gel discs. Cell viability in gel discs was significantly enhanced with increases in pore size mediated by hydroxyethyl cellulose. Conclusion: Our novel methodology of controlling alginate gel shape and pore size together provides a more practical and economical alternative to established corneal tissue/cell storage methods.
Resumo:
Markowitz showed that assets can be combined to produce an 'Efficient' portfolio that will give the highest level of portfolio return for any level of portfolio risk, as measured by the variance or standard deviation. These portfolios can then be connected to generate what is termed an 'Efficient Frontier' (EF). In this paper we discuss the calculation of the Efficient Frontier for combinations of assets, again using the spreadsheet Optimiser. To illustrate the derivation of the Efficient Frontier, we use the data from the Investment Property Databank Long Term Index of Investment Returns for the period 1971 to 1993. Many investors might require a certain specific level of holding or a restriction on holdings in at least some of the assets. Such additional constraints may be readily incorporated into the model to generate a constrained EF with upper and/or lower bounds. This can then be compared with the unconstrained EF to see whether the reduction in return is acceptable. To see the effect that these additional constraints may have, we adopt a fairly typical pension fund profile, with no more than 20% of the total held in Property. The paper shows that it is now relatively easy to use the Optimiser available in at least one spreadsheet (EXCEL) to calculate efficient portfolios for various levels of risk and return, both constrained and unconstrained, so as to be able to generate any number of Efficient Frontiers.
Resumo:
This paper describe a simulation program, which uses Trengenza’s average room illuminance method in conjunction with hourly solar irradiance and luminous efficacy, to predict the potential lighting energy saving for a side-lit room. Two lighting control algorithms of photoelectric switching (on/off) and photoelectric dimming (top-up) have been coded in the program. A simulation for a typical UK office room has been conducted and the results show that energy saving due to the sunlight dependent on the various factors such as orientation, control methods, building depth, glazing area and shading types, etc. This simple tool can be used for estimating the potential lighting energy saving of the windows with various shading devices at the early design stage.
Resumo:
Ulrike Heuer argues that there can be a reason for a person to perform an action that this person cannot perform, as long as this person can take efficient steps towards performing this action. In this reply, I first argue that Heuer’s examples fail to undermine my claim that there cannot be a reason for a person to perform an action if it is impossible that this person will perform this action. I then argue that, on a plausible interpretation of what ‘efficient steps’ are, Heuer’s claim is consistent with my claim. I end by showing that Heuer fails to undermine the arguments I gave for my claim.
Resumo:
This chapter covers the basic concepts of passive building design and its relevant strategies, including passive solar heating, shading, natural ventilation, daylighting and thermal mass. In environments with high seasonal peak temperatures and/or humidity (e.g. cities in temperate regions experiencing the Urban Heat Island effect), wholly passive measures may need to be supplemented with low and zero carbon technologies (LZCs). The chapter also includes three case studies: one residential, one demonstrational and one academic facility (that includes an innovative passive downdraught cooling (PDC) strategy) to illustrate a selection of passive measures.
Resumo:
Lighting and small power will typically account for more than half of the total electricity consumption in an office building. Significant variations in electricity used by different tenants suggest that occupants can have a significant impact on the electricity demand for these end-uses. Yet current modelling techniques fail to represent the interaction between occupant and the building environment in a realistic manner. Understanding the impact of such behaviours is crucial to improve the methodology behind current energy modelling techniques, aiming to minimise the significant gap between predicted and in-use performance of buildings. A better understanding of the impact of occupant behaviour on electricity consumption can also inform appropriate energy saving strategies focused on behavioural change. This paper reports on a study aiming to assess the intent of occupants to switch off lighting and appliances when not in use in office buildings. Based on the Theory of Planned Behaviour, the assessment takes the form of a questionnaire and investigates three predictors to behaviour individually: 1) behavioural attitude; 2) subjective norms; 3) perceived behavioural control. The paper details the development of the assessment procedure and discusses preliminary findings from the study. The questionnaire results are compared against electricity consumption data for individual zones within a multi-tenanted office building. Initial results demonstrate a statistically significant correlation between perceived behavioural control and energy consumption for lighting and small power
Resumo:
There is growing pressure on the construction industry to deliver energy efficient, sustainable buildings but there is evidence to suggest that, in practice, designs regularly fail to achieve the anticipated levels of in-use energy consumption. One of the key factors behind this discrepancy is the behavior of the building occupants. This paper explores how insights from experimental psychology could potentially be used to reduce the gap between the predicted and actual energy performance of buildings. It demonstrates why traditional methods to engage with the occupants are not always successful and proposes a model for a more holistic approach to this issue. The paper concludes that achieving energy efficiency in buildings is not solely a technological issue and that the construction industry needs to adopt a more user-centred approach.
Resumo:
Foot-and-mouth disease virus (FMDV) is a significant economically and distributed globally pathogen of Artiodactyla. Current vaccines are chemically inactivated whole virus particles that require large-scale virus growth in strict bio-containment with the associated risks of accidental release or incomplete inactivation. Non-infectious empty capsids are structural mimics of authentic particles with no associated risk and constitute an alternate vaccine candidate. Capsids self-assemble from the processed virus structural proteins, VP0, VP3 and VP1, which are released from the structural protein precursor P1-2A by the action of the virus-encoded 3C protease. To date recombinant empty capsid assembly has been limited by poor expression levels, restricting the development of empty capsids as a viable vaccine. Here expression of the FMDV structural protein precursor P1-2A in insect cells is shown to be efficient but linkage of the cognate 3C protease to the C-terminus reduces expression significantly. Inactivation of the 3C enzyme in a P1-2A-3C cassette allows expression and intermediate levels of 3C activity resulted in efficient processing of the P1-2A precursor into the structural proteins which assembled into empty capsids. Expression was independent of the insect host cell background and leads to capsids that are recognised as authentic by a range of anti-FMDV bovine sera suggesting their feasibility as an alternate vaccine.
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.