814 resultados para Efficient Frontier
Resumo:
An approach to incorporate spatial dependence into stochastic frontier analysis is developed and applied to a sample of 215 dairy farms in England and Wales. A number of alternative specifications for the spatial weight matrix are used to analyse the effect of these on the estimation of spatial dependence. Estimation is conducted using a Bayesian approach and results indicate that spatial dependence is present when explaining technical inefficiency.
Resumo:
Aims: Therapeutic limbal epithelial stem cells could be managed more efficiently if clinically validated batches were transported for ‘on-demand’ use. Materials & methods: In this study, corneal epithelial cell viability in calcium alginate hydrogels was examined under cell culture, ambient and chilled conditions for up to 7 days. Results: Cell viability improved as gel internal pore size increased, and was further enhanced with modification of the gel from a mass to a thin disc. Ambient storage conditions were optimal for supporting cell viability in gel discs. Cell viability in gel discs was significantly enhanced with increases in pore size mediated by hydroxyethyl cellulose. Conclusion: Our novel methodology of controlling alginate gel shape and pore size together provides a more practical and economical alternative to established corneal tissue/cell storage methods.
Resumo:
Ulrike Heuer argues that there can be a reason for a person to perform an action that this person cannot perform, as long as this person can take efficient steps towards performing this action. In this reply, I first argue that Heuer’s examples fail to undermine my claim that there cannot be a reason for a person to perform an action if it is impossible that this person will perform this action. I then argue that, on a plausible interpretation of what ‘efficient steps’ are, Heuer’s claim is consistent with my claim. I end by showing that Heuer fails to undermine the arguments I gave for my claim.
Resumo:
This chapter covers the basic concepts of passive building design and its relevant strategies, including passive solar heating, shading, natural ventilation, daylighting and thermal mass. In environments with high seasonal peak temperatures and/or humidity (e.g. cities in temperate regions experiencing the Urban Heat Island effect), wholly passive measures may need to be supplemented with low and zero carbon technologies (LZCs). The chapter also includes three case studies: one residential, one demonstrational and one academic facility (that includes an innovative passive downdraught cooling (PDC) strategy) to illustrate a selection of passive measures.
Resumo:
There is growing pressure on the construction industry to deliver energy efficient, sustainable buildings but there is evidence to suggest that, in practice, designs regularly fail to achieve the anticipated levels of in-use energy consumption. One of the key factors behind this discrepancy is the behavior of the building occupants. This paper explores how insights from experimental psychology could potentially be used to reduce the gap between the predicted and actual energy performance of buildings. It demonstrates why traditional methods to engage with the occupants are not always successful and proposes a model for a more holistic approach to this issue. The paper concludes that achieving energy efficiency in buildings is not solely a technological issue and that the construction industry needs to adopt a more user-centred approach.
Resumo:
Foot-and-mouth disease virus (FMDV) is a significant economically and distributed globally pathogen of Artiodactyla. Current vaccines are chemically inactivated whole virus particles that require large-scale virus growth in strict bio-containment with the associated risks of accidental release or incomplete inactivation. Non-infectious empty capsids are structural mimics of authentic particles with no associated risk and constitute an alternate vaccine candidate. Capsids self-assemble from the processed virus structural proteins, VP0, VP3 and VP1, which are released from the structural protein precursor P1-2A by the action of the virus-encoded 3C protease. To date recombinant empty capsid assembly has been limited by poor expression levels, restricting the development of empty capsids as a viable vaccine. Here expression of the FMDV structural protein precursor P1-2A in insect cells is shown to be efficient but linkage of the cognate 3C protease to the C-terminus reduces expression significantly. Inactivation of the 3C enzyme in a P1-2A-3C cassette allows expression and intermediate levels of 3C activity resulted in efficient processing of the P1-2A precursor into the structural proteins which assembled into empty capsids. Expression was independent of the insect host cell background and leads to capsids that are recognised as authentic by a range of anti-FMDV bovine sera suggesting their feasibility as an alternate vaccine.
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.
Resumo:
Induction of classification rules is one of the most important technologies in data mining. Most of the work in this field has concentrated on the Top Down Induction of Decision Trees (TDIDT) approach. However, alternative approaches have been developed such as the Prism algorithm for inducing modular rules. Prism often produces qualitatively better rules than TDIDT but suffers from higher computational requirements. We investigate approaches that have been developed to minimize the computational requirements of TDIDT, in order to find analogous approaches that could reduce the computational requirements of Prism.
Resumo:
In order to gain knowledge from large databases, scalable data mining technologies are needed. Data are captured on a large scale and thus databases are increasing at a fast pace. This leads to the utilisation of parallel computing technologies in order to cope with large amounts of data. In the area of classification rule induction, parallelisation of classification rules has focused on the divide and conquer approach, also known as the Top Down Induction of Decision Trees (TDIDT). An alternative approach to classification rule induction is separate and conquer which has only recently been in the focus of parallelisation. This work introduces and evaluates empirically a framework for the parallel induction of classification rules, generated by members of the Prism family of algorithms. All members of the Prism family of algorithms follow the separate and conquer approach.
Resumo:
Generally classifiers tend to overfit if there is noise in the training data or there are missing values. Ensemble learning methods are often used to improve a classifier's classification accuracy. Most ensemble learning approaches aim to improve the classification accuracy of decision trees. However, alternative classifiers to decision trees exist. The recently developed Random Prism ensemble learner for classification aims to improve an alternative classification rule induction approach, the Prism family of algorithms, which addresses some of the limitations of decision trees. However, Random Prism suffers like any ensemble learner from a high computational overhead due to replication of the data and the induction of multiple base classifiers. Hence even modest sized datasets may impose a computational challenge to ensemble learners such as Random Prism. Parallelism is often used to scale up algorithms to deal with large datasets. This paper investigates parallelisation for Random Prism, implements a prototype and evaluates it empirically using a Hadoop computing cluster.