930 resultados para Efficient Production Scale
Resumo:
Performance indicators in the public sector have often been criticised for being inadequate and not conducive to analysing efficiency. The main objective of this study is to use data envelopment analysis (DEA) to examine the relative efficiency of Australian universities. Three performance models are developed, namely, overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings based on 1995 data show that the university sector was performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. There were also small slacks in input utilisation. More universities were operating at decreasing returns to scale, indicating a potential to downsize. DEA helps in identifying the reference sets for inefficient institutions and objectively determines productivity improvements. As such, it can be a valuable benchmarking tool for educational administrators and assist in more efficient allocation of scarce resources. In the absence of market mechanisms to price educational outputs, which renders traditional production or cost functions inappropriate, universities are particularly obliged to seek alternative efficiency analysis methods such as DEA.
Resumo:
TRANSCREA, Convertir la investigación y el conocimiento en innovación, propiedad intelectual e industrial. Terceira, 16 e 17 Fevereiro, 2011.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Química
Resumo:
We focus on large-scale and dense deeply embedded systems where, due to the large amount of information generated by all nodes, even simple aggregate computations such as the minimum value (MIN) of the sensor readings become notoriously expensive to obtain. Recent research has exploited a dominance-based medium access control(MAC) protocol, the CAN bus, for computing aggregated quantities in wired systems. For example, MIN can be computed efficiently and an interpolation function which approximates sensor data in an area can be obtained efficiently as well. Dominance-based MAC protocols have recently been proposed for wireless channels and these protocols can be expected to be used for achieving highly scalable aggregate computations in wireless systems. But no experimental demonstration is currently available in the research literature. In this paper, we demonstrate that highly scalable aggregate computations in wireless networks are possible. We do so by (i) building a new wireless hardware platform with appropriate characteristics for making dominance-based MAC protocols efficient, (ii) implementing dominance-based MAC protocols on this platform, (iii) implementing distributed algorithms for aggregate computations (MIN, MAX, Interpolation) using the new implementation of the dominance-based MAC protocol and (iv) performing experiments to prove that such highly scalable aggregate computations in wireless networks are possible.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
[Excerpt] Lignocellulosic plant biomass is being envisioned by biorefinery industry as an alternative to current petroleum platform because of the large scale availability, low cost and environmentally benign production. The industrial bioprocessing designed to transform lignocellulosic biomass into biofuels are harsh and the enzymatic reactions may be severely compromised reducing the production of fermentable sugars from lignocellulosic biomass. Thermophilic bacteria consortium are a potential source of cellulases and hemicellulases adapted to extreme environmental conditions, which can be exploited as a new source for the development of more robust enzymatic cocktails. (...)
Resumo:
In microeconomic analysis functions with diminishing returns to scale (DRS) have frequently been employed. Various properties of increasing quasiconcave aggregator functions with DRS are derived. Furthermore duality in the classical sense as well as of a new type is studied for such aggregator functions in production and consumer theory. In particular representation theorems for direct and indirect aggregator functions are obtained. These involve only small sets of generator functions. The study is carried out in the contemporary framework of abstract convexity and abstract concavity.
Resumo:
We characterize the sharing rule for which a contribution mechanism achieves efficiency in a cooperative production setting when agents are heterogeneous. The sharing rule bears no resemblance to those considered by the previous literature. We also show for a large class of sharing rules that if Nash equilibrium yields efficient allocations, the production function displays constant returns to scale, a case in which cooperation in production is useless.
Resumo:
Modern dietary habits are characterized by high-sodium and low-potassium intakes, each of which was correlated with a higher risk for hypertension. In this study, we examined whether long-term variations in the intake of sodium and potassium induce lasting changes in the plasma concentration of circulating steroids by developing a mathematical model of steroidogenesis in mice. One finding of this model was that mice increase their plasma progesterone levels specifically in response to potassium depletion. This prediction was confirmed by measurements in both male mice and men. Further investigation showed that progesterone regulates renal potassium handling both in males and females under potassium restriction, independent of its role in reproduction. The increase in progesterone production by male mice was time dependent and correlated with decreased urinary potassium content. The progesterone-dependent ability to efficiently retain potassium was because of an RU486 (a progesterone receptor antagonist)-sensitive stimulation of the colonic hydrogen, potassium-ATPase (known as the non-gastric or hydrogen, potassium-ATPase type 2) in the kidney. Thus, in males, a specific progesterone concentration profile induced by chronic potassium restriction regulates potassium balance.
Resumo:
Many research projects in life sciences require purified biologically active recombinant protein. In addition, different formats of a given protein may be needed at different steps of experimental studies. Thus, the number of protein variants to be expressed and purified in short periods of time can expand very quickly. We have therefore developed a rapid and flexible expression system based on described episomal vector replication to generate semi-stable cell pools that secrete recombinant proteins. We cultured these pools in serum-containing medium to avoid time-consuming adaptation of cells to serum-free conditions, maintain cell viability and reuse the cultures for multiple rounds of protein production. As such, an efficient single step affinity process to purify recombinant proteins from serum-containing medium was optimized. Furthermore, a series of multi-cistronic vectors were designed to enable simultaneous expression of proteins and their biotinylation in vivo as well as fast selection of protein-expressing cell pools. Combining these improved procedures and innovative steps, exemplified with seven cytokines and cytokine receptors, we were able to produce biologically active recombinant endotoxin free protein at the milligram scale in 4-6weeks from molecular cloning to protein purification.
Resumo:
This publication was prepared to describe how the Iowa State University distillery has been operating, including information on distillery size, equipment, tanks, condenser, heat exchanger, pumps and the process. Photos and diagrams are also included.
Resumo:
Background Enzymatic biodiesel is becoming an increasingly popular topic in bioenergy literature because of its potential to overcome the problems posed by chemical processes. However, the high cost of the enzymatic process still remains the main drawback for its industrial application, mostly because of the high price of refined oils. Unfortunately, low cost substrates, such as crude soybean oil, often release a product that hardly accomplishes the final required biodiesel specifications and need an additional pretreatment for gums removal. In order to reduce costs and to make the enzymatic process more efficient, we developed an innovative system for enzymatic biodiesel production involving a combination of a lipase and two phospholipases. This allows performing the enzymatic degumming and transesterification in a single step, using crude soybean oil as feedstock, and converting part of the phospholipids into biodiesel. Since the two processes have never been studied together, an accurate analysis of the different reaction components and conditions was carried out. Results Crude soybean oil, used as low cost feedstock, is characterized by a high content of phospholipids (900 ppm of phosphorus). However, after the combined activity of different phospholipases and liquid lipase Callera Trans L, a complete transformation into fatty acid methyl esters (FAMEs >95%) and a good reduction of phosphorus (P <5 ppm) was achieved. The combination of enzymes allowed avoidance of the acid treatment required for gums removal, the consequent caustic neutralization, and the high temperature commonly used in degumming systems, making the overall process more eco-friendly and with higher yield. Once the conditions were established, the process was also tested with different vegetable oils with variable phosphorus contents. Conclusions Use of liquid lipase Callera Trans L in biodiesel production can provide numerous and sustainable benefits. Besides reducing the costs derived from enzyme immobilization, the lipase can be used in combination with other enzymes such as phospholipases for gums removal, thus allowing the use of much cheaper, non-refined oils. The possibility to perform degumming and transesterification in a single tank involves a great efficiency increase in the new era of enzymatic biodiesel production at industrial scale.
Resumo:
A crucial step for understanding how lexical knowledge is represented is to describe the relative similarity of lexical items, and how it influences language processing. Previous studies of the effects of form similarity on word production have reported conflicting results, notably within and across languages. The aim of the present study was to clarify this empirical issue to provide specific constraints for theoretical models of language production. We investigated the role of phonological neighborhood density in a large-scale picture naming experiment using fine-grained statistical models. The results showed that increasing phonological neighborhood density has a detrimental effect on naming latencies, and re-analyses of independently obtained data sets provide supplementary evidence for this effect. Finally, we reviewed a large body of evidence concerning phonological neighborhood density effects in word production, and discussed the occurrence of facilitatory and inhibitory effects in accuracy measures. The overall pattern shows that phonological neighborhood generates two opposite forces, one facilitatory and one inhibitory. In cases where speech production is disrupted (e.g. certain aphasic symptoms), the facilitatory component may emerge, but inhibitory processes dominate in efficient naming by healthy speakers. These findings are difficult to accommodate in terms of monitoring processes, but can be explained within interactive activation accounts combining phonological facilitation and lexical competition.
Resumo:
Emission trading with greenhouse gases and green certificates are part if the climate policy the main target of which is reduce greenhouse gas emissions. The carbon dioxide and fine particle emissions of energy production in Helsinki Metropolitan area are calculated in this study. The analysis is made mainly by district heating point of view and the changes of the district heating network are assessed. Carbon dioxide emissions would be a bit higher, if the district heating network is expanded, but then the fine particle emissions would be much lower. Carbon dioxide emissions are roughly 10 % higher, if the district heating network is expanded at same rate as it has in past five years in the year 2030. The expansion of district heating network would decrease the fine particle emissions about 40 %. The cost of the expansion is allocated to be reduction cost of the fine particle emissions, which is considerably higher than the traditional reduction methods costs. The possible new nuclear plant would reduce the emissions considerably and the costs of the nuclear plant would be relatively low comparing the other energy production methods.