947 resultados para model efficiency
Resumo:
Within recent years, increasing international competition has caused an increase in job transitions worldwide. Many countries find it difficult to manage these transitions in a way that ensures a match between labour and demand. One of the countries that seem to manage the transitions in a successful way is Denmark, where unemployment has been dropping dramatically over the last decade without a drop in job quality. This success is ascribed the so-called Danish flexicurity model, where an easy access to hiring and firing employees (flexibility) is combined with extensive active and passive labour market policies (security). The Danish results have gained interest not only among other European countries, where unemployment rates remain high, but also in the US, where job loss is often related to lower job quality. It has, however, been subject to much debate both in Europe and in the US, whether or not countries with distinctively different political-economic settings can learn from one another. Some have argued that cultural differences impose barriers to successful policy transfer, whereas others see it as a perfectly rational calculus to introduce 'best practices' from elsewhere. This paper presents a third strategy. Recent literature on policy transfer suggests that successful cross national policy transfer is possible, even across the Atlantic, but that one must be cautious in choosing the form, content and level of the learning process. By analysing and comparing the labour market policies and their settings in Denmark and the US in detail, this paper addresses the question, what and how the US can learn from the Danish model. Where the US and Denmark share a high degree of flexibility, they differ significantly on the level of security. This also means that the Danish budget for active and passive labour market policies is significantly higher than the American, and it seems unlikely that political support for the introduction of Danish levels of security in the US can be established. However, the paper concludes that there is a learning potential between the US and Demnark in the different local level efficiency of the money already spent. A major reason for the Danish success has been the introduction of tailor made initiatives to the single displaced worker and a stronger coordination between local level actors. Both of which are issues, where a lack of efficiency in the implementation of American active labour market policies has been reported.
Resumo:
Addressing high and volatile natural resource prices, uncertain supply prospects, reindustrialization attempts and environmental damages related to resource use, resource efficiency has evolved into a highly debated proposal among academia, policy makers, firms and international financial institutions (IFIs). In 2011, the European Union (EU) declared resource efficiency as one of its seven flagship initiatives in its Europe 2020 strategy. This paper contributes to the discussions by assessing its key initiative, the Roadmap to a Resource Efficient Europe (EC 2011 571), following two streams of evaluation. In a first step, resource efficiency is linked to two theoretical frameworks regarding sustainability, (i) the sustainability triangle (consisting of economic, social and ecological dimensions) and (ii) balanced sustainability (combining weak and strong sustainability). Subsequently, both sustainability frameworks are used to assess to which degree the Roadmap follows the concept of sustainability. It can be concluded that it partially respects the sustainability triangle as well as balanced sustainability, primarily lacking a social dimension. In a second step, following Steger and Bleischwitz (2009), the impact of resource efficiency on competitiveness as advocated in the Roadmap is empirically evaluated. Using an Arellano–Bond dynamic panel data model reveals no robust impact of resource efficiency on competiveness in the EU between 2004 and 2009 – a puzzling result. Further empirical research and enhanced data availability are needed to better understand the impacts of resource efficiency on competitiveness on the macroeconomic, microeconomic and industry level. In that regard, strengthening the methodologies of resource indicators seem essential. Last but certainly not least, political will is required to achieve the transition of the EU-economy into a resource efficient future.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-03
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-05
Resumo:
In this paper we investigate the trade-off faced by regulators who must set a price for an intermediate good somewhere between the marginal cost and the monopoly price. We utilize a growth model with monopolistic suppliers of intermediate goods. Investment in innovation is required to produce a new intermediate good. Marginal cost pricing deters innovation, while monopoly pricing maximizes innovation and economic growth at the cost of some static inefficiency. We demonstrate the existence of a second-best price above the marginal cost but below the monopoly price, which maximizes consumer welfare. Simulation results suggest that substantial reductions in consumption, production, growth, and welfare occur where regulators focus on static efficiency issues by setting prices at or near marginal cost.
Resumo:
This paper summarises test results that were used to validate a model and scale-up procedure of the high pressure grinding roll (HPGR) which was developed at the JKMRC by Morrell et al. [Morrell, Lim, Tondo, David,1996. Modelling the high pressure grinding rolls. In: Mining Technology Conference, pp. 169-176.]. Verification of the model is based on results from four data sets that describe the performance of three industrial scale units fitted with both studded and smooth roll surfaces. The industrial units are currently in operation within the diamond mining industry and are represented by De Beers, BHP Billiton and Rio Tinto. Ore samples from the De Beers and BHP Billiton operations were sent to the JKMRC for ore characterisation and HPGR laboratory-scale tests. Rio Tinto contributed an historical data set of tests completed during a previous research project. The results conclude that the modelling of the HPGR process has matured to a point where the model may be used to evaluate new and to optimise existing comminution circuits. The model prediction of product size distribution is good and has been found to be strongly dependent of the characteristics of the material being tested. The prediction of throughput and corresponding power draw (based on throughput) is sensitive to inconsistent gap/diameter ratios observed between laboratory-scale tests and full-scale operations. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Background and Aims: We have optimized the isolated perfused mouse kidney (IPMK) model for studying renal vascular and tubular function in vitro using 24-28 g C57BL6J mice; the wild type controls for many transgenic mice. Methods and Results: Buffer composition was optimized for bovine serum albumin concentration (BSA). The effect of adding erythrocytes on renal function and morphology was assessed. Autoregulation was investigated during stepped increases in perfusion pressure. Perfusion for 60 min at 90-110 mmHg with Krebs bicarbonate buffer containing 5.5% BSA, and amino acids produced functional parameters within the in vivo range. Erythrocytes increased renal vascular resistance (3.8 +/- 0.2 vs 2.4 +/- 0.1 mL/min.mmHg, P < 0.05), enhanced sodium reabsorption (FENa = 0.3 +/- 0.08 vs 1.5 +/- 0.7%, P < 0.05), produced equivalent glomerular filtration rates (GFR; 364 +/- 38 vs 400 +/- 9 muL/min per gkw) and reduced distal tubular cell injury in the inner stripe (5.8 +/- 1.7 vs 23.7 +/- 3.1%, P < 0.001) compared to cell free perfusion. The IPMK was responsive to vasoconstrictor (angiotensin II, EC50 100 pM) and vasodilator (methacholine, EC50 75 nM) mediators and showed partial autoregulation of perfusate flow under control conditions over 65-85 mmHg; autoregulatory index (ARI) of 0.66 +/- 0.11. Angiotensin II (100 pM) extended this range (to 65-120 mmHg) and enhanced efficiency (ARI 0.21 +/- 0.02, P < 0.05). Angiotensin II facilitation was antagonized by methacholine (ARI 0.76 +/- 0.08) and papaverine (ARI 0.91 +/- 0.13). Conclusion: The IPMK model is useful for studying renal physiology and pathophysiology without systemic neurohormonal influences.
Resumo:
The paper investigates the effects of trade liberalisation on the technical efficiency of the Bangladesh manufacturing sector by estimating a combined stochastic frontier-inefficiency model using panel data for the period 197894 for 25 three-digit level industries. The results show that the overall technical efficiency of the manufacturing sector as well as the technical efficiencies of the majority of the individual industries has increased over time. The findings also clearly suggest that trade liberalisation, proxied by export orientation and capital deepening, has had significant impact on the reduction of the overall technical inefficiency. Similarly, the scale of operation and the proportion of non-production labour in total employment appear as important determinants of technical inefficiency. The evidence also indicates that both export-promoting and import-substituting industries have experienced rises in technical efficiencies over time. Besides, the results are suggestive of neutral technical change, although (at the 5 per cent level of significance) the empirical results indicate that there was no technical change in the manufacturing industries. Finally, the joint test based on the likelihood ratio (LR) test rejects the Cobb-Douglas production technology as description of the database given the specification of the translog production technology.
Resumo:
Explants of the hard coral Seriatopora hystrix were exposed to sublethal concentrations of the herbicide diuron DCMU (N'-(3,4-dichlorophenyl,-N,N-dimethylurea)) and the heavy metal copper. Pulse amplitude modulated (PAM) chlorophyll fluorescence techniques were used to assess the effects on the photosynthetic efficiency of the algal symbionts in the tissue (in Symbio), and chlorophyll fluorescence and counts of symbiotic algae (normalised to surface area) were used to assess the extent of coral bleaching. At 30 mug DCMU l(-1), there was a reduction in both the maximum effective quantum yield (DeltaF/F-m') and maximum potential quantum yield (F-v/F-m) of the algal symbionts in symbio. Corals subsequently lost their algal symbionts and discoloured (bleached), especially on their upper sunlight-exposed surfaces. At the same DCMU concentration but under low light (5% of growth irradiance), there was a marked reduction in DeltaF/F-m' but only a slight reduction in F-v/F-m and slight loss of algae. Loss of algal symbionts was also noted after a 7 d exposure to concentrations as low as 10 mug DCMU l(-1) under normal growth irradiance, and after 14 d exposure to 10 mug DCMU l(-1) under reduced irradiance. Collectively the results indicate that DCMU-induced bleaching is caused by a light-dependent photoinactivation of algal symbionts, and that bleaching occurs when F-v/F-n, (measured 2 h after sunset) is reduced to a value of less than or equal to 0.6. Elevated copper concentrations (60 mug Cu l(-1) for 10 h) also induced a rapid bleaching in S. hystrix but without affecting the quantum yield of the algae in symbio. Tests with isolated algae indicated that substantially higher concentrations (300 mug Cu l(-1) for 8 h) were needed to significantly reduce the quantum yield. Thus, copper-induced bleaching occurs without affecting the algal photosynthesis and may be related to effects on the host (animal). It is argued that warm-water bleaching of corals resembles both types of chemically induced bleaching, suggesting the need for an integrated model of coral bleaching involving the effect of temperature on both host (coral) and algal symbionts.
Resumo:
The relationship between reported treatments of lameness, metabolic disorders (milk fever, ketosis), digestive disorders, and technical efficiency (TE) was investigated using neutral and non-neutral stochastic frontier analysis (SFA). TE is estimated relative to the stochastic frontier production function for a sample of 574 Danish dairy herds collected in 1997. Contrary to most published results, but in line with the expected negative impact of disorders on the average cow milk production, herds reporting higher frequencies of milk fever are less technically efficient. Unexpectedly, however, the opposite results were observed for lameness, ketosis, and digestive disorders. The non-neutral stochastic frontier indicated that the opposite results are due to the relative. high productivities of inputs. The productivity of the cows is also reflected by the direction of impact of herd management variables. Whereas efficient farms replace cows more frequently, enroll heifers in production at an earlier age, and have shorter calving intervals, they also report higher frequency of disorder treatments. The average estimated energy corrected milk loss per cow is 1036, 451 and 242 kg for low, medium and high efficient farms. The study demonstrates the benefit of the stochastic frontier production function involving the estimation of individual technical efficiencies to evaluate farm performance and investigate the source of inefficiency. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Relationships of various reproductive disorders and milk production performance of Danish dairy farms were investigated. A stochastic frontier production function was estimated using data collected in 1998 from 514 Danish dairy farms. Measures of farm-level milk production efficiency relative to this production frontier were obtained, and relationships between milk production efficiency and the incidence risk of reproductive disorders were examined. There were moderate positive relationships between milk production efficiency and retained placenta, induction of estrus, uterine infections, ovarian cysts, and induction of birth. Inclusion of reproductive management variables showed that these moderate relationships disappeared, but directions of coefficients for almost all those variables remained the same. Dystocia showed a weak negative correlation with milk production efficiency. Farms that were mainly managed by young farmers had the highest average efficiency scores. The estimated milk losses due to inefficiency averaged 1142, 488, and 256 kg of energy-corrected milk per cow, respectively, for low-, medium-, and high-efficiency herds. It is concluded that the availability of younger cows, which enabled farmers to replace cows with reproductive disorders, contributed to high cow productivity in efficient farms. Thus, a high replacement rate more than compensates for the possible negative effect of reproductive disorders. The use of frontier production and efficiency/ inefficiency functions to analyze herd data may enable dairy advisors to identify inefficient herds and to simulate the effect of alternative management procedures on the individual herd's efficiency.
Resumo:
The resource potential of shallow water tables for cropping systems has been investigated using the Australian sugar industry as a case study. Literature concerning shallow water table contributions to sugarcane crops has been summarised, and an assessment of required irrigation for water tables to depths of 2 m investigated using the SWIMv2.1 soil water balance model for three different soils. The study was undertaken because water availability is a major limitation for sugarcane and other crop production systems in Australia and knowledge on how best to incorporate upflow from water tables in irrigation scheduling is limited. Our results showed that for the three soils studied (representing a range of permeabilities as defined by near-saturated hydraulic conductivities), no irrigation would be required for static water tables within 1 m of the soil surface. Irrigation requirements when static water tables exceeded 1 m depth were dependent on the soil type and rooting characteristics (root depth and density). Our results also show that the near-saturated hydraulic conductivities are a better indicator of the ability of water tables below 1 m to supply sufficient upflow as opposed to soil textural classifications. We conclude that there is potential for reductions in irrigation and hence improvements in irrigation water use efficiency in areas where shallow water tables are a low salinity risk: either fresh, or the local hydrology results in net recharge. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The Lattice Solid Model has been used successfully as a virtual laboratory to simulate fracturing of rocks, the dynamics of faults, earthquakes and gouge processes. However, results from those simulations show that in order to make the next step towards more realistic experiments it will be necessary to use models containing a significantly larger number of particles than current models. Thus, those simulations will require a greatly increased amount of computational resources. Whereas the computing power provided by single processors can be expected to increase according to Moore's law, i.e., to double every 18-24 months, parallel computers can provide significantly larger computing power today. In order to make this computing power available for the simulation of the microphysics of earthquakes, a parallel version of the Lattice Solid Model has been implemented. Benchmarks using large models with several millions of particles have shown that the parallel implementation of the Lattice Solid Model can achieve a high parallel-efficiency of about 80% for large numbers of processors on different computer architectures.
Resumo:
Demonstrating the existence of trends in monitoring data is of increasing practical importance to conservation managers wishing to preserve threatened species or reduce the impact of pest species. However, the ability to do so can be compromised if the species in question has low detectability and the true occupancy level or abundance of the species is thus obscured. Zero-inflated models that explicitly model detectability improve the ability to make sound ecological inference in such situations. In this paper we apply an occupancy model including detectability to data from the initial stages of a fox-monitoring program on the Eyre Peninsula, South Australia. We find that detectability is extremely low (< 18%) and varies according to season and the presence or absence of roadside vegetation. We show that simple methods of using monitoring data to inform management, such as plotting the raw data or performing logistic regression, fail to accurately diagnose either the status of the fox population or its trajectory over time. We use the results of the detectability model to consider how future monitoring could be redesigned to achieve efficiency gains. A wide range of monitoring programs could benefit from similar analyses, as part of an active adaptive approach to improving monitoring and management.