887 resultados para Availability and efficiency
Resumo:
This thesis investigates the design of optimal tax systems in dynamic environments. The first essay characterizes the optimal tax system where wages depend on stochastic shocks and work experience. In addition to redistributive and efficiency motives, the taxation of inexperienced workers depends on a second-best requirement that encourages work experience, a social insurance motive and incentive effects. Calibrations using U.S. data yield higher expected optimal marginal income tax rates for experienced workers for most of the inexperienced workers. They confirm that the average marginal income tax rate increases (decreases) with age when shocks and work experience are substitutes (complements). Finally, more variability in experienced workers' earnings prospects leads to increasing tax rates since income taxation acts as a social insurance mechanism. In the second essay, the properties of an optimal tax system are investigated in a dynamic private information economy where labor market frictions create unemployment that destroys workers' human capital. A two-skill type model is considered where wages and employment are endogenous. I find that the optimal tax system distorts the first-period wages of all workers below their efficient levels which leads to more employment. The standard no-distortion-at-the-top result no longer holds due to the combination of private information and the destruction of human capital. I show this result analytically under the Maximin social welfare function and confirm it numerically for a general social welfare function. I also investigate the use of a training program and job creation subsidies. The final essay analyzes the optimal linear tax system when there is a population of individuals whose perceptions of savings are linked to their disposable income and their family background through family cultural transmission. Aside from the standard equity/efficiency trade-off, taxes account for the endogeneity of perceptions through two channels. First, taxing labor decreases income, which decreases the perception of savings through time. Second, taxation on savings corrects for the misperceptions of workers and thus savings and labor decisions. Numerical simulations confirm that behavioral issues push labor income taxes upward to finance saving subsidies. Government transfers to individuals are also decreased to finance those same subsidies.
Resumo:
As the globalization of knowledge has taken hold over the past decade, and as governments around the world review their new roles in support of the production of knowledge, several factors have shaped the context in governments’ approach to public research. Arguably, none has been more affected by these pressures for reform than government scientific and technology laboratories or institutes. Often ignored in the re-shaping of national systems of innovation, these organizations play an important role in advancing national economic and social objectives. This paper, by reviewing examples of reforms underway in several countries, including Canada, France, Germany, the UK, Japan, USA and Latin America, will argue that government research and technology institutes — often historically surrogates for industrial research — are gradually re-defining their mandates to meet the new pressures of globalization as well as satisfying growing public demands for increased relevance and efficiency in responding to citizens’ and industry needs.
Resumo:
The embedding of third sector organisations in the policy world is fraught with tensions. Accountability and autonomy become oppositional forces causing an uneasy relationship. Government agencies are concerned that their equity and efficiency goals and objectives be met when they enter partnerships with the third sector for the delivery of programs and services. Third sector agencies question the impact of accountability mechanisms on their independence and identities. Even if the relationship between government and third sector agencies seems to be based on cooperation, concerns about cooptation (for nonprofits) and capturing (for governments) may linger calling the legitimacy of the partnership into question. Two means of improving the relationship between the governing and third sectors have been proposed recently in Canada by the Panel on Accountability and Governance in the Voluntary Sector (PAGVS) and the Joint Tables sponsored by the Voluntary Sector Task Force (VSTF). The two endeavours represent a historic undertaking in Canada aimed at improving and facilitating the relationship between the federal government and the nonprofit sector. The reports borrow on other country models but offer new insights into mediating the relationship, including new models for a regulatory body and a charity compact for Canada. Do these recommendations adequately address concerns of autonomy, accountability and cooptation or capturing? The Canadian reports do offer new insights into resolving the four tensions inherent in partnerships between the governing and third sector but also raise important questions about the nature of these relationships and the evolution of democracy within the Canadian political system.
Resumo:
Climate change is expected to have marked impacts on forest ecosystems. In Ontario forests, this includes changes in tree growth, stand composition and disturbance regimes, with expected impacts on many forest-dependent communities, the bioeconomy, and other environmental considerations. In response to climate change, renewable energy systems, such as forest bioenergy, are emerging as critical tools for carbon emissions reductions and climate change mitigation. However, these systems may also need to adapt to changing forest conditions. Therefore, the aim of this research was to estimate changes in forest growth and forest cover in response to anticipated climatic changes in the year 2100 in Ontario forests, to ultimately explore the sustainability of bioenergy in the future. Using the Haliburton Forest and Wildlife Reserve in Ontario as a case study, this research used a spatial climate analog approach to match modeled Haliburton temperature and precipitation (via Fourth Canadian Regional Climate Model) to regions currently exhibiting similar climate (climate analogs). From there, current forest cover and growth rates of core species in Haliburton were compared to forests plots in analog regions from the US Forest Service Forest Inventory and Analysis (FIA). This comparison used two different emission scenarios, corresponding to a high and a mid-range emission future. This research then explored how these changes in forests may influence bioenergy feasibility in the future. It examined possible volume availability and composition of bioenergy feedstock under future conditions. This research points to a potential decline of softwoods in the Haliburton region with a simultaneous expansion of pre-established hardwoods such as northern red oak and red maple, as well as a potential loss in sugar maple cover. From a bioenergy perspective, hardwood residues may be the most feasible feedstock in the future with minimal change in biomass availability for energy production; under these possible conditions, small scale combined heat and power (CHP) and residential pellet use may be the most viable and ecologically sustainable options. Ultimately, understanding the way in which forests may change is important in informing meaningful policy and management, allowing for improved forest bioenergy systems, now and in the future.
Resumo:
This paper discusses some aspects of hunter-gatherer spatial organization in southern South Patagonia, in later times to 10,000 cal yr BP. Various methods of spatial analysis, elaborated with a Geographic Information System (GIS) were applied to the distributional pattern of archaeological sites with radiocarbon dates. The shift in the distributional pattern of chronological information was assessed in conjunction with other lines of evidence within a biogeographic framework. Accordingly, the varying degrees of occupation and integration of coastal and interior spaces in human spatial organization are explained in association with the adaptive strategies hunter-gatherers have used over time. Both are part of the same human response to changes in risk and uncertainty variability in the region in terms of resource availability and environmental dynamics.
Resumo:
Kelp forests represent some of the most productive and diverse habitats on Earth. Understanding drivers of ecological patterns at large spatial scales is critical for effective management and conservation of marine habitats. We surveyed kelp forests dominated by Laminaria hyperborea (Gunnerus) Foslie 1884 across 9° latitude and >1000 km of coastline and measured a number of physical parameters at multiple scales to link ecological structure and standing stock of carbon with environmental variables. Kelp density, biomass, morphology and age were generally greater in exposed sites within regions, highlighting the importance of wave exposure in structuring L. hyperborea populations. At the regional scale, wave-exposed kelp canopies in the cooler regions (the north and west of Scotland) were greater in biomass, height and age than in warmer regions (southwest Wales and England). The range and maximal values of estimated standing stock of carbon contained within kelp forests was greater than in historical studies, suggesting that this ecosystem property may have been previously undervalued. Kelp canopy density was positively correlated with large-scale wave fetch and fine-scale water motion, whereas kelp canopy biomass and the standing stock of carbon were positively correlated with large-scale wave fetch and light levels and negatively correlated with temperature. As light availability and summer temperature were important drivers of kelp forest biomass, effective management of human activities that may affect coastal water quality is necessary to maintain ecosystem functioning, while increased temperatures related to anthropogenic climate change may impact the structure of kelp forests and the ecosystem services they provide.
Resumo:
Kelp forests represent some of the most productive and diverse habitats on Earth. Understanding drivers of ecological patterns at large spatial scales is critical for effective management and conservation of marine habitats. We surveyed kelp forests dominated by Laminaria hyperborea (Gunnerus) Foslie 1884 across 9° latitude and >1000 km of coastline and measured a number of physical parameters at multiple scales to link ecological structure and standing stock of carbon with environmental variables. Kelp density, biomass, morphology and age were generally greater in exposed sites within regions, highlighting the importance of wave exposure in structuring L. hyperborea populations. At the regional scale, wave-exposed kelp canopies in the cooler regions (the north and west of Scotland) were greater in biomass, height and age than in warmer regions (southwest Wales and England). The range and maximal values of estimated standing stock of carbon contained within kelp forests was greater than in historical studies, suggesting that this ecosystem property may have been previously undervalued. Kelp canopy density was positively correlated with large-scale wave fetch and fine-scale water motion, whereas kelp canopy biomass and the standing stock of carbon were positively correlated with large-scale wave fetch and light levels and negatively correlated with temperature. As light availability and summer temperature were important drivers of kelp forest biomass, effective management of human activities that may affect coastal water quality is necessary to maintain ecosystem functioning, while increased temperatures related to anthropogenic climate change may impact the structure of kelp forests and the ecosystem services they provide.
Resumo:
This paper describes an implementation of a method capable of integrating parametric, feature based, CAD models based on commercial software (CATIA) with the SU2 software framework. To exploit the adjoint based methods for aerodynamic optimisation within the SU2, a formulation to obtain geometric sensitivities directly from the commercial CAD parameterisation is introduced, enabling the calculation of gradients with respect to CAD based design variables. To assess the accuracy and efficiency of the alternative approach, two aerodynamic optimisation problems are investigated: an inviscid, 3D, problem with multiple constraints, and a 2D high-lift aerofoil, viscous problem without any constraints. Initial results show the new parameterisation obtaining reliable optimums, with similar levels of performance of the software native parameterisations. In the final paper, details of computing CAD sensitivities will be provided, including accuracy as well as linking geometric sensitivities to aerodynamic objective functions and constraints; the impact in the robustness of the overall method will be assessed and alternative parameterisations will be included.
Resumo:
Background To our knowledge, there is little study on the interaction between nutrient availability and molecular structure changes induced by different processing methods in dairy cattle. The objective of this study was to investigate the effect of heat processing methods on interaction between nutrient availability and molecular structure in terms of functional groups that are related to protein and starch inherent structure of oat grains with two continued years and three replication of each year. Method The oat grains were kept as raw (control) or heated in an air-draft oven (dry roasting: DO) at 120 °C for 60 min and under microwave irradiation (MIO) for 6 min. The molecular structure features were revealed by vibrational infrared molecular spectroscopy. Results The results showed that rumen degradability of dry matter, protein and starch was significantly lower (P <0.05) for MIO compared to control and DO treatments. A higher protein α-helix to β-sheet and a lower amide I to starch area ratio were observed for MIO compared to DO and/or raw treatment. A negative correlation (−0.99, P < 0.01) was observed between α-helix or amide I to starch area ratio and dry matter. A positive correlation (0.99, P < 0.01) was found between protein β-sheet and crude protein. Conclusion The results reveal that oat grains are more sensitive to microwave irradiation than dry heating in terms of protein and starch molecular profile and nutrient availability in ruminants.
Resumo:
A new variant of the Element-Free Galerkin (EFG) method, that combines the diffraction method, to characterize the crack tip solution, and the Heaviside enrichment function for representing discontinuity due to a crack, has been used to model crack propagation through non-homogenous materials. In the case of interface crack propagation, the kink angle is predicted by applying the maximum tangential principal stress (MTPS) criterion in conjunction with consideration of the energy release rate (ERR). The MTPS criterion is applied to the crack tip stress field described by both the stress intensity factor (SIF) and the T-stress, which are extracted using the interaction integral method. The proposed EFG method has been developed and applied for 2D case studies involving a crack in an orthotropic material, crack along an interface and a crack terminating at a bi-material interface, under mechanical or thermal loading; this is done to demonstrate the advantages and efficiency of the proposed methodology. The computed SIFs, T-stress and the predicted interface crack kink angles are compared with existing results in the literature and are found to be in good agreement. An example of crack growth through a particle-reinforced composite materials, which may involve crack meandering around the particle, is reported.
Resumo:
Future warming is predicted to shift the Earth system into a mode with progressive increase and vigour of extreme climate events possibly stimulating other mechanisms that invigorate global warming. This study provides new data and modelling investigating climatic consequences and biogeochemical feedbacks that happened in a warmer world ~112 Myr ago. Our study focuses on the Cretaceous Oceanic Anoxic Event (OAE) 1b and explores how the Earth system responded to a moderate ~25,000 yr lasting climate perturbation that is modelled to be less than 1 °C in global average temperature. Using a new chronological model for OAE 1b we present high-resolution elemental and bulk carbon isotope records from DSDP Site 545 from Mazagan Plateau off NW Africa and combine this information with a coupled atmosphere-land-ocean model. The simulations suggest that a perturbation at the onset of OAE 1b caused almost instantaneous warming of the atmosphere on the order of 0.3 °C followed by a longer (~45,000 yr) period of ~0.8 °C cooling. The marine records from DSDP Site 545 support that these moderate swings in global climate had immediate consequences for African continental supply of mineral matter and nutrients (phosphorous), subsequent oxygen availability, and organic carbon burial in the eastern subtropical Atlantic, however, without turning the ocean anoxic. The match between modelling results and stratigraphic isotopic data support previous studies [summarized in Jenkyns 2003, doi:10.1098/rsta.2003.1240] in that methane emission from marine hydrates, albeit moderate in dimension, may have been the trigger for OAE 1b, though we can not finally rule out alternative mechanisms. Following the hydrate mechanism a total of 1.15 * 10**18 g methane carbon (delta13C=-60 ?), equivalent to about 10% to the total modern gas hydrate inventory, generated the delta13Ccarb profile recorded in the section. Modelling suggests a combination of moderate-scale methane pulses supplemented by continuous methane emission at elevated levels over ~25,000 yr. The proposed mechanism, though difficult to finally confirm in the geological past, is arguably more likely to occur in a warmer world and apparently perturbs global climate and ocean chemistry almost instantaneously. This study shows that, once set-off, this mechanism can maintain Earth's climate in a perturbed mode over geological time leading to pronounced changes in regional climate.
Resumo:
Efficiency represents the ratio of work done to energy expended. In human movement, it is desirable to maximise the work done or minimise the energy expenditure. Whilst research has examined the efficiency of human movement for the lower and upper body, there is a paucity of research which considers the efficiency of a total body movement. Rowing is a movement which encompasses all parts of the body to generate locomotion and is a useful modality to measure total body efficiency. It was the aim of this research to develop a total body model of efficiency and explore how skill level of participants and assumptions of the modelling process affected the efficiency estimates Three studies were used to develop and evaluate the efficiency model. Firstly, the efficiency of ten healthy males was established using rowing, cycling and arm cranking. The model included internal work from motion capture and efficiency estimates were comparable to published literature, indicating the suitability of the model to estimate efficiency. Secondly, the model was developed to include a multi-segmented trunk and twelve novice and twelve skilled participants were assessed for efficiency. Whilst the efficiency estimates were similar to published results, novice participants were assessed as more efficient. Issues such as the unique physiology of trained rowers and a lack of energy transfers in the model were considered contributing factors. Finally the model was redeveloped to account for energy transfers, where skilled participants had higher efficiency at large workloads. This work presents a novel model for estimating efficiency during a rowing motion. The specific inclusion of energy transfers expands previous knowledge of internal work and efficiency, demonstrating a need to include energy transfers in the assessment of efficiency of a total body action.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
BACKGROUND: Even though physician rating websites (PRWs) have been gaining in importance in both practice and research, little evidence is available on the association of patients' online ratings with the quality of care of physicians. It thus remains unclear whether patients should rely on these ratings when selecting a physician. The objective of this study was to measure the association between online ratings and structural and quality of care measures for 65 physician practices from the German Integrated Health Care Network "Quality and Efficiency" (QuE). METHODS: Online reviews from two German PRWs were included which covered a three-year period (2011 to 2013) and included 1179 and 991 ratings, respectively. Information for 65 QuE practices was obtained for the year 2012 and included 21 measures related to structural information (N = 6), process quality (N = 10), intermediate outcomes (N = 2), patient satisfaction (N = 1), and costs (N = 2). The Spearman rank coefficient of correlation was applied to measure the association between ratings and practice-related information. RESULTS: Patient satisfaction results from offline surveys and the patients per doctor ratio in a practice were shown to be significantly associated with online ratings on both PRWs. For one PRW, additional significant associations could be shown between online ratings and cost-related measures for medication, preventative examinations, and one diabetes type 2-related intermediate outcome measure. There again, results from the second PRW showed significant associations with the age of the physicians and the number of patients per practice, four process-related quality measures for diabetes type 2 and asthma, and one cost-related measure for medication. CONCLUSIONS: Several significant associations were found which varied between the PRWs. Patients interested in the satisfaction of other patients with a physician might select a physician on the basis of online ratings. Even though our results indicate associations with some diabetes and asthma measures, but not with coronary heart disease measures, there is still insufficient evidence to draw strong conclusions. The limited number of practices in our study may have weakened our findings.
Resumo:
Dans l’industrie de l’aluminium, le coke de pétrole calciné est considéré comme étant le composant principal de l’anode. Une diminution dans la qualité du coke de pétrole a été observée suite à une augmentation de sa concentration en impuretés. Cela est très important pour les alumineries car ces impuretés, en plus d’avoir un effet réducteur sur la performance des anodes, contaminent le métal produit. Le coke de pétrole est aussi une source de carbone fossile et, durant sa consommation, lors du processus d’électrolyse, il y a production de CO2. Ce dernier est considéré comme un gaz à effet de serre et il est bien connu pour son rôle dans le réchauffement planétaire et aussi dans les changements climatiques. Le charbon de bois est disponible et est produit mondialement en grande quantité. Il pourrait être une alternative attrayante pour le coke de pétrole dans la fabrication des anodes de carbone utilisées dans les cuves d’électrolyse pour la production de l’aluminium. Toutefois, puisqu’il ne répond pas aux critères de fabrication des anodes, son utilisation représente donc un grand défi. En effet, ses principaux désavantages connus sont sa grande porosité, sa structure désordonnée et son haut taux de minéraux. De plus, sa densité et sa conductivité électrique ont été rapportées comme étant inférieures à celles du coke de pétrole. L’objectif de ce travail est d’explorer l’effet du traitement de chaleur sur les propriétés du charbon de bois et cela, dans le but de trouver celles qui s’approchent le plus des spécifications requises pour la production des anodes. L’évolution de la structure du charbon de bois calciné à haute température a été suivie à l’aide de différentes techniques. La réduction de son contenu en minéraux a été obtenue suite à des traitements avec de l’acide chlorhydrique utilisé à différentes concentrations. Finalement, différentes combinaisons de ces deux traitements, calcination et lixiviation, ont été essayées dans le but de trouver les meilleures conditions de traitement.