43 resultados para Analysis of teaching process
Resumo:
Classical and contemporary scholarship on leadership has referred to political performance and the ability of political actors to deploy the self to political purpose. Literature on contemporary British politics (Hennessy, 2001; Marquand, 2008, King, 2009) has highlighted the qualitative shift in political leadership from the mid-1990s towards a focus upon the image, style, celebrity and performance of political leaders, and the shift towards the presidentialisation or semi-presidentialisation of the prime minister (Foley, 2001). However, the literature has lacked a focus upon political performance and a methodology for assessing leadership performance within cultural and institutional contexts. This thesis assesses British political leadership performance from 1997-2010 through the proposal of a framework of political performance to suit comparative purpose. The framework consisting of culture, institutions and performance is used to assess the performance of the case studies (Tony Blair, Gordon Brown and David Cameron, and Gordon Brown, David Cameron and Nick Clegg in the televised Leaders’ Debates of 2010). The application of the framework to the case studies will allow us to a) analyse political performance within given cultural and institutional contexts; b) establish the character traits and other aspects of a politician’s political persona; and c) appraise the role and effects of performance and persona upon the political process.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.
Resumo:
The open content creation process has proven itself to be a powerful and influential way of developing text-based content, as demonstrated by the success of Wikipedia and related sites. Distributed individuals independently edit, revise, or refine content, thereby creating knowledge artifacts of considerable breadth and quality. Our study explores the mechanisms that control and guide the content creation process and develops an understanding of open content governance. The repertory grid method is employed to systematically capture the experiences of individuals involved in the open content creation process and to determine the relative importance of the diverse control and guiding mechanisms. Our findings illustrate the important control and guiding mechanisms and highlight the multifaceted nature of open content governance. A range of governance mechanisms is discussed with regard to the varied levels of formality, the different loci of authority, and the diverse interaction environments involved. Limitations and opportunities for future research are provided.
Resumo:
Business organisations are going through rapid external environmental and internal organisational changes due to increasing globalisation, E-business, and outsourcing. As a result, the future of purchasing and supply management—as a function within organisations, as a process that spans organisation boundaries and as a profession—raises important concerns for both organisations and the purchasing professional. This paper considers a broad and rather fragmented body of empirical evidence and analyses 42 relevant empirical studies on the future of purchasing and supply management. The major findings are reported in terms of changes in business contexts, purchasing strategy, structure, role and responsibility, system development and skills. Cross-sectional comparative analyses were also conducted to examine variation by sector, firm type, people's roles in purchasing, and country. A number of major implications for the purchasing function, process and professional bodies are presented together with suggestions for future research to address significant gaps in the current body of knowledge.
Resumo:
A three-dimensional finite element analysis (FEA) model with elastic-plastic anisotropy was built to investigate the effects of anisotropy on nanoindentation measurements for cortical bone. The FEA model has demonstrated a capability to capture the cortical bone material response under the indentation process. By comparison with the contact area obtained from monitoring the contact profile in FEA simulations, the Oliver-Pharr method was found to underpredict or overpredict the contact area due to the effects of anisotropy. The amount of error (less than 10% for cortical bone) depended on the indentation orientation. The indentation modulus results obtained from FEA simulations at different surface orientations showed a trend similar to experimental results and were also similar to moduli calculated from a mathematical model. The Oliver-Pharr method has been shown to be useful for providing first-order approximations in the analysis of anisotropic mechanical properties of cortical bone, although the indentation modulus is influenced by anisotropy.
Resumo:
This paper critically reviews the evolution of financial reporting in the banking sector with specific reference to the reporting of market risk and the growing use of the measure known as Value at Risk (VaR). The paper investigates the process by which VaR became 'institutionalised'. The analysis highlights a number of inherent limitations of VaR as a risk measure and questions the usefulness of published VaR disclosures, concluding that risk 'disclosure' might be more apparent than real. It also looks at some of the implications for risk reporting practice and the accounting profession more generally.
Resumo:
Preliminary work is reported on 2-D and 3-D microstructures written directly with a Yb:YAG 1026 nm femtosecond (fs) laser on bulk chemical vapour deposition (CVD) single-crystalline diamond. Smooth graphitic lines and other structures were written on the surface of a CVD diamond sample with a thickness of 0.7mm under low laser fluences. This capability opens up the opportunity for making electronic devices and micro-electromechanical structures on diamond substrates. The fabrication process was optimised through testing a range of laser energies at a 100 kHz repetition rate with sub-500fs pulses. These graphitic lines and structures have been characterised using optical microscopy, Raman spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy. Using these analysis techniques, the formation of sp2 and sp3 bonds is explored and the ratio between sp2 and sp3 bonds after fs laser patterning is quantified. We present the early findings from this study and characterise the relationship between the graphitic line formation and the different fs laser exposure conditions. © 2012 Taylor & Francis.
Resumo:
Despite growing interest in learning and teaching as emotional activities, there is still very little research on experiences of sensitive issues. Using qualitative data from students from a range of social science disciplines, this study investigates student's experiences. The paper highlights how, although they found it difficult and distressing at times, the students all valued being able to explore sensitive issues during their studies. The paper argues that it is though repeated exposure to sensitive issues within the classroom that the students became more comfortable with the issues. This process of lessening sensitivity is an important part of the emotional journey through higher education. It will argue that good student experiences need not always be positive emotions and that sensitive issues should be seen as an important part of transformational education.
A simulation analysis of spoke-terminals operating in LTL Hub-and-Spoke freight distribution systems
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT The research presented in this thesis is concerned with Discrete-Event Simulation (DES) modelling as a method to facilitate logistical policy development within the UK Less-than-Truckload (LTL) freight distribution sector which has been typified by “Pallet Networks” operating on a hub-and-spoke philosophy. Current literature relating to LTL hub-and-spoke and cross-dock freight distribution systems traditionally examines a variety of network and hub design configurations. Each is consistent with classical notions of creating process efficiency, improving productivity, reducing costs and generally creating economies of scale through notions of bulk optimisation. Whilst there is a growing abundance of papers discussing both the network design and hub operational components mentioned above, there is a shortcoming in the overall analysis when it comes to discussing the “spoke-terminal” of hub-and-spoke freight distribution systems and their capabilities for handling the diverse and discrete customer profiles of freight that multi-user LTL hub-and-spoke networks typically handle over the “last-mile” of the delivery, in particular, a mix of retail and non-retail customers. A simulation study is undertaken to investigate the impact on operational performance when the current combined spoke-terminal delivery tours are separated by ‘profile-type’ (i.e. retail or nonretail). The results indicate that a potential improvement in delivery performance can be made by separating retail and non-retail delivery runs at the spoke-terminal and that dedicated retail and non-retail delivery tours could be adopted in order to improve customer delivery requirements and adapt hub-deployed policies. The study also leverages key operator experiences to highlight the main practical implementation challenges when integrating the observed simulation results into the real-world. The study concludes that DES be harnessed as an enabling device to develop a ‘guide policy’. This policy needs to be flexible and should be applied in stages, taking into account the growing retail-exposure.
Resumo:
IEEE 802.11 standard is the dominant technology for wireless local area networks (WLANs). In the last two decades, the Distributed coordination function (DCF) of IEEE 802.11 standard has become the one of the most important media access control (MAC) protocols for mobile ad hoc networks (MANETs). The DCF protocol can also be combined with cognitive radio, thus the IEEE 802.11 cognitive radio ad hoc networks (CRAHNs) come into being. There were several literatures which focus on the modeling of IEEE 802.11 CRAHNs, however, there is still no thorough and scalable analytical models for IEEE 802.11 CRAHNs whose cognitive node (i.e., secondary user, SU) has spectrum sensing and possible channel silence process before the MAC contention process. This paper develops a unified analytical model for IEEE 802.11 CRAHNs for comprehensive MAC layer queuing analysis. In the proposed model, the SUs are modeled by a hyper generalized 2D Markov chain model with an M/G/1/K model while the primary users (PUs) are modeled by a generalized 2D Markov chain and an M/G/1/K model. The performance evaluation results show that the quality-of-service (QoS) of both the PUs and SUs can be statistically guaranteed with the suitable settings of duration of channel sensing and silence phase in the case of under loading.
Resumo:
We investigated family members’ lived experience of Parkinson’s disease (PD) aiming to investigate opportunities for well-being. A lifeworld-led approach to healthcare was adopted. Interpretative phenomenological analysis was used to explore in-depth interviews with people living with PD and their partners. The analysis generated four themes: It’s more than just an illness revealed the existential challenge of diagnosis; Like a bird with a broken wing emphasizing the need to adapt to increasing immobility through embodied agency; Being together with PD exploring the kinship within couples and belonging experienced through support groups; and Carpe diem! illuminated the significance of time and fractured future orientation created by diagnosis. Findings were interpreted using an existential-phenomenological theory of well-being. We highlighted how partners shared the impact of PD in their own ontological challenges. Further research with different types of families and in different situations is required to identify services required to facilitate the process of learning to live with PD. Care and support for the family unit needs to provide emotional support to manage threats to identity and agency alongside problem-solving for bodily changes. Adopting a lifeworld-led healthcare approach would increase opportunities for well-being within the PD illness journey.
Resumo:
Quantitative analysis of solid-state processes from isothermal microcalorimetric data is straightforward if data for the total process have been recorded and problematic (in the more likely case) when they have not. Data are usually plotted as a function of fraction reacted (α); for calorimetric data, this requires knowledge of the total heat change (Q) upon completion of the process. Determination of Q is difficult in cases where the process is fast (initial data missing) or slow (final data missing). Here we introduce several mathematical methods that allow the direct calculation of Q by selection of data points when only partial data are present, based on analysis with the Pérez-Maqueda model. All methods in addition allow direct determination of the reaction mechanism descriptors m and n and from this the rate constant, k. The validity of the methods is tested with the use of simulated calorimetric data, and we introduce a graphical method for generating solid-state power-time data. The methods are then applied to the crystallization of indomethacin from a glass. All methods correctly recovered the total reaction enthalpy (16.6 J) and suggested that the crystallization followed an Avrami model. The rate constants for crystallization were determined to be 3.98 × 10-6, 4.13 × 10-6, and 3.98 × 10 -6 s-1 with methods 1, 2, and 3, respectively. © 2010 American Chemical Society.
Drying kinetic analysis of municipal solid waste using modified page model and pattern search method
Resumo:
This work studied the drying kinetics of the organic fractions of municipal solid waste (MSW) samples with different initial moisture contents and presented a new method for determination of drying kinetic parameters. A series of drying experiments at different temperatures were performed by using a thermogravimetric technique. Based on the modified Page drying model and the general pattern search method, a new drying kinetic method was developed using multiple isothermal drying curves simultaneously. The new method fitted the experimental data more accurately than the traditional method. Drying kinetic behaviors under extrapolated conditions were also predicted and validated. The new method indicated that the drying activation energies for the samples with initial moisture contents of 31.1 and 17.2 % on wet basis were 25.97 and 24.73 kJ mol−1. These results are useful for drying process simulation and industrial dryer design. This new method can be also applied to determine the drying parameters of other materials with high reliability.