783 resultados para Resource-based theory


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract The dehydrogenation of cyclohexanol to cyclohexanone is very important in the manufacture of nylon. Copper-based catalysts are the most popular catalysts for this reaction, and on these catalysts the reaction mechanism and active site are in debate. In order to elucidate the mechanism and active site of the cyclohexanol dehydrogenation on copper-based catalysts, density functional theory with dispersion corrections were performed on up to six facets of copper in two different oxidation states: monovalent copper and metallic copper. By calculating the surface energies of these facets, Cu(111) and Cu2O(111) were found to be the most stable facets for metallic copper and for monovalent copper, respectively. On these two facets, all the possible elementary steps in the dehydrogenation pathway of cyclohexanol were calculated, including the adsorption, dehydrogenation, hydrogen coupling and desorption. Two different reaction pathways for dehydrogenation were considered on both surfaces. It was revealed that the dehydrogenation mechanisms are different on these two surfaces: on Cu(111) the hydrogen belonging to the hydroxyl is removed first, then the hydrogen belonging to the carbon is subtracted, while on Cu2O(111) the hydrogen belonging to the carbon is removed followed by the subtraction of the hydrogen in the hydroxyl group. Furthermore, by comparing the energy profiles of these two surfaces, Cu2O(111) was found to be more active for cyclohexanol dehydrogenation than Cu(111). In addition, we found that the coordinatively unsaturated copper sites on Cu2O(111) are the reaction sites for all the steps. Therefore, the coordinatively unsaturated copper site on Cu2O(111) is likely to be the active site for cyclohexanol dehydrogenation on the copper-based catalysts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the presence of a (time-dependent) macroscopic electric field the electron dynamics of dielectrics cannot be described by the time-dependent density only. We present a real-time formalism that has the density and the macroscopic polarization P as key quantities. We show that a simple local function of P already captures long-range correlation in linear and nonlinear optical response functions. Specifically, after detailing the numerical implementation, we examine the optical absorption, the second- and third-harmonic generation of bulk Si, GaAs, AlAs and CdTe at different level of approximation. We highlight links with ultranonlocal exchange-correlation functional approximations proposed within linear response time-dependent density functional theory framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Existing Workflow Management Systems (WFMSs) follow a pragmatic approach. They often use a proprietary modelling language with an intuitive graphical layout. However the underlying semantics lack a formal foundation. As a consequence, analysis issues, such as proving correctness i.e. soundness and completeness, and reliable execution are not supported at design level. This project will be using an applied ontology approach by formally defining key terms such as process, sub-process, action/task based on formal temporal theory. Current business process modelling (BPM) standards such as Business Process Modelling Notation (BPMN) and Unified Modelling Language (UML) Activity Diagram (AD) model their constructs with no logical basis. This investigation will contribute to the research and industry by providing a framework that will provide grounding for BPM to reason and represent a correct business process (BP). This is missing in the current BPM domain, and may result in reduction of the design costs and avert the burden of redundant terms used by the current standards. A graphical tool will be introduced which will implement the formal ontology defined in the framework. This new tool can be used both as a modelling tool and at the same time will serve the purpose of validating the model. This research will also fill the existing gap by providing a unified graphical representation to represent a BP in a logically consistent manner for the mainstream modelling standards in the fields of business and IT. A case study will be conducted to analyse a catalogue of existing ‘patient pathways’ i.e. processes, of King’s College Hospital NHS Trust including current performance statistics. Following the application of the framework, a mapping will be conducted, and new performance statistics will be collected. A cost/benefits analysis report will be produced comparing the results of the two approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Future distribution systems will have to deal with an intensive penetration of distributed energy resources ensuring reliable and secure operation according to the smart grid paradigm. SCADA (Supervisory Control and Data Acquisition) is an essential infrastructure for this evolution. This paper proposes a new conceptual design of an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). This SCADA model is used to support the energy resource management undertaken by a distribution network operator (DNO). Resource management considers all the involved costs, power flows, and electricity prices, allowing the use of network reconfiguration and load curtailment. Locational Marginal Prices (LMP) are evaluated and used in specific situations to apply Demand Response (DR) programs on a global or a local basis. The paper includes a case study using a 114 bus distribution network and load demand based on real data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a genetic algorithm for the Resource Constrained Project Scheduling Problem (RCPSP). The chromosome representation of the problem is based on random keys. The schedule is constructed using a heuristic priority rule in which the priorities of the activities are defined by the genetic algorithm. The heuristic generates parameterized active schedules. The approach was tested on a set of standard problems taken from the literature and compared with other approaches. The computational results validate the effectiveness of the proposed algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The threat of punishment usually promotes cooperation. However, punishing itself is costly, rare in nonhuman animals, and humans who punish often finish with low payoffs in economic experiments. The evolution of punishment has therefore been unclear. Recent theoretical developments suggest that punishment has evolved in the context of reputation games. We tested this idea in a simple helping game with observers and with punishment and punishment reputation (experimentally controlling for other possible reputational effects). We show that punishers fully compensate their costs as they receive help more often. The more likely defection is punished within a group, the higher the level of within-group cooperation. These beneficial effects perish if the punishment reputation is removed. We conclude that reputation is key to the evolution of punishment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past several decades, many theories have been advanced as to why efforts to reform the public service have met with only limited success. Clearly, the role of leadership with respect to reform must be examined, since successful organizational leaders should be extremely accomplished in the promotion and protection of the values that underlie decision-making. The issue of effective leadership is particularly significant for the future of the public service of Canada. Large numbers of public servants in the executive ranks are due to retire within the next five years. Given their central role, it is vital that there be enough dedicated and committed public servants to staff future vacancies. It is also essential that future public service leaders possess the competencies and values associated with a world-class public service and, a new type of public organization. Related to this point is the importance of people-management skills. People management in the public service is an issue that has historically faced - and will continue to face - major challenges with respect to recruiting and retaining the leaders it requires for its continued success. It is imperative that the public service not only be revitalized and be seen as an employer of choice, but also that the process by which it accomplishes this goal - the practice of human resource management - be modernized. To achieve the flexibility needed to remain effective, the public service requires the kind ofleadership that supports new public service values such as innovation and which emphasizes a "people- first" approach. This thesis examines the kind of public service leadership needed to modernize the human resource management regime in the federal public service. A historical examination of public service values is presented to help determine the values that are important for public service leaders with respect to modernizing human resource management. Since replenishing the 2 ranks of public service leaders is crucial to ensure the quality of service to Canadians, leadership that supports career planning will be a major focus of this paper. In addition, this thesis demonstrates that while traditional public service values continue to endure, innovative leaders must effectively reconcile new public service values with traditional values in order to increase the possibilities for successful reform as well as achieve business objectives. Much of the thesis is devoted to explaining the crucial role of post-bureaucratic leadership to successful reform. One of the major findings of the thesis is that leaders who demonstrate a blending of new public service values and traditional values are critical to creating effective employment relationships, which are key to modernizing human resource management in the federal public service. It will be apparent that public service leaders must ensure that an appropriate accountability framework is in place before embarking on reform. However, leaders who support new public service values such as innovation and empowerment and who emphasize the importance of people are essential to successful reform.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis, based on the results of an organizational ethnography of a university-based feminist organization in Southern Ontario (the Centre), traces how third wave feminism is being constituted in the goals, initiatives, mandate, organizational structure, and overall culture of university-based feminist organizations. I argue that, from its inception, the meanings and goals of the Centre have been contested through internal critique, reflection, and discussion inspired by significant shifts in feminist theory that challenge the fundamental principles of second wave feminism. I identify a major shift in the development and direction of the Centre that occurs in two distinct phases. The first phase of the shift occurs with the emergence of an antioppression framework, which broadens the Centre's mandate beyond gender and sexism to consider multiple axes of identity and oppression that affect women's lives. The second phase of this shift is characterized by a focus on (trans) inclusion and accessibility and has involved changing the Centre's name so that it is no longer identified as a women's centre in order to reflect more accurately its focus on mUltiple axes of identity and oppression. Along with identifying two phases of a major shift in the direction of the Centre, I trace two discourses about its development. The dominant discourse of the Centre's development is one of progress and evolution. The dominant discourse characterizes the Centre as a dynamic feminist organization that consistently strives to be more inclusive and diverse. The reverse discourse undermines the dominant discourse by emphasizing that, despite the Centre's official attempts to be inclusive and to build diversity, little has actually changed, leaving women of colour marginalized in the Centre's dominant culture of whiteness. This research reveals that, while many of their strategies have unintended (negative) consequences, members of the Centre are working to build an inclusive politics of resistance that avoids the mistakes of earlier feminist movements and organizations. These members, along with other activists, actively constitute third wave feminism in a process that is challenging, contradictory, and often painful. A critical analysis of this process and the strategies it involves provides an opportunity for activists to reflect on their experiences and develop new strategies in an effort to further struggles for social justice and equity.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A review of the literature reveals that there are a number of children in the educational system who are characterized by Attention Deficit Disorder. Further review of the literature reveals that there are information processing programs which have had some success in increasing the learning of these children. Currently, an information processing program which is based on schema theory is being implemented in Lincoln County. Since schema theory based programs build structural, conditional, factual, and procedural schemata which assist the learner in attending to salient factors, learning should be increased. Thirty-four children were selected from a random sampling of Grade Seven classes in Lincoln County. Seventeen of these children were identified by the researcher and classroom teacher as being characterized by Attention Deficit Disorder. From the remaining population, 17 children who were not characterized by Attention Deficit Disorder were randomly selected. The data collected were compared using independent t-tests, paired t-tests, and correlation analysis. Significant differences were found in all cases. The Non-Attention Deficit Disorder children scored significantly higher on all the tests but the Attention Defici t Disorder children had a significantly higher ratio of gain between the pretests and posttests.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Second-rank tensor interactions, such as quadrupolar interactions between the spin- 1 deuterium nuclei and the electric field gradients created by chemical bonds, are affected by rapid random molecular motions that modulate the orientation of the molecule with respect to the external magnetic field. In biological and model membrane systems, where a distribution of dynamically averaged anisotropies (quadrupolar splittings, chemical shift anisotropies, etc.) is present and where, in addition, various parts of the sample may undergo a partial magnetic alignment, the numerical analysis of the resulting Nuclear Magnetic Resonance (NMR) spectra is a mathematically ill-posed problem. However, numerical methods (de-Pakeing, Tikhonov regularization) exist that allow for a simultaneous determination of both the anisotropy and orientational distributions. An additional complication arises when relaxation is taken into account. This work presents a method of obtaining the orientation dependence of the relaxation rates that can be used for the analysis of the molecular motions on a broad range of time scales. An arbitrary set of exponential decay rates is described by a three-term truncated Legendre polynomial expansion in the orientation dependence, as appropriate for a second-rank tensor interaction, and a linear approximation to the individual decay rates is made. Thus a severe numerical instability caused by the presence of noise in the experimental data is avoided. At the same time, enough flexibility in the inversion algorithm is retained to achieve a meaningful mapping from raw experimental data to a set of intermediate, model-free

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ongoing changes in global economic structure along information revolution have produced an environment where knowledge and skills or education and training are considered increasingly valued commodities. This is based on the simple notion that nation’s economic progress is linked to education and training. This idea is embodied in the theory of human capital, according to which the knowledge and skill found in labour represents valuable resources for the market. Thus the important assumptions of the Human capital theory are 910 Human capital is an investment for future (2) More training and education leads to better work skills (3) Educational institutions play a central role in the development of human capital(4) the technological revolution is often cited as the most pressing reason why education and knowledge are becoming valuable economic commodities . The objectives of the present study are, the investment and institutional or structural framework of higher education in Kerala, the higher education market and the strengths and weakness of supply demand conditions , cost and the benefits of higher education in Kerala , impact of recent policy changes in higher education,need for expanding higher education market to solve the grave problem of Un employment on the basis of as systematic manpower planning and the higher education and its association with income and employment.