863 resultados para Job Demands-Resources Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A description and model of the near-surface hydrothermal system at Casa Diablo, with its implications for the larger-scale hydrothermal system of Long Valley, California, is presented. The data include resistivity profiles with penetrations to three different depth ranges, and analyses of inorganic mercury concentrations in 144 soil samples taken over a 1.3 by 1.7 km area. Analyses of the data together with the mapping of active surface hydrothermal features (fumaroles, mudpots, etc.), has revealed that the relationship between the hydrothermal system, surface hydrothermal activity, and mercury anomalies is strongly controlled by faults and topography. There are, however, more subtle factors responsible for the location of many active and anomalous zones such as fractures, zones of high permeability, and interactions between hydrothermal and cooler groundwater. In addition, the near-surface location of the upwelling from the deep hydrothermal reservoir, which supplies the geothermal power plants at Casa Diablo and the numerous hot pools in the caldera with hydrothermal water, has been detected. The data indicate that after upwelling the hydrothermal water flows eastward at shallow depth for at least 2 km and probably continues another 10 km to the east, all the way to Lake Crowley.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual machines (VMs) are powerful platforms for building agile datacenters and emerging cloud systems. However, resource management for a VM-based system is still a challenging task. First, the complexity of application workloads as well as the interference among competing workloads makes it difficult to understand their VMs’ resource demands for meeting their Quality of Service (QoS) targets; Second, the dynamics in the applications and system makes it also difficult to maintain the desired QoS target while the environment changes; Third, the transparency of virtualization presents a hurdle for guest-layer application and host-layer VM scheduler to cooperate and improve application QoS and system efficiency. This dissertation proposes to address the above challenges through fuzzy modeling and control theory based VM resource management. First, a fuzzy-logic-based nonlinear modeling approach is proposed to accurately capture a VM’s complex demands of multiple types of resources automatically online based on the observed workload and resource usages. Second, to enable fast adaption for resource management, the fuzzy modeling approach is integrated with a predictive-control-based controller to form a new Fuzzy Modeling Predictive Control (FMPC) approach which can quickly track the applications’ QoS targets and optimize the resource allocations under dynamic changes in the system. Finally, to address the limitations of black-box-based resource management solutions, a cross-layer optimization approach is proposed to enable cooperation between a VM’s host and guest layers and further improve the application QoS and resource usage efficiency. The above proposed approaches are prototyped and evaluated on a Xen-based virtualized system and evaluated with representative benchmarks including TPC-H, RUBiS, and TerraFly. The results demonstrate that the fuzzy-modeling-based approach improves the accuracy in resource prediction by up to 31.4% compared to conventional regression approaches. The FMPC approach substantially outperforms the traditional linear-model-based predictive control approach in meeting application QoS targets for an oversubscribed system. It is able to manage dynamic VM resource allocations and migrations for over 100 concurrent VMs across multiple hosts with good efficiency. Finally, the cross-layer optimization approach further improves the performance of a virtualized application by up to 40% when the resources are contended by dynamic workloads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present article discusses the directives of the policies of the initial teachers’ formation introduced in Brazil in the 1990’s. These directives have brought new directions and demands, both to the formation institutions and to teachers’ formation. This work analyzes the enlargement and the variation of the formation locus, a topic criticized by the teaching educators for enabling different and flexible models which have taken to a more technical and instrumental formation to the detriment of a theoretical and practical solid formation. In a general way, the study reveals that the CEFET’s use as a space of teachers’ formation has given priority to the quantitative aspects, the optimization of the resources and the instrumentation of the educators’ formation, even though under the speech of the quality of the educational process. Differently, the operationalization of this policy at CEFET-RN has favored a formative model, which has joined research, extension and teaching, guaranteeing the specificity of a solid formation, as well as the articulation between theory and practice and the sense of investigation, necessary characteristics to the formation of a devoted educator with the quality of public education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful conservation of migratory birds demands we understand how habitat factors on the breeding grounds influences breeding success. Multiple factors are known to directly influence breeding success in territorial songbirds. For example, greater food availability and fewer predators can have direct effects on breeding success. However, many of these same habitat factors can also result in higher conspecific density that may ultimately reduce breeding success through density dependence. In this case, there is a negative indirect effect of habitat on breeding success through its effects on conspecific density and territory size. Therefore, a key uncertainty facing land managers is whether important habitat attributes directly influence breeding success or indirectly influence breeding success through territory size. We used radio-telemetry, point-counts, vegetation sampling, predator observations, and insect sampling over two years to provide data on habitat selection of a steeply declining songbird species, the Canada Warbler (Cardellina canadensis). These data were then applied in a hierarchical path modeling framework and an AIC model selection approach to determine the habitat attributes that best predict breeding success. Canada Warblers had smaller territories in areas with high shrub cover, in the presence of red squirrels (Tamiasciurus hudsonicus), at shoreline sites relative to forest-interior sites and as conspecific density increased. Breeding success was lower for birds with smaller territories, which suggests competition for limited food resources, but there was no direct evidence that food availability influenced territory size or breeding success. The negative relationship between shrub cover and territory size in our study may arise because these specific habitat conditions are spatially heterogeneous, whereby individuals pack into patches of preferred breeding habitat scattered throughout the landscape, resulting in reduced territory size and an associated reduction in resource availability per territory. Our results therefore highlight the importance of considering direct and indirect effects for Canada warblers; efforts to increase the amount of breeding habitat may ultimately result in lower breeding success if habitat availability is limited and negative density dependent effects occur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many service firms require frontline service employees (FLEs) to follow routines and standardized operating procedures during the service encounter, to deliver consistently high service standards. However, to create superior, pleasurable experiences for customers, featuring both helpful services and novel approaches to meeting their needs, firms in various sectors also have begun to encourage FLEs to engage in more innovative service behaviors. This study therefore investigates a new and complementary route to customer loyalty, beyond the conventional service-profit chain, that moves through FLEs' innovative service behavior. Drawing on conservation of resources (COR) theory, this study introduces a resource gain spiral at the service encounter, which runs from FLEs' emotional job engagement to innovative service behavior, and then leads to customer delight and finally customer loyalty. In accordance with COR theory, the proposed model also includes factors that might hinder (customer aggression, underemployment) or foster (colleague support, supervisor support) FLEs' resource gain spiral. A multilevel analysis of a large-scale, dyadic data set that contains responses from both FLEs and customers in multiple industries strongly supports the proposed resource gain spiral as a complementary route to customer loyalty. The positive emotional job engagement-innovative service behavior relationship is undermined by customer aggression and underemployment, as hypothesized. Surprisingly though, and contrary to the hypotheses, colleague and supervisor support do not seem to foster FLEs' resource gain spiral. Instead, colleague support weakens the engagement-innovative service behavior relationship, and supervisor support does not affect it. These results indicate that if FLEs can solicit resources from other sources, they may not need to invest as many of their individual resources. In particular, colleague support even appears to serve as a substitute for FLEs' individual resource investments in the resource gain spiral.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research into the dynamicity of job performance criteria has found evidence suggesting the presence of rank-order changes to job performance scores across time as well as intraindividual trajectories in job performance scores across time. These findings have influenced a large body of research into (a) the dynamicity of validities of individual differences predictors of job performance and (b) the relationship between individual differences predictors of job performance and intraindividual trajectories of job performance. In the present dissertation, I addressed these issues within the context of the Five Factor Model of personality. The Five Factor Model is arranged hierarchically, with five broad higher-order factors subsuming a number of more narrowly tailored personality facets. Research has debated the relative merits of broad versus narrow traits for predicting job performance, but the entire body of research has addressed the issue from a static perspective -- by examining the relative magnitude of validities of global factors versus their facets. While research along these lines has been enlightening, theoretical perspectives suggest that the validities of global factors versus their facets may differ in their stability across time. Thus, research is needed to not only compare the relative magnitude of validities of global factors versus their facets at a single point in time, but also to compare the relative stability of validities of global factors versus their facets across time. Also necessary to advance cumulative knowledge concerning intraindividual performance trajectories is research into broad vs. narrow traits for predicting such trajectories. In the present dissertation, I addressed these issues using a four-year longitudinal design. The results indicated that the validities of global conscientiousness were stable across time, while the validities of conscientiousness facets were more likely to fluctuate. However, the validities of emotional stability and extraversion facets were no more likely to fluctuate across time than those of the factors. Finally, while some personality factors and facets predicted performance intercepts (i.e., performance at the first measurement occasion), my results failed to indicate a significant effect of any personality variable on performance growth. Implications for research and practice are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more pow- erful levels of abstractions to address the rising complexity inherent in software solu- tions. One new development paradigm that places models as abstraction at the fore- front of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code. Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process. The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources. At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM’s synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise. This dissertation investigates how to decouple the DSK from the MoE and sub- sequently producing a generic model of execution (GMoE) from the remaining appli- cation logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis com- ponent of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions. This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Model for Prediction Across Scales (MPAS) is a novel set of Earth system simulation components and consists of an atmospheric model, an ocean model and a land-ice model. Its distinct features are the use of unstructured Voronoi meshes and C-grid discretisation to address shortcomings of global models on regular grids and the use of limited area models nested in a forcing data set, with respect to parallel scalability, numerical accuracy and physical consistency. This concept allows one to include the feedback of regional land use information on weather and climate at local and global scales in a consistent way, which is impossible to achieve with traditional limited area modelling approaches. Here, we present an in-depth evaluation of MPAS with regards to technical aspects of performing model runs and scalability for three medium-size meshes on four different high-performance computing (HPC) sites with different architectures and compilers. We uncover model limitations and identify new aspects for the model optimisation that are introduced by the use of unstructured Voronoi meshes. We further demonstrate the model performance of MPAS in terms of its capability to reproduce the dynamics of the West African monsoon (WAM) and its associated precipitation in a pilot study. Constrained by available computational resources, we compare 11-month runs for two meshes with observations and a reference simulation from the Weather Research and Forecasting (WRF) model. We show that MPAS can reproduce the atmospheric dynamics on global and local scales in this experiment, but identify a precipitation excess for the West African region. Finally, we conduct extreme scaling tests on a global 3?km mesh with more than 65 million horizontal grid cells on up to half a million cores. We discuss necessary modifications of the model code to improve its parallel performance in general and specific to the HPC environment. We confirm good scaling (70?% parallel efficiency or better) of the MPAS model and provide numbers on the computational requirements for experiments with the 3?km mesh. In doing so, we show that global, convection-resolving atmospheric simulations with MPAS are within reach of current and next generations of high-end computing facilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents an account and analysis of published mainland Chinese media coverage surrounding three major events of public protest during the Hu-Wen era (2003-2013). The research makes a qualitative analysis of printed material drawn from a range of news outlets, differentiated by their specific political and commercial affiliations. The goal of the research is to better understand the role of mainstream media in social conflict resolution, a hitherto under-studied area, and to identify gradations within the ostensibly monolithic mainland Chinese media on issues of political sensitivity. China’s modern media formation displays certain characteristics of Anglophone media at its hyper-commercialised, populist core. However, the Chinese state retains an explicit, though often ambiguous, remit to engage with news production. Because of this, Chinese newspapers are often assumed to be one-dimensional propaganda ‘tools’ and, accordingly, easily dismissed from analyses of public protest. This research finds that, in an area where political actors have rescinded their monopoly on communicative power, a result of both policy decisions and the rise of Internet-based media platforms, established purveyors of news have acquired greater latitude to report on hitherto sensitive episodes of conflict but do so under the burden of having to correctly guide public opinion. The thesis examines the discursive resources that are deployed in this task, as well as reporting patterns which are suggestive of a new propaganda approach to handling social conflict within public media. Beside the explicitly political nature of coverage of protest events, the study sheds lights on gradations within China’s complex, hybrid media landscape both in terms of institutional purpose and qualitative performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research addresses the impact of long-term reward patterns on contents of personal work goals among young Finnish managers (N = 747). Reward patterns were formed on the basis of perceived and objective career rewards (i.e., career stability and promotions) across four measurements (years 2006 –2012). Goals were measured in 2012 and classified into categories of competence, progression, well-being, job change, job security, organization, and financial goals. The factor mixture analysis identified a three-class solution as the best model of reward patterns: High rewards (77%); Increasing rewards (17%); and Reducing rewards (7%). Participants with Reducing rewards reported more progression, well-being, job change and financial goals than participants with High rewards as well as fewer competence and organizational goals than participants with Increasing rewards. Workplace resources can be in a key role in facilitating goals towards building competence and organizational performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and compare four broad categories of spatially-explicit modelling approaches currently used to understand and project changes in the distribution and productivity of living marine resources including: 1) statistical species distribution models, 2) physiology-based, biophysical models of single life stages or the whole life cycle of species, 3) food web models, and 4) end-to-end models. Single pressures are rare and, in the future, models must be able to examine multiple factors affecting living marine resources such as interactions between: i) climate-driven changes in temperature regimes and acidification, ii) reductions in water quality due to eutrophication, iii) the introduction of alien invasive species, and/or iv) (over-)exploitation by fisheries. Statistical (correlative) approaches can be used to detect historical patterns which may not be relevant in the future. Advancing predictive capacity of changes in distribution and productivity of living marine resources requires explicit modelling of biological and physical mechanisms. New formulations are needed which (depending on the question) will need to strive for more realism in ecophysiology and behaviour of individuals, life history strategies of species, as well as trophodynamic interactions occurring at different spatial scales. Coupling existing models (e.g. physical, biological, economic) is one avenue that has proven successful. However, fundamental advancements are needed to address key issues such as the adaptive capacity of species/groups and ecosystems. The continued development of end-to-end models (e.g., physics to fish to human sectors) will be critical if we hope to assess how multiple pressures may interact to cause changes in living marine resources including the ecological and economic costs and trade-offs of different spatial management strategies. Given the strengths and weaknesses of the various types of models reviewed here, confidence in projections of changes in the distribution and productivity of living marine resources will be increased by assessing model structural uncertainty through biological ensemble modelling.