25 resultados para resource -based theory
Resumo:
The aim of this article is to present the main contributions of human resource management to develop sustainable organizations. The relationship between human resources and organizational sustainability, which is based on economical, social and environmental performance, involves some important aspects concerning management such as innovation, cultural diversity and the environment. The integration of items from the triple bottom line approach leads to developing a model based on a strategic and central posture of human resource management. Based on this model, propositions and recommendations for future research on this theme are presented.
Resumo:
The aim of this article is to analyze the theoretical model proposed by [Jabbour CJC, Santos FCA. Relationships between human resource dimensions and environmental management in companies: proposal of a model. Journal of Cleaner Production 2008;16(1):5 1-8.] based on the data collected in four Brazilian companies. This model investigates how the phases of the environmental management system can be linked to human resource practices in order to attain continuous improvement of a company`s environmental performance. Our aim is to contribute to a field, which has little empirical evidence. Although the interaction between the phases of the environmental management system and human resource practices is recommended by the specialized literature [Daily BE Huang S. Achieving sustainability through attention to human resource factors in environmental management. International Journal of Operations and Production Management 2001:21(12):1539-52.], the results indicate that most of the theoretical assumptions could not be confirmed in these Brazilian companies. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The paper presents the development of a mechanical actuator using a shape memory alloy with a cooling system based on the thermoelectric effect (Seebeck-Peltier effect). Such a method has the advantage of reduced weight and requires a simpler control strategy as compared to other forced cooling systems. A complete mathematical model of the actuator was derived, and an experimental prototype was implemented. Several experiments are used to validate the model and to identify all parameters. A robust and nonlinear controller, based on sliding-mode theory, was derived and implemented. Experiments were used to evaluate the actuator closed-loop performance, stability, and robustness properties. The results showed that the proposed cooling system and controller are able to improve the dynamic response of the actuator. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
Scheduling parallel and distributed applications efficiently onto grid environments is a difficult task and a great variety of scheduling heuristics has been developed aiming to address this issue. A successful grid resource allocation depends, among other things, on the quality of the available information about software artifacts and grid resources. In this article, we propose a semantic approach to integrate selection of equivalent resources and selection of equivalent software artifacts to improve the scheduling of resources suitable for a given set of application execution requirements. We also describe a prototype implementation of our approach based on the Integrade grid middleware and experimental results that illustrate its benefits. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Susceptible-infective-removed (SIR) models are commonly used for representing the spread of contagious diseases. A SIR model can be described in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. Here, this framework is employed for investigating the consequences of applying vaccine against the propagation of a contagious infection, by considering vaccination as a game, in the sense of game theory. In this game, the players are the government and the susceptible newborns. In order to maximize their own payoffs, the government attempts to reduce the costs for combating the epidemic, and the newborns may be vaccinated only when infective individuals are found in their neighborhoods and/or the government promotes an immunization program. As a consequence of these strategies supported by cost-benefit analysis and perceived risk, numerical simulations show that the disease is not fully eliminated and the government implements quasi-periodic vaccination campaigns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Genetic variation and environmental heterogeneity fundamentally shape the interactions between plants of the same species. According to the resource partitioning hypothesis, competition between neighbors intensifies as their similarity increases. Such competition may change in response to increasing supplies of limiting resources. We tested the resource partitioning hypothesis in stands of genetically identical (clone-origin) and genetically diverse (seed-origin) Eucalyptus trees with different water and nutrient supplies, using individual-based tree growth models. We found that genetic variation greatly reduced competitive interactions between neighboring trees, supporting the resource partitioning hypothesis. The importance of genetic variation for Eucalyptus growth patterns depended strongly on local stand structure and focal tree size. This suggests that spatial and temporal variation in the strength of species interactions leads to reversals in the growth rank of seed-origin and clone-origin trees. This study is one of the first to experimentally test the resource partitioning hypothesis for intergenotypic vs. intragenotypic interactions in trees. We provide evidence that variation at the level of genes, and not just species, is functionally important for driving individual and community-level processes in forested ecosystems.
Resumo:
Overcommitment of development capacity or development resource deficiencies are important problems in new product development (NPD). Existing approaches to development resource planning have largely neglected the issue of resource magnitude required for NPD. This research aims to fill the void by developing a simple higher-level aggregate model based on an intuitive idea: The number of new product families that a firm can effectively undertake is bound by the complexity of its products or systems and the total amount of resources allocated to NPD. This study examines three manufacturing companies to verify the proposed model. The empirical results confirm the study`s initial hypothesis: The more complex the product family, the smaller the number of product families that are launched per unit of revenue. Several suggestions and implications for managing NPD resources are discussed, such as how this study`s model can establish an upper limit for the capacity to develop and launch new product families.
Resumo:
Discussion opposing the Theory of the Firm to the Theory of Stakeholders are contemporaneous and polemical. One focal point of such debates refers to which objective-function companies, should choose, whether that of the shareholders or that of the stakeholders, and whether it is possible to opt for both simultaneously. Several empirical studies. have attempted-to test a possible correlation between both functions, and there has not been any consensus-so far. The objective of the present research is to examine a gap in such discussions: is there (or not) a subordination of the stakeholders` objective-function to that of the shareholders? The research is empirical,and analytical and employs quantitative methods. Hypotheses were tested and data analyzed by using non-parametrical (chi-square test) and parametrical procedures (frequency. correlation `coefficient). Secondary data was collected from he Economitica database and from the Brazilian Institute of Social and-Economic Analyses (IBASE) website, relative to public companies that have published their Social Balance Statements following the IBASE model from 1999 to 2006, whose sample amounted to 65 companies; In order to assess the objective-function of shareholders a proxy was created based on the following three indices: ROE (return on equity), EnterpriseValue and Tobin`s Q. In order to assess the objective-function of stakeholders a proxy was created by employing the following IBASE social balance indices: internal ones (ISI), external ones (ISE), and environmental ones (IAM). The results have shown no evidence of subordination of stakeholders` objective-function to that of the shareholders in analyzed companies, negating initial expectations and calling for deeper investigation of results. Its main conclusion, which states that the attempted subordination does not take place, is limited to the sample herein investigated and calls for ongoing research aiming at improvements which may lead to sample enlargement and, as a consequence, may make feasible the application of other statistical techniques which may yield a more thorough, analysis of the studied phenomehon.