916 resultados para continuous-resource model
Resumo:
This study presents a two stage process to determine suitable areas to grow fuel crops: i) FAO Agro Ecological Zones (AEZ) procedure is applied to four Indian states of different geographical characteristics; and ii) Modelling the growth of candidate crops with GEPIC water and nutrient model, which is used to determine potential yield of candidate crops in areas where irrigation water is brackish or soil is saline. Absence of digital soil maps, paucity of readily available climate data and knowledge of detailed requirements of candidate crops are some of the major problems, of which, a series of detailed maps will evaluate true potential of biofuels in India.
Resumo:
Small and Medium Enterprises (SMEs) play an important part in the economy of any country. Initially, a flat management hierarchy, quick response to market changes and cost competitiveness were seen as the competitive characteristics of an SME. Recently, in developed economies, technological capabilities (TCs) management- managing existing and developing or assimilating new technological capabilities for continuous process and product innovations, has become important for both large organisations and SMEs to achieve sustained competitiveness. Therefore, various technological innovation capability (TIC) models have been developed at firm level to assess firms‘ innovation capability level. These models output help policy makers and firm managers to devise policies for deepening a firm‘s technical knowledge generation, acquisition and exploitation capabilities for sustained technological competitive edge. However, in developing countries TCs management is more of TCs upgrading: acquisitions of TCs from abroad, and then assimilating, innovating and exploiting them. Most of the TIC models for developing countries delineate the level of TIC required as firms move from the acquisition to innovative level. However, these models do not provide tools for assessing the existing level of TIC of a firm and various factors affecting TIC, to help practical interventions for TCs upgrading of firms for improved or new processes and products. Recently, the Government of Pakistan (GOP) has realised the importance of TCs upgrading in SMEs-especially export-oriented, for their sustained competitiveness. The GOP has launched various initiatives with local and foreign assistance to identify ways and means of upgrading local SMEs capabilities. This research targets this gap and developed a TICs assessment model for identifying the existing level of TIC of manufacturing SMEs existing in clusters in Sialkot, Pakistan. SME executives in three different export-oriented clusters at Sialkot were interviewed to analyse technological capabilities development initiatives (CDIs) taken by them to develop and upgrade their firms‘ TCs. Data analysed at CDI, firm, cluster and cross-cluster level first helped classify interviewed firms as leader, follower and reactor, with leader firms claiming to introduce mostly new CDIs to their cluster. Second, the data analysis displayed that mostly interviewed leader firms exhibited ‗learning by interacting‘ and ‗learning by training‘ capabilities for expertise acquisition from customers and international consultants. However, these leader firms did not show much evidence of learning by using, reverse engineering and R&D capabilities, which according to the extant literature are necessary for upgrading existing TIC level and thus TCs of firm for better value-added processes and products. The research results are supported by extant literature on Sialkot clusters. Thus, in sum, a TIC assessment model was developed in this research which qualitatively identified interviewed firms‘ TIC levels, the factors affecting them, and is validated by existing literature on interviewed Sialkot clusters. Further, the research gives policy level recommendations for TIC and thus TCs upgrading at firm and cluster level for targeting better value-added markets.
Resumo:
We investigate a class of simple models for Langevin dynamics of turbulent flows, including the one-layer quasi-geostrophic equation and the two-dimensional Euler equations. Starting from a path integral representation of the transition probability, we compute the most probable fluctuation paths from one attractor to any state within its basin of attraction. We prove that such fluctuation paths are the time reversed trajectories of the relaxation paths for a corresponding dual dynamics, which are also within the framework of quasi-geostrophic Langevin dynamics. Cases with or without detailed balance are studied. We discuss a specific example for which the stationary measure displays either a second order (continuous) or a first order (discontinuous) phase transition and a tricritical point. In situations where a first order phase transition is observed, the dynamics are bistable. Then, the transition paths between two coexisting attractors are instantons (fluctuation paths from an attractor to a saddle), which are related to the relaxation paths of the corresponding dual dynamics. For this example, we show how one can analytically determine the instantons and compute the transition probabilities for rare transitions between two attractors.
Resumo:
This work introduces a model in which agents of a network act upon one another according to three different kinds of moral decisions. These decisions are based on an increasing level of sophistication in the empathy capacity of the agent, a hierarchy which we name Piaget's ladder. The decision strategy of the agents is non-rational, in the sense they are arbitrarily fixed, and the model presents quenched disorder given by the distribution of its defining parameters. An analytical solution for this model is obtained in the large system limit as well as a leading order correction for finite-size systems which shows that typical realisations of the model develop a phase structure with both continuous and discontinuous non-thermal transitions.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
For industrialised economy of ourdays, remanufacturing represents perhaps the largest unexploited resource and opportunity for realising a greater growth of the economy in an environmental-conscious manner. The aim of this paper is to investigate of the impact of remanufacturing in the economy from an economic-efficiency point of view. In static context this phenomenon was analysed in the literature. We use the multi-sector input–output framework in a dynamic context to study intra-period relationships of the sectors of economy. We extend the classical dynamic input–output model taking into consideration the activity of remanufacturing .We try to answer the question, whether the remanufacturing/reuse increases the growth possibility of an economy. We expose a sufficient condition concerning the effectivity of an economy with remanufacturing. By this evaluation we analyse a possible sustainable development of the economy on the basis of the product recovery management of industries.
Resumo:
Increasing use of the term, Strategic Human Resource Management (SHRM), reflects the recognition of the interdependencies between corporate strategy, organization and human resource management in the functioning of the firm. Dyer and Holder (1988) proposed a comprehensive Human Resource Strategic Typology consisting of three strategic types--inducement, investment and involvement. This research attempted to empirically validate their typology and also test the performance implications of the match between corporate strategy and HR strategy. Hypotheses were tested to determine the relationships between internal consistency in HRM sub-systems, match between corporate strategy and HR strategy, and firm performance. Data were collected by a mail survey of 998 senior HR executives of whom 263 returned the completed questionnaire. Financial information on 909 firms was collected from secondary sources like 10-K reports and CD-Disclosure. Profitability ratios were indexed to industry averages. Confirmatory Factor Analysis using LISREL provided support in favor of the six-factor HR measurement model; the six factors were staffing, training, compensation, appraisal, job design and corporate involvement. Support was also found for the presence of a second-order factor labeled "HR Strategic Orientation" explaining the variations among the six factors. LISREL analysis also supported the congruence hypothesis that HR Strategic Orientation significantly affects firm performance. There was a significant associative relationship between HR Strategy and Corporate Strategy. However, the contingency effects of the match between HR and Corporate strategies were not supported. Several tests were conducted to show that the survey results are not affected by non-response bias nor by mono-method bias. Implications of these findings for both researchers and practitioners are discussed. ^
Resumo:
Disturbances alter competitive hierarchies by reducing populations and altering resource regimes. The interaction between disturbance and resource availability may strongly influence the structure of plant communities, as observed in the recolonization of seagrass beds in outer Florida Bay that were denuded by sea-urchin overgrazing. There is no consensus concerning the interaction between disturbance and resource availability on competition intensity (CI). On the other hand, species diversity is dependent on both factors. Peaks in species diversity have been observed to occur when both resource availability and disturbance intensity are high, thus implying that CI is low. Based on this supposition of previous models, I presented the resource-disturbance hypothesis as a graphical model to make predictions of CI as a function of both disturbance intensity and the availability of a limiting resource. The predictions of this model were tested in two experiments within a seagrass community in south Florida, in which transplants of Halodule wrightii were placed into near-monocultures of Syringodium filiforme in a full-factorial array. In the first experiment, two measures of relative CI were calculated based on the changes in the short-shoot number (SS) and of rhizome length (RHL) on the transplants. Both light and disturbance were identified as important factors, though the interaction between light * disturbance was not significant. Relative CISS ranged between 0.2 and 1.0 for the high light and high disturbance treatments and the relative CIRHL < 0 for the same treatments, though results were not significantly different due to high variability and low sample size. These results, including a contour schematic using six data points from the different treatment combinations, preliminarily suggests that the resource-disturbance hypothesis may be used may be used as a next step in developing our understanding of the mechanisms involved in structuring plant communities. Furthermore, the focus of the model is on the outcome of CI, which may be a useful predictor of changes in species diversity. Further study is needed to confirm the results of this study and validate the usefulness of this model in other systems. ^
Resumo:
Next-generation integrated wireless local area network (WLAN) and 3G cellular networks aim to take advantage of the roaming ability in a cellular network and the high data rate services of a WLAN. To ensure successful implementation of an integrated network, many issues must be carefully addressed, including network architecture design, resource management, quality-of-service (QoS), call admission control (CAC) and mobility management. ^ This dissertation focuses on QoS provisioning, CAC, and the network architecture design in the integration of WLANs and cellular networks. First, a new scheduling algorithm and a call admission control mechanism in IEEE 802.11 WLAN are presented to support multimedia services with QoS provisioning. The proposed scheduling algorithms make use of the idle system time to reduce the average packet loss of realtime (RT) services. The admission control mechanism provides long-term transmission quality for both RT and NRT services by ensuring the packet loss ratio for RT services and the throughput for non-real-time (NRT) services. ^ A joint CAC scheme is proposed to efficiently balance traffic load in the integrated environment. A channel searching and replacement algorithm (CSR) is developed to relieve traffic congestion in the cellular network by using idle channels in the WLAN. The CSR is optimized to minimize the system cost in terms of the blocking probability in the interworking environment. Specifically, it is proved that there exists an optimal admission probability for passive handoffs that minimizes the total system cost. Also, a method of searching the probability is designed based on linear-programming techniques. ^ Finally, a new integration architecture, Hybrid Coupling with Radio Access System (HCRAS), is proposed for lowering the average cost of intersystem communication (IC) and the vertical handoff latency. An analytical model is presented to evaluate the system performance of the HCRAS in terms of the intersystem communication cost function and the handoff cost function. Based on this model, an algorithm is designed to determine the optimal route for each intersystem communication. Additionally, a fast handoff algorithm is developed to reduce the vertical handoff latency.^
Resumo:
An integrated flow and transport model using MIKE SHE/MIKE 11 software was developed to predict the flow and transport of mercury, Hg(II), under varying environmental conditions. The model analyzed the impact of remediation scenarios within the East Fork Poplar Creek watershed of the Oak Ridge Reservation with respect to downstream concentration of mercury. The numerical simulations included the entire hydrological cycle: flow in rivers, overland flow, groundwater flow in the saturated and unsaturated zones, and evapotranspiration and precipitation time series. Stochastic parameters and hydrologic conditions over a five year period of historical hydrological data were used to analyze the hydrological cycle and to determine the prevailing mercury transport mechanism within the watershed. Simulations of remediation scenarios revealed that reduction of the highly contaminated point sources, rather than general remediation of the contaminant plume, has a more direct impact on downstream mercury concentrations.
Resumo:
In the U.S., construction accidents remain a significant economic and social problem. Despite recent improvement, the Construction industry, generally, has lagged behind other industries in implementing safety as a total management process for achieving zero accidents and developing a high-performance safety culture. One aspect of this total approach to safety that has frustrated the construction industry the most has been “measurement”, which involves identifying and quantifying the factors that critically influence safe work behaviors. The basic problem attributed is the difficulty in assessing what to measure and how to measure it—particularly the intangible aspects of safety. Without measurement, the notion of continuous improvement is hard to follow. This research was undertaken to develop a strategic framework for the measurement and continuous improvement of total safety in order to achieve and sustain the goal of zero accidents, while improving the quality, productivity and the competitiveness of the construction industry as it moves forward. The research based itself on an integral model of total safety that allowed decomposition of safety into interior and exterior characteristics using a multiattribute analysis technique. Statistical relationships between total safety dimensions and safety performance (measured by safe work behavior) were revealed through a series of latent variables (factors) that describe the total safety environment of a construction organization. A structural equation model (SEM) was estimated for the latent variables to quantify relationships among them and between these total safety determinants and safety performance of a construction organization. The developed SEM constituted a strategic framework for identifying, measuring, and continuously improving safety as a total concern for achieving and sustaining the goal of zero accidents.
Resumo:
This paper examines the inability of work organizations to achieve racial balance and use training and development, organization development, and multicultural organizational development principles to manage and enable diversity initiatives. The paper proposes a conceptual framework for a micro and macro model as an approach to diversity initiatives.
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Community ecology seeks to understand and predict the characteristics of communities that can develop under different environmental conditions, but most theory has been built on analytical models that are limited in the diversity of species traits that can be considered simultaneously. We address that limitation with an individual-based model to simulate assembly of fish communities characterized by life history and trophic interactions with multiple physiological tradeoffs as constraints on species performance. Simulation experiments were carried out to evaluate the distribution of 6 life history and 4 feeding traits along gradients of resource productivity and prey accessibility. These experiments revealed that traits differ greatly in importance for species sorting along the gradients. Body growth rate emerged as a key factor distinguishing community types and defining patterns of community stability and coexistence, followed by egg size and maximum body size. Dominance by fast-growing, relatively large, and fecund species occurred more frequently in cases where functional responses were saturated (i.e. high productivity and/or prey accessibility). Such dominance was associated with large biomass fluctuations and priority effects, which prevented richness from increasing with productivity and may have limited selection on secondary traits, such as spawning strategies and relative size at maturation. Our results illustrate that the distribution of species traits and the consequences for community dynamics are intimately linked and strictly dependent on how the benefits and costs of these traits are balanced across different conditions.