976 resultados para Multi-sector models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Femtosecond laser microfabrication has emerged over the last decade as a 3D flexible technology in photonics. Numerical simulations provide an important insight into spatial and temporal beam and pulse shaping during the course of extremely intricate nonlinear propagation (see e.g. [1,2]). Electromagnetics of such propagation is typically described in the form of the generalized Non-Linear Schrdinger Equation (NLSE) coupled with Drude model for plasma [3]. In this paper we consider a multi-threaded parallel numerical solution for a specific model which describes femtosecond laser pulse propagation in transparent media [4, 5]. However our approach can be extended to similar models. The numerical code is implemented in NVIDIA Graphics Processing Unit (GPU) which provides an effitient hardware platform for multi-threded computing. We compare the performance of the described below parallel code implementated for GPU using CUDA programming interface [3] with a serial CPU version used in our previous papers [4,5]. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Private label branding strategies differ to that of the manufacturer. The study aims to identify optimal private label branding strategies for (a) utilitarian products and (b) hedonistic products, considering the special factors reflected in consumer behavior related to private labels in Hungary. The issue of House of Brands and Branded House strategies are discussed and evaluated in the light of retail business models. Focus group interviews and factor analysis of the survey found differences in branding strategies preferred by consumers for the two product categories. The study also outlines a strong trend in possible private label development based on consumer’s changing attitude in favor of national products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tanárok pályaelhagyási döntését vizsgálva, a tanulmány a következő két kérdésre keresi a választ. Milyen szerepet játszanak e döntésekben a keresetek, alternatív kereseti lehetőségek? Hogyan hatott a tanárok pályaelhagyására a 2002. évi közalkalmazotti béremelés? Az elemzéshez az OEP-ONYF-FH összekapcsolt nagymintás adatbázis felhasználásával kétféle modellt becsült a szerző: 1. két lehetőséget megkülönböztetve (elhagyja a tanári pályát/nem hagyja el) Cox-féle arányos hazárdfüggvényeket, 2. a pályaelhagyás okai között a más állásba kerülést és az egyéb pályaelhagyási okokat megkülönböztetve versengő kockázati modelleket. Az eredmények azt mutatják, hogy a kereseti lehetőségek hatnak a pályaelhagyási döntésekre. A magasabb jövedelem és magasabb relatív kereset csökkenti annak valószínűségét, hogy egy tanár elhagyja a pályát, és más pályán helyezkedjen el, vagy nem foglalkoztatotti státusba kerüljön. A közalkalmazotti béremelés átmenetileg csökkentette a pályaelhagyás valószínűségét a fiatal tanárok körében, de a hatás egy-két év alatt eltűnt. Az 51 évesnél idősebb tanárokat pedig inkább a pályán tartotta a béremelés, csökkentette annak valószínűségét is, hogy más pályán helyezkedjenek el, vagy hogy nem foglalkoztatotti státusba kerüljenek. ______ The paper investigates teachers decisions to leave the profession. It first examines the role in such decisions of pay compared with earnings in alternative occupations, and then discusses how the public-sector pay increase of 2002 af-fected exit decisions by teachers. Duration models were estimated using large merged administrative data sets. First binary-choice Cox proportional hazard models (leaving teaching profession or not), then competing risk models that distinguish exits to another occupation and exits to a non-working state. Results show that earnings matter. Higher wages reduce the probability of exiting teacher profession to go to another occupation or to non-employment. The public-sector pay increase decreased the probability of inexperienced teachers leaving the teacher profession temporarily, but the effect disappeared after one or two years. For experienced teachers over 51 years old, the wage increase was found to reduce attrition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the successful strategic management of the modern companies each function plays their specific role. While today’s businesses in many ways are different from their ancestors, the key fundamentals are derived from the same roots. Their main purpose of existence is to serve the needs of their shareholders and stakeholders by creating value (Pike et al., 1993). To achieve this effectively and efficiently the various functions need to work in close cooperation with each other. The global crisis, starting in 2008, proved that volatility is higher for the financial markets and the ordinary businesses that have been anticipated before. As the recession started as a financial crisis many people started to blame – amongst others – banks and financial institutions for excessive risk taking and taking short profits ahead of long term sustainable growth. Accordingly the lost confidence in the financial institutions has taken a toll on the reputation of other Finance professionals such as accountants, book keepers, treasury, tax people and others. The finance function’s strategic importance is linked to its ability to help interpreting the business performance and provide transparency. In order to restore the trust the finance profession is now facing one of the biggest challenges of its history, the need to reinvent itself. This paper presents the findings of a recent international research conducted in the United Kingdom, France, Hungary and Poland interviewing 169 executives of the business sector plus the review of 237 job descriptions of finance professionals in order to understand the challenges of the modern finance function. The findings of the study could provide relevant answers and help to overcome a very current problem that Finance is facing today, how to rebuild reputation and to stay a trusted partner and enabler for long term business strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major challenge of modern teams lies in the coordination of the efforts not just of individuals within a team, but also of teams whose efforts are ultimately entwined with those of other teams. Despite this fact, much of the research on work teams fails to consider the external dependencies that exist in organizational teams and instead focuses on internal or within team processes. Multi-Team Systems Theory is used as a theoretical framework for understanding teams-of-teams organizational forms (Multi-Team Systems; MTS's); and leadership teams are proposed as one remedy that enable MTS members to dedicate needed resources to intra-team activities while ensuring effective synchronization of between-team activities. Two functions of leader teams were identified: strategy development and coordination facilitation; and a model was developed delineating the effects of the two leader roles on multi-team cognitions, processes, and performance.^ Three hundred eighty-four undergraduate psychology and business students participated in a laboratory simulation that modeled an MTS; each MTS was comprised of three, two-member teams each performing distinct but interdependent components of an F-22 battle simulation task. Two roles of leader teams supported in the literature were manipulated through training in a 2 (strategy training vs. control) x 2 (coordination training vs. control) design. Multivariate analysis of variance (MANOVA) and mediated regression analysis were used to test the study's hypotheses. ^ Results indicate that both training manipulations produced differences in the effectiveness of the intended form of leader behavior. The enhanced leader strategy training resulted in more accurate (but not more similar) MTS mental models, better inter-team coordination, and higher levels of multi-team (but not component team) performance. Moreover, mental model accuracy fully mediated the relationship between leader strategy and inter-team coordination; and inter-team coordination fully mediated the effect of leader strategy on multi-team performance. Leader coordination training led to better inter-team coordination, but not to higher levels of either team or multi-team performance. Mediated Input-Process-Output (I-P-O) relationships were not supported with leader coordination; rather, leader coordination facilitation and inter-team coordination uniquely contributed to component team and multi-team level performance. The implications of these findings and future research directions are also discussed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation discussed resource allocation mechanisms in several network topologies including infrastructure wireless network, non-infrastructure wireless network and wire-cum-wireless network. Different networks may have different resource constrains. Based on actual technologies and implementation models, utility function, game theory and a modern control algorithm have been introduced to balance power, bandwidth and customers' satisfaction in the system. ^ In infrastructure wireless networks, utility function was used in the Third Generation (3G) cellular network and the network was trying to maximize the total utility. In this dissertation, revenue maximization was set as an objective. Compared with the previous work on utility maximization, it is more practical to implement revenue maximization by the cellular network operators. The pricing strategies were studied and the algorithms were given to find the optimal price combination of power and rate to maximize the profit without degrading the Quality of Service (QoS) performance. ^ In non-infrastructure wireless networks, power capacity is limited by the small size of the nodes. In such a network, nodes need to transmit traffic not only for themselves but also for their neighbors, so power management become the most important issue for the network overall performance. Our innovative routing algorithm based on utility function, sets up a flexible framework for different users with different concerns in the same network. This algorithm allows users to make trade offs between multiple resource parameters. Its flexibility makes it a suitable solution for the large scale non-infrastructure network. This dissertation also covers non-cooperation problems. Through combining game theory and utility function, equilibrium points could be found among rational users which can enhance the cooperation in the network. ^ Finally, a wire-cum-wireless network architecture was introduced. This network architecture can support multiple services over multiple networks with smart resource allocation methods. Although a SONET-to-WiMAX case was used for the analysis, the mathematic procedure and resource allocation scheme could be universal solutions for all infrastructure, non-infrastructure and combined networks. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid globalization and integration of world capital markets, more and more stocks are listed in multiple markets. With multi-listed stocks, the traditional measurement of systematic risk, the domestic beta, is not appropriate since it only contain information from one market. ^ Prakash et al. (1993) developed a technique, the global beta, to capture information from multiple markets wherein the stocks are listed. In this study, the global betas are obtained as well as domestic betas for 704 multi-listed stocks from 59 world equity markets. Welch tests show that domestic betas are not equal across markets, therefore, global beta is more appropriate in a global investment setting. ^ The traditional Capital Asset Pricing Models (CAPM) is also tested with regards to both domestic beta and global beta. The results generally support the positive relationship between stocks returns and global beta while tend to reject this relationship between stocks returns and domestic beta. Further tests of International CAPM with domestic beta and global beta strengthen the conclusion.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bus stops are key links in the journeys of transit patrons with disabilities. Inaccessible bus stops prevent people with disabilities from using fixed-route bus services, thus limiting their mobility. The Americans with Disabilities Act (ADA) of 1990 prescribes the minimum requirements for bus stop accessibility by riders with disabilities. Due to limited budgets, transit agencies can only select a limited number of bus stop locations for ADA improvements annually. These locations should preferably be selected such that they maximize the overall benefits to patrons with disabilities. In addition, transit agencies may also choose to implement the universal design paradigm, which involves higher design standards than current ADA requirements and can provide amenities that are useful for all riders, like shelters and lighting. Many factors can affect the decision to improve a bus stop, including rider-based aspects like the number of riders with disabilities, total ridership, customer complaints, accidents, deployment costs, as well as locational aspects like the location of employment centers, schools, shopping areas, and so on. These interlacing factors make it difficult to identify optimum improvement locations without the aid of an optimization model. This dissertation proposes two integer programming models to help identify a priority list of bus stops for accessibility improvements. The first is a binary integer programming model designed to identify bus stops that need improvements to meet the minimum ADA requirements. The second involves a multi-objective nonlinear mixed integer programming model that attempts to achieve an optimal compromise among the two accessibility design standards. Geographic Information System (GIS) techniques were used extensively to both prepare the model input and examine the model output. An analytic hierarchy process (AHP) was applied to combine all of the factors affecting the benefits to patrons with disabilities. An extensive sensitivity analysis was performed to assess the reasonableness of the model outputs in response to changes in model constraints. Based on a case study using data from Broward County Transit (BCT) in Florida, the models were found to produce a list of bus stops that upon close examination were determined to be highly logical. Compared to traditional approaches using staff experience, requests from elected officials, customer complaints, etc., these optimization models offer a more objective and efficient platform on which to make bus stop improvement suggestions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past two decades, multi-agent systems (MAS) have emerged as a new paradigm for conceptualizing large and complex distributed software systems. A multi-agent system view provides a natural abstraction for both the structure and the behavior of modern-day software systems. Although there were many conceptual frameworks for using multi-agent systems, there was no well established and widely accepted method for modeling multi-agent systems. This dissertation research addressed the representation and analysis of multi-agent systems based on model-oriented formal methods. The objective was to provide a systematic approach for studying MAS at an early stage of system development to ensure the quality of design. ^ Given that there was no well-defined formal model directly supporting agent-oriented modeling, this study was centered on three main topics: (1) adapting a well-known formal model, predicate transition nets (PrT nets), to support MAS modeling; (2) formulating a modeling methodology to ease the construction of formal MAS models; and (3) developing a technique to support machine analysis of formal MAS models using model checking technology. PrT nets were extended to include the notions of dynamic structure, agent communication and coordination to support agent-oriented modeling. An aspect-oriented technique was developed to address the modularity of agent models and compositionality of incremental analysis. A set of translation rules were defined to systematically translate formal MAS models to concrete models that can be verified through the model checker SPIN (Simple Promela Interpreter). ^ This dissertation presents the framework developed for modeling and analyzing MAS, including a well-defined process model based on nested PrT nets, and a comprehensive methodology to guide the construction and analysis of formal MAS models.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In finance literature many economic theories and models have been proposed to explain and estimate the relationship between risk and return. Assuming risk averseness and rational behavior on part of the investor, the models are developed which are supposed to help in forming efficient portfolios that either maximize (minimize) the expected rate of return (risk) for a given level of risk (rates of return). One of the most used models to form these efficient portfolios is the Sharpe's Capital Asset Pricing Model (CAPM). In the development of this model it is assumed that the investors have homogeneous expectations about the future probability distribution of the rates of return. That is, every investor assumes the same values of the parameters of the probability distribution. Likewise financial volatility homogeneity is commonly assumed, where volatility is taken as investment risk which is usually measured by the variance of the rates of return. Typically the square root of the variance is used to define financial volatility, furthermore it is also often assumed that the data generating process is made of independent and identically distributed random variables. This again implies that financial volatility is measured from homogeneous time series with stationary parameters. In this dissertation, we investigate the assumptions of homogeneity of market agents and provide evidence for the case of heterogeneity in market participants' information, objectives, and expectations about the parameters of the probability distribution of prices as given by the differences in the empirical distributions corresponding to different time scales, which in this study are associated with different classes of investors, as well as demonstrate that statistical properties of the underlying data generating processes including the volatility in the rates of return are quite heterogeneous. In other words, we provide empirical evidence against the traditional views about homogeneity using non-parametric wavelet analysis on trading data, The results show heterogeneity of financial volatility at different time scales, and time-scale is one of the most important aspects in which trading behavior differs. In fact we conclude that heterogeneity as posited by the Heterogeneous Markets Hypothesis is the norm and not the exception.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: (1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (E LUMO) via QSAR modelling and analysis; (2) to validate the models by using internal and external cross-validation techniques; (3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl ) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: (1) Linear or Multi-linear Regression (MLR); (2) Partial Least Squares (PLS); and (3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: (1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; (2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; (3) E LUMO are shown to correlate highly with the NCl for several classes of DBPs; and (4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-dimensional, 2D, finite-difference time-domain (FDTD) method is used to analyze two different models of multi-conductor transmission lines (MTL). The first model is a two-conductor MTL and the second is a threeconductor MTL. Apart from the MTL's, a three-dimensional, 3D, FDTD method is used to analyze a three-patch microstrip parasitic array. While the MTL analysis is entirely in time-domain, the microstrip parasitic array is a study of scattering parameter Sn in the frequency-domain. The results clearly indicate that FDTD is an efficient and accurate tool to model and analyze multiconductor transmission line as well as microstrip antennas and arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to develop a new methodology, which can be used to design new refrigerants that are better than the currently used refrigerants. The methodology draws some parallels with the general approach of computer aided molecular design. However, the mathematical way of representing the molecular structure of an organic compound and the use of meta models during the optimization process make it different. In essence, this approach aimed to generate molecules that conform to various property requirements that are known and specified a priori. A modified way of mathematically representing the molecular structure of an organic compound having up to four carbon atoms, along with atoms of other elements such as hydrogen, oxygen, fluorine, chlorine and bromine, was developed. The normal boiling temperature, enthalpy of vaporization, vapor pressure, tropospheric lifetime and biodegradability of 295 different organic compounds, were collected from open literature and data bases or estimated. Surrogate models linking the previously mentioned quantities with the molecular structure were developed. Constraints ensuring the generation of structurally feasible molecules were formulated and used in commercially available optimization algorithms to generate molecular structures of promising new refrigerants. This study was intended to serve as a proof-of-concept of designing refrigerants using the newly developed methodology.