906 resultados para Network Synthesis, Disaster Response, Internet, Network Management
Resumo:
Tanulmányunk fókuszában az üzleti hálózatok állnak. Az IMP csoport (Industrial Marketing and Purchasing Group) üzleti kapcsolatokkal, üzleti hálózatokkal foglalkozó több évtizedes kutatási eredményeire és fogalomrendszerére alapozva áttekintjük a téma alapmegközelítéseit, majd a Versenyképesség-kutatás 2009. évei felmérése adatainak felhasználásával vizsgáljuk a hálózati kép és a versenyképesség összefüggéseit. Jellemezzük a vállaltvezetők értékelése szerint az iparági hálózatukban központi szerepet játszó, befolyással bíró (domináns hálózati pozícióval rendelkező) vállalatokat, különös tekintettel az üzleti teljesítmény és versenyképesség jellemzőire. ________ The paper analyses the business network position of Hungarian companies, based on data of the Competitiveness research program. After an overview of the theoretical background of business relationships, business networks and network position – based on the IMP (Industrial Marketing and Purchasing Group) approach, we analyse the performance and competitiveness characteristics of firms with central position in their industrial network.
Resumo:
A tanulmány célja, hogy értelmezze a globális gazdaság alapvető fontosságú építőelemét, az üzleti hálózatot, majd megvizsgálja annak felépítését és működésének főbb vezérlőelveit. Először az alapfogalmak – üzleti hálózat, ellátási lánc és ellátási háló – meghatározására és azok felépítésének bemutatására kerül sor. Ezt követően a cikk röviden ismerteti, hogy melyek voltak azok a vállalati gazdálkodás környezetében végbement változások, melyek a gazdaság hálózatosodását elősegítették és ennek kapcsán elvezettek az üzleti hálózatok versenyképességben játszott szerepének erősödéséhez. A szerző ugyanakkor bemutatja a kialakuló új gazdasági modell, az ún. hálózati gazdaság működési modelljének lényeges új tulajdonságait. A tanulmány ezután ismerteti az üzleti hálózat – s ezen belül az ellátási lánc – működtetésében meghatározó koordinációs mechanizmusokban megfigyelhető markáns változásokat. Végül részletesen ismerteti az üzleti hálózat két fő építőelemét: a hálózatot alkotó üzleti egységeknek, illetve a közöttük kialakuló kapcsolatoknak az alapvető típusait. ________ The aim of the paper is to present and interpret the basic building element of global business: the business network, its structure and operation. First basic terms – network, supply chain, supply network – are defined and described, than those changes are introduced that played significant role in increasing their importance. Characteristics of the new network economy are presented; especially changes in the coordination mechanism between cooperating parties in the network are demonstrated. Finally the two building blocks of global business networks: (i) nodes (business units) and (ii) threads (partnerships) are described in details.
Resumo:
Mára elkerülhetetlennek tűnik a gondolati váltás a gazdaságtudományok területén annak érdekében, hogy közelebb kerüljünk a hármas (gazdasági, társadalmi, környezeti) válságot előidéző problémák megoldásaihoz. A szerzők cikkükben amellett érvelnek, hogy a komplexitás befogadása kezdeti törekvésként értelmezhető azon az úton, amely a mindennapokban érzékelhető társadalmi és ökológiai problémák megoldásához vezet. Már léteznek olyan gazdaság- és vállalatelméletek, amelyek kezdik megkérdőjelezni az eddigi főáramú koncepciókat, és készek arra, hogy elveiket, elméleteiket a működési környezet bonyolultságát elfogadva alakítsák ki, és így találjanak mindennapos megoldásokat a gazdasági működésben. Ilyen például – a cikkben szereplő – IMP (Industrial Marketing and Purchasing) csoport hálózatelmélete. A szerzők azt kívánják bemutatni, hogy az ilyen hálózatelméletek alkalmasak arra, hogy a vállalatközi szférán túl is befolyásolják azt, ahogyan a vállalatok döntéseiket meghozzák és kapcsolataikat kezelik. ______ Today, it seems inevitable change of thought in the field of economics in order to get closer to solutions of problems which induced the triple (economic, social, environmental) crisis. In their article, the authors argue that the reception of complexity can be interpreted as an initial effort on the path that leads to the solutions of the everyday perceived social and ecological problems. The authors aim to present that the IMP (Industrial Marketing and Purchasing) group network theory is able to influence beyond the sphere of inter-company, that companies take their decisions and their relationships are treated.
Resumo:
On January 28-30, 2015 Corvinus University of Budapest hosted the latest workshop of the Regional Studies Association’s Tourism Research Network. The event had been held previously in Izmir, Aalborg, Warsaw, Östersund, Antalya, Leeds and Vila-seca Catalonia. The aim of the RSA research network is to examine tourism diversity from the perspective of regional development in order to identify current challenges and opportunities in a systematic manner, and hence provide the basis for a more well-informed integration of tourism in regional development strategies and move beyond political short-termism and buzzword fascination. In the frame of the network a series of workshops have been organised from various topics of destination management till rural tourism.
Resumo:
A minőségügy egyik kulcsfeladata, hogy azonosítsa az értékteremtés szempontjából kritikus tényezőket, meghatározza ezek értékét, valamint intézkedjen negatív hatásuk megelőzése és csökkentése érdekében. Az értékteremtés sok esetben folyamatokon keresztül történik, amelyek tevékenységekből, elvégzendő feladatokból állnak. Ezekhez megfelelő munkatársak kellenek, akiknek az egyik legfontosabb jellemzője az általuk birtokolt tudás. Mindezek alapján a feladat-tudás-erőforrás kapcsolatrendszer ismerete és kezelése minőségügyi feladat is. A komplex rendszerek elemzésével foglalkozó hálózatkutatás eszközt biztosíthat ehhez, ezért indokolt a minőségügyi területen történő alkalmazhatóságának vizsgálata. Az alkalmazási lehetőségek rendszerezése érdekében a szerzők kategorizálták a minőségügyi hálózatokat az élek (kapcsolatok) és a csúcsok (hálózati pontok) típusai alapján. Ezt követően definiálták a multimodális (több különböző csúcstípusból álló) tudáshálózatot, amely a feladatokból, az erőforrásokból, a tudáselemekből és a közöttük lévő kapcsolatokból épül fel. A hálózat segítségével kategóriákba sorolták a tudáselemeket, valamint a fokszámok alapján meghatározták értéküket. A multimodális hálózatból képzett tudáselem-hálózatban megadták az összefüggő csoportok jelentését, majd megfogalmaztak egy összefüggést a tudáselem-elvesztés kockázatának meghatározására. _______ The aims of quality management are to identify those factors that have significant influence on value production, qualify or quantify them, and make preventive and corrective actions in order to reduce their negative effects. The core elements of value production are processes and tasks, along with workforce having the necessary knowledge to work. For that reason the task-resource-knowledge structure is pertinent to quality management. Network science provides methods to analyze complex systems; therefore it seems reasonable to study the use of tools of network analysis in association with quality management issues. First of all the authors categorized quality networks according to the types of nodes (vertices) and links (edges or arcs). Focusing on knowledge management, they defined the multimodal knowledge network, consisting of tasks, resources, knowledge items and their interconnections. Based on their degree, network nodes can be categorized and their value can be quantified. Derived from the multimodal network knowledge-item network is to be created, where the meaning of cohesive subgroups is defined. Eventually they proposed a formula for determining the risk of knowledge loss.
Resumo:
3D geographic information system (GIS) is data and computation intensive in nature. Internet users are usually equipped with low-end personal computers and network connections of limited bandwidth. Data reduction and performance optimization techniques are of critical importance in quality of service (QoS) management for online 3D GIS. In this research, QoS management issues regarding distributed 3D GIS presentation were studied to develop 3D TerraFly, an interactive 3D GIS that supports high quality online terrain visualization and navigation. ^ To tackle the QoS management challenges, multi-resolution rendering model, adaptive level of detail (LOD) control and mesh simplification algorithms were proposed to effectively reduce the terrain model complexity. The rendering model is adaptively decomposed into sub-regions of up-to-three detail levels according to viewing distance and other dynamic quality measurements. The mesh simplification algorithm was designed as a hybrid algorithm that combines edge straightening and quad-tree compression to reduce the mesh complexity by removing geometrically redundant vertices. The main advantage of this mesh simplification algorithm is that grid mesh can be directly processed in parallel without triangulation overhead. Algorithms facilitating remote accessing and distributed processing of volumetric GIS data, such as data replication, directory service, request scheduling, predictive data retrieving and caching were also proposed. ^ A prototype of the proposed 3D TerraFly implemented in this research demonstrates the effectiveness of our proposed QoS management framework in handling interactive online 3D GIS. The system implementation details and future directions of this research are also addressed in this thesis. ^
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. ^ The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as (1) closure or connectedness within the group, (2) bridging ties which extend outside of the group, and (3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. ^ The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. ^ Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software. ^
Resumo:
Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.
Resumo:
An emergency is a deviation from a planned course of events that endangers people, properties, or the environment. It can be described as an unexpected event that causes economic damage, destruction, and human suffering. When a disaster happens, Emergency Managers are expected to have a response plan to most likely disaster scenarios. Unlike earthquakes and terrorist attacks, a hurricane response plan can be activated ahead of time, since a hurricane is predicted at least five days before it makes landfall. This research looked into the logistics aspects of the problem, in an attempt to develop a hurricane relief distribution network model. We addressed the problem of how to efficiently and effectively deliver basic relief goods to victims of a hurricane disaster. Specifically, where to preposition State Staging Areas (SSA), which Points of Distributions (PODs) to activate, and the allocation of commodities to each POD. Previous research has addressed several of these issues, but not with the incorporation of the random behavior of the hurricane's intensity and path. This research presents a stochastic meta-model that deals with the location of SSAs and the allocation of commodities. The novelty of the model is that it treats the strength and path of the hurricane as stochastic processes, and models them as Discrete Markov Chains. The demand is also treated as stochastic parameter because it depends on the stochastic behavior of the hurricane. However, for the meta-model, the demand is an input that is determined using Hazards United States (HAZUS), a software developed by the Federal Emergency Management Agency (FEMA) that estimates losses due to hurricanes and floods. A solution heuristic has been developed based on simulated annealing. Since the meta-model is a multi-objective problem, the heuristic is a multi-objective simulated annealing (MOSA), in which the initial solution and the cooling rate were determined via a Design of Experiments. The experiment showed that the initial temperature (T0) is irrelevant, but temperature reduction (δ) must be very gradual. Assessment of the meta-model indicates that the Markov Chains performed as well or better than forecasts made by the National Hurricane Center (NHC). Tests of the MOSA showed that it provides solutions in an efficient manner. Thus, an illustrative example shows that the meta-model is practical.
Resumo:
This ex post facto study (N = 209) examined the relationships between employer job strategies and job retention among organizations participating in Florida welfare-to-work network programs and associated the strategies with job retention data to determine best practices. ^ An internet-based self-report survey battery was administered to a heterogeneous sampling of organizations participating in the Florida welfare-to-work network program. Hypotheses were tested through correlational and hierarchical regression analytic procedures. The partial correlation results linked each of the job retention strategies to job retention. Wages, benefits, training and supervision, communication, job growth, work/life balance, fairness and respect were all significantly related to job retention. Hierarchical regression results indicated that the training and supervision variable was the best predictor of job retention in the regression equation. ^ The size of the organization was also a significant predictor of job retention. Large organizations reported higher job retention rates than small organizations. There was no statistical difference between the types of organizations (profit-making and non-profit) and job retention. The standardized betas ranged from to .26 to .41 in the regression equation. Twenty percent of the variance in job retention was explained by the combination of demographic and job retention strategy predictors, supporting the theoretical, empirical, and practical relevance of understanding the association between employer job strategies and job retention outcomes. Implications for adult education and human resource development theory, research, and practice are highlighted as possible strategic leverage points for creating conditions that facilitate the development of job strategies as a means for improving former welfare workers’ job retention.^
Resumo:
The trend of green consumerism and increased standardization of environmental regulations has driven multinational corporations (MNCs) to seek standardization of environmental practices or at least seek to be associated with such behavior. In fact, many firms are seeking to free ride on this global green movement, without having the actual ecological footprint to substantiate their environmental claims. While scholars have articulated the benefits from such optimization of uniform global green operations, the challenges for MNCs to control and implement such operations are understudied. For firms to translate environmental commitment to actual performance, the obstacles are substantial, particularly for the MNC. This is attributed to headquarters' (HQ) control challenges (1) in managing core elements of the corporate environmental management (CEM) process and specifically matching verbal commitment and policy with ecological performance and by (2) the fact that the MNC operates in multiple markets and the HQ is required to implement policy across complex subsidiary networks consisting of diverse and distant units. Drawing from the literature on HQ challenges of MNC management and control, this study examines (1) how core components of the CEM process impact optimization of global environmental performance (GEP) and then uses network theory to examine how (2) a subsidiary network's dimensions can present challenges to the implementation of green management policies. It presents a framework for CEM which includes (1) MNCs' Verbal environmental commitment, (2) green policy Management which guides standards for operations, (3) actual environmental Performance reflected in a firm's ecological footprint and (4) corporate environmental Reputation (VMPR). Then it explains how an MNC's key subsidiary network dimensions (density, diversity, and dispersion) create challenges that hinder the relationship between green policy management and actual environmental performance. It combines content analysis, multiple regression, and post-hoc hierarchal cluster analysis to study US manufacturing MNCs. The findings support a positive significant effect of verbal environmental commitment and green policy management on actual global environmental performance and environmental reputation, as well as a direct impact of verbal environmental commitment on green policy management. Unexpectedly, network dimensions were not found to moderate the relationship between green management policy and GEP.