991 resultados para Efficient implementation
Resumo:
Given the uncertainties of the environment in today's world, strategic planning is again discussed as an important tool to position the organization in future likely. To address this a more dynamic and less linear, trying to adapt the multiple realities, the method of scenarios can be one of the strategies to anticipate future designs. In the public sphere, an efficient implementation of human resources and financial managers require a new approach to the formulation of strategies. Tourism, in turn, presents itself as an important segment of the national economy shaped up as a major source of funds for the formation of the Gross Domestic Product - GDP - of states and municipalities. This study aims to know the guidelines and perspectives of municipal planning of tourism in the city of Natal/RN from the case by the representatives of the sector. The survey was developed based on a qualitative, exploratory, based on the case in the Secretaria Municipal de Turismo e Desenvolvimento Econômico - SETURDE. The results express that the body goes through a time of changes in its organizational structure and in defining its role with the local tourism. The national plan for tourism and the choice of Christmas as the host city for the World Cup in 2014 have stimulated interest in developing a formal strategic planning in the organization. However, when it comes to more complex tools, such as the method of future scenarios, the technical know its definition and importance for future actions. The results presented support the conclusion that the actions are designed intuitively and without complying with the scientific methods developed for this purpose, as the method of strategic scenarios. However, the evidence beginning to emerge from the plans and documents issued by the federal government, as well as the Secretary's own initiative, direct the actions of the body to coordinate and act as a fulcrum for local action to the increase in tourism
Resumo:
Continuous delivery (CD) is a software engineering approach where the focus lays on creating a short delivery cycle by automating parts of the deployment pipeline which includes build, deploy-, test and release process. CD is based on that during development should be possible to always automatically generate a release based on the source code in its current state. One of CD's many advantages is that through continuous releases it allows you to get a quick feedback loop leading to faster and more efficient implementation of new functions, at the same time fixing errors. Although CD has many advantages, there are also several challenges a maintenance management project must manage in the transition to CD. These challenges may differ depending on the maturity level for a maintenance management project and what strengths and weaknesses the project has. Our research question was: "What challenges can a maintenance management project face in transition to Continuous delivery?" The purpose of this study is to describe Continuous delivery and the challenges a maintenance management project may face during a transition to Continuous delivery. A descriptive case study has been carried out with the data collection methods of interviews and documents. A situation analysis was created based on the collected data in a shape of a process model that represent the maintenance management projects release process. The processmodel was used as the basis of SWOT analysis and analysis by Rehn et al's Maturity Model. From these analyzes we found challenges of a maintenance management project may face in the transition to CD. The challenges are about customers and the management's attitude towards a transition to CD. But the biggest challenge is about automation of the deployment pipeline steps.
Resumo:
Given the uncertainties of the environment in today's world, strategic planning is again discussed as an important tool to position the organization in future likely. To address this a more dynamic and less linear, trying to adapt the multiple realities, the method of scenarios can be one of the strategies to anticipate future designs. In the public sphere, an efficient implementation of human resources and financial managers require a new approach to the formulation of strategies. Tourism, in turn, presents itself as an important segment of the national economy shaped up as a major source of funds for the formation of the Gross Domestic Product - GDP - of states and municipalities. This study aims to know the guidelines and perspectives of municipal planning of tourism in the city of Natal/RN from the case by the representatives of the sector. The survey was developed based on a qualitative, exploratory, based on the case in the Secretaria Municipal de Turismo e Desenvolvimento Econômico - SETURDE. The results express that the body goes through a time of changes in its organizational structure and in defining its role with the local tourism. The national plan for tourism and the choice of Christmas as the host city for the World Cup in 2014 have stimulated interest in developing a formal strategic planning in the organization. However, when it comes to more complex tools, such as the method of future scenarios, the technical know its definition and importance for future actions. The results presented support the conclusion that the actions are designed intuitively and without complying with the scientific methods developed for this purpose, as the method of strategic scenarios. However, the evidence beginning to emerge from the plans and documents issued by the federal government, as well as the Secretary's own initiative, direct the actions of the body to coordinate and act as a fulcrum for local action to the increase in tourism
Resumo:
In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.
Resumo:
In the multi-core CPU world, transactional memory (TM)has emerged as an alternative to lock-based programming for thread synchronization. Recent research proposes the use of TM in GPU architectures, where a high number of computing threads, organized in SIMT fashion, requires an effective synchronization method. In contrast to CPUs, GPUs offer two memory spaces: global memory and local memory. The local memory space serves as a shared scratch-pad for a subset of the computing threads, and it is used by programmers to speed-up their applications thanks to its low latency. Prior work from the authors proposed a lightweight hardware TM (HTM) support based in the local memory, modifying the SIMT execution model and adding a conflict detection mechanism. An efficient implementation of these features is key in order to provide an effective synchronization mechanism at the local memory level. After a quick description of the main features of our HTM design for GPU local memory, in this work we gather together a number of proposals designed with the aim of improving those mechanisms with high impact on performance. Firstly, the SIMT execution model is modified to increase the parallelism of the application when transactions must be serialized in order to make forward progress. Secondly, the conflict detection mechanism is optimized depending on application characteristics, such us the read/write sets, the probability of conflict between transactions and the existence of read-only transactions. As these features can be present in hardware simultaneously, it is a task of the compiler and runtime to determine which ones are more important for a given application. This work includes a discussion on the analysis to be done in order to choose the best configuration solution.
Resumo:
Effective and efficient implementation of intelligent and/or recently emerged networked manufacturing systems require an enterprise level integration. The networked manufacturing offers several advantages in the current competitive atmosphere by way to reduce, by shortening manufacturing cycle time and maintaining the production flexibility thereby achieving several feasible process plans. The first step in this direction is to integrate manufacturing functions such as process planning and scheduling for multi-jobs in a network based manufacturing system. It is difficult to determine a proper plan that meets conflicting objectives simultaneously. This paper describes a mobile-agent based negotiation approach to integrate manufacturing functions in a distributed manner; and its fundamental framework and functions are presented. Moreover, ontology has been constructed by using the Protégé software which possesses the flexibility to convert knowledge into Extensible Markup Language (XML) schema of Web Ontology Language (OWL) documents. The generated XML schemas have been used to transfer information throughout the manufacturing network for the intelligent interoperable integration of product data models and manufacturing resources. To validate the feasibility of the proposed approach, an illustrative example along with varied production environments that includes production demand fluctuations is presented and compared the proposed approach performance and its effectiveness with evolutionary algorithm based Hybrid Dynamic-DNA (HD-DNA) algorithm. The results show that the proposed scheme is very effective and reasonably acceptable for integration of manufacturing functions.
Resumo:
Nowadays the urgency to address climate change and global warming is growing rapidly: the industry and the energy sector must be decarbonized. Hydrogen can play a key role in the energy transition: it is expected to progressively replace fossil fuels, penetrating economies and gaining interest from the public. However, this new possible energy scenario requires further investigation on safety aspects, which currently represent a challenge. The present study aims at making a little contribution to this field. The focus is on the analysis and modeling of hazardous scenarios concerning liquid hydrogen. The investigation of BLEVEs (Boiling Liquid Expanding Vapor Explosion) consequences lies at the core of this research: among various consequences (overpressure, radiation), the interest is on the generation and projection of fragments. The goal is to investigate whether the models developed for conventional fuels and tanks give good predictions also when handling hydrogen. The experimental data from the SH2IFT - Safe Hydrogen Fuel Handling and Use for Efficient Implementation project are used to validate those models. This project’s objective was to increase competence within safety of hydrogen technology, especially focusing on consequences of handling large amounts of this substance.
Resumo:
This report documents the Iowa Department of Transportation's accomplishments and ongoing efforts in response to 39 recommendations proposed by the Governor's Blue Ribbon Transportation Task Force at the end of 1995. Governor Terry Branstad challenged the Task Force to "maximize the benefits of each dollar spent from the Road Use Tax Fund."
Resumo:
This paper discusses the design, implementation and synthesis of an FFT module that has been specifically optimized for use in the OFDM based Multiband UWB system, although the work is generally applicable to many other OFDM based receiver systems. Previous work has detailed the requirements for the receiver FFT module within the Multiband UWB ODFM based system and this paper draws on those requirements coupled with modern digital architecture principles and low power design criteria to converge on our optimized solution. The FFT design obtained in this paper is also applicable for implementation of the transmitter IFFT module therefore only needing one FFT module for half-duplex operation. The results from this paper enable the baseband designers of the 200Mbit/sec variant of Multiband UWB systems (and indeed other OFDM based receivers) using System-on-Chip (SoC), FPGA and ASIC technology to create cost effective and low power solutions biased toward the competitive consumer electronics market.
Resumo:
The conventional way to calculate hard scattering processes in perturbation theory using Feynman diagrams is not efficient enough to calculate all necessary processes - for example for the Large Hadron Collider - to a sufficient precision. Two alternatives to order-by-order calculations are studied in this thesis.rnrnIn the first part we compare the numerical implementations of four different recursive methods for the efficient computation of Born gluon amplitudes: Berends-Giele recurrence relations and recursive calculations with scalar diagrams, with maximal helicity violating vertices and with shifted momenta. From the four methods considered, the Berends-Giele method performs best, if the number of external partons is eight or bigger. However, for less than eight external partons, the recursion relation with shifted momenta offers the best performance. When investigating the numerical stability and accuracy, we found that all methods give satisfactory results.rnrnIn the second part of this thesis we present an implementation of a parton shower algorithm based on the dipole formalism. The formalism treats initial- and final-state partons on the same footing. The shower algorithm can be used for hadron colliders and electron-positron colliders. Also massive partons in the final state were included in the shower algorithm. Finally, we studied numerical results for an electron-positron collider, the Tevatron and the Large Hadron Collider.
Resumo:
Several strategies relying on kriging have recently been proposed for adaptively estimating contour lines and excursion sets of functions under severely limited evaluation budget. The recently released R package KrigInv 3 is presented and offers a sound implementation of various sampling criteria for those kinds of inverse problems. KrigInv is based on the DiceKriging package, and thus benefits from a number of options concerning the underlying kriging models. Six implemented sampling criteria are detailed in a tutorial and illustrated with graphical examples. Different functionalities of KrigInv are gradually explained. Additionally, two recently proposed criteria for batch-sequential inversion are presented, enabling advanced users to distribute function evaluations in parallel on clusters or clouds of machines. Finally, auxiliary problems are discussed. These include the fine tuning of numerical integration and optimization procedures used within the computation and the optimization of the considered criteria.
Resumo:
Simulating the spatio-temporal dynamics of inundation is key to understanding the role of wetlands under past and future climate change. Earlier modelling studies have mostly relied on fixed prescribed peatland maps and inundation time series of limited temporal coverage. Here, we describe and assess the the Dynamical Peatland Model Based on TOPMODEL (DYPTOP), which predicts the extent of inundation based on a computationally efficient TOPMODEL implementation. This approach rests on an empirical, grid-cell-specific relationship between the mean soil water balance and the flooded area. DYPTOP combines the simulated inundation extent and its temporal persistency with criteria for the ecosystem water balance and the modelled peatland-specific soil carbon balance to predict the global distribution of peatlands. We apply DYPTOP in combination with the LPX-Bern DGVM and benchmark the global-scale distribution, extent, and seasonality of inundation against satellite data. DYPTOP successfully predicts the spatial distribution and extent of wetlands and major boreal and tropical peatland complexes and reveals the governing limitations to peatland occurrence across the globe. Peatlands covering large boreal lowlands are reproduced only when accounting for a positive feedback induced by the enhanced mean soil water holding capacity in peatland-dominated regions. DYPTOP is designed to minimize input data requirements, optimizes computational efficiency and allows for a modular adoption in Earth system models.
Resumo:
The construction industry has long been considered as highly fragmented and non-collaborative industry. This fragmentation sprouted from complex and unstructured traditional coordination processes and information exchanges amongst all parties involved in a construction project. This nature coupled with risk and uncertainty has pushed clients and their supply chain to search for new ways of improving their business process to deliver better quality and high performing product. This research will closely investigate the need to implement a Digital Nervous System (DNS), analogous to a biological nervous system, on the flow and management of digital information across the project lifecycle. This will be through direct examination of the key processes and information produced in a construction project and how a DNS can provide a well-integrated flow of digital information throughout the project lifecycle. This research will also investigate how a DNS can create a tight digital feedback loop that enables the organisation to sense, react and adapt to changing project conditions. A Digital Nervous System is a digital infrastructure that provides a well-integrated flow of digital information to the right part of the organisation at the right time. It provides the organisation with the relevant and up-to-date information it needs, for critical project issues, to aid in near real-time decision-making. Previous literature review and survey questionnaires were used in this research to collect and analyse data about information management problems of the industry – e.g. disruption and discontinuity of digital information flow due to interoperability issues, disintegration/fragmentation of the adopted digital solutions and paper-based transactions. Results analysis revealed efficient and effective information management requires the creation and implementation of a DNS.
Resumo:
Cost efficiency has been a dominant perspective in the traditional IT literature. However, in complex technology and business environment, the widely recognized cost efficient assumption of information technology has been increasingly challenged. Drawing from a case study of wireless network implementation situated in a politically sensitive workplace, this paper provided practice insights for IT managers in today’s networked economy. More specifically, stories experienced in the case study illustrated that despite well-calculated cost efficiency of wireless network infrastructure, the radical implementation process in the case organization encountered enormous challenges and opposition due to the fact that administrators failed to consider various stakeholders’ positions and interests. Eventually, the implementation objectives and outcome were considerably undermined. Implications from this empirical case research reemphasized the significance of understanding political forces situated in any business environment where different stakeholders hold conflicting interests. Lessons learned from the case story further encouraged IT managers and policy makers to better strategize emerging information technology in general and wireless networks in particular as the whole global society and business environment are increasingly facing an emerging wireless world.