794 resultados para Information Systems Applications
Resumo:
The growing demands for industrial products are imposing an increasingly intense level of competitiveness on the industrial operations. In the meantime, the convergence of information technology (IT) and automation technology (AT) is showing itself to be a tool of great potential for the modernization and improvement of industrial plants. However, for this technology fully to achieve its potential, several obstacles need to be overcome, including the demonstration of the reasoning behind estimations of benefits, investments and risks used to plan the implementation of corporative technology solutions. This article focuses on the evolutionary development of planning and adopting processes of IT & AT convergence. It proposes the incorporation of IT & AT convergence practices into Lean Thinking/Six Sigma, via the method used for planning the convergence of technological activities, known as the Smarter Operation Transformation (SOT) methodology. This article illustrates the SOT methodology through its application in a Brazilian company in the sector of consumer goods. In this application, it is shown that with IT & AT convergence is possible with low investment, in order to reduce the risk of not achieving the goals of key indicators.
Resumo:
[EN] This paper describes a wildfire forecasting application based on a 3D virtual environment and a fire simulation engine. A new open source framework is presented for the development of 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems community. The application includes a remote module that allows simultaneous connection of several users for monitoring a real wildfire event. The user is enabled to simulate and visualize a wildfire spreading on the terrain under conditions of spatial information on topography and fuels along with weather and wind files.
Resumo:
[EN] This abstract describes the development of a wildfire forecasting plugin using Capaware. Capaware is designed as an easy to use open source framework to develop 3D graphics applications over large geographic areas offering high performance 3D visualization and powerful interaction tools for the Geographic Information Systems (GIS) community.
Resumo:
Salt deposits characterize the subsurface of Tuzla (BiH) and made it famous since the ancient times. Archeological discoveries demonstrate the presence of a Neolithic pile-dwelling settlement related to the existence of saltwater springs that contributed to make the most of the area a swampy ground. Since the Roman times, the town is reported as “the City of Salt deposits and Springs”; "tuz" is the Turkish word for salt, as the Ottomans renamed the settlement in the 15th century following their conquest of the medieval Bosnia (Donia and Fine, 1994). Natural brine springs were located everywhere and salt has been evaporated by means of hot charcoals since pre-Roman times. The ancient use of salt was just a small exploitation compared to the massive salt production carried out during the 20th century by means of classical mine methodologies and especially wild brine pumping. In the past salt extraction was practised tapping natural brine springs, while the modern technique consists in about 100 boreholes with pumps tapped to the natural underground brine runs, at an average depth of 400-500 m. The mining operation changed the hydrogeological conditions enabling the downward flow of fresh water causing additional salt dissolution. This process induced severe ground subsidence during the last 60 years reaching up to 10 meters of sinking in the most affected area. Stress and strain of the overlying rocks induced the formation of numerous fractures over a conspicuous area (3 Km2). Consequently serious damages occurred to buildings and infrastructures such as water supply system, sewage networks and power lines. Downtown urban life was compromised by the destruction of more than 2000 buildings that collapsed or needed to be demolished causing the resettlement of about 15000 inhabitants (Tatić, 1979). Recently salt extraction activities have been strongly reduced, but the underground water system is returning to his natural conditions, threatening the flooding of the most collapsed area. During the last 60 years local government developed a monitoring system of the phenomenon, collecting several data about geodetic measurements, amount of brine pumped, piezometry, lithostratigraphy, extension of the salt body and geotechnical parameters. A database was created within a scientific cooperation between the municipality of Tuzla and the city of Rotterdam (D.O.O. Mining Institute Tuzla, 2000). The scientific investigation presented in this dissertation has been financially supported by a cooperation project between the Municipality of Tuzla, The University of Bologna (CIRSA) and the Province of Ravenna. The University of Tuzla (RGGF) gave an important scientific support in particular about the geological and hydrogeological features. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas (Gutierrez et al., 2008). The subject of this study is the collapsing phenomenon occurring in Tuzla area with the aim to identify and quantify the several factors involved in the system and their correlations. Tuzla subsidence phenomenon can be defined as geohazard, which represents the consequence of an adverse combination of geological processes and ground conditions precipitated by human activity with the potential to cause harm (Rosenbaum and Culshaw, 2003). Where an hazard induces a risk to a vulnerable element, a risk management process is required. The single factors involved in the subsidence of Tuzla can be considered as hazards. The final objective of this dissertation represents a preliminary risk assessment procedure and guidelines, developed in order to quantify the buildings vulnerability in relation to the overall geohazard that affect the town. The historical available database, never fully processed, have been analyzed by means of geographic information systems and mathematical interpolators (PART I). Modern geomatic applications have been implemented to deeply investigate the most relevant hazards (PART II). In order to monitor and quantify the actual subsidence rates, geodetic GPS technologies have been implemented and 4 survey campaigns have been carried out once a year. Subsidence related fractures system has been identified by means of field surveys and mathematical interpretations of the sinking surface, called curvature analysis. The comparison of mapped and predicted fractures leaded to a better comprehension of the problem. Results confirmed the reliability of fractures identification using curvature analysis applied to sinking data instead of topographic or seismic data. Urban changes evolution has been reconstructed analyzing topographic maps and satellite imageries, identifying the most damaged areas. This part of the investigation was very important for the quantification of buildings vulnerability.
Resumo:
Environmental Management includes many components, among which we can include Environmental Management Systems (EMS), Environmental Reporting and Analysis, Environmental Information Systems and Environmental Communication. In this work two applications are presented: the developement and implementation of an Environmental Management System in local administrations, according to the European scheme "EMAS", and the analysis of a territorial energy system through scenario building and environmental sustainability assessment. Both applications are linked by the same objective, which is the quest for more scientifically sound elements; in fact, both EMS and energy planning are oftec carachterized by localism and poor comparability. Emergy synthesis, proposed by ecologist H.T. Odum and described in his book "Environmental Accounting: Emergy and Environmental Decision Making" (1996) has been chosen and applied as an environmental evaluation tool, in order complete the analysis with an assessment of the "global value" of goods and processes. In particular, eMergy syntesis has been applied in order to improve the evaluation of the significance of environmental aspects in an EMS, and in order to evaluate the environmental performance of three scenarios of future evolution of the energy system. Regarding EMS, in this work an application of an EMS together with the CLEAR methodology for environmental accounting is discussed, in order to improve the identification of the environmental aspects; data regarding environmental aspects and significant ones for 4 local authorities are also presented, together with a preliminary proposal for the integration of the assessment of the significance of environmental aspects with eMergy synthesis. Regarding the analysis of an energy system, in this work the carachterization of the current situation is presented together with the overall energy balance and the evaluation of the emissions of greenhouse gases; moreover, three scenarios of future evolution are described and discussed. The scenarios have been realized with the support of the LEAP software ("Long Term Energy Alternatives Planning System" by SEI - "Stockholm Environment Institute"). Finally, the eMergy synthesis of the current situation and of the three scenarios is shown.
Resumo:
The rapid development in the field of lighting and illumination allows low energy consumption and a rapid growth in the use, and development of solid-state sources. As the efficiency of these devices increases and their cost decreases there are predictions that they will become the dominant source for general illumination in the short term. The objective of this thesis is to study, through extensive simulations in realistic scenarios, the feasibility and exploitation of visible light communication (VLC) for vehicular ad hoc networks (VANETs) applications. A brief introduction will introduce the new scenario of smart cities in which visible light communication will become a fundamental enabling technology for the future communication systems. Specifically, this thesis focus on the acquisition of several, frequent, and small data packets from vehicles, exploited as sensors of the environment. The use of vehicles as sensors is a new paradigm to enable an efficient environment monitoring and an improved traffic management. In most cases, the sensed information must be collected at a remote control centre and one of the most challenging aspects is the uplink acquisition of data from vehicles. My thesis discusses the opportunity to take advantage of short range vehicle-to-vehicle (V2V) and vehicle-to-roadside (V2R) communications to offload the cellular networks. More specifically, it discusses the system design and assesses the obtainable cellular resource saving, by considering the impact of the percentage of vehicles equipped with short range communication devices, of the number of deployed road side units, and of the adopted routing protocol. When short range communications are concerned, WAVE/IEEE 802.11p is considered as standard for VANETs. Its use together with VLC will be considered in urban vehicular scenarios to let vehicles communicate without involving the cellular network. The study is conducted by simulation, considering both a simulation platform (SHINE, simulation platform for heterogeneous interworking networks) developed within the Wireless communication Laboratory (Wilab) of the University of Bologna and CNR, and network simulator (NS3). trying to realistically represent all the wireless network communication aspects. Specifically, simulation of vehicular system was performed and introduced in ns-3, creating a new module for the simulator. This module will help to study VLC applications in VANETs. Final observations would enhance and encourage potential research in the area and optimize performance of VLC systems applications in the future.
Resumo:
In this paper we provide a framework that enables the rapid development of applications using non-standard input devices. Flash is chosen as programming language since it can be used for quickly assembling applications. We overcome the difficulties of Flash to access external devices by introducing a very generic concept: The state information generated by input devices is transferred to a PC where a program collects them, interprets them and makes them available on a web server. Application developers can now integrate a Flash component that accesses the data stored in XML format and directly use it in their application.
Resumo:
In manual order picking systems, order pickers walk or drive through a distribution warehouse in order to collect items which are requested by (internal or external) customers. In order to perform these operations efficiently, it is usually required that customer orders are combined into (more substantial) picking orders of limited size. The Order Batching Problem considered in this paper deals with the question of how a given set of customer orders should be combined such that the total length of all tours is minimized which are necessary to collect all items. The authors introduce two metaheuristic approaches for the solution of this problem: the first one is based on Iterated Local Search; the second on Ant Colony Optimization. In a series of extensive numerical experiments, the newly developed approaches are benchmarked against classic solution methods. It is demonstrated that the proposed methods are not only superior to existing methods but provide solutions which may allow distribution warehouses to be operated significantly more efficiently.
Resumo:
While revenue management (RM) is traditionally considered a tool of service operations, RM shows considerable potential for application in manufacturing operations. The typical challenges in make-to-order manufacturing are fixed manufacturing capacities and a great variety in offered products, going along with pronounced fluctuations in demand and profitability. Since Harris and Pinder in the mid-90s, numerous papers have furthered the understanding of RM theory in this environment. Nevertheless, results to be expected from applying the developed methods to a practical industry setting have yet to be reported. To this end, this paper investigates a possible application of RM at ThyssenKrupp VDM, leading to considerable improvements in several areas.
Resumo:
Die E-Learning-Plattform VBA@HfTL unterstützt das Erlernen von grundlegenden Programmierkonzepten mithilfe der Programmiersprache Visual Basic for Applications (VBA). Diese Plattform wurde von Studierenden für Studierende der Fachrichtung Wirtschaftsinformatik entwickelt, so dass ein Student2Student (S2S)-Ansatz umgesetzt wurde. Der Beitrag führt die konzeptionellen Grundlagen dieses Ansatzes ein und erläutert die organisatorischen sowie technischen Rahmenbedingungen des Entwicklungsprojekts als Forschungsfallstudie. Das Projektergebnis zeigt, dass Studierende selbstorganisiert E-Learning-Ressourcen entwickeln und sich dabei interdisziplinäre Fachinhalte der Wirtschaftsinformatik aneignen können. Die resultierende E-Learning-Plattform liefert aufgrund der hohen Resonanz nicht nur einen wertvollen Beitrag zur Unterstützung von Lernprozessen in der Aus- und Weiterbildung, sondern bietet der Hochschule auch eine Möglichkeit zur Profilierung des Bildungsangebots im Rahmen der Öffentlichkeitsarbeit.
Resumo:
Lehrvideos erfreuen sich dank aktueller Entwicklungen im Bereich der Online-Lehre (Videoplattformen, MOOCs) auf der einen Seite und einer riesigen Auswahl sowie einer einfachen Produktion und Distribution auf der anderen Seite großer Beliebtheit bei der Wissensvermittlung. Trotzdem bringen Videos einen entscheidenden Nachteil mit sich, welcher in der Natur des Datenformats liegt. So sind die Suche nach konkreten Sachverhalten in einem Video sowie die semantische Aufbereitung zur automatisierten Verknüpfung mit weiteren spezifischen Inhalten mit hohem Aufwand verbunden. Daher werden die lernerfolg-orientierte Selektion von Lehrsegmenten und ihr Arrangement zur auf Lernprozesse abgestimmten Steuerung gehemmt. Beim Betrachten des Videos werden unter Umständen bereits bekannte Sachverhalte wiederholt bzw. können nur durch aufwendiges manuelles Spulen übersprungen werden. Selbiges Problem besteht auch bei der gezielten Wiederholung von Videoabschnitten. Als Lösung dieses Problems wird eine Webapplikation vorgestellt, welche die semantische Aufbereitung von Videos hin zu adaptiven Lehrinhalten ermöglicht: mittels Integration von Selbsttestaufgaben mit definierten Folgeaktionen können auf Basis des aktuellen Nutzerwissens Videoabschnitte automatisiert übersprungen oder wiederholt und externe Inhalte verlinkt werden. Der präsentierte Ansatz basiert somit auf einer Erweiterung der behavioristischen Lerntheorie der Verzweigten Lehrprogramme nach Crowder, die auf den Lernverlauf angepasste Sequenzen von Lerneinheiten beinhaltet. Gleichzeitig werden mittels regelmäßig eingeschobener Selbsttestaufgaben Motivation sowie Aufmerksamkeit des Lernenden nach Regeln der Programmierten Unterweisung nach Skinner und Verstärkungstheorie gefördert. Durch explizite Auszeichnung zusammengehöriger Abschnitte in Videos können zusätzlich die enthaltenden Informationen maschinenlesbar gestaltet werden, sodass weitere Möglichkeiten zum Auffinden und Verknüpfen von Lerninhalten geschaffen werden.
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.