920 resultados para INTELLIGENCE SYSTEMS METHODOLOGY


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines two passive techniques for vibration reduction in mechanical systems: the first one is based on dynamic vibration absorbers (DVAs) and the second uses resonant circuit shunted (RCS) piezoceramics. Genetic algorithms are used to determine the optimal design parameters with respect to performance indexes, which are associated with the dynamical behavior of the system over selected frequency bands. The calculation of the frequency response functions (FRFs) of the composite structure (primary system + DVAs) is performed through a substructure coupling technique. A modal technique is used to determine the frequency response function of the structure containing shunted piezoceramics which are bonded to the primary structure. The use of both techniques simultaneously on the same structure is investigated. The methodology developed is illustrated by numerical applications in which the primary structure is represented by simple Euler-Bernoulli beams. However, the design aspects of vibration control devices presented in this paper can be extended to more complex structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The necessity of EC (Electronic Commerce) and enterprise systems integration is perceived from the integrated nature of enterprise systems. The proven benefits of EC to provide competitive advantages to the organizations force enterprises to adopt and integrate EC with their enterprise systems. Integration is a complex task to facilitate seamless flow of information and data between different systems within and across enterprises. Different systems have different platforms, thus to integrate systems with different platforms and infrastructures, integration technologies, such as middleware, SOA (Service-Oriented Architecture), ESB (Enterprise Service Bus), JCA (J2EE Connector Architecture), and B2B (Business-to-Business) integration standards are required. Huge software vendors, such as Oracle, IBM, Microsoft, and SAP suggest various solutions to address EC and enterprise systems integration problems. There are limited numbers of literature about the integration of EC and enterprise systems in detail. Most of the studies in this area have focused on the factors which influence the adoption of EC by enterprise or other studies provide limited information about a specific platform or integration methodology in general. Therefore, this thesis is conducted to cover the technical details of EC and enterprise systems integration and covers both the adoption factors and integration solutions. In this study, many literature was reviewed and different solutions were investigated. Different enterprise integration approaches as well as most popular integration technologies were investigated. Moreover, various methodologies of integrating EC and enterprise systems were studied in detail and different solutions were examined. In this study, the influential factors to adopt EC in enterprises were studied based on previous literature and categorized to technical, social, managerial, financial, and human resource factors. Moreover, integration technologies were categorized based on three levels of integration, which are data, application, and process. In addition, different integration approaches were identified and categorized based on their communication and platform. Also, different EC integration solutions were investigated and categorized based on the identified integration approaches. By considering different aspects of integration, this study is a great asset to the architectures, developers, and system integrators in order to integrate and adopt EC with enterprise systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a methodology for the development of Teleoperated Robotic Systems through the Internet. Initially, it is presented a bibliographical review of the Telerobotic systems that uses Internet as way of control. The methodology is implemented and tested through the development of two systems. The first is a manipulator with two degrees of freedom commanded remotely through the Internet denominated RobWebCam (http://www.graco.unb.br/robwebcam). The second is a system which teleoperates an ABB (Asea Brown Boveri) Industrial Robot of six degrees of freedom denominated RobWebLink (http://webrobot.graco.unb.br). RobWebCam is composed of a manipulator with two degrees of freedom, a video camera, Internet, computers and communication driver between the manipulator and the Unix system; and RobWebLink composed of the same components plus the Industrial Robot. With the use of this technology, it is possible to move far distant positioning objects minimizing transport costs, materials and people; acting in real time in the process that is wanted to be controller. This work demonstrates that the teleoperating via Internet of robotic systems and other equipments is viable, in spite of using rate transmission data with low bandwidth. Possible applications include remote surveillance, control and remote diagnosis and maintenance of machines and equipments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial applications demand that robots operate in agreement with the position and orientation of their end effector. It is necessary to solve the kinematics inverse problem. This allows the displacement of the joints of the manipulator to be determined, to accomplish a given objective. Complete studies of dynamical control of joint robotics are also necessary. Initially, this article focuses on the implementation of numerical algorithms for the solution of the kinematics inverse problem and the modeling and simulation of dynamic systems. This is done using real time implementation. The modeling and simulation of dynamic systems are performed emphasizing off-line programming. In sequence, a complete study of the control strategies is carried out through the study of several elements of a robotic joint, such as: DC motor, inertia, and gearbox. Finally a trajectory generator, used as input for a generic group of joints, is developed and a proposal of the controller's implementation of joints, using EPLD development system, is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

State-of-the-art predictions of atmospheric states rely on large-scale numerical models of chaotic systems. This dissertation studies numerical methods for state and parameter estimation in such systems. The motivation comes from weather and climate models and a methodological perspective is adopted. The dissertation comprises three sections: state estimation, parameter estimation and chemical data assimilation with real atmospheric satellite data. In the state estimation part of this dissertation, a new filtering technique based on a combination of ensemble and variational Kalman filtering approaches, is presented, experimented and discussed. This new filter is developed for large-scale Kalman filtering applications. In the parameter estimation part, three different techniques for parameter estimation in chaotic systems are considered. The methods are studied using the parameterized Lorenz 95 system, which is a benchmark model for data assimilation. In addition, a dilemma related to the uniqueness of weather and climate model closure parameters is discussed. In the data-oriented part of this dissertation, data from the Global Ozone Monitoring by Occultation of Stars (GOMOS) satellite instrument are considered and an alternative algorithm to retrieve atmospheric parameters from the measurements is presented. The validation study presents first global comparisons between two unique satellite-borne datasets of vertical profiles of nitrogen trioxide (NO3), retrieved using GOMOS and Stratospheric Aerosol and Gas Experiment III (SAGE III) satellite instruments. The GOMOS NO3 observations are also considered in a chemical state estimation study in order to retrieve stratospheric temperature profiles. The main result of this dissertation is the consideration of likelihood calculations via Kalman filtering outputs. The concept has previously been used together with stochastic differential equations and in time series analysis. In this work, the concept is applied to chaotic dynamical systems and used together with Markov chain Monte Carlo (MCMC) methods for statistical analysis. In particular, this methodology is advocated for use in numerical weather prediction (NWP) and climate model applications. In addition, the concept is shown to be useful in estimating the filter-specific parameters related, e.g., to model error covariance matrix parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More discussion is required on how and which types of biomass should be used to achieve a significant reduction in the carbon load released into the atmosphere in the short term. The energy sector is one of the largest greenhouse gas (GHG) emitters and thus its role in climate change mitigation is important. Replacing fossil fuels with biomass has been a simple way to reduce carbon emissions because the carbon bonded to biomass is considered as carbon neutral. With this in mind, this thesis has the following objectives: (1) to study the significance of the different GHG emission sources related to energy production from peat and biomass, (2) to explore opportunities to develop more climate friendly biomass energy options and (3) to discuss the importance of biogenic emissions of biomass systems. The discussion on biogenic carbon and other GHG emissions comprises four case studies of which two consider peat utilization, one forest biomass and one cultivated biomasses. Various different biomass types (peat, pine logs and forest residues, palm oil, rapeseed oil and jatropha oil) are used as examples to demonstrate the importance of biogenic carbon to life cycle GHG emissions. The biogenic carbon emissions of biomass are defined as the difference in the carbon stock between the utilization and the non-utilization scenarios of biomass. Forestry-drained peatlands were studied by using the high emission values of the peatland types in question to discuss the emission reduction potential of the peatlands. The results are presented in terms of global warming potential (GWP) values. Based on the results, the climate impact of the peat production can be reduced by selecting high-emission-level peatlands for peat production. The comparison of the two different types of forest biomass in integrated ethanol production in pulp mill shows that the type of forest biomass impacts the biogenic carbon emissions of biofuel production. The assessment of cultivated biomasses demonstrates that several selections made in the production chain significantly affect the GHG emissions of biofuels. The emissions caused by biofuel can exceed the emissions from fossil-based fuels in the short term if biomass is in part consumed in the process itself and does not end up in the final product. Including biogenic carbon and other land use carbon emissions into the carbon footprint calculations of biofuel reveals the importance of the time frame and of the efficiency of biomass carbon content utilization. As regards the climate impact of biomass energy use, the net impact on carbon stocks (in organic matter of soils and biomass), compared to the impact of the replaced energy source, is the key issue. Promoting renewable biomass regardless of biogenic GHG emissions can increase GHG emissions in the short term and also possibly in the long term.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This pro gradu –thesis discusses generating competitive advantage through competitor information systems. The structure of this thesis follows the structure of the WCA model by Alter (1996). In the WCA model, business process is influenced by three separate but connected elements: information, technology, and process participants. The main research question is how competitor information can be incorporated into or made into a tool creating competitive advantage. Research subquestions are: How does competitor information act as a part of the business process creating competitive advantage? How is a good competitor information system situated and structured in an organisation? How can management help information generate competitive advantage in the business process with participants, information, and technology? This thesis discusses each of the elements separate, but the elements are connected to each other and to competitive advantage. Information is discussed by delving into competitor information and competitor analysis. Competitive intelligence and competitor analysis requires commitment throughout the organisation, including top management, the desire to perform competitive intelligence and the desire to use the end products of that competitive intelligence. In order to be successful, systematic competitive intelligence and competitor analysis require vision, willingness to strive for the goals set, and clear strategies to proceed. Technology is discussed by taking a look into the function of the competitor information systems play and the place they occupy within an organization. In addition, there is discussion about the basic infrastructure of competitor information systems, and the problems competitor information systems can have plaguing them. In order for competitor information systems to be useful and worthy of the resources it takes to develop and maintain them, competitor information systems require on-going resource allocation and high quality information. In order for competitor information systems justify their existence business process participants need to maintain and utilize competitor information systems on all levels. Business process participants are discussed through management practices. This thesis discusses way to manage information, technology, and process participants, when the goal is to generate competitive advantage through competitor information systems. This is possible when information is treated as a resource with value, technology requires strategy in order to be successful within an organization, and process participants are an important resource. Generating competitive advantage through competitor information systems is possible when the elements of information, technology, and business process participants all align advantageously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to explore the possibilities of utilizing business intelligence (BI)systems in management control (MC). The topic of this study is explored trough four researchquestions. Firstly, what kind of management control systems (MCS) use or could use the data and information enabled by the BI system? Secondly, how the BI system is or could be utilized? Thirdly, has BI system enabled new forms of control or changed old ones? The fourth and final research question is whether the BI system supports some forms of control that the literature has not thought of, or is the BI system not used for some forms of control the literature suggests it should be used? The study is conducted as an extensive case study. Three different organizations were interviewed for the study. For the theoretical basis of the study, central theories in the field of management control are introduced. The term business intelligence is discussed in detail and the mechanisms for governance of business intelligence are presented. A literature analysis of the uses of BI for management control is introduced. The theoretical part of the study ends in the construction of a framework for business intelligence in management control. In the empirical part of the study the case organizations, their BI systems, and the ways they utilize these systems for management control are presented. The main findings of the study are that BI systems can be utilized in the fields suggested in the literature, namely in planning, cybernetic, reward, boundary, and interactive control. The systems are used both as the data or information feeders and directly as the tools. Using BI systems has also enabled entirely new forms of control in the studied organizations, most significantly in the area of interactive control. They have also changed the old control systems by making the information more readily available to the whole organization. No evidence of the BI systems being used for forms of control that the literature had not suggested was found. The systems were mostly used for cybernetic control and interactive control, whereas the support for other types of control was not as prevalent. The main contribution of the study to the existing literature is the insight provided into how BI systems, both theoretically and empirically, are used for management control. The framework for business intelligence in management control presented in the study can also be utilized in further studies about the subject.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, business intelligence (BI) has gained momentum in real-world practice. At the same time, business intelligence has evolved as an important research subject of Information Systems (IS) within the decision support domain. Today’s growing competitive pressure in business has led to increased needs for real-time analytics, i.e., so called real-time BI or operational BI. This is especially true with respect to the electricity production, transmission, distribution, and retail business since the law of physics determines that electricity as a commodity is nearly impossible to be stored economically, and therefore demand-supply needs to be constantly in balance. The current power sector is subject to complex changes, innovation opportunities, and technical and regulatory constraints. These range from low carbon transition, renewable energy sources (RES) development, market design to new technologies (e.g., smart metering, smart grids, electric vehicles, etc.), and new independent power producers (e.g., commercial buildings or households with rooftop solar panel installments, a.k.a. Distributed Generation). Among them, the ongoing deployment of Advanced Metering Infrastructure (AMI) has profound impacts on the electricity retail market. From the view point of BI research, the AMI is enabling real-time or near real-time analytics in the electricity retail business. Following Design Science Research (DSR) paradigm in the IS field, this research presents four aspects of BI for efficient pricing in a competitive electricity retail market: (i) visual data-mining based descriptive analytics, namely electricity consumption profiling, for pricing decision-making support; (ii) real-time BI enterprise architecture for enhancing management’s capacity on real-time decision-making; (iii) prescriptive analytics through agent-based modeling for price-responsive demand simulation; (iv) visual data-mining application for electricity distribution benchmarking. Even though this study is from the perspective of the European electricity industry, particularly focused on Finland and Estonia, the BI approaches investigated can: (i) provide managerial implications to support the utility’s pricing decision-making; (ii) add empirical knowledge to the landscape of BI research; (iii) be transferred to a wide body of practice in the power sector and BI research community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pelastuslaitosten liiketoimintatiedonhallinnalla, tietoperusteisuudella ja tietojohtamisella on tulevaisuudessa merkittävä rooli päätettäessä palveluista. Julkisen pelastustoimen kuntien liikelaitoksina ja eriytettyinä taseyksiköinä toimivien pelastuslaitosten haasteet tulevat olemaan jatkossa tehokkaiden ja vaikuttavien palveluiden strategisessa johtamisessa ja suunnittelussa. Näistä asioista päättäminen on kriittinen vaihe onnistumisen kannalta. Päätöksenteko eri tasoilla tarvitsee tuekseen toiminnasta ja palveluista kanavoitua analysoitua tietoa. Asiakastarpeesta lähtevä vaikuttavuus ja laatu korostuvat. Liiketoimintatiedonhallinta ja tietoperusteisuus haastavat pelastuslaitoksen johtamisjärjestelmän. Johtamisen kyvykkyys ja henkilöstön osaaminen ovat tietoperusteisuuden ja tiedonhallinnan keskiössä. Systemaattisen liiketoimintatiedonhallinnan ja tietoperusteisuuden erottaa perinteisestä virkamiehen tietojen hyväksikäytöstä käsitteen kokonaisvaltaisuus ja järjestelmällisyys kaikessa tiedollisessa toiminnassa. Tämä kattaa tietojärjestelmät, mittarit, prosessit, strategian suunnitelmat, asiakirjat, raportoinnin, kehittämisen ja tutkimuksen. Liiketoimin-tatiedonhallinta ja tietojohtaminen linkittävät kaiken toisiinsa muodostaen keskinäisriippuvaisen yhtenäisen järjestelmän ja kokonaisvaltaisen ymmärryksen. Tutkimukseni on laadullinen tutkimus jossa tiedon keruu ja analysointi on toteutettu toisiaan tukevilla tutkimusotteilla. Metodologia nojaa teorialähtöiseen systemaattiseen analyysiin, jossa on valikoituja osia sisällön analyysistä. Tutkimuksessa on käytetty aineisto- ja menetelmätriangulaatioita. Tutkimuksen aineisto on kerätty teemahaastatteluilla valittujen kohde pelastuslaitosten asiantuntijoilta palveluiden päätös- ja suunnittelutasolta, johtoryhmistä ja joh-tokunnista. Haastatteluja varten tutkija on tutustunut kohdepelastuslaitosten palveluita mää-rittävään tiedolliseen dokumentaatioon kuten palvelutasopäätöksiin ja riskianalyyseihin. Ai-neisto keruun kohteiksi valikoitui pääkaupunkiseudun alueen pelastuslaitokset: Helsingin kaupungin pelastuslaitos sekä Itä-, Keski- ja Länsi-Uudenmaan pelastuslaitokset. Tulosten mukaan pelastuslaitosten keskeiset liiketoimintatiedonhallinnan esteet muodostuvat johtamisen ongelmista, organisaation muutosvastarinnasta ja päätöksenteon tietoperusteen puutteesta. Nämä ilmenevät strategisen johtamisen puutteina, vaikuttavuuden mittaamisen sekä tiedon jalostamisen ongelmina. Keskeistä tiedollista yhdistävää ja linkittävää tekijää ei tunnisteta ja löydetä. Tiedollisessa liiketoimintatiedonhallinnan prosessityössä voisi olla tulos-ten mukaan mahdollisuuksia tämän tyhjiön täyttämiseen. Pelastuslaitoksille jää tulevaisuudessa valinta suunnasta johon ne haluavat edetä tiedonhal-linnan, tietojohtamisen ja tietoperusteisuuden kanssa. Tämä vaikuttaa kehitykseen ja tavoitteeseen keskeisistä palveluiden päätöksentekoa tukevista johtamis- ja tietojärjestelmistä, tietoa kokoavista ja luovista dokumenteista sekä organisaation joustavasta rakenteesta. Tietoprosessiin, tiedon prosessimaiseen johtamiseen ja systemaattiseen tiedonhallintaan meneminen vaikuttaa tutkimuksen tulosten mukaan lupaavalta mahdollisuudelta. Samalla se haastaa pelauslaitokset suureen kulttuuriseen muutokseen ja asettaa uusien vaikuttavuusmittareiden tuottaman tiedon ennakoivan hyväksynnän vaateen strategiselle suunnittelulle. Tämä vaatii pelastuslaitosten johdolta ja henkilöstöltä osaamista, yhteisymmärrystä, muutostarpeiden hyväksyntää sekä asiakkaan asettamista vaikuttavuuden keskiöön.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to propose a novel control method for teleoperated electrohydraulic servo systems that implements a reliable haptic sense between the human and manipulator interaction, and an ideal position control between the manipulator and the task environment interaction. The proposed method has the characteristics of a universal technique independent of the actual control algorithm and it can be applied with other suitable control methods as a real-time control strategy. The motivation to develop this control method is the necessity for a reliable real-time controller for teleoperated electrohydraulic servo systems that provides highly accurate position control based on joystick inputs with haptic capabilities. The contribution of the research is that the proposed control method combines a directed random search method and a real-time simulation to develop an intelligent controller in which each generation of parameters is tested on-line by the real-time simulator before being applied to the real process. The controller was evaluated on a hydraulic position servo system. The simulator of the hydraulic system was built based on Markov chain Monte Carlo (MCMC) method. A Particle Swarm Optimization algorithm combined with the foraging behavior of E. coli bacteria was utilized as the directed random search engine. The control strategy allows the operator to be plugged into the work environment dynamically and kinetically. This helps to ensure the system has haptic sense with high stability, without abstracting away the dynamics of the hydraulic system. The new control algorithm provides asymptotically exact tracking of both, the position and the contact force. In addition, this research proposes a novel method for re-calibration of multi-axis force/torque sensors. The method makes several improvements to traditional methods. It can be used without dismantling the sensor from its application and it requires smaller number of standard loads for calibration. It is also more cost efficient and faster in comparison to traditional calibration methods. The proposed method was developed in response to re-calibration issues with the force sensors utilized in teleoperated systems. The new approach aimed to avoid dismantling of the sensors from their applications for applying calibration. A major complication with many manipulators is the difficulty accessing them when they operate inside a non-accessible environment; especially if those environments are harsh; such as in radioactive areas. The proposed technique is based on design of experiment methodology. It has been successfully applied to different force/torque sensors and this research presents experimental validation of use of the calibration method with one of the force sensors which method has been applied to.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of security violations is increasing and a security breach could have irreversible impacts to business. There are several ways to improve organization security, but some of them may be difficult to comprehend. This thesis demystifies threat modeling as part of secure system development. Threat modeling enables developers to reveal previously undetected security issues from computer systems. It offers a structured approach for organizations to find and address threats against vulnerabilities. When implemented correctly threat modeling will reduce the amount of defects and malicious attempts against the target environment. In this thesis Microsoft Security Development Lifecycle (SDL) is introduced as an effective methodology for reducing defects in the target system. SDL is traditionally meant to be used in software development, principles can be however partially adapted to IT-infrastructure development. Microsoft threat modeling methodology is an important part of SDL and it is utilized in this thesis to find threats from the Acme Corporation’s factory environment. Acme Corporation is used as a pseudonym for a company providing high-technology consumer electronics. Target for threat modeling is the IT-infrastructure of factory’s manufacturing execution system. Microsoft threat modeling methodology utilizes STRIDE –mnemonic and data flow diagrams to find threats. Threat modeling in this thesis returned results that were important for the organization. Acme Corporation now has more comprehensive understanding concerning IT-infrastructure of the manufacturing execution system. On top of vulnerability related results threat modeling provided coherent views of the target system. Subject matter experts from different areas can now agree upon functions and dependencies of the target system. Threat modeling was recognized as a useful activity for improving security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tässä diplomityössä selvitetään case-tutkimuksena parhaita käytäntöjä Business Intelligence Competency Centerin (BICC) eli liiketoimintatiedonhallinnan osaamiskeskuksen perustamiseen. Työ tehdään LähiTapiolalle, jossa on haasteita BI-alueen hallinnoinnissa kehittämisen hajaantuessa eri yksiköihin ja yhtiöihin. Myös järjestelmäympäristö on moninainen. BICC:llä tavoitellaan parempaa näkyvyyttä liiketoiminnan tarpeisiin ja toisaalta halutaan tehostaa tiedon hyödyntämistä johtamisessa sekä operatiivisen tason työskentelyssä. Tavoitteena on lisäksi saada kustannuksia pienemmäksi yhtenäistämällä järjestelmäympäristöjä ja BI-työkaluja kuten myös toimintamalleja. Työssä tehdään kirjallisuuskatsaus ja haastatellaan asiantuntijoita kolmessa yrityksessä. Tutkimuksen perusteella voidaan todeta, että liiketoiminnan BI-tarpeita kannattaa mahdollistaa eri tasoilla perusraportoinnista Ad-hoc –raportointiin ja edistyneeseen analytiikkaan huomioimalla nämä toimintamalleissa ja järjestelmäarkkitehtuurissa. BICC:n perustamisessa liiketoimintatarpeisiin vastaaminen on etusijalla.