981 resultados para validation process
Resumo:
This study is specifically concerned with the effect of the Enterprise Resource Planning (ERP) on the Business Process Redesign (BPR). Researcher’s experience and the investigation on previous researches imply that BPR and ERP are deeply related to each other and a study to found the mentioned relation further is necessary. In order to elaborate the hypothesis, a case study, in particular Turkish electricity distribution market and the phase of privatization are investigated. Eight companies that have taken part in privatization process and executed BPR serve as cases in this study. During the research, the cases are evaluated through critical success factors on both BPR and ERP. It was seen that combining the ERP Solution features with business processes lead the companies to be successful in ERP and BPR implementation. When the companies’ success and efficiency were compared before and after the ERP implementation, a considerable change was observed in organizational structure. It was spotted that the team composition is important in the success of ERP projects. Additionally, when the ERP is in driver or enabler role, the companies can be considered successful. On the contrary, when the ERP has a neutral role of business processes, the project fails. In conclusion, it can be said that the companies, which have implemented the ERP successfully, have accomplished the goals of the BPR.
Resumo:
Pharmaceuticals and personal care products (PPCPs) are widely used on a daily basis. After their usage they reach the wastewater treatment plants (WWTPs). These compounds have different physico-chemical characteristics, which makes them difficult to completely remove in the WWTPs, througth conventional treatments. Currently, there is no legislation regarding PPCPs thresholds in effluent discharge. But, even at vestigial concentrations, these compounds enclose environmental risks due to, e.g., endocrine disruption potential. There is a need of alternative techniques for their removal in WWTPs. The main goal of this work was to assess the use of electrodialytic (ED) process to remove PPCPs from the effluent to be discharged. A two-compartment ED cell was used testing (i) the effluent position in the cell (anode and cathode compartment); (ii) the use of anion (AEM) and cation exchange membrane (CEM); (iii) the treatment period (6, 12 and 24 hours); (iv) effluent recirculation and current steps; (v) the feasibility of sequential treatments. Phosphorus (P) removal from effluent and energetic costs associated to the process were also evaluated. Five PPCPs were studied – caffeine (CAF), bisphenol A (BPA), 17 β-estradiol (E2), ethinyl estradiol (EE2) and oxybenzone (MBPh). The ED process showed to be effective in the removal when effluent is in the anode compartment. Oxidation is suggested to be the main removal process, which was between 88 and 96%, for all the compounds, in 6 hours. Nevertheless, the presence of intermediates and/or by-products was also observed in some cases. Effluent recirculation should have a retention time in the ED cell big enough to promote removal whereas the current steps (effluent in anode compartment) slightly increased removal efficiencies (higher than 80% for all PPCPs). The sequential set of ED treatment (effluent in anode compartment) showed to be effective during both periods with a removal percentage between 80 and 95% and 73 to 88% in the case of AEM and CEM, respectively. Again, the main removal process is strongly suggested to be oxidation in the anode compartment. However, there was an increase of BOD5 and COD, which might be explained by effluent spiking, these parameters limiting the effluent discharge. From these treatments, the use of AEM, enhanced the P removal from effluent to minimize risk of eutrophication. Energetic costs of the best set-up (6 hours) are approximately 0,8€/m3 of wastewater, a value considered low, attending to the prices of other treatment processes.
Resumo:
The moisture content in concrete structures has an important influence in their behavior and performance. Several vali-dated numerical approaches adopt the governing equation for relative humidity fields proposed in Model Code 1990/2010. Nevertheless there is no integrative study which addresses the choice of parameters for the simulation of the humidity diffusion phenomenon, particularly in concern to the range of parameters forwarded by Model Code 1990/2010. A software based on a Finite Difference Method Algorithm (1D and axisymmetric cases) is used to perform sensitivity analyses on the main parameters in a normal strength concrete. Then, based on the conclusions of the sensi-tivity analyses, experimental results from nine different concrete compositions are analyzed. The software is used to identify the main material parameters that better fit the experimental data. In general, the model was able to satisfactory fit the experimental results and new correlations were proposed, particularly focusing on the boundary transfer coeffi-cient.
Resumo:
This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.
Resumo:
Given the current economic situation of the Portuguese municipalities, it is necessary to identify the priority investments in order to achieve a more efficient financial management. The classification of the road network of the municipality according to the occurrence of traffic accidents is fundamental to set priorities for road interventions. This paper presents a model for road network classification based on traffic accidents integrated in a geographic information system. Its practical application was developed through a case study in the municipality of Barcelos. An equation was defined to obtain a road safety index through the combination of the following indicators: severity, property damage only and accident costs. In addition to the road network classification, the application of the model allows to analyze the spatial coverage of accidents in order to determine the centrality and dispersion of the locations with the highest incidence of road accidents. This analysis can be further refined according to the nature of the accidents namely in collision, runoff and pedestrian crashes.
Resumo:
Given the need for using more sustainable constructive solutions, an innovative composite material based on a combination of distinct industrial by-products is proposed aiming to reduce waste and energy consumption in the production of construction materials. The raw materials are thermal activated flue-gas desulphurization (FGD) gypsum, which acts as a binder, granulated cork as the aggregate and recycled textile fibres from used tyres intended to reinforce the material. This paper presents the results of the design of the composite mortar mixes, the characterization of the key physical properties (density, porosity and ultrasonic pulse velocity) and the mechanical validation based on uniaxial compressive tests and fracture energy tests. In the experimental campaign, the influence of the percentage of the raw materials in terms of gypsum mass, on the mechanical properties of the composite material was assessed. It was observed that the percentage of granulated cork decreases the compressive strength of the composite material but contributes to the increase in the compressive fracture energy. Besides, the recycled textile fibres play an important role in the mode I fracture process and in the fracture energy of the composite material, resulting in a considerable increase in the mode I fracture energy.
Resumo:
Due to the increasing acceptance of BPM, nowadays BPM tools are extensively used in organizations. Core to BPM are the process modeling languages, of which BPMN is the one that has been receiving most attention these days. Once a business process is described using BPMN, one can use a process simulation approach in order to find its optimized form. In this context, the simulation of business processes, such as those defined in BPMN, appears as an obvious way of improving processes. This paper analyzes the business process modeling and simulation areas, identifying the elements that must be present in the BPMN language in order to allow processes described in BPMN to be simulated. During this analysis a set of existing BPM tools, which support BPMN, are compared regarding their limitations in terms of simulation support.
Resumo:
IP networks are currently the major communication infrastructure used by an increasing number of applications and heterogeneous services, including voice services. In this context, the Session Initiation Protocol (SIP) is a signaling protocol widely used for controlling multimedia communication sessions such as voice or video calls over IP networks, thus performing vital functions in an extensive set of public and enter- prise solutions. However, the SIP protocol dissemination also entails some challenges, such as the complexity associated with the testing/validation processes of IMS/SIP networks. As a consequence, manual IMS/SIP testing solutions are inherently costly and time consuming tasks, being crucial to develop automated approaches in this specific area. In this perspective, this article presents an experimental approach for automated testing/validation of SIP scenarios in IMS networks. For that purpose, an automation framework is proposed allowing to replicate the configuration of SIP equipment from the pro- duction network and submit such equipment to a battery of tests in the testing network. The proposed solution allows to drastically reduce the test and validation times when compared with traditional manual approaches, also allowing to enhance testing reliability and coverage. The automation framework comprises of some freely available tools which are conveniently integrated with other specific modules implemented within the context of this work. In order to illustrate the advantages of the proposed automated framework, a real case study taken from a PT Inovação customer is presented comparing the time required to perform a manual SIP testing approach with the one time required when using the proposed auto- mated framework. The presented results clearly corroborate the advantages of using the presented framework.
Resumo:
Although most of the accidents occurred in Olive Oil Mill (OOM) resulted from “basic” risks, there is a need to apply adequate tools to support risk decisions that can meet the specificities of this sector. This study aims to analyse the views of Occupational, Safety & Health (OSH) practitioners about the risk assessment process in OOM, identifying the key difficulties inherent to the risk assessment process in these sector, as well as identifying some improvements to the current practice. This analysis was based on a questionnaire that was developed and applied to 13 OSH practitioners working at OOM. The results showed that the time available to perform the risk assessment is the more frequent limitation. They believe that the methodologies available are not an important limitation to this process. However, a specific risk assessment methodology, that includes acceptance criteria adjusted to the OOM reality, using risk metrics supported on the frequency of accidents and workdays lost, were indicated as being also an important contributions improve the process. A semi-quantitative approach, complemented with the use of the sector accident statistics, can be a good solution for this sector. However, further strategies should also be adopted, mainly those that can lead to an easy application of the risk assessment process.
Resumo:
Developing and implementing data-oriented workflows for data migration processes are complex tasks involving several problems related to the integration of data coming from different schemas. Usually, they involve very specific requirements - every process is almost unique. Having a way to abstract their representation will help us to better understand and validate them with business users, which is a crucial step for requirements validation. In this demo we present an approach that provides a way to enrich incrementally conceptual models in order to support an automatic way for producing their correspondent physical implementation. In this demo we will show how B2K (Business to Kettle) system works transforming BPMN 2.0 conceptual models into Kettle data-integration executable processes, approaching the most relevant aspects related to model design and enrichment, model to system transformation, and system execution.
Resumo:
Dissertação de mestrado em Engenharia de Sistemas
Resumo:
The present paper focuses on a damage identification method based on the use of the second order spectral properties of the nodal response processes. The explicit dependence on the frequency content of the outputs power spectral densities makes them suitable for damage detection and localization. The well-known case study of the Z24 Bridge in Switzerland is chosen to apply and further investigate this technique with the aim of validating its reliability. Numerical simulations of the dynamic response of the structure subjected to different types of excitation are carried out to assess the variability of the spectrum-driven method with respect to both type and position of the excitation sources. The simulated data obtained from random vibrations, impulse, ramp and shaking forces, allowed to build the power spectrum matrix from which the main eigenparameters of reference and damage scenarios are extracted. Afterwards, complex eigenvectors and real eigenvalues are properly weighed and combined and a damage index based on the difference between spectral modes is computed to pinpoint the damage. Finally, a group of vibration-based damage identification methods are selected from the literature to compare the results obtained and to evaluate the performance of the spectral index.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.