938 resultados para Software process improvement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INVESTIGATION INTO CURRENT EFFICIENCY FOR PULSE ELECTROCHEMICAL MACHINING OF NICKEL ALLOY Yu Zhang, M.S. University of Nebraska, 2010 Adviser: Kamlakar P. Rajurkar Electrochemical machining (ECM) is a nontraditional manufacturing process that can machine difficult-to-cut materials. In ECM, material is removed by controlled electrochemical dissolution of an anodic workpiece in an electrochemical cell. ECM has extensive applications in automotive, petroleum, aerospace, textile, medical, and electronics industries. Improving current efficiency is a challenging task for any electro-physical or electrochemical machining processes. The current efficiency is defined as the ratio of the observed amount of metal dissolved to the theoretical amount predicted from Faraday’s law, for the same specified conditions of electrochemical equivalent, current, etc [1]. In macro ECM, electrolyte conductivity greatly influences the current efficiency of the process. Since there is a certain limit to enhance the conductivity of the electrolyte, a process innovation is needed for further improvement in current efficiency in ECM. Pulse electrochemical machining (PECM) is one such approach in which the electrolyte conductivity is improved by electrolyte flushing in pulse off-time. The aim of this research is to study the influence of major factors on current efficiency in a pulse electrochemical machining process in macro scale and to develop a linear regression model for predicting current efficiency of the process. An in-house designed electrochemical cell was used for machining nickel alloy (ASTM B435) by PECM. The effects of current density, type of electrolyte, and electrolyte flow rate, on current efficiency under different experimental conditions were studied. Results indicated that current efficiency is dependent on electrolyte, electrolyte flow rate, and current density. Linear regression models of current efficiency were compared with twenty new data points graphically and quantitatively. Models developed were close enough to the actual results to be reliable. In addition, an attempt has been made in this work to consider those factors in PECM that have not been investigated in earlier works. This was done by simulating the process by using COMSOL software. However, it was found that the results from this attempt were not substantially different from the earlier reported studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stricter environmental policies are shown necessary to ensure an effective pollutant emission control. It is expected for the present year of 2015, that Brazil will assume, at the 21th United Nation's Climate Change Conference (COP21), implementation of commitment to a low carbon economy. This positioning affects the industrial environment, so that is deemed necessary to search for new technologies, less aggressive to the environment, so the adequacies to the new emission policies do not cause a negative effect on production. Almost all of the processes performed in the steel industry demand burning fuel and, therefore, flue gases are sent to the atmosphere. In this present work is discussed the utilization of heat exchangers so, by recovering part of the available heat from the flue gases of certain industrial process, the combustion air is preheated. The combustion air preheat results in less energy requirement, i.e., less need of fuel consumption and, in addition, minor amount of pollutants to be emitted. Due to better fitting to the process, it is studied the utilization of spiral plate heat exchangers. The heat exchanger dimensioning is made by an iterative method implemented in the software Microsoft Excel. Subsequently are analyzed the gains in terms of process's thermal efficiency improvement and the percentage of fuel saving. The latter implies in reduction of the same percentage of greenhouse gases emission

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

doi: 10.1111/j.1741-2358.2011.00526.x Biological evaluation of the bone healing process after application of two potentially osteogenic proteins: an animal experimental model Objective: The aim of this work was to analyse qualitatively and quantitatively the newly formed bone after insertion of rhBMP-2 and protein extracted from Hevea brasiliensis (P-1), associated or not with a carrier in critical bone defects created in Wistar rat calvarial bone, using histological and histomorphometrical analyses. Materials and methods: Eighty-four male Wistar rats were used, divided into two groups, according to the period of time until the sacrifice (2 and 6 weeks). Each one of these groups was subdivided into six groups with seven animals each, according to the treatments: (1) 5 mu g of pure rhBMP-2, (2) 5 mu g of rhBMP-2/monoolein gel, (3) pure monoolein gel, (4) 5 mu g of pure P-1, (5) 5 mu g of P-1/monoolein gel and (6) critical bone defect controls. The animals were euthanised and the calvarial bone tissue removed for histological and histomorphometrical analyses. Result and conclusion: The results showed an improvement in the bone healing process using the rhBMP-2 protein, associated or not with a material carrier in relation to the other groups, and this process demonstrated to be time dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A gene encoding a-L-arabinofuranosidase (abfA) from Aspergillus niveus was identified, cloned, and successfully expressed in Aspergillus nidulans. Based on amino acid sequence comparison, the 88.6 kDa enzyme could be assigned to the GH family 51. The characterization of the purified recombinant AbfA revealed that the enzyme was active at a limited pH range (pH 4.0-5.0) and an optimum temperature of 70 degrees C. The AbfA was able to hydrolyze arabinoxylan, xylan from birchwood, debranched arabinan, and 4-nitrophenyl arabinofuranoside. Synergistic reactions using both AbfA and endoxylanase were also assessed. The highest degree of synergy was obtained after the sequential treatment of the substrate with endoxylanase, followed by AbfA, which was observed to release noticeably more reducing sugars than that of either enzyme acting individually. The immobilization of AbfA was performed via ionic adsorption onto various supports: agarose activated by polyethyleneimine polymers, cyanogen bromide activated Sepharose, DEAE-Sepharose, and Sepharose-Q The Sepharose-Q derivative remained fully active at pH 5 after 360 min at 60 degrees C, whereas the free AbfA was inactivated after 60 min. A synergistic effect of arabinoxylan hydrolysis by AbfA immobilized in Sepharose-Q and endoxylanase immobilized in glyoxyl agarose was also observed. The stabilization of arabinofuranosidases using immobilization tools is a novel and interesting topic. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many pathways can be used to synthesize polythiophenes derivatives. The polycondensation reactions performed with organometallics are preferred since they lead to regioregular polymers (with high content of heat-to-tail coupling) which have enhanced conductivity and luminescence. However, these pathways have several steps; the reactants are highly moisture sensitive and expensive. On the other hand, the oxidative polymerization using FeCl3 is a one-pot reaction that requires less moisture sensitive reactants with lower cost, although the most common reaction conditions lead to polymers with low regioregularity. Here, we report that by changing the reaction conditions, such as FeCl3 addition rate and reaction temperature, poly-3-octylthiophenes with different the regioregularities can be obtained, reaching about 80% of heat-to-tail coupling. Different molar mass distributions and polydispersivities were obtained. The preliminary results suggest that the oxidative polymerization process could be improved to yield polythiophenes with higher regioregularity degree and narrower molar mass distributions by just setting some reaction conditions. We also verified that it is possible to solvent extract part of the lower regioregular fraction of the polymer further improving the regioregularity degree. (C) 2011 Wiley Periodicals, Inc. J Appl Polym Sci, 2012

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality concepts represent one of the important factors for the success of organizations and among these concepts the stabilization of the production process contributes to the improvement, waste reduction and increased competitiveness. Thus, this study aimed to evaluate the production process of solid wood flooring on its predictability and capacity, based on its critical points. Therefore, the research was divided into three stages. The first one was the process mapping of the company and the elaboration of flowcharts for the activities. The second one was the identification and the evaluation of the critical points using FMEA (Failure Mode and Effect Analysis) adapted methodology. The third one was the evaluation of the critical points applying the statistical process control and the determination of the process capability for the C-pk index. The results showed the existence of six processes, two of them are critical. In those two ones, fifteen points were considered critical and two of them, related with the dimension of the pieces and defects caused by sandpaper, were selected for evaluation. The productive process of the company is unstable and not capable to produce wood flooring according to the specifications and, therefore these specifications should be reevaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background In recent years, biorefining of lignocellulosic biomass to produce multi-products such as ethanol and other biomaterials has become a dynamic research area. Pretreatment technologies that fractionate sugarcane bagasse are essential for the successful use of this feedstock in ethanol production. In this paper, we investigate modifications in the morphology and chemical composition of sugarcane bagasse submitted to a two-step treatment, using diluted acid followed by a delignification process with increasing sodium hydroxide concentrations. Detailed chemical and morphological characterization of the samples after each pretreatment condition, studied by high performance liquid chromatography, solid-state nuclear magnetic resonance, diffuse reflectance Fourier transformed infrared spectroscopy and scanning electron microscopy, is reported, together with sample crystallinity and enzymatic digestibility. Results Chemical composition analysis performed on samples obtained after different pretreatment conditions showed that up to 96% and 85% of hemicellulose and lignin fractions, respectively, were removed by this two-step method when sodium hydroxide concentrations of 1% (m/v) or higher were used. The efficient lignin removal resulted in an enhanced hydrolysis yield reaching values around 100%. Considering the cellulose loss due to the pretreatment (maximum of 30%, depending on the process), the total cellulose conversion increases significantly from 22.0% (value for the untreated bagasse) to 72.4%. The delignification process, with consequent increase in the cellulose to lignin ratio, is also clearly observed by nuclear magnetic resonance and diffuse reflectance Fourier transformed infrared spectroscopy experiments. We also demonstrated that the morphological changes contributing to this remarkable improvement occur as a consequence of lignin removal from the sample. Bagasse unstructuring is favored by the loss of cohesion between neighboring cell walls, as well as by changes in the inner cell wall structure, such as damaging, hole formation and loss of mechanical resistance, facilitating liquid and enzyme access to crystalline cellulose. Conclusions The results presented herewith show the efficiency of the proposed method for improving the enzymatic digestibility of sugarcane bagasse and provide understanding of the pretreatment action mechanism. Combining the different techniques applied in this work warranted thorough information about the undergoing morphological and chemical changes and was an efficient approach to understand the morphological effects resulting from sample delignification and its influence on the enhanced hydrolysis results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The use of lignocellulosic constituents in biotechnological processes requires a selective separation of the main fractions (cellulose, hemicellulose and lignin). During diluted acid hydrolysis for hemicellulose extraction, several toxic compounds are formed by the degradation of sugars and lignin, which have ability to inhibit microbial metabolism. Thus, the use of a detoxification step represents an important aspect to be considered for the improvement of fermentation processes from hydrolysates. In this paper, we evaluated the application of Advanced Oxidative Processes (AOPs) for the detoxification of rice straw hemicellulosic hydrolysate with the goal of improving ethanol bioproduction by Pichia stipitis yeast. Aiming to reduce the toxicity of the hemicellulosic hydrolysate, different treatment conditions were analyzed. The treatments were carried out according to a Taguchi L16 orthogonal array to evaluate the influence of Fe+2, H2O2, UV, O3 and pH on the concentration of aromatic compounds and the fermentative process. Results The results showed that the AOPs were able to remove aromatic compounds (furan and phenolic compounds derived from lignin) without affecting the sugar concentration in the hydrolysate. Ozonation in alkaline medium (pH 8) in the presence of H2O2 (treatment A3) or UV radiation (treatment A5) were the most effective for hydrolysate detoxification and had a positive effect on increasing the yeast fermentability of rice straw hemicellulose hydrolysate. Under these conditions, the higher removal of total phenols (above 40%), low molecular weight phenolic compounds (above 95%) and furans (above 52%) were observed. In addition, the ethanol volumetric productivity by P. stipitis was increased in approximately twice in relation the untreated hydrolysate. Conclusion These results demonstrate that AOPs are a promising methods to reduce toxicity and improve the fermentability of lignocellulosic hydrolysates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was based on a study of social enterprises in Brazil, to find out if and how these organizations plan and manage the succession process for their senior positions. The study investigated the subset of the associations dedicated to collectively producing goods and services, because they are formally set up and aimed at speeding up the dynamism of local development. The empirical research consisted of two stages. The first was a survey covering a sample of 378 organizations, to find out which of those had already undergone or were undergoing a succession process. The second interviewed the main manager of 32 organizations, to obtain a description of their succession experience. In this stage, the research aimed to analyze how the Individual, Organization and Environment dimensions interact to configure the succession process, identifying which factors of each of these dimensions can facilitate or limit this process. The following guiding elements were taken as the analytical basis: Individual dimension - leadership roles, skill and styles; Organization dimension - structure, planning, advisory boards, communication (transparency), control and evaluation; and Environment dimension - influence of the stakeholders (community, suppliers, clients, and business partners) on the succession process. The results indicated that succession in the researched associations is in the construction stage: it adapts to the requirements of current circumstances but is evidently in need of improvement in order for more effective planning and shared management of the process to be achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Confronto tra due software specifici per l'analisi di rischio nel trasporto stradale di merci pericolose (TRAT GIS 4.1 e QRAM 3.6) mediante applicazione a un caso di studio semplice e al caso reale di Casalecchio di Reno, comune della provincia di Bologna.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process algebraic architectural description languages provide a formal means for modeling software systems and assessing their properties. In order to bridge the gap between system modeling and system im- plementation, in this thesis an approach is proposed for automatically generating multithreaded object-oriented code from process algebraic architectural descriptions, in a way that preserves – under certain assumptions – the properties proved at the architectural level. The approach is divided into three phases, which are illustrated by means of a running example based on an audio processing system. First, we develop an architecture-driven technique for thread coordination management, which is completely automated through a suitable package. Second, we address the translation of the algebraically-specified behavior of the individual software units into thread templates, which will have to be filled in by the software developer according to certain guidelines. Third, we discuss performance issues related to the suitability of synthesizing monitors rather than threads from software unit descriptions that satisfy specific constraints. In addition to the running example, we present two case studies about a video animation repainting system and the implementation of a leader election algorithm, in order to summarize the whole approach. The outcome of this thesis is the implementation of the proposed approach in a translator called PADL2Java and its integration in the architecture-centric verification tool TwoTowers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Subduction zones are the favorite places to generate tsunamigenic earthquakes, where friction between oceanic and continental plates causes the occurrence of a strong seismicity. The topics and the methodologies discussed in this thesis are focussed to the understanding of the rupture process of the seismic sources of great earthquakes that generate tsunamis. The tsunamigenesis is controlled by several kinematical characteristic of the parent earthquake, as the focal mechanism, the depth of the rupture, the slip distribution along the fault area and by the mechanical properties of the source zone. Each of these factors plays a fundamental role in the tsunami generation. Therefore, inferring the source parameters of tsunamigenic earthquakes is crucial to understand the generation of the consequent tsunami and so to mitigate the risk along the coasts. The typical way to proceed when we want to gather information regarding the source process is to have recourse to the inversion of geophysical data that are available. Tsunami data, moreover, are useful to constrain the portion of the fault area that extends offshore, generally close to the trench that, on the contrary, other kinds of data are not able to constrain. In this thesis I have discussed the rupture process of some recent tsunamigenic events, as inferred by means of an inverse method. I have presented the 2003 Tokachi-Oki (Japan) earthquake (Mw 8.1). In this study the slip distribution on the fault has been inferred by inverting tsunami waveform, GPS, and bottom-pressure data. The joint inversion of tsunami and geodetic data has revealed a much better constrain for the slip distribution on the fault rather than the separate inversions of single datasets. Then we have studied the earthquake occurred on 2007 in southern Sumatra (Mw 8.4). By inverting several tsunami waveforms, both in the near and in the far field, we have determined the slip distribution and the mean rupture velocity along the causative fault. Since the largest patch of slip was concentrated on the deepest part of the fault, this is the likely reason for the small tsunami waves that followed the earthquake, pointing out how much the depth of the rupture plays a crucial role in controlling the tsunamigenesis. Finally, we have presented a new rupture model for the great 2004 Sumatra earthquake (Mw 9.2). We have performed the joint inversion of tsunami waveform, GPS and satellite altimetry data, to infer the slip distribution, the slip direction, and the rupture velocity on the fault. Furthermore, in this work we have presented a novel method to estimate, in a self-consistent way, the average rigidity of the source zone. The estimation of the source zone rigidity is important since it may play a significant role in the tsunami generation and, particularly for slow earthquakes, a low rigidity value is sometimes necessary to explain how a relatively low seismic moment earthquake may generate significant tsunamis; this latter point may be relevant for explaining the mechanics of the tsunami earthquakes, one of the open issues in present day seismology. The investigation of these tsunamigenic earthquakes has underlined the importance to use a joint inversion of different geophysical data to determine the rupture characteristics. The results shown here have important implications for the implementation of new tsunami warning systems – particularly in the near-field – the improvement of the current ones, and furthermore for the planning of the inundation maps for tsunami-hazard assessment along the coastal area.