637 resultados para Missions to Muslims.
Resumo:
In this paper, composites from polypropylene and Kraft pulp (from Pinus radiata) were prepared. Phenyl isocyanate, unblocked and phenol blocked derivatives of 4,4`-methylenebis (phenyl isocyanate) (MDI) were used as coupling agents and the mechanical properties of the obtained composites analyzed. The results showed that the addition of such compatibilizers readily improved the tensile and flexural strengths of the composites. However, no significant variation in the mechanical properties was observed for composite formulations comprising different isocyanate compounds. Accordingly, the chemical structure of isocyanate derivatives did not affect extensively the mechanical properties of MDI-coupled pine fiber reinforced composites. These results were similar to those obtained in previous studies regarding the efficiency of organosilane coupling agents. In comparison to monoreactive isocyanates, the addition of MIDI increased considerably the mechanical properties of pine fiber-polypropylene composites. The mechanical anchoring of polymeric PP chains onto the irregular reinforcement surface supported this result. Non-isothermal DSC analysis showed a slowing effect of MDI on the crystallization kinetics of the coupled composites. This may have been the result of diminished polymer chain mobility in the matrix due to mechanical anchoring onto the fiber surface. Considering these results, the occurrence of strong bonds between the composite components was stated, rather than the unique existence of Van der Waals interactions among the non-polar structures. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents results of an experimental investigation carried out to determine the effects of the surface roughness of different materials on nucleate boiling heat transfer of refrigerants R-134a and R-123. Experiments have been performed over cylindrical surfaces of copper, brass and stainless steel. Surfaces have been treated by different methods in order to obtain an average roughness, Ra, varying from 0.03 mu m to 10.5 mu m. Boiling curves at different reduced pressures have been raised as part of the investigation. The obtained results have shown significant effects of the surface material, with brass being the best performing and stainless steel the worst. Polished surfaces seem to present slightly better performance than the sand paper roughened. Boiling on very rough surfaces presents a peculiar behavior characterized by good thermal performance at low heat fluxes, the performance deteriorating at high heat fluxes with respect to smoother surfaces. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
In this paper, the microbial characteristics of the granular sludge in the presence of oxygen (3.0 +/- 0.7 mg O-2 1(-1)) were analyzed using molecular biology techniques. The granules were provided by an upflow anaerobic sludge blanket (UASB) operated over 469 days and fed with synthetic substrate. Ethanol and sulfate were added to obtain different COD/SO42- ratios (3.0, 2.0, and 1.6). The results of fluorescent in situ hybridization (FISH) analyses showed that archaeal cells, detected by the ARC915 probe, accounted for 77%, 84%, and 75% in the COD/SO42- ratios (3.0, 2.0, and 1.6, respectively). Methanosaeta sp. was the predominant acetoclastic archaea observed by optical microscopy and FISH analyses, and confirmed by sequencing of the excised bands of the DGGE gel with a similarity of 96%. The sulfate-reducing bacterium Desulfovibrio vulgaris subsp. vulgaris (similarity of 99%) was verified by sequencing of the DGGE band. Others identified microorganism were similar to Shewanella sp. and Desulfitobacterium hafniense, with similarities of 95% and 99%, respectively. These results confirmed that the presence of oxygen did not severely affect the metabolism of microorganisms that are commonly considered strictly anaerobic. We obtained mean efficiencies of organic matter conversion and sulfate reducing higher than 74%. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Swallowing dynamics involves the coordination and interaction of several muscles and nerves which allow correct food transport from mouth to stomach without laryngotracheal penetration or aspiration. Clinical swallowing assessment depends on the evaluator`s knowledge of anatomic structures and of neurophysiological processes involved in swallowing. Any alteration in those steps is denominated oropharyngeal dysphagia, which may have many causes, such as neurological or mechanical disorders. Videofluoroscopy of swallowing is presently considered to be the best exam to objectively assess the dynamics of swallowing, but the exam needs to be conducted under certain restrictions, due to patient`s exposure to radiation, which limits periodical repetition for monitoring swallowing therapy. Another method, called cervical auscultation, is a promising new diagnostic tool for the assessment of swallowing disorders. The potential to diagnose dysphagia in a noninvasive manner by assessing the sounds of swallowing is a highly attractive option for the dysphagia clinician. Even so, the captured sound has an amount of noise, which can hamper the evaluator`s decision. In that way, the present paper proposes the use of a filter to improve the quality of audible sound and facilitate the perception of examination. The wavelet denoising approach is used to decompose the noisy signal. The signal to noise ratio was evaluated to demonstrate the quantitative results of the proposed methodology. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Chloride attack in marine environments or in structures where deicing salts are used will not always show profiles with concentrations that decrease from the external surface to the interior of the concrete. Some profiles show an increase in chloride concentrations from when a peak is formed. This type of profile must be analyzed in a different way from the traditional model of Fick`s second law to generate more precise service life models. A model for forecasting the penetration of chloride ions as a function of time for profiles having formed a peak. To confirm the efficiency of this model, it is necessary to observe the behavior of a chloride profile with peak in a specific structure over a period of time. To achieve this, two chloride profiles with different ages (22 and 27 years) were extracted from the same structure. The profile obtained from the 22-year sample was used to estimate the chloride profile at 27 years using three models: a) the traditional model using Fick`s second law and extrapolating the value of C(S)-external surface chloride concentration; b) the traditional model using Fick`s second law and shifting the x-axis to the peak depth; c) the previously proposed model. The results from these models were compared with the actual profile measured in the 27-year sample and the results were analyzed. The model was presented with good precision for this study of case, requiring to be tested with other structures in use.
Resumo:
The continuous growth of peer-to-peer networks has made them responsible for a considerable portion of the current Internet traffic. For this reason, improvements in P2P network resources usage are of central importance. One effective approach for addressing this issue is the deployment of locality algorithms, which allow the system to optimize the peers` selection policy for different network situations and, thus, maximize performance. To date, several locality algorithms have been proposed for use in P2P networks. However, they usually adopt heterogeneous criteria for measuring the proximity between peers, which hinders a coherent comparison between the different solutions. In this paper, we develop a thoroughly review of popular locality algorithms, based on three main characteristics: the adopted network architecture, distance metric, and resulting peer selection algorithm. As result of this study, we propose a novel and generic taxonomy for locality algorithms in peer-to-peer networks, aiming to enable a better and more coherent evaluation of any individual locality algorithm.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a free software tool that supports the next-generation Mobile Communications, through the automatic generation of models of components and electronic devices based on neural networks. This tool enables the creation, training, validation and simulation of the model directly from measurements made on devices of interest, using an interface totally oriented to non-experts in neural models. The resulting model can be exported automatically to a traditional circuit simulator to test different scenarios.
Resumo:
This paper shows a new hybrid method for risk assessment regarding interruptions in sensitive processes due to faults in electric power distribution systems. This method determines indices related to long duration interruptions and short duration voltage variations (SDVV), such as voltage sags and swells in each customer supplied by the distribution network. Frequency of such occurrences and their impact on customer processes are determined for each bus and classified according to their corresponding magnitude and duration. The method is based on information regarding network configuration, system parameters and protective devices. It randomly generates a number of fault scenarios in order to assess risk areas regarding long duration interruptions and voltage sags and swells in an especially inventive way, including frequency of events according to their magnitude and duration. Based on sensitivity curves, the method determines frequency indices regarding disruption in customer processes that represent equipment malfunction and possible process interruptions due to voltage sags and swells. Such approach allows for the assessment of the annual costs associated with each one of the evaluated power quality indices.
Resumo:
This paper discusses the need to simultaneously monitor voltage unbalance and harmonic distortions in addition to root-mean-square voltage values. An alternative way to obtain the parameters related to voltage unbalance at fundamental frequency as well as voltage harmonic distortions is here proposed, which is based on the representation of instantaneous values at the axes and at the instantaneous Euclidean norm. A new power-quality (PQ) index is then proposed to combine the effects of voltage unbalance and harmonic distortions. This new index is easily implemented into existing electronic power meters. This PQ index is determined from the analysis of temperature rise in induction motor windings, which were tested for long periods of time. This paper also shows that these voltage disturbances, which are harmful to the lifetime expectancy of motors, can be measured by alternative ways in relation to conventional methods. Although this paper deals with induction motors only, the results show the relevance for further studies on other pieces of equipment.
Resumo:
In this paper a computational implementation of an evolutionary algorithm (EA) is shown in order to tackle the problem of reconfiguring radial distribution systems. The developed module considers power quality indices such as long duration interruptions and customer process disruptions due to voltage sags, by using the Monte Carlo simulation method. Power quality costs are modeled into the mathematical problem formulation, which are added to the cost of network losses. As for the EA codification proposed, a decimal representation is used. The EA operators, namely selection, recombination and mutation, which are considered for the reconfiguration algorithm, are herein analyzed. A number of selection procedures are analyzed, namely tournament, elitism and a mixed technique using both elitism and tournament. The recombination operator was developed by considering a chromosome structure representation that maps the network branches and system radiality, and another structure that takes into account the network topology and feasibility of network operation to exchange genetic material. The topologies regarding the initial population are randomly produced so as radial configurations are produced through the Prim and Kruskal algorithms that rapidly build minimum spanning trees. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A novel methodology to assess the risk of power transformer failures caused by external faults, such as short-circuit, taking the paper insulation condition into account, is presented. The risk index is obtained by contrasting the insulation paper condition with the probability that the transformer withstands the short-circuit current flowing along the winding during an external fault. In order to assess the risk, this probability and the value of the degree of polymerization of the insulating paper are regarded as inputs of a type-2 fuzzy logic system (T2-FLS), which computes the fuzzy risk level. A Monte Carlo simulation has been used to find the survival function of the currents flowing through the transformer winding during a single-phase or a three-phase short-circuit. The Roy Billinton Test System and a real power system have been used to test the results. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A geometrical approach of the finite-element analysis applied to electrostatic fields is presented. This approach is particularly well adapted to teaching Finite Elements in Electrical Engineering courses at undergraduate level. The procedure leads to the same system of algebraic equations as that derived by classical approaches, such as variational principle or weighted residuals for nodal elements with plane symmetry. It is shown that the extension of the original procedure to three dimensions is straightforward, provided the domain be meshed in first-order tetrahedral elements. The element matrices are derived by applying Maxwell`s equations in integral form to suitably chosen surfaces in the finite-element mesh.
Resumo:
A procedure is proposed to accurately model thin wires in lossy media by finite element analysis. It is based on the determination of a suitable element width in the vicinity of the wire, which strongly depends on the wire radius to yield accurate results. The approach is well adapted to the analysis of grounding systems. The numerical results of the application of finite element analysis with the suitably chosen element width are compared with both analytical results and those computed by a commercial package for the analysis of grounding systems, showing very good agreement.