163 resultados para Modeling technique
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Honeycomb structures have been used in different engineering fields. In civil engineering, honeycomb fiber-reinforced polymer (FRP) structures have been used as bridge decks to rehabilitate highway bridges in the United States. In this work, a simplified finite-element modeling technique for honeycomb FRP bridge decks is presented. The motivation is the combination of the complex geometry of honeycomb FRP decks and computational limits, which may prevent modeling of these decks in detail. The results from static and modal analyses indicate that the proposed modeling technique provides a viable tool for modeling the complex geometry of honeycomb FRP bridge decks. The modeling of other bridge components (e.g., steel girders, steel guardrails, deck-to-girder connections, and pier supports) is also presented in this work.
Resumo:
In this study, the concept of cellular automata is applied in an innovative way to simulate the separation of phases in a water/oil emulsion. The velocity of the water droplets is calculated by the balance of forces acting on a pair of droplets in a group, and cellular automata is used to simulate the whole group of droplets. Thus, it is possible to solve the problem stochastically and to show the sequence of collisions of droplets and coalescence phenomena. This methodology enables the calculation of the amount of water that can be separated from the emulsion under different operating conditions, thus enabling the process to be optimized. Comparisons between the results obtained from the developed model and the operational performance of an actual desalting unit are carried out. The accuracy observed shows that the developed model is a good representation of the actual process. (C) 2010 Published by Elsevier Ltd.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
We describe an estimation technique for biomass burning emissions in South America based on a combination of remote-sensing fire products and field observations, the Brazilian Biomass Burning Emission Model (3BEM). For each fire pixel detected by remote sensing, the mass of the emitted tracer is calculated based on field observations of fire properties related to the type of vegetation burning. The burnt area is estimated from the instantaneous fire size retrieved by remote sensing, when available, or from statistical properties of the burn scars. The sources are then spatially and temporally distributed and assimilated daily by the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS) in order to perform the prognosis of related tracer concentrations. Three other biomass burning inventories, including GFEDv2 and EDGAR, are simultaneously used to compare the emission strength in terms of the resultant tracer distribution. We also assess the effect of using the daily time resolution of fire emissions by including runs with monthly-averaged emissions. We evaluate the performance of the model using the different emission estimation techniques by comparing the model results with direct measurements of carbon monoxide both near-surface and airborne, as well as remote sensing derived products. The model results obtained using the 3BEM methodology of estimation introduced in this paper show relatively good agreement with the direct measurements and MOPITT data product, suggesting the reliability of the model at local to regional scales.
Resumo:
The confined flows in tubes with permeable surfaces arc associated to tangential filtration processes (microfiltration or ultrafiltration). The complexity of the phenomena do not allow for the development of exact analytical solutions, however, approximate solutions are of great interest for the calculation of the transmembrane outflow and estimate of the concentration, polarization phenomenon. In the present work, the generalized integral transform technique (GITT) was employed in solving the laminar and permanent flow in permeable tubes of Newtonian and incompressible fluid. The mathematical formulation employed the parabolic differential equation of chemical species conservation (convective-diffusive equation). The velocity profiles for the entrance region flow, which are found in the connective terms of the equation, were assessed by solutions obtained from literature. The velocity at the permeable wall was considered uniform, with the concentration at the tube wall regarded as variable with an axial position. A computational methodology using global error control was applied to determine the concentration in the wall and concentration boundary layer thickness. The results obtained for the local transmembrane flux and the concentration boundary layer thickness were compared against others in literature. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Cementitious stabilization of aggregates and soils is an effective technique to increase the stiffness of base and subbase layers. Furthermore, cementitious bases can improve the fatigue behavior of asphalt surface layers and subgrade rutting over the short and long term. However, it can lead to additional distresses such as shrinkage and fatigue in the stabilized layers. Extensive research has tested these materials experimentally and characterized them; however, very little of this research attempts to correlate the mechanical properties of the stabilized layers with their performance. The Mechanistic Empirical Pavement Design Guide (MEPDG) provides a promising theoretical framework for the modeling of pavements containing cementitiously stabilized materials (CSMs). However, significant improvements are needed to bring the modeling of semirigid pavements in MEPDG to the same level as that of flexible and rigid pavements. Furthermore, the MEPDG does not model CSMs in a manner similar to those for hot-mix asphalt or portland cement concrete materials. As a result, performance gains from stabilized layers are difficult to assess using the MEPDG. The current characterization of CSMs was evaluated and issues with CSM modeling and characterization in the MEPDG were discussed. Addressing these issues will help designers quantify the benefits of stabilization for pavement service life.
Resumo:
The pervasive and ubiquitous computing has motivated researches on multimedia adaptation which aims at matching the video quality to the user needs and device restrictions. This technique has a high computational cost which needs to be studied and estimated when designing architectures and applications. This paper presents an analytical model to quantify these video transcoding costs in a hardware independent way. The model was used to analyze the impact of transcoding delays in end-to-end live-video transmissions over LANs, MANs and WANs. Experiments confirm that the proposed model helps to define the best transcoding architecture for different scenarios.
Resumo:
The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.
Resumo:
In this paper, the laminar fluid flow of Newtonian and non-Newtonian of aqueous solutions in a tubular membrane is numerically studied. The mathematical formulation, with associated initial and boundary conditions for cylindrical coordinates, comprises the mass conservation, momentum conservation and mass transfer equations. These equations are discretized by using the finite-difference technique on a staggered grid system. Comparisons of the three upwinding schemes for discretization of the non-linear (convective) terms are presented. The effects of several physical parameters on the concentration profile are investigated. The numerical results compare favorably with experimental data and the analytical solutions. (C) 2011 Elsevier Inc. All rights reserved.
Rehabilitation of severely resorbed edentulous mandible using the modified visor osteotomy technique
Resumo:
The prosthetic rehabilitation of an atrophic mandible is usually unsatisfactory due to the lack of support tissues, mainly bone and keratinized mucosa for treatment with osseointegrated implants or even conventional prosthesis. The prosthetic instability leads to social and functional limitations and chronic physical trauma decreasing the patient's quality of life. A 53-year-old female patient sought care at our surgical service complaining of impairment of her masticatory function associated with the instability of the lower total prosthetic denture. The clinical and complementary exams revealed edentulism in both arches, while the mandibular arch presented severe reabsorption resulting in denture instability and chronic trauma to the oral mucosa. The proposed treatment plan consisted in the mandibular rehabilitation with osseointegrated implants and fixed Brånemark's protocol prosthesis after mandibular reconstruction applying the modified visor osteotomy technique. The proposed technique offered predictable results for reconstruction of the severely resorbed edentulous mandible and posterior rehabilitation with osseointegrated implants.
Resumo:
Onion (Allium cepa) is one of the most cultivated and consumed vegetables in Brazil and its importance is due to the large laborforce involved. One of the main pests that affect this crop is the Onion Thrips (Thrips tabaci), but the spatial distribution of this insect, although important, has not been considered in crop management recommendations, experimental planning or sampling procedures. Our purpose here is to consider statistical tools to detect and model spatial patterns of the occurrence of the onion thrips. In order to characterize the spatial distribution pattern of the Onion Thrips a survey was carried out to record the number of insects in each development phase on onion plant leaves, on different dates and sample locations, in four rural properties with neighboring farms under different infestation levels and planting methods. The Mantel randomization test proved to be a useful tool to test for spatial correlation which, when detected, was described by a mixed spatial Poisson model with a geostatistical random component and parameters allowing for a characterization of the spatial pattern, as well as the production of prediction maps of susceptibility to levels of infestation throughout the area.
Resumo:
Below cloud scavenging processes have been investigated considering a numerical simulation, local atmospheric conditions and particulate matter (PM) concentrations, at different sites in Germany. The below cloud scavenging model has been coupled with bulk particulate matter counter TSI (Trust Portacounter dataset, consisting of the variability prediction of the particulate air concentrations during chosen rain events. The TSI samples and meteorological parameters were obtained during three winter Campaigns: at Deuselbach, March 1994, consisting in three different events; Sylt, April 1994 and; Freiburg, March 1995. The results show a good agreement between modeled and observed air concentrations, emphasizing the quality of the conceptual model used in the below cloud scavenging numerical modeling. The results between modeled and observed data have also presented high square Pearson coefficient correlations over 0.7 and significant, except the Freiburg Campaign event. The differences between numerical simulations and observed dataset are explained by the wind direction changes and, perhaps, the absence of advection mass terms inside the modeling. These results validate previous works based on the same conceptual model.
Resumo:
In this work we study the problem of modeling identification of a population employing a discrete dynamic model based on the Richards growth model. The population is subjected to interventions due to consumption, such as hunting or farming animals. The model identification allows us to estimate the probability or the average time for a population number to reach a certain level. The parameter inference for these models are obtained with the use of the likelihood profile technique as developed in this paper. The identification method here developed can be applied to evaluate the productivity of animal husbandry or to evaluate the risk of extinction of autochthon populations. It is applied to data of the Brazilian beef cattle herd population, and the the population number to reach a certain goal level is investigated.
Resumo:
The enzyme purine nucleoside phosphorylase from Schistosoma mansoni (SmPNP) is an attractive molecular target for the development of novel drugs against schistosomiasis, a neglected tropical disease that affects about 200 million people worldwide. In the present work, enzyme kinetic studies were carried out in order to determine the potency and mechanism of inhibition of a series of SmPNP inhibitors. In addition to the biochemical investigations, crystallographic and molecular modeling studies revealed important molecular features for binding affinity towards the target enzyme, leading to the development of structure-activity relationships (SAR).
Resumo:
An important approach to cancer therapy is the design of small molecule modulators that interfere with microtubule dynamics through their specific binding to the ²-subunit of tubulin. In the present work, comparative molecular field analysis (CoMFA) studies were conducted on a series of discodermolide analogs with antimitotic properties. Significant correlation coefficients were obtained (CoMFA(i), q² =0.68, r²=0.94; CoMFA(ii), q² = 0.63, r²= 0.91), indicating the good internal and external consistency of the models generated using two independent structural alignment strategies. The models were externally validated employing a test set, and the predicted values were in good agreement with the experimental results. The final QSAR models and the 3D contour maps provided important insights into the chemical and structural basis involved in the molecular recognition process of this family of discodermolide analogs, and should be useful for the design of new specific ²-tubulin modulators with potent anticancer activity.