232 resultados para Statistical modeling technique


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Honeycomb structures have been used in different engineering fields. In civil engineering, honeycomb fiber-reinforced polymer (FRP) structures have been used as bridge decks to rehabilitate highway bridges in the United States. In this work, a simplified finite-element modeling technique for honeycomb FRP bridge decks is presented. The motivation is the combination of the complex geometry of honeycomb FRP decks and computational limits, which may prevent modeling of these decks in detail. The results from static and modal analyses indicate that the proposed modeling technique provides a viable tool for modeling the complex geometry of honeycomb FRP bridge decks. The modeling of other bridge components (e.g., steel girders, steel guardrails, deck-to-girder connections, and pier supports) is also presented in this work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJETIVO: Estabelecer a evolução da prevalência de desnutrição na população brasileira de crianças menores de cinco anos de idade entre 1996 e 2007 e identificar os principais fatores responsáveis por essa evolução.MÉTODOS: Os dados analisados procedem de inquéritos "Demographic Health Surveys" realizados no Brasil em 1996 e 2006/7 em amostras probabilísticas de cerca de 4 mil crianças menores de cinco anos. A identificação dos fatores responsáveis pela variação temporal da prevalência da desnutrição (altura-para-idade inferior a -2 escores z; padrão OMS 2006) considerou mudanças na distribuição de quatro determinantes potenciais do estado nutricional. Modelagem estatística da associação independente entre determinante e risco de desnutrição em cada inquérito e cálculo de frações atribuíveis parciais foram utilizados para avaliar a importância relativa de cada fator na evolução da desnutrição infantil. RESULTADOS: A prevalência da desnutrição foi reduzida em cerca de 50%: de 13,5% (IC 95%: 12,1%;14,8%) em 1996 para 6,8% (5,4%;8,3%) em 2006/7. Dois terços dessa redução poderiam ser atribuídos à evolução favorável dos quatro fatores estudados: 25,7% ao aumento da escolaridade materna; 21,7% ao crescimento do poder aquisitivo das famílias; 11,6% à expansão da assistência à saúde e 4,3% à melhoria nas condições de saneamento.CONCLUSÕES: A taxa anual de declínio de 6,3% na proporção de crianças com déficits de altura-para-idade indica que em cerca de mais dez anos a desnutrição infantil poderia deixar de ser um problema de saúde pública no Brasil. A conquista desse resultado dependerá da manutenção das políticas econômicas e sociais que têm favorecido o aumento do poder aquisitivo dos mais pobres e de investimentos públicos que permitam completar a universalização do acesso da população brasileira aos serviços essenciais de educação, saúde e saneamento

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJETIVO: Descrever a variação temporal na prevalência de desnutrição infantil na região Nordeste do Brasil, em dois períodos sucessivos, identificando os principais fatores responsáveis pela evolução observada em cada período. MÉTODOS: Os dados analisados provêm de amostras probabilísticas da população de crianças menores de cinco anos estudadas por inquéritos domiciliares do programa Demographic Health Surveys realizados em 1986 (n=1.302), 1996 (n=1.108) e 2006 (n=950). A identificação dos fatores responsáveis pela variação na prevalência da desnutrição (altura para idade < -2 z) levou em conta mudanças na freqüência de cinco determinantes potenciais do estado nutricional, modelagens estatísticas da associação independente entre determinante e risco de desnutrição no início de cada período e cálculo de frações atribuíveis. RESULTADOS: A prevalência da desnutrição foi reduzida em um terço de 1986 a 1996 (de 33,9 por cento para 22,2 por cento ) e em quase três quartos de 1996 a 2006(de 22,2 por cento para 5,9 por cento ). Melhorias na escolaridade materna e na disponibilidade de serviços de saneamento foram particularmente importantes para o declínio da desnutrição no primeiro período, enquanto no segundo período foram decisivos o aumento do poder aquisitivo das famílias mais pobres e, novamente, a melhoria da escolaridade materna. CONCLUSÕES: A aceleração do declínio da desnutrição do primeiro para o segundo período foi consistente com a aceleração de melhorias em escolaridade materna, saneamento, assistência à saúde e antecedentes reprodutivos e, sobretudo, com o excepcional aumento do poder aquisitivo familiar, observado apenas no segundo período. Mantida a taxa de declínio observada entre 1996 e 2006, o problema da desnutrição infantil na região Nordeste poderia ser considerado controlado em menos de dez anos. ) Para se chegar a este resultado será preciso manter o aumento do poder aquisitivo dos mais pobres e assegurar investimentos públicos para completar a universalização do acesso a serviços essenciais de educação, saúde e saneamento

Relevância:

50.00% 50.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We describe an estimation technique for biomass burning emissions in South America based on a combination of remote-sensing fire products and field observations, the Brazilian Biomass Burning Emission Model (3BEM). For each fire pixel detected by remote sensing, the mass of the emitted tracer is calculated based on field observations of fire properties related to the type of vegetation burning. The burnt area is estimated from the instantaneous fire size retrieved by remote sensing, when available, or from statistical properties of the burn scars. The sources are then spatially and temporally distributed and assimilated daily by the Coupled Aerosol and Tracer Transport model to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS) in order to perform the prognosis of related tracer concentrations. Three other biomass burning inventories, including GFEDv2 and EDGAR, are simultaneously used to compare the emission strength in terms of the resultant tracer distribution. We also assess the effect of using the daily time resolution of fire emissions by including runs with monthly-averaged emissions. We evaluate the performance of the model using the different emission estimation techniques by comparing the model results with direct measurements of carbon monoxide both near-surface and airborne, as well as remote sensing derived products. The model results obtained using the 3BEM methodology of estimation introduced in this paper show relatively good agreement with the direct measurements and MOPITT data product, suggesting the reliability of the model at local to regional scales.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, the concept of cellular automata is applied in an innovative way to simulate the separation of phases in a water/oil emulsion. The velocity of the water droplets is calculated by the balance of forces acting on a pair of droplets in a group, and cellular automata is used to simulate the whole group of droplets. Thus, it is possible to solve the problem stochastically and to show the sequence of collisions of droplets and coalescence phenomena. This methodology enables the calculation of the amount of water that can be separated from the emulsion under different operating conditions, thus enabling the process to be optimized. Comparisons between the results obtained from the developed model and the operational performance of an actual desalting unit are carried out. The accuracy observed shows that the developed model is a good representation of the actual process. (C) 2010 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Onion (Allium cepa) is one of the most cultivated and consumed vegetables in Brazil and its importance is due to the large laborforce involved. One of the main pests that affect this crop is the Onion Thrips (Thrips tabaci), but the spatial distribution of this insect, although important, has not been considered in crop management recommendations, experimental planning or sampling procedures. Our purpose here is to consider statistical tools to detect and model spatial patterns of the occurrence of the onion thrips. In order to characterize the spatial distribution pattern of the Onion Thrips a survey was carried out to record the number of insects in each development phase on onion plant leaves, on different dates and sample locations, in four rural properties with neighboring farms under different infestation levels and planting methods. The Mantel randomization test proved to be a useful tool to test for spatial correlation which, when detected, was described by a mixed spatial Poisson model with a geostatistical random component and parameters allowing for a characterization of the spatial pattern, as well as the production of prediction maps of susceptibility to levels of infestation throughout the area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

study-specific results, their findings should be interpreted with caution

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The biochemical alterations between inflammatory fibrous hyperplasia (IFH) and normal tissues of buccal mucosa were probed by using the FT-Raman spectroscopy technique. The aim was to find the minimal set of Raman bands that would furnish the best discrimination. Background: Raman-based optical biopsy is a widely recognized potential technique for noninvasive real-time diagnosis. However, few studies had been devoted to the discrimination of very common subtle or early pathologic states as inflammatory processes that are always present on, for example, cancer lesion borders. Methods: Seventy spectra of IFH from 14 patients were compared with 30 spectra of normal tissues from six patients. The statistical analysis was performed with principal components analysis and soft independent modeling class analogy cross-validated, leave-one-out methods. Results: Bands close to 574, 1,100, 1,250 to 1,350, and 1,500 cm(-1) (mainly amino acids and collagen bands) showed the main intragroup variations that are due to the acanthosis process in the IFH epithelium. The 1,200 (C-C aromatic/DNA), 1,350 (CH(2) bending/collagen 1), and 1,730 cm(-1) (collagen III) regions presented the main intergroup variations. This finding was interpreted as originating in an extracellular matrix-degeneration process occurring in the inflammatory tissues. The statistical analysis results indicated that the best discrimination capability (sensitivity of 95% and specificity of 100%) was found by using the 530-580 cm(-1) spectral region. Conclusions: The existence of this narrow spectral window enabling normal and inflammatory diagnosis also had useful implications for an in vivo dispersive Raman setup for clinical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The confined flows in tubes with permeable surfaces arc associated to tangential filtration processes (microfiltration or ultrafiltration). The complexity of the phenomena do not allow for the development of exact analytical solutions, however, approximate solutions are of great interest for the calculation of the transmembrane outflow and estimate of the concentration, polarization phenomenon. In the present work, the generalized integral transform technique (GITT) was employed in solving the laminar and permanent flow in permeable tubes of Newtonian and incompressible fluid. The mathematical formulation employed the parabolic differential equation of chemical species conservation (convective-diffusive equation). The velocity profiles for the entrance region flow, which are found in the connective terms of the equation, were assessed by solutions obtained from literature. The velocity at the permeable wall was considered uniform, with the concentration at the tube wall regarded as variable with an axial position. A computational methodology using global error control was applied to determine the concentration in the wall and concentration boundary layer thickness. The results obtained for the local transmembrane flux and the concentration boundary layer thickness were compared against others in literature. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cementitious stabilization of aggregates and soils is an effective technique to increase the stiffness of base and subbase layers. Furthermore, cementitious bases can improve the fatigue behavior of asphalt surface layers and subgrade rutting over the short and long term. However, it can lead to additional distresses such as shrinkage and fatigue in the stabilized layers. Extensive research has tested these materials experimentally and characterized them; however, very little of this research attempts to correlate the mechanical properties of the stabilized layers with their performance. The Mechanistic Empirical Pavement Design Guide (MEPDG) provides a promising theoretical framework for the modeling of pavements containing cementitiously stabilized materials (CSMs). However, significant improvements are needed to bring the modeling of semirigid pavements in MEPDG to the same level as that of flexible and rigid pavements. Furthermore, the MEPDG does not model CSMs in a manner similar to those for hot-mix asphalt or portland cement concrete materials. As a result, performance gains from stabilized layers are difficult to assess using the MEPDG. The current characterization of CSMs was evaluated and issues with CSM modeling and characterization in the MEPDG were discussed. Addressing these issues will help designers quantify the benefits of stabilization for pavement service life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interval-censored survival data, in which the event of interest is not observed exactly but is only known to occur within some time interval, occur very frequently. In some situations, event times might be censored into different, possibly overlapping intervals of variable widths; however, in other situations, information is available for all units at the same observed visit time. In the latter cases, interval-censored data are termed grouped survival data. Here we present alternative approaches for analyzing interval-censored data. We illustrate these techniques using a survival data set involving mango tree lifetimes. This study is an example of grouped survival data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a statistical model of agricultural yield data based on a set of hierarchical Bayesian models that allows joint modeling of temporal and spatial autocorrelation. This method captures a comprehensive range of the various uncertainties involved in predicting crop insurance premium rates as opposed to the more traditional ad hoc, two-stage methods that are typically based on independent estimation and prediction. A panel data set of county-average yield data was analyzed for 290 counties in the State of Parana (Brazil) for the period of 1990 through 2002. Posterior predictive criteria are used to evaluate different model specifications. This article provides substantial improvements in the statistical and actuarial methods often applied to the calculation of insurance premium rates. These improvements are especially relevant to situations where data are limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Histamine is an important biogenic amine, which acts with a group of four G-protein coupled receptors (GPCRs), namely H(1) to H(4) (H(1)R - H(4)R) receptors. The actions of histamine at H(4)R are related to immunological and inflammatory processes, particularly in pathophysiology of asthma, and H(4)R ligands having antagonistic properties could be helpful as antiinflammatory agents. In this work, molecular modeling and QSAR studies of a set of 30 compounds, indole and benzimidazole derivatives, as H(4)R antagonists were performed. The QSAR models were built and optimized using a genetic algorithm function and partial least squares regression (WOLF 5.5 program). The best QSAR model constructed with training set (N = 25) presented the following statistical measures: r (2) = 0.76, q (2) = 0.62, LOF = 0.15, and LSE = 0.07, and was validated using the LNO and y-randomization techniques. Four of five compounds of test set were well predicted by the selected QSAR model, which presented an external prediction power of 80%. These findings can be quite useful to aid the designing of new anti-H(4) compounds with improved biological response.