916 resultados para Nelson and Siegel model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cuboctahedron (CUB) and icosahedron (ICO) model structures are widely used in the study of transition-metal (TM) nanoparticles (NPs), however, it might not provide a reliable description for small TM NPs such as the Pt(55) and Au(55) systems in gas phase. In this work, we combined density-functional theory calculations with atomic configurations generated by the basin hopping Monte Carlo algorithm within the empirical Sutton-Chen embedded atom potential. We identified alternative lower energy configurations compared with the ICO and CUB model structures, e. g., our lowest energy structures are 5.22 eV (Pt(55)) and 2.01 eV (Au(55)) lower than ICO. The energy gain is obtained by the Pt and Au diffusion from the ICO core region to the NP surface, which is driven by surface compression (only 12 atoms) on the ICO core region. Therefore, in the lowest energy configurations, the core size reduces from 13 atoms (ICO, CUB) to about 9 atoms while the NP surface increases from 42 atoms (ICO, CUB) to about 46 atoms. The present mechanism can provide an improved atom-level understanding of small TM NPs reconstructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Brazilian Atlantic Forest is one of the richest biodiversity hotspots of the world. Paleoclimatic models have predicted two large stability regions in its northern and central parts, whereas southern regions might have suffered strong instability during Pleistocene glaciations. Molecular phylogeographic and endemism studies show, nevertheless, contradictory results: although some results validate these predictions, other data suggest that paleoclimatic models fail to predict stable rainforest areas in the south. Most studies, however, have surveyed species with relatively high dispersal rates whereas taxa with lower dispersion capabilities should be better predictors of habitat stability. Here, we have used two land planarian species as model organisms to analyse the patterns and levels of nucleotide diversity on a locality within the Southern Atlantic Forest. We find that both species harbour high levels of genetic variability without exhibiting the molecular footprint of recent colonization or population expansions, suggesting a long-term stability scenario. The results reflect, therefore, that paleoclimatic models may fail to detect refugia in the Southern Atlantic Forest, and that model organisms with low dispersal capability can improve the resolution of these models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Banana, an important component in the diet of the global population, is one of the most consumed fruits in the world. This fruit is also very favorable to industry processes (e. g., fermented beverages) due to its rich content on soluble solids and minerals, with low acidity. The main objective of this work was to evaluate the influence of factors such as banana weight and extraction time during a hot aqueous extraction process on the total soluble solids content of banana. The extract is to be used by the food and beverage industries. The experiments were performed with 105 mL of water, considering the moisture of the ripe banana (65%). Total sugar concentrations were obtained in a beer analyzer and the result expressed in degrees Plato (degrees P, which is the weight of the extract or the sugar equivalent in 100 g solution at 20 degrees C), aiming at facilitating the use of these results by the beverage industries. After previous studies of characterization of the fruit and of ripening performance, a 2(2) full-factorial star design was carried out, and a model was developed to describe the behavior of the dependent variable (total soluble solids) as a function of the factors (banana weight and extraction time), indicating as optimum conditions for extraction 38.5 g of banana at 39.7 min.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The kinetics of the ethoxylation of fatty alcohols catalyzed by potassium hydroxide was studied to obtain the rate constants for modeling of the industrial process. Experimental data obtained in a lab-scale semibatch autoclave reactor were used to evaluate kinetic and equilibrium parameters. The kinetic model was employed to model the performance of an industrial-scale spray tower reactor for fatty alcohol ethoxylation. The reactor model considers that mass transfer and reaction occur independently in two distinct zones of the reactor. Good agreement between the model predictions and real data was found. These findings confirm the reliability of the kinetic and reactor model for simulating fatty alcohol ethoxylation processes under industrial conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxidation processes can be used to treat industrial wastewater containing non-biodegradable organic compounds. However, the presence of dissolved salts may inhibit or retard the treatment process. In this study, wastewater desalination by electrodialysis (ED) associated with an advanced oxidation process (photo-Fenton) was applied to an aqueous NaCl solution containing phenol. The influence of process variables on the demineralization factor was investigated for ED in pilot scale and a correlation was obtained between the phenol, salt and water fluxes with the driving force. The oxidation process was investigated in a laboratory batch reactor and a model based on artificial neural networks was developed by fitting the experimental data describing the reaction rate as a function of the input variables. With the experimental parameters of both processes, a dynamic model was developed for ED and a continuous model, using a plug flow reactor approach, for the oxidation process. Finally, the hybrid model simulation could validate different scenarios of the integrated system and can be used for process optimization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lightning-induced overvoltages have a considerable impact on the power quality of overhead distribution and telecommunications systems, and various models have been developed for the computation of the electromagnetic transients caused by indirect strokes. The most adequate has been shown to be the one proposed by Agrawal et al.; the Rusck model can be visualized as a particular case, as both models are equivalent when the lightning channel is perpendicular to the ground plane. In this paper, an extension of the Rusck model that enables the calculation of lightning-induced transients considering flashes to nearby elevated structures and realistic line configurations is tested against data obtained from both natural lightning and scale model experiments. The latter, performed under controlled conditions, can be used also to verify the validity of other coupling models and relevant codes. The so-called Extended Rusck Model, which is shown to be sufficiently accurate, is applied to the analysis of lightning-induced voltages on lines with a shield wire and/or surge arresters. The investigation conducted indicates that the ratio between the peak values of the voltages induced by typical first and subsequent strokes can be either greater or smaller than the unity, depending on the line configuration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a regression model considering the modified Weibull distribution. This distribution can be used to model bathtub-shaped failure rate functions. Assuming censored data, we consider maximum likelihood and Jackknife estimators for the parameters of the model. We derive the appropriate matrices for assessing local influence on the parameter estimates under different perturbation schemes and we also present some ways to perform global influence. Besides, for different parameter settings, sample sizes and censoring percentages, various simulations are performed and the empirical distribution of the modified deviance residual is displayed and compared with the standard normal distribution. These studies suggest that the residual analysis usually performed in normal linear regression models can be straightforwardly extended for a martingale-type residual in log-modified Weibull regression models with censored data. Finally, we analyze a real data set under log-modified Weibull regression models. A diagnostic analysis and a model checking based on the modified deviance residual are performed to select appropriate models. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using data from a logging experiment in the eastern Brazilian Amazon region, we develop a matrix growth and yield model that captures the dynamic effects of harvest system choice on forest structure and composition. Multinomial logistic regression is used to estimate the growth transition parameters for a 10-year time step, while a Poisson regression model is used to estimate recruitment parameters. The model is designed to be easily integrated with an economic model of decisionmaking to perform tropical forest policy analysis. The model is used to compare the long-run structure and composition of a stand arising from the choice of implementing either conventional logging techniques or more carefully planned and executed reduced-impact logging (RIL) techniques, contrasted against a baseline projection of an unlogged forest. Results from log and leave scenarios show that a stand logged according to Brazilian management requirements will require well over 120 years to recover its initial commercial volume, regardless of logging technique employed. Implementing RIL, however, accelerates this recovery. Scenarios imposing a 40-year cutting cycle raise the possibility of sustainable harvest volumes, although at significantly lower levels than is implied by current regulations. Meeting current Brazilian forest policy goals may require an increase in the planned total area of permanent production forest or the widespread adoption of silvicultural practices that increase stand recovery and volume accumulation rates after RIL harvests. Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colletotrichum gossypii var. cephalosporioides, the fungus that causes ramulosis disease of cotton, is widespread in Brazil and can cause severe yield loss. Because weather conditions greatly affect disease development, the objective of this work was to develop weather-based models to assess disease favorability. Latent period, incidence, and severity of ramulosis symptoms were evaluated in controlled environment experiments using factorial combinations of temperature (15, 20, 25, 30, and 35 degrees C) and leaf wetness duration (0, 4, 8, 16, 32, and 64 h after inoculation). Severity was modeled as an exponential function of leaf wetness duration and temperature. At the optimum temperature of disease development, 27 degrees C, average latent period was 10 days. Maximum ramulosis severity occurred from 20 to 30 degrees C, with sharp decreases at lower and higher temperatures. Ramulosis severity increased as wetness periods were increased from 4 to 32 h. In field experiments at Piracicaba, Sao Paulo State, Brazil, cotton plots were inoculated (10(5) conidia ml(-1)) and ramulosis severity was evaluated weekly. The model obtained from the controlled environment study was used to generate a disease favorability index for comparison with disease progress rate in the field. Hourly measurements of solar radiation, temperature, relative humidity, leaf wetness duration, rainfall, and wind speed were also evaluated as possible explanatory variables. Both the disease favorability model and a model based on rainfall explained ramulosis growth rate well, with R(2) of 0.89 and 0.91, respectively. They are proposed as models of ramulosis development rate on cotton in Brazil, and weather-disease relationships revealed by this work can form the basis of a warning system for ramulosis development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Superconducting pairing of electrons in nanoscale metallic particles with discrete energy levels and a fixed number of electrons is described by the reduced Bardeen, Cooper, and Schrieffer model Hamiltonian. We show that this model is integrable by the algebraic Bethe ansatz. The eigenstates, spectrum, conserved operators, integrals of motion, and norms of wave functions are obtained. Furthermore, the quantum inverse problem is solved, meaning that form factors and correlation functions can be explicitly evaluated. Closed form expressions are given for the form factors and correlation functions that describe superconducting pairing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To improve the success of culturing olfactory neurons from human nasal mucosa by investigating the intranasal distribution of the olfactory epithelium and devising new techniques for growing human olfactory epithelium in vitro. Design: Ninety-seven biopsy specimens were obtained from 33 individuals, aged 21 to 74 years, collected from 6 regions of the nasal cavity. Each biopsy specimen was bisected, and 1 piece was processed for immunohistochemistry or electron microscopy while the other piece was dissected further for explant culture. Four culture techniques were performed, including whole explants and explanted biopsy slices. Five days after plating, neuronal differentiation was induced by means of a medium that contained basic fibroblast growth factor. After another 5 days, cultures were processed for immunocytochemical analysis. Results: The probability of finding olfactory epithelium in a biopsy specimen ranged from 30% to 76%, depending on its location. The dorsoposterior regions of the nasal septum and the superior turbinate provided the highest probability, but, surprisingly, olfactory epithelium was also found anteriorly and ventrally on both septum and turbinates. A new method of culturing the olfactory epithelium was devised. This slice culture technique improved the success rate for generating olfactory neurons from 10% to 90%. Conclusions: This study explains and overcomes most of the variability in the success in observing neurogenesis in cultures of adult human olfactory epithelium. The techniques presented here make the human olfactory epithelium a useful model for clinical research into certain olfactory dysfunctions and a model for the causes of neurodevelopmental and neurodegenerative diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To use magnetic resonance imaging (MRI) to validate estimates of muscle and adipose tissue (AT) in lower limb sections obtained by dual-energy X-ray absorptiometry (DXA) modelling. DESIGN: MRI measurements were used as reference for validating limb muscle and AT estimates obtained by DXA models that assume fat-free soft tissue (FFST) comprised mainly muscle: model A accounted for bone hydration only; model B also applied constants for FFST in bone and skin and fat in muscle and AT; model C was as model B but allowing for variable fat in muscle and AT. SUBJECTS: Healthy men (n = 8) and women (n = 8), ages 41 - 62 y; mean (s.d.) body mass indices (BMIs) of 28.6 (5.4) kg/m(2) and 25.1 (5.4) kg/m2, respectively. MEASUREMENTS: MRI scans of the legs and whole body DXA scans were analysed for muscle and AT content of thigh (20 cm) and lower leg (10 cm) sections; 24 h creatinine excretion was measured. RESULTS: Model A overestimated thigh muscle volume (MRI mean, 2.3 l) substantially (bias 0.36 l), whereas model B underestimated it by only 2% (bias 0.045 l). Lower leg muscle (MRI mean, 0.6 l) was better predicted using model A (bias 0.04 l, 7% overestimate) than model B (bias 0.1 l, 17% underestimate). The 95% limits of agreement were high for these models (thigh,+/- 20%; lower leg,+/- 47%). Model C predictions were more discrepant than those of model B. There was generally less agreement between MRI and all DXA models for AT. Measurement variability was generally less for DXA measurements of FFST (coefficient of variation 0.7 - 1.8%) and fat (0.8 - 3.3%) than model B estimates of muscle (0.5-2.6%) and AT (3.3 - 6.8%), respectively. Despite strong relationships between them, muscle mass was overestimated by creatinine excretion with highly variable predictability. CONCLUSION: This study has shown the value of DXA models for assessment of muscle and AT in leg sections, but suggests the need to re-evaluate some of the assumptions upon which they are based.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of event time and size statistics in two heterogeneous cellular automaton models of earthquake behavior are studied and compared to the evolution of these quantities during observed periods of accelerating seismic energy release Drier to large earthquakes. The two automata have different nearest neighbor laws, one of which produces self-organized critical (SOC) behavior (PSD model) and the other which produces quasi-periodic large events (crack model). In the PSD model periods of accelerating energy release before large events are rare. In the crack model, many large events are preceded by periods of accelerating energy release. When compared to randomized event catalogs, accelerating energy release before large events occurs more often than random in the crack model but less often than random in the PSD model; it is easier to tell the crack and PSD model results apart from each other than to tell either model apart from a random catalog. The evolution of event sizes during the accelerating energy release sequences in all models is compared to that of observed sequences. The accelerating energy release sequences in the crack model consist of an increase in the rate of events of all sizes, consistent with observations from a small number of natural cases, however inconsistent with a larger number of cases in which there is an increase in the rate of only moderate-sized events. On average, no increase in the rate of events of any size is seen before large events in the PSD model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Animal venom components are of considerable interest to researchers across a wide variety of disciplines, including molecular biology, biochemistry, medicine, and evolutionary genetics. The three-finger family of snake venom peptides is a particularly interesting and biochemically complex group of venom peptides, because they are encoded by a large multigene family and display a diverse array of functional activities. In addition, understanding how this complex and highly varied multigene family evolved is an interesting question to researchers investigating the biochemical diversity of these peptides and their impact on human health. Therefore, the purpose of our study was to investigate the long-term evolutionary patterns exhibited by these snake venom toxins to understand the mechanisms by which they diversified into a large, biochemically diverse, multigene family. Our results show a much greater diversity of family members than was previously known, including a number of subfamilies that did not fall within any previously identified groups with characterized activities. In addition, we found that the long-term evolutionary processes that gave rise to the diversity of three-finger toxins are consistent with the birth-and-death model of multigene family evolution. It is anticipated that this three-finger toxin toolkit will prove to be useful in providing a clearer picture of the diversity of investigational ligands or potential therapeutics available within this important family.