996 resultados para Distributed parameters
Resumo:
Abstract Apricot is one of the fruits dried by using different methods, such as sun, convective or microwave drying. The effects of drying methods on the components of this fruit differ depending upon the temperature or time parameters. In this research, the impacts of convective, microwave and microwave–convective drying techniques on color, β-carotene, minerals and antioxidant activity of apricots were investigated. The color values (L*, b*,ΔEab, h° and C*ab) of dried fruit were decreased, while the a* values increased. Compared with a fresh sample, the dried apricots showed a 1.4-3.9-fold proportional increase in β-carotene based on the increment of dry matter. The samples dried at high temperature and microwave levels, at 75 °C+90 watt and 75 °C+160 watt, showed lower antioxidant activity. Of the different drying treatments, the microwave-convective method (50 °C+160 watt) obtained a higher β-carotene content while maintaining antioxidant activity with a short drying time.
Resumo:
Abstract A wide range of quality parameters have been used to describe maize flours for food use, but there is no general agreement about the most suitable parameters for breadmaking. The objective of this study was to identify the maize flour parameters related to the consumer perceived quality of Portuguese broa bread (more than 50% maize flour). The influence of eleven maize landraces was assessed and compared with commercial flour using baking tests. The broa were evaluated by instrumental (colour, firmness) and sensory hedonic analysis with a consumer panel of 52 assessors. The broa sensory analysis revealed similar assessments among landraces and the lowest scores for commercial flour. The flour particle size distribution is the major influence, with commercial flour showing the highest mean diameter and a large particle distribution range. Broa consumer panel linkage associations and specific sensory descriptors have been identified; age as an influence on colour, cohesiveness, and source region as an influence on appearance and texture.
Resumo:
The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.
Resumo:
Liberalization of electricity markets has resulted in a competed Nordic electricity market, in which electricity retailers play a key role as electricity suppliers, market intermediaries, and service providers. Although these roles may remain unchanged in the near future, the retailers’ operation may change fundamentally as a result of the emerging smart grid environment. Especially the increasing amount of distributed energy resources (DER), and improving opportunities for their control, are reshaping the operating environment of the retailers. This requires that the retailers’ operation models are developed to match the operating environment, in which the active use of DER plays a major role. Electricity retailers have a clientele, and they operate actively in the electricity markets, which makes them a natural market party to offer new services for end-users aiming at an efficient and market-based use of DER. From the retailer’s point of view, the active use of DER can provide means to adapt the operation to meet the challenges posed by the smart grid environment, and to pursue the ultimate objective of the retailer, which is to maximize the profit of operation. This doctoral dissertation introduces a methodology for the comprehensive use of DER in an electricity retailer’s short-term profit optimization that covers operation in a variety of marketplaces including day-ahead, intra-day, and reserve markets. The analysis results provide data of the key profit-making opportunities and the risks associated with different types of DER use. Therefore, the methodology may serve as an efficient tool for an experienced operator in the planning of the optimal market-based DER use. The key contributions of this doctoral dissertation lie in the analysis and development of the model that allows the retailer to benefit from profit-making opportunities brought by the use of DER in different marketplaces, but also to manage the major risks involved in the active use of DER. In addition, the dissertation introduces an analysis of the economic potential of DER control actions in different marketplaces including the day-ahead Elspot market, balancing power market, and the hourly market of Frequency Containment Reserve for Disturbances (FCR-D).
Resumo:
An interspecific hybrid resulting from the crossing of elephant grass (Pennisetum purpureum Schumach) x pearl millet (Pennisetum glaucum (L.) R. Brown) has been developed. This hybrid, however, revealed low phenotypic uniformity and low production of pure seeds. Through recurrent selection, two improved populations were obtained (genotypes Corte and Pastoreio). The aim of this study was assessing seed quality of the three hybrids (genotypes Corte, Pastoreio and Paraiso) by tests of: seed purity; seed germination; accelerated aging test, at 42 ºC; 1,000 seeds weight; drying curves; and sorption and desorption isotherms. Recurrent selection altered the seed size and increased initial quality of population for genotype Pastoreio. Drying curves for the three hybrids have shown similar behavior and reached moisture contents of 2.1%, 1.9%, and 1.8%, respectively, after 63 days. The accelerated aging test showed that hybrid Pastoreio was the most vigorous.
Resumo:
A new approach to the determination of the thermal parameters of high-power batteries is introduced here. Application of local heat flux measurement with a gradient heat flux sensor (GHFS) allows determination of the cell thermal parameters in di_erent surface points of the cell. The suggested methodology is not cell destructive as it does not require deep discharge of the cell or application of any charge/discharge cycles during measurements of the thermal parameters of the cell. The complete procedure is demonstrated on a high-power Li-ion pouch cell, and it is verified on a sample with well-known thermal parameters. A comparison of the experimental results with conventional thermal characterization methods shows an acceptably low error. The dependence of the cell thermal parameters on state of charge (SoC) and measurement points on the surface was studied by the proposed measurement approach.
Resumo:
Breeding parameters of Great Cormorants (PkaZac/iOCOfiCLX CCUibo dCUtbo) and Double-crested Cormorants (P. CLU/uXuA CMJhLtllb) were examined at two mixed species colonies at Cape Tryon and Durell Point, Prince Edward Island from 1976 to 1978. Differential access to nests at the two colony sites resulted in more complete demographic data for P. CCUibo than for P. CLUJiituA. In 1911j P. CCtfibo was present at both colonies by 21 March, whereas P. auAAJtuA did not return until 1 April and 16 April at Cape Tryon and Durell Point, respectively. Differences in the arrival chronology by individuals of each species and differences in the time of nest site occupation according to age, are suggested as factors influencing the nest site distribution of P. CXUtbo and P. aiVtituA at Cape Tryon. Forty-eight P. dOJtbo chicks banded at the Durell Point colony between 19 74 and 19 76 returned there to nest as two- to four-year olds in 19 77 and 19 78. Unmarked individuals with clutch-starts in April were likely greater than four years old as all marked two to four-year olds (with one possible exception) in 19 77 and 1978 had clutch-starts in May and June. Seasonal variation in the breeding success of P. dOJibo individuals was examined at Durell Point in 1977. Mean clutch-size, hatching success and fledging success exhibited a seasonal decline. Four- and 5-egg clutches represented the majority (75%) of all P. CCUibo clutches at Durell Point in 1977 and had the highest reproductive success (0.48 and 0.43 chicks fledged per egg laid respectively). Smaller clutches produced small broods with significantly higher chick mortality while larger clutches suffered high egg loss prior to clutch completion.
Resumo:
Order parameter profiles extracted from the NMR spectra of model membranes are a valuable source of information about their structure and molecular motions. To al1alyze powder spectra the de-Pake-ing (numerical deconvolution) ~echnique can be used, but it assumes a random (spherical) dist.ribution of orientations in the sample. Multilamellar vesicles are known to deform and orient in the strong magnetic fields of NMR magnets, producing non-spherical orientation distributions. A recently developed technique for simultaneously extracting the anisotropies of the system as well as the orientation distributions is applied to the analysis of partially magnetically oriented 31p NMR spectra of phospholipids. A mixture of synthetic lipids, POPE and POPG, is analyzed to measure distortion of multilamellar vesicles in a magnetic field. In the analysis three models describing the shape of the distorted vesicles are examined. Ellipsoids of rotation with a semiaxis ratio of about 1.14 are found to provide a good approximation of the shape of the distorted vesicles. This is in reasonable agreement with published experimental work. All three models yield clearly non-spherical orientational distributions, as well as a precise measure of the anisotropy of the chemical shift. Noise in the experimental data prevented the analysis from concluding which of the three models is the best approximation. A discretization scheme for finding stability in the algorithm is outlined
Resumo:
Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .
Resumo:
Tesis (Maestría en Ciencias con Orientación en Matemáticas) UANL, 2013.
Resumo:
In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.
Resumo:
In this paper, we characterize the asymmetries of the smile through multiple leverage effects in a stochastic dynamic asset pricing framework. The dependence between price movements and future volatility is introduced through a set of latent state variables. These latent variables can capture not only the volatility risk and the interest rate risk which potentially affect option prices, but also any kind of correlation risk and jump risk. The standard financial leverage effect is produced by a cross-correlation effect between the state variables which enter into the stochastic volatility process of the stock price and the stock price process itself. However, we provide a more general framework where asymmetric implied volatility curves result from any source of instantaneous correlation between the state variables and either the return on the stock or the stochastic discount factor. In order to draw the shapes of the implied volatility curves generated by a model with latent variables, we specify an equilibrium-based stochastic discount factor with time non-separable preferences. When we calibrate this model to empirically reasonable values of the parameters, we are able to reproduce the various types of implied volatility curves inferred from option market data.
Resumo:
We consider the problem of accessing the uncertainty of calibrated parameters in computable general equilibrium (CGE) models through the construction of confidence sets (or intervals) for these parameters. We study two different setups under which this can be done.
Resumo:
The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter).
Resumo:
Affiliation: Claudia Kleinman, Nicolas Rodrigue & Hervé Philippe : Département de biochimie, Faculté de médecine, Université de Montréal