901 resultados para Creo Parametric


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Are the learning procedures of genetic algorithms (GAs) able to generate optimal architectures for artificial neural networks (ANNs) in high frequency data? In this experimental study,GAs are used to identify the best architecture for ANNs. Additional learning is undertaken by the ANNs to forecast daily excess stock returns. No ANN architectures were able to outperform a random walk,despite the finding of non-linearity in the excess returns. This failure is attributed to the absence of suitable ANN structures and further implies that researchers need to be cautious when making inferences from ANN results that use high frequency data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the use of the non-parametric free disposal hull (FDH) and the parametric multi-level model (MLM) as alternative methods for measuring pupil and school attainment where hierarchical structured data are available. Using robust FDH estimates, we show how to decompose the overall inefficiency of a unit (a pupil) into a unit specific and a higher level (a school) component. By a sample of entry and exit attainments of 3017 girls in British ordinary single sex schools, we test the robustness of the non-parametric and parametric estimates. Finally, the paper uses the traditional MLM model in a best practice framework so that pupil and school efficiencies can be computed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial prediction has attracted a lot of interest due to the financial implications that the accurate prediction of financial markets can have. A variety of data driven modellingapproaches have been applied but their performance has produced mixed results. In this study we apply both parametric (neural networks with active neurons) and nonparametric (analog complexing) self-organisingmodelling methods for the daily prediction of the exchangerate market. We also propose acombinedapproach where the parametric and nonparametricself-organising methods are combined sequentially, exploiting the advantages of the individual methods with the aim of improving their performance. The combined method is found to produce promising results and to outperform the individual methods when tested with two exchangerates: the American Dollar and the Deutche Mark against the British Pound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To carry out an analysis of variance, several assumptions are made about the nature of the experimental data which have to be at least approximately true for the tests to be valid. One of the most important of these assumptions is that a measured quantity must be a parametric variable, i.e., a member of a normally distributed population. If the data are not normally distributed, then one method of approach is to transform the data to a different scale so that the new variable is more likely to be normally distributed. An alternative method, however, is to use a non-parametric analysis of variance. There are a limited number of such tests available but two useful tests are described in this Statnote, viz., the Kruskal-Wallis test and Friedmann’s analysis of variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Different types of numerical data can be collected in a scientific investigation and the choice of statistical analysis will often depend on the distribution of the data. A basic distinction between variables is whether they are ‘parametric’ or ‘non-parametric’. When a variable is parametric, the data come from a symmetrically shaped distribution known as the ‘Gaussian’ or ‘normal distribution’ whereas non-parametric variables may have a distribution which deviates markedly in shape from normal. This article describes several aspects of the problem of non-normality including: (1) how to test for two common types of deviation from a normal distribution, viz., ‘skew’ and ‘kurtosis’, (2) how to fit the normal distribution to a sample of data, (3) the transformation of non-normally distributed data and scores, and (4) commonly used ‘non-parametric’ statistics which can be used in a variety of circumstances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Diagnosis Related Groups (DRG) as a mechanism for hospital financing is a currently debated topic in Portugal. The DRG system was scheduled to be initiated by the Health Ministry of Portugal on January 1, 1990 as an instrument for the allocation of public hospital budgets funded by the National Health Service (NHS), and as a method of payment for other third party payers (e.g., Public Employees (ADSE), private insurers, etc.). Based on experience from other countries such as the United States, it was expected that implementation of this system would result in more efficient hospital resource utilisation and a more equitable distribution of hospital budgets. However, in order to minimise the potentially adverse financial impact on hospitals, the Portuguese Health Ministry decided to gradually phase in the use of the DRG system for budget allocation by using blended hospitalspecific and national DRG casemix rates. Since implementation in 1990, the percentage of each hospitals budget based on hospital specific costs was to decrease, while the percentage based on DRG casemix was to increase. This was scheduled to continue until 1995 when the plan called for allocating yearly budgets on a 50% national and 50% hospitalspecific cost basis. While all other nonNHS third party payers are currently paying based on DRGs, the adoption of DRG casemix as a National Health Service budget setting tool has been slower than anticipated. There is now some argument in both the political and academic communities as to the appropriateness of DRGs as a budget setting criterion as well as to their impact on hospital efficiency in Portugal. This paper uses a twostage procedure to assess the impact of actual DRG payment on the productivity (through its components, i.e., technological change and technical efficiency change) of diagnostic technology in Portuguese hospitals during the years 1992–1994, using both parametric and nonparametric frontier models. We find evidence that the DRG payment system does appear to have had a positive impact on productivity and technical efficiency of some commonly employed diagnostic technologies in Portugal during this time span.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often observations are nested within other units. This is particularly the case in the educational sector where school performance in terms of value added is the result of school contribution as well as pupil academic ability and other features relating to the pupil. Traditionally, the literature uses parametric (i.e. it assumes a priori a particular function on the production process) Multi-Level Models to estimate the performance of nested entities. This paper discusses the use of the non-parametric (i.e. without a priori assumptions on the production process) Free Disposal Hull model as an alternative approach. While taking into account contextual characteristics as well as atypical observations, we show how to decompose non-parametrically the overall inefficiency of a pupil into a unit specific and a higher level (i.e. a school) component. By a sample of entry and exit attainments of 3017 girls in British ordinary single sex schools, we test the robustness of the non-parametric and parametric estimates. We find that the two methods agree in the relative measures of the scope for potential attainment improvement. Further, the two methods agree on the variation in pupil attainment and the proportion attributable to pupil and school level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

If in a correlation test, one or both variables are small whole numbers, scores based on a limited scale, or percentages, a non-parametric correlation coefficient should be considered as an alternative to Pearson’s ‘r’. Kendall’s t and Spearman’s rs are similar tests but the former should be considered if the analysis is to be extended to include partial correlations. If the data contain many tied values, then gamma should be considered as a suitable test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nonlinear instabilities are responsible for spontaneous pattern formation in a vast number of natural and engineered systems, ranging from biology to galaxy buildup. We propose a new instability mechanism leading to pattern formation in spatially extended nonlinear systems, which is based on a periodic antiphase modulation of spectrally dependent losses arranged in a zigzag way: an effective filtering is imposed at symmetrically located wave numbers k and -k in alternating order. The properties of the dissipative parametric instability differ from the features of both key classical concepts of modulation instabilities, i.e., the Benjamin-Feir instability and the Faraday instabiltyity. We demonstrate how the dissipative parametric instability can lead to the formation of stable patterns in one- and two-dimensional systems. The proposed instability mechanism is generic and can naturally occur or can be implemented in various physical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By using an alternative setup for photorefractive parametric oscillation in which wave mixing between the recording beams is avoided it has become possible to make more detailed comparisons with the space-charge wave theory. In the present paper we compare the experimental features of longitudinal parametric oscillation observed in a crystal of Bi12SiO20 with the theoretical predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of beam coupling on photorefractive parametric oscillation generated in a Bi12SiO20 crystal is investigated experimentally by comparing two configurations with and without the presence of beam coupling. It is shown that beam coupling has a great influence; for example, the transversal split of the K/2 subharmonic grating is seen only in the beam-coupling geometry. A case that resembles K/4 subharmonic generation can, however, still be found in the absence of beam coupling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research is concerned with the application of the computer simulation technique to study the performance of reinforced concrete columns in a fire environment. The effect of three different concrete constitutive models incorporated in the computer simulation on the structural response of reinforced concrete columns exposed to fire is investigated. The material models differed mainly in respect to the formulation of the mechanical properties of concrete. The results from the simulation have clearly illustrated that a more realistic response of a reinforced concrete column exposed to fire is given by a constitutive model with transient creep or appropriate strain effect The assessment of the relative effect of the three concrete material models is considered from the analysis by adopting the approach of a parametric study, carried out using the results from a series of analyses on columns heated on three sides which produce substantial thermal gradients. Three different loading conditions were used on the column; axial loading and eccentric loading both to induce moments in the same sense and opposite sense to those induced by the thermal gradient. An axially loaded column heated on four sides was also considered. The computer modelling technique adopted separated the thermal and structural responses into two distinct computer programs. A finite element heat transfer analysis was used to determine the thermal response of the reinforced concrete columns when exposed to the ISO 834 furnace environment. The temperature distribution histories obtained were then used in conjunction with a structural response program. The effect of the occurrence of spalling on the structural behaviour of reinforced concrete column is also investigated. There is general recognition of the potential problems of spalling but no real investigation into what effect spalling has on the fire resistance of reinforced concrete members. In an attempt to address the situation, a method has been developed to model concrete columns exposed to fire which incorporates the effect of spalling. A total of 224 computer simulations were undertaken by varying the amounts of concrete lost during a specified period of exposure to fire. An array of six percentages of spalling were chosen for one range of simulation while a two stage progressive spalling regime was used for a second range. The quantification of the reduction in fire resistance of the columns against the amount of spalling, heating and loading patterns, and the time at which the concrete spalls appears to indicate that it is the amount of spalling which is the most significant variable in the reduction of fire resistance.