16 resultados para model determination

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper addresses the problem of determining which 3D shape is present, and more importantly, the dimensions of the shape in a scene. This is performed in an active vision system because it reduces the complexity of the problem through the use of gaze stabilization, choice of foveation point, and selective processing by adaptively processing regions of interest. In our case, only a small number of equations and parameters are needed for each shape and these are incorporated into functional descriptions of the shapes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An important task in multiple-criteria decision making is how to learn the weights and parameters of an aggregation function from empirical data. We consider this in the context of quantifying ecological diversity, where such data is to be obtained as a set of pairwise comparisons specifying that one community should be considered more diverse than another. A problem that arises is how to collect a sufficient amount of data for reliable model determination without overloading individuals with the number of comparisons they need to make. After providing an algorithm for determining criteria weights and an overall ranking from such information, we then investigate the improvement in accuracy if ranked 3-tuples are supplied instead of pairs. We found that aggregation models could be determined accurately from significantly fewer 3-tuple comparisons than pairs. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to assess whether universities are meeting the needs of students in large marketing classes. In so doing the study investigates the application of self determination theory and psychological needs satisfaction. The basic needs scale, comprising of three constructs; Control, Competence and Caring was adapted and used to evaluate students’ perception of an introductory marketing subject. The study used a multi-method approach consisting of a literature review, a qualitative phase involving in-depth interviews with marketing teaching staff and focus groups with marketing students and a survey of students about introductory level marketing. An adapted version of the basic psychological needs scale was included in a questionnaire that was administered to a convenience sample of 366 students. MANOVA, ANOVA and descriptive statistics were used to analyse the data. The results show that the psychological needs satisfaction of many students are not being fully realised. It was also found that marketing degree students enjoyed the challenges and were more stimulated by the subject. The higher achieving students enjoyed the challenge of the subject more than the lower achieving students. As a result of this study, there are three suggestions for further research. Firstly, further study should compare subjects, with relatively small enrolments, to those with large enrolments to corroborate the value of this method of assessing student satisfaction. Secondly, the use of a larger sample across other universities would confirm whether these findings hold for other institutions. Finally, it is suggested that a structural model should be developed to extend this investigation of student satisfaction and the constructs used in the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a model of exchange rate determination within an error correction framework. The intention is to identify both long and short term determinants that can be used to forecast the AUD/US exchange rate. The paper identifies a set of significant variables associated with exchange rate movements over a twenty year period from 1984 to 2004. Specifically, the overnight interest rate differential, Australia's foreign trade-weighted exposure to commodity prices as well as exchange rate volatility are variables identified that are able explain movements in the AUDIUS dollar relationship. An error correction model is subsequently constructed that incorporates an equilibrium correction term, a short-term interest rate differential variable, a commodity price variable and a proxy for exchange rate volatility. The model is then used to forecast out of sample and is found to dominate a naIve random walk model based on three different metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overarching goal of this dissertation was to evaluate the contextual components of instructional strategies for the acquisition of complex programming concepts. A meta-knowledge processing model is proposed, on the basis of the research findings, thereby facilitating the selection of media treatment for electronic courseware. When implemented, this model extends the work of Smith (1998), as a front-end methodology, for his glass-box interpreter called Bradman, for teaching novice programmers. Technology now provides the means to produce individualized instructional packages with relative ease. Multimedia and Web courseware development accentuate a highly graphical (or visual) approach to instructional formats. Typically, little consideration is given to the effectiveness of screen-based visual stimuli, and curiously, students are expected to be visually literate, despite the complexity of human-computer interaction. Visual literacy is much harder for some people to acquire than for others! (see Chapter Four: Conditions-of-the-Learner) An innovative research programme was devised to investigate the interactive effect of instructional strategies, enhanced with text-plus-textual metaphors or text-plus-graphical metaphors, and cognitive style, on the acquisition of a special category of abstract (process) programming concept. This type of concept was chosen to focus on the role of analogic knowledge involved in computer programming. The results are discussed within the context of the internal/external exchange process, drawing on Ritchey's (1980) concepts of within-item and between-item encoding elaborations. The methodology developed for the doctoral project integrates earlier research knowledge in a novel, interdisciplinary, conceptual framework, including: from instructional science in the USA, for the concept learning models; British cognitive psychology and human memory research, for defining the cognitive style construct; and Australian educational research, to provide the measurement tools for instructional outcomes. The experimental design consisted of a screening test to determine cognitive style, a pretest to determine prior domain knowledge in abstract programming knowledge elements, the instruction period, and a post-test to measure improved performance. This research design provides a three-level discovery process to articulate: 1) the fusion of strategic knowledge required by the novice learner for dealing with contexts within instructional strategies 2) acquisition of knowledge using measurable instructional outcome and learner characteristics 3) knowledge of the innate environmental factors which influence the instructional outcomes This research has successfully identified the interactive effect of instructional strategy, within an individual's cognitive style construct, in their acquisition of complex programming concepts. However, the significance of the three-level discovery process lies in the scope of the methodology to inform the design of a meta-knowledge processing model for instructional science. Firstly, the British cognitive style testing procedure, is a low cost, user friendly, computer application that effectively measures an individual's position on the two cognitive style continua (Riding & Cheema,1991). Secondly, the QUEST Interactive Test Analysis System (Izard,1995), allows for a probabilistic determination of an individual's knowledge level, relative to other participants, and relative to test-item difficulties. Test-items can be related to skill levels, and consequently, can be used by instructional scientists to measure knowledge acquisition. Finally, an Effect Size Analysis (Cohen,1977) allows for a direct comparison between treatment groups, giving a statistical measurement of how large an effect the independent variables have on the dependent outcomes. Combined with QUEST's hierarchical positioning of participants, this tool can assist in identifying preferred learning conditions for the evaluation of treatment groups. By combining these three assessment analysis tools into instructional research, a computerized learning shell, customised for individuals' cognitive constructs can be created (McKay & Garner,1999). While this approach has widespread application, individual researchers/trainers would nonetheless, need to validate with an extensive pilot study programme (McKay,1999a; McKay,1999b), the interactive effects within their specific learning domain. Furthermore, the instructional material does not need to be limited to a textual/graphical comparison, but could be applied to any two or more instructional treatments of any kind. For instance: a structured versus exploratory strategy. The possibilities and combinations are believed to be endless, provided the focus is maintained on linking of the front-end identification of cognitive style with an improved performance outcome. My in-depth analysis provides a better understanding of the interactive effects of the cognitive style construct and instructional format on the acquisition of abstract concepts, involving spatial relations and logical reasoning. In providing the basis for a meta-knowledge processing model, this research is expected to be of interest to educators, cognitive psychologists, communications engineers and computer scientists specialising in computer-human interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a random design model based on independent and identically distributed pairs of observations (Xi, Yi), where the regression function m(x) is given by m(x) = E(Yi|Xi = x) with one independent variable. In a nonparametric setting the aim is to produce a reasonable approximation to the unknown function m(x) when we have no precise information about the form of the true density, f(x) of X. We describe an estimation procedure of non-parametric regression model at a given point by some appropriately constructed fixed-width (2d) confidence interval with the confidence coefficient of at least 1−. Here, d(> 0) and 2 (0, 1) are two preassigned values. Fixed-width confidence intervals are developed using both Nadaraya-Watson and local linear kernel estimators of nonparametric regression with data-driven bandwidths. The sample size was optimized using the purely and two-stage sequential procedures together with asymptotic properties of the Nadaraya-Watson and local linear estimators. A large scale simulation study was performed to compare their coverage accuracy. The numerical results indicate that the confi dence bands based on the local linear estimator have the better performance than those constructed by using Nadaraya-Watson estimator. However both estimators are shown to have asymptotically correct coverage properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ordered weighted averaging (OWA) determination method with stress function was proposed by Yager, and it makes the OWA operator elements scatter in the shape of the stress function. In this paper, we extend the OWA determination with the stress function method using an optimization model. The proposed method transforms the OWA optimal solution elements into the interpolation points of the stress function. The proposed method extends the basic form of the stress function method with both scale and vertical shift transformations.We also explore a number of properties of this optimization-based stress function method. The OWA operator optimal solution elements can distribute as the shape of the given stress function in a parameterized way, in which case, the solution always possesses the arithmetic average operator as a special case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method for foreground/background separation of audio using a background modelling technique. The technique models the background in an online, unsupervised, and adaptive fashion, and is designed for application to long term surveillance and monitoring problems. The background is determined using a statistical method to model the states of the audio over time. In addition, three methods are used to increase the accuracy of background modelling in complex audio environments. Such environments can cause the failure of the statistical model to accurately capture the background states. An entropy-based approach is used to unify background representations fragmented over multiple states of the statistical model. The approach successfully unifies such background states, resulting in a more robust background model. We adaptively adjust the number of states considered background according to background complexity, resulting in the more accurate classification of background models. Finally, we use an auxiliary model cache to retain potential background states in the system. This prevents the deletion of such states due to a rapid influx of observed states that can occur for highly dynamic sections of the audio signal. The separation algorithm was successfully applied to a number of audio environments representing monitoring applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following the recent success in quantitative analysis of essential fatty acid compositions in a commercial microencapsulated fish oil (?EFO) supplement, we extended the application of portable attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopic technique and partial least square regression (PLSR) analysis for rapid determination of total protein contents-the other major component in most commercial ?EFO powders. In contrast to the traditional chromatographic methodology used in a routine amino acid analysis (AAA), the ATR-FTIR spectra of the ?EFO powder can be acquired directly from its original powder form with no requirement of any sample preparation, making the technique exceptionally fast, noninvasive, and environmentally friendly as well as being cost effective and hence eminently suitable for routine use by industry. By optimizing the spectral region of interest and number of latent factors through the developed PLSR strategy, a good linear calibration model was produced as indicated by an excellent value of coefficient of determination R2 = 0.9975, using standard ?EFO powders with total protein contents in the range of 140-450 mg/g. The prediction of the protein contents acquired from an independent validation set through the optimized PLSR model was highly accurate as evidenced through (1) a good linear fitting (R2 = 0.9759) in the plot of predicted versus reference values, which were obtained from a standard AAA method, (2) lowest root mean square error of prediction (11.64 mg/g), and (3) high residual predictive deviation (6.83) ranked in very good level of predictive quality indicating high robustness and good predictive performance of the achieved PLSR calibration model. The study therefore demonstrated the potential application of the portable ATR-FTIR technique when used together with PLSR analysis for rapid online monitoring of the two major components (i.e., oil and protein contents) in finished ?EFO powders in the actual manufacturing setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mechanical behaviour of Fe-18Mn-0.6C-1Al (wt%) TWIP steel was modelled in the temperature range from room temperature to 400°C. The proposed constitutive model was based on the Kocks-Mecking-Estrin (KME) model. The model parameters were determined using extensive experimental measurements of the physical parameters such as the dislocation mean free path and the volume fraction of twinned grains. More than 100 grains with a total area of ~300μm2 were examined at different strain levels over the entire stress-strain curve. Uniaxial tensile deformation of the TWIP steel was modelled for different deformation temperatures using a modelling approach which considers two distinct populations of grains: twinned and twin-free ones. A key point of the work was a meticulous experimental determination of the evolution of the volume fraction of twinned grains during uniaxial tensile deformation. This information was implemented in a phase-mixture model that yielded a very good agreement with the experimental tensile behaviour for the tested range of deformation temperatures. © 2014 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acidic potassium permanganate chemiluminescence enables direct post-column detection of glutathione, but its application to assess the redox state of a wider range of biological fluids and tissues is limited by its sensitivity. Herein we show that the simple on-line addition of an aqueous formaldehyde solution not only enhances the sensitivity of the procedure by two orders of magnitude, but also provides a remarkable improvement in the selectivity of the reagent towards thiols such as glutathione (compared to phenols and amino acids that do not possess a thiol group). This enhanced mode of detection was applied to the determination of glutathione and its corresponding disulfide species in homogenised striatum samples taken from both wild type mice and the R6/1 transgenic mouse model of Huntington's disease, at both 8 and 12 weeks of age. No significant difference was observed between the GSH/GSSG ratios of wild type mice and R6/1 mice at either age group, suggesting that the early disease progression had not significantly altered the intracellular redox environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As is well known, when using an information criterion to select the number of common factors in factor models the appropriate penalty is generally indetermine in the sense that it can be scaled by an arbitrary constant, c say, without affecting consistency. In an influential paper, Hallin and Liška (J Am Stat Assoc102:603–617, 2007) proposes a data-driven procedure for selecting the appropriate value of c. However, by removing one source of indeterminacy, the new procedure simultaneously creates several new ones, which make for rather complicated implementation, a problem that has been largely overlooked in the literature. By providing an extensive analysis using both simulated and real data, the current paper fills this gap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The often violent emergence of new independent states following the end of the Cold War generated discussion about the normative grounds of territorial separatism. A number of opposing approaches surfaced debating whether and under which circumstances there is a right for a community to secede from its host country. Overwhelmingly, these studies placed emphasis on the right to secession and neglected the moral stance of secessionist movements as agents in international relations. In this book Costas Laoutides explores the collective moral agency involved in secessionist struggles offering a theoretical model for the collective responsibility of secessionist groups. Case-studies on the Kurds and the people of Moldova-Transdniestria illustrate the author’s theoretical arguments as he seeks to establish how, although the principle of self-determination was envisaged as a means of gradually bestowing political power upon the people, it never managed to realize its full potential because it was interpreted strictly within a framework of exclusionary politics of identity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-silico optimisation of a two-dimensional high performance liquid chromatography (2D-HPLC) separation protocol has been developed for the interogation of methamphetamine samples including model, real world seizure, and laboratory synthesised samples. The protocol used Drylab® software to rapidly identify the optimum separation conditions from a library of chromatography columns. The optimum separation space was provided by the Phenomonex Kinetex PFP column (first dimension) and an Agilent Poroshell 120 EC-C18 column (second dimension). To facilitate a rapid 2D-HPLC analysis the particle packed C18 column was replaced with a Phenomenex Onyx Monolithic C18 withought sacrificing separation performance. The Drylab® optimised and experimental separations matched very closely, highlighting the robust nature of HPLC simulations. The chemical information gained from an intermediate methamphetamine sample was significant and complimented that generated from a pure seizure sample. The influence of the two-dimensional separation on the analytical figures of merit was also investigated. The limits of detection for key analytes in the second dimension determined for methamphetamine (4.59 × 10-⁴ M), pseudoephedrine (4.03 × 10-4 M), caffeine (5.16 × 10-⁴ M), aspirin (9.32 × 10-4 M), paracetamol (5.93 × 10-4 M) and procaine (2.02 × 10-3 M).