964 resultados para statistical framework


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a framework, design and study of an ambient persuasive interface. We introduce a novel framework of persua sive Cues in Ambient Intelligence (perCues). Based on this framework we designed an application for mobile devices. The application aims to persuade people to abstain from using their cars and to use public mass transportation instead in order to reduce emissions. It contains a bus schedule and information about the pollution status. We evaluated the application in two successive studies regarding user acceptance, oppor tune moments of use and persuasive effects. The perCues received a high acceptance due to its benefit for the users. The results confirm the im portance of opportune moment and user acceptance for persuasion. The findings also indicate the persuasive potential of perCues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tackling of coastal eutrophication requires water protection measures based on status assessments of water quality. The main purpose of this thesis was to evaluate whether it is possible both scientifically and within the terms of the European Union Water Framework Directive (WFD) to assess the status of coastal marine waters reliably by using phytoplankton biomass (ww) and chlorophyll a (Chl) as indicators of eutrophication in Finnish coastal waters. Empirical approaches were used to study whether the criteria, established for determining an indicator, are fulfilled. The first criterion (i) was that an indicator should respond to anthropogenic stresses in a predictable manner and has low variability in its response. Summertime Chl could be predicted accurately by nutrient concentrations, but not from the external annual loads alone, because of the rapid affect of primary production and sedimentation close to the loading sources in summer. The most accurate predictions were achieved in the Archipelago Sea, where total phosphorus (TP) and total nitrogen (TN) alone accounted for 87% and 78% of the variation in Chl, respectively. In river estuaries, the TP mass-balance regression model predicted Chl most accurately when nutrients originated from point-sources, whereas land-use regression models were most accurate in cases when nutrients originated mainly from diffuse sources. The inclusion of morphometry (e.g. mean depth) into nutrient models improved accuracy of the predictions. The second criterion (ii) was associated with the WFD. It requires that an indicator should have type-specific reference conditions, which are defined as "conditions where the values of the biological quality elements are at high ecological status". In establishing reference conditions, the empirical approach could only be used in the outer coastal water types, where historical observations of Secchi depth of the early 1900s are available. The most accurate prediction was achieved in the Quark. In the inner coastal water types, reference Chl, estimated from present monitoring data, are imprecise - not only because of the less accurate estimation method but also because the intrinsic characteristics, described for instance by morphometry, vary considerably inside these extensive inner coastal types. As for phytoplankton biomass, the reference values were less accurate than in the case of Chl, because it was possible to estimate reference conditions for biomass only by using the reconstructed Chl values, not the historical Secchi observations. An paleoecological approach was also applied to estimate annual average reference conditions for Chl. In Laajalahti, an urban embayment off Helsinki, strongly loaded by municipal waste waters in the 1960s and 1970s, reference conditions prevailed in the mid- and late 1800s. The recovery of the bay from pollution has been delayed as a consequence of benthic release of nutrients. Laajalahti will probably not achieve the good quality objectives of the WFD on time.    The third criterion (iii) was associated with coastal management including the resources it has available. Analyses of Chl are cheap and fast to carry out compared to the analyses of phytoplankton biomass and species composition; the fact which has an effect on number of samples to be taken and thereby on the reliability of assessments. However, analyses on phytoplankton biomass and species composition provide more metrics for ecological classification, the metrics which reveal various aspects of eutrophication contrary to what Chl alone does.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing interest to autonomously collect or manipulate objects in remote or unknown environments, such as mountains, gullies, bush-land, or rough terrain. There are several limitations of conventional methods using manned or remotely controlled aircraft. The capability of small Unmanned Aerial Vehicles (UAV) used in parallel with robotic manipulators could overcome some of these limitations. By enabling the autonomous exploration of both naturally hazardous environments, or areas which are biologically, chemically, or radioactively contaminated, it is possible to collect samples and data from such environments without directly exposing personnel to such risks. This paper covers the design, integration, and initial testing of a framework for outdoor mobile manipulation UAV. The framework is designed to allow further integration and testing of complex control theories, with the capability to operate outdoors in unknown environments. The results obtained act as a reference for the effectiveness of the integrated sensors and low-level control methods used for the preliminary testing, as well as identifying the key technologies needed for the development of an outdoor capable system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and sustained contribution of the Systems Theory Framework to career development theory and practice is well documented in national and international literatures. In addition to its contribution to theory integration, it has added to the growing literature on connecting career theory and practice, in particular for non-Western populations. In addition, it has been the basis of the development of a broad array of constructivist approaches to career counselling, and indeed specific reflective career assessment activities. This article begins with a brief history of the Systems Theory Framework which is then followed by a rationale for its development. The contribution of the Systems Theory Framework to theory and practice is then described prior to concluding comments by the authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since its inception, the Systems Theory Framework of career development has afforded ready translation into practice, especially into career counselling and qualitative career assessment. Through its clearly articulated constructs and the clarity of its diagrammatic representation, the Systems Theory Framework has facilitated the development of qualitative career assessment instruments as well as a quantitative measure. This article briefly overviews these practical applications of the Systems Theory Framework as well as its application in career counselling through a story telling approach. The article concludes by offering a synthesis of and considering future directions for the Systems Theory Framework’s practical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclostationary analysis has proven effective in identifying signal components for diagnostic purposes. A key descriptor in this framework is the cyclic power spectrum, traditionally estimated by the averaged cyclic periodogram and the smoothed cyclic periodogram. A lengthy debate about the best estimator finally found a solution in a cornerstone work by Antoni, who proposed a unified form for the two families, thus allowing a detailed statistical study of their properties. Since then, the focus of cyclostationary research has shifted towards algorithms, in terms of computational efficiency and simplicity of implementation. Traditional algorithms have proven computationally inefficient and the sophisticated "cyclostationary" definition of these estimators slowed their spread in the industry. The only attempt to increase the computational efficiency of cyclostationary estimators is represented by the cyclic modulation spectrum. This indicator exploits the relationship between cyclostationarity and envelope analysis. The link with envelope analysis allows a leap in computational efficiency and provides a "way in" for the understanding by industrial engineers. However, the new estimator lies outside the unified form described above and an unbiased version of the indicator has not been proposed. This paper will therefore extend the analysis of envelope-based estimators of the cyclic spectrum, proposing a new approach to include them in the unified form of cyclostationary estimators. This will enable the definition of a new envelope-based algorithm and the detailed analysis of the properties of the cyclic modulation spectrum. The computational efficiency of envelope-based algorithms will be also discussed quantitatively for the first time in comparison with the averaged cyclic periodogram. Finally, the algorithms will be validated with numerical and experimental examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Population dynamics are generally viewed as the result of intrinsic (purely density dependent) and extrinsic (environmental) processes. Both components, and potential interactions between those two, have to be modelled in order to understand and predict dynamics of natural populations; a topic that is of great importance in population management and conservation. This thesis focuses on modelling environmental effects in population dynamics and how effects of potentially relevant environmental variables can be statistically identified and quantified from time series data. Chapter I presents some useful models of multiplicative environmental effects for unstructured density dependent populations. The presented models can be written as standard multiple regression models that are easy to fit to data. Chapters II IV constitute empirical studies that statistically model environmental effects on population dynamics of several migratory bird species with different life history characteristics and migration strategies. In Chapter II, spruce cone crops are found to have a strong positive effect on the population growth of the great spotted woodpecker (Dendrocopos major), while cone crops of pine another important food resource for the species do not effectively explain population growth. The study compares rate- and ratio-dependent effects of cone availability, using state-space models that distinguish between process and observation error in the time series data. Chapter III shows how drought, in combination with settling behaviour during migration, produces asymmetric spatially synchronous patterns of population dynamics in North American ducks (genus Anas). Chapter IV investigates the dynamics of a Finnish population of skylark (Alauda arvensis), and point out effects of rainfall and habitat quality on population growth. Because the skylark time series and some of the environmental variables included show strong positive autocorrelation, the statistical significances are calculated using a Monte Carlo method, where random autocorrelated time series are generated. Chapter V is a simulation-based study, showing that ignoring observation error in analyses of population time series data can bias the estimated effects and measures of uncertainty, if the environmental variables are autocorrelated. It is concluded that the use of state-space models is an effective way to reach more accurate results. In summary, there are several biological assumptions and methodological issues that can affect the inferential outcome when estimating environmental effects from time series data, and that therefore need special attention. The functional form of the environmental effects and potential interactions between environment and population density are important to deal with. Other issues that should be considered are assumptions about density dependent regulation, modelling potential observation error, and when needed, accounting for spatial and/or temporal autocorrelation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robotics is taught in many Australian ICT classrooms, in both primary and secondary schools. Robotics activities, including those developed using the LEGO Mindstorms NXT technology, are mathematics-rich and provide a fertile round for learners to develop and extend their mathematical thinking. However, this context for learning mathematics is often under-exploited. In this paper a variant of the model construction sequence (Lesh, Cramer, Doerr, Post, & Zawojewski, 2003) is proposed, with the purpose of explicitly integrating robotics and mathematics teaching and learning. Lesh et al.’s model construction sequence and the model eliciting activities it embeds were initially researched in primary mathematics classrooms and more recently in university engineering courses. The model construction sequence involves learners working collaboratively upon product-focussed tasks, through which they develop and expose their conceptual understanding. The integrating model proposed in this paper has been used to design and analyse a sequence of activities in an Australian Year 4 classroom. In that sequence more traditional classroom learning was complemented by the programming of LEGO-based robots to ‘act out’ the addition and subtraction of simple fractions (tenths) on a number-line. The framework was found to be useful for planning the sequence of learning and, more importantly, provided the participating teacher with the ability to critically reflect upon robotics technology as a tool to scaffold the learning of mathematics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a recent spate of high profile infrastructure cost overruns in Australia and internationally. This is just the tip of a longer-term and more deeply-seated problem with initial budget estimating practice, well recognised in both academic research and industry reviews: the problem of uncertainty. A case study of the Sydney Opera House is used to identify and illustrate the key causal factors and system dynamics of cost overruns. It is conventionally the role of risk management to deal with such uncertainty, but the type and extent of the uncertainty involved in complex projects is shown to render established risk management techniques ineffective. This paper considers a radical advance on current budget estimating practice which involves a particular approach to statistical modelling complemented by explicit training in estimating practice. The statistical modelling approach combines the probability management techniques of Savage, which operate on actual distributions of values rather than flawed representations of distributions, and the data pooling technique of Skitmore, where the size of the reference set is optimised. Estimating training employs particular calibration development methods pioneered by Hubbard, which reduce the bias of experts caused by over-confidence and improve the consistency of subjective decision-making. A new framework for initial budget estimating practice is developed based on the combined statistical and training methods, with each technique being explained and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the functioning of a neural system in terms of its underlying circuitry is an important problem in neuroscience. Recent d evelopments in electrophysiology and imaging allow one to simultaneously record activities of hundreds of neurons. Inferring the underlying neuronal connectivity patterns from such multi-neuronal spike train data streams is a challenging statistical and computational problem. This task involves finding significant temporal patterns from vast amounts of symbolic time series data. In this paper we show that the frequent episode mining methods from the field of temporal data mining can be very useful in this context. In the frequent episode discovery framework, the data is viewed as a sequence of events, each of which is characterized by an event type and its time of occurrence and episodes are certain types of temporal patterns in such data. Here we show that, using the set of discovered frequent episodes from multi-neuronal data, one can infer different types of connectivity patterns in the neural system that generated it. For this purpose, we introduce the notion of mining for frequent episodes under certain temporal constraints; the structure of these temporal constraints is motivated by the application. We present algorithms for discovering serial and parallel episodes under these temporal constraints. Through extensive simulation studies we demonstrate that these methods are useful for unearthing patterns of neuronal network connectivity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eight new open-framework inorganic-organic hybrid compounds based on indium have been synthesized employing hydrothermal methods. All of the compounds have InO6, C2O4, and HPO3/HPO4/SO4 units connected to form structures of different dimensionality Thus, the compounds have zero- (I), two- (II, III, IV, V, VII, and VIII), and three-dimensionally (VI) extended networks. The formation of the first zero-dimensional hybrid compound is noteworthy In addition, concomitant polymorphic structures have been observed in the present study. The molecular compound, I, was found to be reactive, and the transformation studies in the presence of a base (pyridine) give rise to the polymorphic structures of II and III, while the addition of an acid (H3PO3) gives rise to a new indium phosphite with a pillared layer structure (T1). Preliminary density functional theory calculations suggest that the stabilities of the polymorphs are different, with one of the forms (II) being preferred over the other, which is consistent with the observed experimental behavior. The oxalate units perform more than one role in the present structures. Thus, the oxalate units connect two In centers to satisfy the coordination requirements as well as to achieve charge balance in compounds II, IV, and VI. The terminal oxalate units observed in compounds I, IV, and V suggest the possibility of intermediate structures. Both in-plane and out-of-plane connectivity of the oxalate units were observed in compound VI. The 31 compounds have been characterized by powder X-ray diffraction, IR spectroscopy, thermogravimetric analysis, and P-31 NMR studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A three-dimensional zinc arsenate with an interrupted zeolitic framework (-IIO), [C4N3H16](2)[Zn-5(AsO4)(4)(HAsO4)(2)], I has been synthesized solvothermally. The structure is built up from ZnO4, AsO4 and HAsO4 tetrahedral units connected alternatively through their vertices forming the 3-D structure possessing one-dimensional channels bound by 10 T-atoms (T = Zn, As), The framework density of the structure is 10.4 T-atoms which indicates considerable openness in its structure. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new framework is proposed in this work to solve multidimensional population balance equations (PBEs) using the method of discretization. A continuous PBE is considered as a statement of evolution of one evolving property of particles and conservation of their n internal attributes. Discretization must therefore preserve n + I properties of particles. Continuously distributed population is represented on discrete fixed pivots as in the fixed pivot technique of Kumar and Ramkrishna [1996a. On the solution of population balance equation by discretization-I A fixed pivot technique. Chemical Engineering Science 51(8), 1311-1332] for 1-d PBEs, but instead of the earlier extensions of this technique proposed in the literature which preserve 2(n) properties of non-pivot particles, the new framework requires n + I properties to be preserved. This opens up the use of triangular and tetrahedral elements to solve 2-d and 3-d PBEs, instead of the rectangles and cuboids that are suggested in the literature. Capabilities of computational fluid dynamics and other packages available for generating complex meshes can also be harnessed. The numerical results obtained indeed show the effectiveness of the new framework. It also brings out the hitherto unknown role of directionality of the grid in controlling the accuracy of the numerical solution of multidimensional PBEs. The numerical results obtained show that the quality of the numerical solution can be improved significantly just by altering the directionality of the grid, which does not require any increase in the number of points, or any refinement of the grid, or even redistribution of pivots in space. Directionality of a grid can be altered simply by regrouping of pivots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Batch Processing Machine (BPM) is one which processes a number of jobs simultaneously as a batch with common beginning and ending times. Also, a BPM, once started cannot be interrupted in between (Pre-emption not allowed). This research is motivated by a BPM in steel casting industry. There are three main stages in any steel casting industry viz., pre-casting stage, casting stage and post-casting stage. A quick overview of the entire process, is shown in Figure 1. There are two BPMs : (1) Melting furnace in the pre-casting stage and (2) Heat Treatment Furnace (HTF) in the post casting stage of steel casting manufacturing process. This study focuses on scheduling the latter, namely HTF. Heat-treatment operation is one of the most important stages of steel casting industries. It determines the final properties that enable components to perform under demanding service conditions such as large mechanical load, high temperature and anti-corrosive processing. In general, different types of castings have to undergo more than one type of heat-treatment operations, where the total heat-treatment processing times change. To have a better control, castings are primarily classified into a number of job-families based on the alloy type such as low-alloy castings and high alloy castings. For technical reasons such as type of alloy, temperature level and the expected combination of heat-treatment operations, the castings from different families can not be processed together in the same batch.