915 resultados para Non-linear programming


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper argues a model of open system design for sustainable architecture, based on a thermodynamics framework of entropy as an evolutionary paradigm. The framework can be simplified to stating that an open system evolves in a non-linear pattern from a far-from-equilibrium state towards a non-equilibrium state of entropy balance, which is a highly ordered organization of the system when order comes out of chaos. This paper is work in progress on a PhD research project which aims to propose building information modelling for optimization and adaptation of buildings environmental performance as an alternative sustainable design program in architecture. It will be used for efficient distribution and consumption of energy and material resource in life-cycle buildings, with the active involvement of the end-users and the physical constraints of the natural environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a robust stochastic framework for the incorporation of visual observations into conventional estimation, data fusion, navigation and control algorithms. The representation combines Isomap, a non-linear dimensionality reduction algorithm, with expectation maximization, a statistical learning scheme. The joint probability distribution of this representation is computed offline based on existing training data. The training phase of the algorithm results in a nonlinear and non-Gaussian likelihood model of natural features conditioned on the underlying visual states. This generative model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The instantiated likelihoods are expressed as a Gaussian mixture model and are conveniently integrated within existing non-linear filtering algorithms. Example applications based on real visual data from heterogenous, unstructured environments demonstrate the versatility of the generative models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a robust stochastic model for the incorporation of natural features within data fusion algorithms. The representation combines Isomap, a non-linear manifold learning algorithm, with Expectation Maximization, a statistical learning scheme. The representation is computed offline and results in a non-linear, non-Gaussian likelihood model relating visual observations such as color and texture to the underlying visual states. The likelihood model can be used online to instantiate likelihoods corresponding to observed visual features in real-time. The likelihoods are expressed as a Gaussian Mixture Model so as to permit convenient integration within existing nonlinear filtering algorithms. The resulting compactness of the representation is especially suitable to decentralized sensor networks. Real visual data consisting of natural imagery acquired from an Unmanned Aerial Vehicle is used to demonstrate the versatility of the feature representation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a general methodology for learning articulated motions that, despite having non-linear correlations, are cyclical and have a defined pattern of behavior Using conventional algorithms to extract features from images, a Bayesian classifier is applied to cluster and classify features of the moving object. Clusters are then associated in different frames and structure learning algorithms for Bayesian networks are used to recover the structure of the motion. This framework is applied to the human gait analysis and tracking but applications include any coordinated movement such as multi-robots behavior analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we present the application of a non-linear dimensionality reduction technique for the learning and probabilistic classification of hyperspectral image. Hyperspectral image spectroscopy is an emerging technique for geological investigations from airborne or orbital sensors. It gives much greater information content per pixel on the image than a normal colour image. This should greatly help with the autonomous identification of natural and manmade objects in unfamiliar terrains for robotic vehicles. However, the large information content of such data makes interpretation of hyperspectral images time-consuming and userintensive. We propose the use of Isomap, a non-linear manifold learning technique combined with Expectation Maximisation in graphical probabilistic models for learning and classification. Isomap is used to find the underlying manifold of the training data. This low dimensional representation of the hyperspectral data facilitates the learning of a Gaussian Mixture Model representation, whose joint probability distributions can be calculated offline. The learnt model is then applied to the hyperspectral image at runtime and data classification can be performed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For many decades correlation and power spectrum have been primary tools for digital signal processing applications in the biomedical area. The information contained in the power spectrum is essentially that of the autocorrelation sequence; which is sufficient for complete statistical descriptions of Gaussian signals of known means. However, there are practical situations where one needs to look beyond autocorrelation of a signal to extract information regarding deviation from Gaussianity and the presence of phase relations. Higher order spectra, also known as polyspectra, are spectral representations of higher order statistics, i.e. moments and cumulants of third order and beyond. HOS (higher order statistics or higher order spectra) can detect deviations from linearity, stationarity or Gaussianity in the signal. Most of the biomedical signals are non-linear, non-stationary and non-Gaussian in nature and therefore it can be more advantageous to analyze them with HOS compared to the use of second order correlations and power spectra. In this paper we have discussed the application of HOS for different bio-signals. HOS methods of analysis are explained using a typical heart rate variability (HRV) signal and applications to other signals are reviewed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Young people are arguably facing more ‘complex and contested’ transitions to adulthood and an increasing array of ‘non-linear’ paths. Education and training have been extended, identity is increasingly shaped through leisure and consumerism and youth must navigate their life trajectories in highly individualised ways. The study utilises 819 short essays compiled by students aged 14–16 years from 19 schools in Australia. It examines how young people understand their own unique positions and the possibilities open to them through their aspirations and future orientations to employment and family life. These young people do not anticipate postponing work identities, but rather embrace post-school options such as gaining qualifications, work experience and achieving financial security. Boys expected a distant involvement in family life secondary to participation in paid work. In contrast, around half the girls simultaneously expected a future involving primary care-giving and an autonomous, independent career, suggesting attempts to remake gendered inequalities

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Electrocardiogram (ECG) is an important bio-signal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. The HRV signal can be used as a base signal to observe the heart's functioning. These signals are non-linear and non-stationary in nature. So, higher order spectral (HOS) analysis, which is more suitable for non-linear systems and is robust to noise, was used. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, we have extracted seven features from the heart rate signals using HOS and fed them to a support vector machine (SVM) for classification. Our performance evaluation protocol uses 330 subjects consisting of five different kinds of cardiac disease conditions. We demonstrate a sensitivity of 90% for the classifier with a specificity of 87.93%. Our system is ready to run on larger data sets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Agricultural soils emit about 50% of the global flux of N2O attributable to human influence, mostly in response to nitrogen fertilizer use. Recent evidence that the relationship between N2O fluxes and N-fertilizer additions to cereal maize are non-linear provides an opportunity to estimate regional N2O fluxes based on estimates of N application rates rather than as a simple percentage of N inputs as used by the Intergovernmental Panel on Climate Change (IPCC). We combined a simple empirical model of N2O production with the SOCRATES soil carbon dynamics model to estimate N2O and other sources of Global Warming Potential (GWP) from cereal maize across 19,000 cropland polygons in the North Central Region (NCR) of the US over the period 1964–2005. Results indicate that the loading of greenhouse gases to the atmosphere from cereal maize production in the NCR was 1.7 Gt CO2e, with an average 268 t CO2e produced per tonne of grain. From 1970 until 2005, GHG emissions per unit product declined on average by 2.8 t CO2e ha−1 annum−1, coinciding with a stabilisation in N application rate and consistent increases in grain yield from the mid-1970’s. Nitrous oxide production from N fertilizer inputs represented 59% of these emissions, soil C decline (0–30 cm) represented 11% of total emissions, with the remaining 30% (517 Mt) from the combustion of fuel associated with farm operations. Of the 126 Mt of N fertilizer applied to cereal maize from 1964 to 2005, we estimate that 2.2 Mt N was emitted as N2O when using a non-linear response model, equivalent to 1.75% of the applied N.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mock circulation loops (MCLs) are used to evaluate cardiovascular devices prior to in-vivo trials; however they lack the vital autoregulatory responses that occur in humans. This study aimed to develop and implement a left and right ventricular Frank-Starling response in a MCL. A proportional controller based on ventricular end diastolic volume was used to control the driving pressure of the MCL’s pneumatically operated ventricles. Ventricular pressure-volume loops and end systolic pressure-volume relationships were produced for a variety of healthy and pathological conditions and compared with human data to validate the simulated Frank-Starling response. The non-linear Frank-Starling response produced in this study successfully altered left and right ventricular contractility with changing preload and was validated with previously reported data. This improvement to an already detailed MCL has resulted in a test rig capable of further refining cardiovascular devices and reducing the number of in-vivo trials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this chapter is to increase understanding of how a sound theoretical model of the learner and learning processes informs the organisation of learning environments and effective and efficient use of practice time. Drawing on an in-depth interview with Greg Chappell, the head coach at the Centre of Excellence—the Brisbane-based centre for training and development in cricket of the Australian Institute of Sport (AIS) and Cricket Australia—it describes and explains many of the key features of non-linear pedagogy. Specifically, after backgrounding the constraints-led approach, it deals with environmental constraints; the focus of the individual and the implications of self-organisation for coaching strategies; implications for the coach–athlete relationship; manipulating constraints; representative practice; developing decision-makers and learning design including discovery and implicit learning. It then moves on to a discussion of more global issues such as the reactions of coaches and players when a constraints-led approach is introduced, before finally considering the widely held belief among coaches that approaches such as Teaching Games for Understanding (TGfU) ‘take longer’ than traditional coaching methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: Why we need to base childrens’ sport and physical education on the principles of dynamical systems theory and ecological psychology As the childhood years are crucial for developing many physical skills as well as establishing the groundwork leading to lifelong participation in sport and physical activities, (Orlick & Botterill, 1977, p. 11) it is essential to examine current practice to make sure it is meeting the needs of children. In recent papers (e.g. Renshaw, Davids, Chow & Shuttleworth, in press; Renshaw, Davids, Chow & Hammond, in review; Chow et al., 2009) we have highlighted that a guiding theoretical framework is needed to provide a principled approach to teaching and coaching and that the approach must be evidence- based and focused on mechanism and not just on operational issues such as practice, competition and programme management (Lyle, 2002). There is a need to demonstrate how nonlinear pedagogy underpins teaching and coaching practice for children given that some of the current approaches underpinning children’s sport and P.E. may not be leading to optimal results. For example, little time is spent undertaking physical activities (Tinning, 2006) and much of this practice is not representative of the competition demands of the performance environment (Kirk & McPhail, 2002; Renshaw et al., 2008). Proponents of a non- linear pedagogy advocate the design of practice by applying key concepts such as the mutuality of the performer and environment, the tight coupling of perception and action, and the emergence of movement solutions due to self organisation under constraints (see Renshaw, et al., in press). As skills are shaped by the unique interacting individual, task and environmental constraints in these learning environments, small changes to individual structural (e.g. factors such as height or limb length) or functional constraints (e.g. factors such as motivation, perceptual skills, strength that can be acquired), task rules, equipment, or environmental constraints can lead to dramatic changes in movement patterns adopted by learners to solve performance problems. The aim of this chapter is to provide real life examples for teachers and coaches who wish to adopt the ideas of non- linear pedagogy in their practice. Specifically, I will provide examples related to specific issues related to individual constraints in children and in particular the unique challenges facing coaches when individual constraints are changing due to growth and development. Part two focuses on understanding how cultural environmental constraints impact on children’s sport. This is an area that has received very little attention but plays a very important part in the long- term development of sporting expertise. Finally, I will look at how coaches can manipulate task constraints to create effective learning environments for young children.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Columns are one of the key load bearing elements that are highly susceptible to vehicle impacts. The resulting severe damages to columns may leads to failures of the supporting structure that are catastrophic in nature. However, the columns in existing structures are seldom designed for impact due to inadequacies of design guidelines. The impact behaviour of columns designed for gravity loads and actions other than impact is, therefore, of an interest. A comprehensive investigation is conducted on reinforced concrete column with a particular focus on investigating the vulnerability of the exposed columns and to implement mitigation techniques under low to medium velocity car and truck impacts. The investigation is based on non-linear explicit computer simulations of impacted columns followed by a comprehensive validation process. The impact is simulated using force pulses generated from full scale vehicle impact tests. A material model capable of simulating triaxial loading conditions is used in the analyses. Circular columns adequate in capacity for five to twenty story buildings, designed according to Australian standards are considered in the investigation. The crucial parameters associated with the routine column designs and the different load combinations applied at the serviceability stage on the typical columns are considered in detail. Axially loaded columns are examined at the initial stage and the investigation is extended to analyse the impact behaviour under single axis bending and biaxial bending. The impact capacity reduction under varying axial loads is also investigated. Effects of the various load combinations are quantified and residual capacity of the impacted columns based on the status of the damage and mitigation techniques are also presented. In addition, the contribution of the individual parameter to the failure load is scrutinized and analytical equations are developed to identify the critical impulses in terms of the geometrical and material properties of the impacted column. In particular, an innovative technique was developed and introduced to improve the accuracy of the equations where the other techniques are failed due to the shape of the error distribution. Above all, the equations can be used to quantify the critical impulse for three consecutive points (load combinations) located on the interaction diagram for one particular column. Consequently, linear interpolation can be used to quantify the critical impulse for the loading points that are located in-between on the interaction diagram. Having provided a known force and impulse pair for an average impact duration, this method can be extended to assess the vulnerability of columns for a general vehicle population based on an analytical method that can be used to quantify the critical peak forces under different impact durations. Therefore the contribution of this research is not only limited to produce simplified yet rational design guidelines and equations, but also provides a comprehensive solution to quantify the impact capacity while delivering new insight to the scientific community for dealing with impacts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Birth weight and length have seasonal fluctuations. Previous analyses of birth weight by latitude effects identified seemingly contradictory results, showing both 6 and 12 monthly periodicities in weight. The aims of this paper are twofold: (a) to explore seasonal patterns in a large, Danish Medical Birth Register, and (b) to explore models based on seasonal exposures and a non-linear exposure-risk relationship. Methods Birth weight and birth lengths on over 1.5 million Danish singleton, live births were examined for seasonality. We modelled seasonal patterns based on linear, U- and J-shaped exposure-risk relationships. We then added an extra layer of complexity by modelling weighted population-based exposure patterns. Results The Danish data showed clear seasonal fluctuations for both birth weight and birth length. A bimodal model best fits the data, however the amplitude of the 6 and 12 month peaks changed over time. In the modelling exercises, U- and J-shaped exposure-risk relationships generate time series with both 6 and 12 month periodicities. Changing the weightings of the population exposure risks result in unexpected properties. A J-shaped exposure-risk relationship with a diminishing population exposure over time fitted the observed seasonal pattern in the Danish birth weight data. Conclusion In keeping with many other studies, Danish birth anthropometric data show complex and shifting seasonal patterns. We speculate that annual periodicities with non-linear exposure-risk models may underlie these findings. Understanding the nature of seasonal fluctuations can help generate candidate exposures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: This paper reviews the epidemiological evidence on the relationship between ambient temperature and morbidity. It assesses the methodological issues in previous studies, and proposes future research directions. DATA SOURCES AND DATA EXTRACTION: We searched the PubMed database for epidemiological studies on ambient temperature and morbidity of non-communicable diseases published in refereed English journals prior to June 2010. 40 relevant studies were identified. Of these, 24 examined the relationship between ambient temperature and morbidity, 15 investigated the short-term effects of heatwave on morbidity, and 1 assessed both temperature and heatwave effects. DATA SYNTHESIS: Descriptive and time-series studies were the two main research designs used to investigate the temperature–morbidity relationship. Measurements of temperature exposure and health outcomes used in these studies differed widely. The majority of studies reported a significant relationship between ambient temperature and total or cause-specific morbidities. However, there were some inconsistencies in the direction and magnitude of non-linear lag effects. The lag effect of hot temperature on morbidity was shorter (several days) compared to that of cold temperature (up to a few weeks). The temperature–morbidity relationship may be confounded and/or modified by socio-demographic factors and air pollution. CONCLUSIONS: There is a significant short-term effect of ambient temperature on total and cause-specific morbidities. However, further research is needed to determine an appropriate temperature measure, consider a diverse range of morbidities, and to use consistent methodology to make different studies more comparable.