12 resultados para Non linearity

em Deakin Research Online - Australia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Orientation detection and discrimination thresholds were measured for Gabor ‘envelopes’ formed from contrast-modulation of luminance ‘carriers’. Consistent with previous research differences between carrier and envelope orientation had no effect on sensitivity to envelopes. Using plaid carriers in which the proportion of contrast modulation ‘carried’ by each plaid component was systematically manipulated, it was shown that this tolerance to carrier-envelope orientation difference reflects linear summation across orientation indicative of a single second-stage channel coding for contrast-defined structure. That contrast envelopes did not exhibit linear summation across spatial-frequency, nor across combinations of orientation and spatial-frequency differences, suggests that these second-order channels operate only within certain spatial scales. Using arrays of Gabor micropatterns as carriers in which the orientation distribution of the carriers was manipulated independently of the difference between envelope orientation and mean carrier orientation, it was further demonstrated that the locus of orientation integration must occur prior to envelope detection. In the context of two-stage models that incorporate a non-linearity between the stages, the pattern of results obtained is consistent with the operation of an orientation pooling process between first-stage and second-stage channels, analogous to having all filters of the first-stage feed into all filters of the second-stage within the same spatial-frequency band.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines the applicability of Zipf's law to tourism. It is established that a variation of this law holds in this case - a rank-size rule with concavity. Due to this non-linearity, it is shown that a spline regression provides an extremely convenient tool for predicting tourist arrivals in a country. The concavity is explained by appealing to random growth theory (lognormal distribution; Gibrat's law) and locational fundamentals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to describe how order-generated rules applied to organizing form dualities can assist in creating the conditions for emergent, self-organized behavior in organizations, thereby offering an operational deployment of complexity theory.

Design/methodology/approach – The paper begins by showing that the concept of dualities is consistent with complexity-thinking. In addition, when applied to organizing forms, dualities represent a practical way of affecting an organization's balance between chaos and order. Thus, when augmented with order-generating rules, organizing form dualities provide an access point for the practical instigation of edge of chaos conditions and the potential for emergence.

Findings
– The paper maintains that many attempts to “manage” complexity have been associated with changes to organizing forms, specifically toward new forms of organizing. It is suggested that organizing form dualities provide some management guidance for encouraging the “edge of chaos” conditions advocated in complexity theory, although the details of self-organization cannot be prescribed given the assumptions of non-linearity associated with complexity theory perspectives. Finally, it is proposed that organizing dualities can elucidate the nature and application of order-generating rules in non-linear complex systems.

Practical implications – Dualities offer some guidance toward the practical implementation of complexity theory as they represent an accessible sub-system where the forces for order and chaos – traditional and new forms of organizing respectively – are accessible and subject to manipulation.

Originality/value
– The commonalities between dualities and complexity theory are intuitive, but little conceptual work has shown how the former can be employed as a guide to managing organizing forms. Moreover, this approach demonstrates that managers may be able to stimulate “edge of chaos” conditions in a practical way, without making positivistic assumptions about the causality associated with their efforts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Replication, or repeated tests at the same stress amplitude, is used to provide statistical confidence in life data during the development of S-N curves. This paper discusses the effects of replication on the measurement of S-N curves and presents an alternative to traditional replication methods for the determination of S-N curves, particularly for the development of preliminary S-N curves. Using specimens made out of the extruded bars of a magnesium alloy, it is demonstrated that the S-N curve estimated using the data from non-replication tests is almost same as that from replication tests. The advantage of using non-replication fatigue tests is that it uses fewer specimens, in this instance, only half of that required for 50% replication fatigue test, to achieve the same estimation as that of the replication fatigue tests. Another advantage of using non-replication fatigue tests is that it can detect the non-linearity using limited specimens.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper offers new findings which support the hypothesis that a causal link from happiness to social capital might exist. The paper exploits the very long German socio-economic panel of around 15000 people. Using the prospective study methodology, it finds that happier people contribute more to social capital. Both parametric and nonparametric results suggest that there exists an inverted-U shape relationship between happiness to social capital. Moreover, optimism appears to be an important channel through which happiness is linked to social capital. The paper also presents residual happiness as a measure of optimism which might be a valuable tool for empirical researchers. The results are robust to inclusion of various controls including the initial level of social capital, random sampling, non-linearity, different measures of social capital, and estimation techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wetland and floodplain ecosystems along many regulated rivers are highly stressed, primarily due to a lack of environmental flows of appropriate magnitude, frequency, duration, and timing to support ecological functions. In the absence of increased environmental flows, the ecological health of river ecosystems can be enhanced by the operation of existing and new flow-control infrastructure (weirs and regulators) to return more natural environmental flow regimes to specific areas. However, determining the optimal investment and operation strategies over time is a complex task due to several factors including the multiple environmental values attached to wetlands, spatial and temporal heterogeneity and dependencies, nonlinearity, and time-dependent decisions. This makes for a very large number of decision variables over a long planning horizon. The focus of this paper is the development of a nonlinear integer programming model that accommodates these complexities. The mathematical objective aims to return the natural flow regime of key components of river ecosystems in terms of flood timing, flood duration, and interflood period. We applied a 2-stage recursive heuristic using tabu search to solve the model and tested it on the entire South Australian River Murray floodplain. We conclude that modern meta-heuristics can be used to solve the very complex nonlinear problems with spatial and temporal dependencies typical of environmental flow allocation in regulated river ecosystems. The model has been used to inform the investment in, and operation of, flow-control infrastructure in the South Australian River Murray.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Replication, or repeated tests at the same stress amplitude, is used to provide statistical confidence in life data during the development of S-N curves. This paper discusses the effects of replication on the measurement of S-N curves and presents an alternative to traditional replication methods for the determination of S-N curves, particularly for the development of preliminary S-N curves. Using specimens made out of the extruded bars of a magnesium alloy, it is demonstrated that the S-N curve estimated using the data from non-replication tests is almost same as that from replication tests. The advantage of using non-replication fatigue tests is that it uses fewer specimens, in this instance, only half of that required for 50% replication fatigue test, to achieve the same estimation as that of the replication fatigue tests. Another advantage of using non-replication fatigue tests is that it can detect the non-linearity using limited specimens.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stochastic search techniques such as evolutionary algorithms (EA) are known to be better explorer of search space as compared to conventional techniques including deterministic methods. However, in the era of big data like most other search methods and learning algorithms, suitability of evolutionary algorithms is naturally questioned. Big data pose new computational challenges including very high dimensionality and sparseness of data. Evolutionary algorithms' superior exploration skills should make them promising candidates for handling optimization problems involving big data. High dimensional problems introduce added complexity to the search space. However, EAs need to be enhanced to ensure that majority of the potential winner solutions gets the chance to survive and mature. In this paper we present an evolutionary algorithm with enhanced ability to deal with the problems of high dimensionality and sparseness of data. In addition to an informed exploration of the solution space, this technique balances exploration and exploitation using a hierarchical multi-population approach. The proposed model uses informed genetic operators to introduce diversity by expanding the scope of search process at the expense of redundant less promising members of the population. Next phase of the algorithm attempts to deal with the problem of high dimensionality by ensuring broader and more exhaustive search and preventing premature death of potential solutions. To achieve this, in addition to the above exploration controlling mechanism, a multi-tier hierarchical architecture is employed, where, in separate layers, the less fit isolated individuals evolve in dynamic sub-populations that coexist alongside the original or main population. Evaluation of the proposed technique on well known benchmark problems ascertains its superior performance. The algorithm has also been successfully applied to a real world problem of financial portfolio management. Although the proposed method cannot be considered big data-ready, it is certainly a move in the right direction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Vascular implants belong to a specialised class of medical textiles. The basic purpose of a vascular implant (graft and stent) is to act as an artificial conduit or substitute for a diseased artery. However, the long-term healing function depends on its ability to mimic the mechanical and biological behaviour of the artery. This requires a thorough understanding of the structure and function of an artery, which can then be translated into a synthetic structure based on the capabilities of the manufacturing method utilised. Common textile manufacturing techniques, such as weaving, knitting, braiding, and electrospinning, are frequently used to design vascular implants for research and commercial purposes for the past decades. However, the ability to match attributes of a vascular substitute to those of a native artery still remains a challenge. The synthetic implants have been found to cause disturbance in biological, biomechanical, and hemodynamic parameters at the implant site, which has been widely attributed to their structural design. In this work, we reviewed the design aspect of textile vascular implants and compared them to the structure of a natural artery as a basis for assessing the level of success as an implant. The outcome of this work is expected to encourage future design strategies for developing improved long lasting vascular implants.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigates the effects of oil price shocks on three measures of oil exporters' and oil importers' external balances: total trade balance, oil trade balance and non-oil trade balance. We employ three second-generation heterogeneous linear panel models and one recently developed non-linear panel estimation technique that allows for cross-sectional dependence. With respect to 28 major oil exporting countries, an increase in oil prices leads to an improved real oil trade balance, although it is detrimental to non-oil and total trade balances. This finding might be due to the expenditure effect arising from increases in proceeds from oil exports. A decrease in oil prices is found to be beneficial for both total and oil trade balances in these oil exporting countries. Forty major oil importers seem to be increasingly shielded from positive oil shocks over the 1970s and 1980s; however, they must worry about oil price declines. A decline in oil prices has a negative impact on both total and real oil trade balances resulting from increased oil imports in emerging economies. Hence, a decline in oil prices is beneficial to oil exporters due to the quantity effect outweighing the price effect, while for oil importers a stable oil price is more desirable than a price decline. These results are important to take into account if we are to gain a full understanding on the magnitude of the trade and macroeconomic effects of oil price changes and what the policy responses should be.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the late 1900s, suitable key lengths were determined by cryptographers who considered four main features based on implementation, expected lifespan and associated security. By 2010, recommendations are aimed at governmental and commercial institutions, which take into consideration practical implementations that provide data security. By aggregating the key length predictive data since 1985, we notice that while the figures proposed between 1990 and 2010 increase linearly, those proposed for 2010 to 2050 do not. This motivates us to re-think the factors used as a basis for key length predictions and we initiate this re-evaluation in this paper. Focusing first on implementation, we clarify the meaning of Moore’s Law by going back to his original papers and commentary. We then focus on the period 2010-2015, when non-linearity appears, and test Moore’s Law based on three different hardware platforms. Our conclusion is that current assumptions about Moore’s law are still reasonable and that non-linearity is likely to be caused by other factors which we will investigate in future work.