898 resultados para Non-linear series


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The standard approach to tax compliance applies the economics-of-crime methodology pioneered by Becker (1968): in its first application, due to Allingham and Sandmo (1972) it models the behaviour of agents as a decision involving a choice of the extent of their income to report to tax authorities, given a certain institutional environment, represented by parameters such as the probability of detection and penalties in the event the agent is caught. While this basic framework yields important insights on tax compliance behavior, it has some critical limitations. Specifically, it indicates a level of compliance that is significantly below what is observed in the data. This thesis revisits the original framework with a view towards addressing this issue, and examining the political economy implications of tax evasion for progressivity in the tax structure. The approach followed involves building a macroeconomic, dynamic equilibrium model for the purpose of examining these issues, by using a step-wise model building procedure starting with some very simple variations of the basic Allingham and Sandmo construct, which are eventually integrated to a dynamic general equilibrium overlapping generations framework with heterogeneous agents. One of the variations involves incorporating the Allingham and Sandmo construct into a two-period model of a small open economy of the type originally attributed to Fisher (1930). A further variation of this simple construct involves allowing agents to initially decide whether to evade taxes or not. In the event they decide to evade, the agents then have to decide the extent of income or wealth they wish to under-report. We find that the ‘evade or not’ assumption has strikingly different and more realistic implications for the extent of evasion, and demonstrate that it is a more appropriate modeling strategy in the context of macroeconomic models, which are essentially dynamic in nature, and involve consumption smoothing across time and across various states of nature. Specifically, since deciding to undertake tax evasion impacts on the consumption smoothing ability of the agent by creating two states of nature in which the agent is ‘caught’ or ‘not caught’, there is a possibility that their utility under certainty, when they choose not to evade, is higher than the expected utility obtained when they choose to evade. Furthermore, the simple two-period model incorporating an ‘evade or not’ choice can be used to demonstrate some strikingly different political economy implications relative to its Allingham and Sandmo counterpart. In variations of the two models that allow for voting on the tax parameter, we find that agents typically choose to vote for a high degree of progressivity by choosing the highest available tax rate from the menu of choices available to them. There is, however, a small range of inequality levels for which agents in the ‘evade or not’ model vote for a relatively low value of the tax rate. The final steps in the model building procedure involve grafting the two-period models with a political economy choice into a dynamic overlapping generations setting with more general, non-linear tax schedules and a ‘cost-of evasion’ function that is increasing in the extent of evasion. Results based on numerical simulations of these models show further improvement in the model’s ability to match empirically plausible levels of tax evasion. In addition, the differences between the political economy implications of the ‘evade or not’ version of the model and its Allingham and Sandmo counterpart are now very striking; there is now a large range of values of the inequality parameter for which agents in the ‘evade or not’ model vote for a low degree of progressivity. This is because, in the ‘evade or not’ version of the model, low values of the tax rate encourages a large number of agents to choose the ‘not-evade’ option, so that the redistributive mechanism is more ‘efficient’ relative to the situations in which tax rates are high. Some further implications of the models of this thesis relate to whether variations in the level of inequality, and parameters such as the probability of detection and penalties for tax evasion matter for the political economy results. We find that (i) the political economy outcomes for the tax rate are quite insensitive to changes in inequality, and (ii) the voting outcomes change in non-monotonic ways in response to changes in the probability of detection and penalty rates. Specifically, the model suggests that changes in inequality should not matter, although the political outcome for the tax rate for a given level of inequality is conditional on whether there is a large or small or large extent of evasion in the economy. We conclude that further theoretical research into macroeconomic models of tax evasion is required to identify the structural relationships underpinning the link between inequality and redistribution in the presence of tax evasion. The models of this thesis provide a necessary first step in that direction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a new approach for state estimation of angles and frequencies of equivalent areas in large power systems with synchronized phasor measurement units. Defining coherent generators and their correspondent areas, generators are aggregated and system reduction is performed in each area of inter-connected power systems. The structure of the reduced system is obtained based on the characteristics of the reduced linear model and measurement data to form the non-linear model of the reduced system. Then a Kalman estimator is designed for the reduced system to provide an equivalent dynamic system state estimation using the synchronized phasor measurement data. The method is simulated on two test systems to evaluate the feasibility of the proposed method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We examined the variation in association between high temperatures and elderly mortality (age ≥ 75 years) from year to year in 83 US cities between 1987 and 2000. We used a Poisson regression model and decomposed the mortality risk for high temperatures into: a “main effect” due to high temperatures using lagged non-linear function, and an “added effect” due to consecutive high temperature days. We pooled yearly effects across both regional and national levels. The high temperature effects (both main and added effects) on elderly mortality varied greatly from year to year. In every city there was at least one year where higher temperatures were associated with lower mortality. Years with relatively high heat-related mortality were often followed by years with relatively low mortality. These year to year changes have important consequences for heat-warning systems and for predictions of heat-related mortality due to climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cold-formed steel beams are increasingly used as floor joists and bearers in buildings and often their behaviour and moment capacities are influenced by lateral-torsional buckling. With increasing usage of cold-formed steel beams their fire safety design has become an important issue. Fire design rules are commonly based on past research on hot-rolled steel beams. Hence a detailed parametric study was undertaken using validated finite element models to investigate the lateral-torsional buckling behaviour of simply supported cold-formed steel lipped channel beams subjected to uniform bending at uniform elevated temperatures. The moment capacity results were compared with the predictions from the available ambient temperature and fire design rules and suitable recommendations were made. European fire design rules were found to be over-conservative while the ambient temperature design rules could not be used based on single buckling curve. Hence a new design method was proposed that includes the important non-linear stress-strain characteristics observed for cold-formed steels at elevated temperatures. Comparison with numerical moment capacities demonstrated the accuracy of the new design method. This paper presents the details of the parametric study, comparisons with current design rules and the new design rules proposed in this research for lateral-torsional buckling of cold-formed steel lipped channel beams at elevated temperatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Research background: Communicating the diverse nature of multimodal practice is inherently difficult for the design-led research academic. Websites are an effective means of displaying images and text, but for the user/viewer the act of viewing is often random and disorienting, due to the non-linear means of accessing the information. This characteristic of websites limits the medium’s efficacy in regard to presenting an overarching philosophical standpoint or theme - the key driver behind most academic research. Research Contribution: This website: http://www.ianweirarchitect.com, presents a means of reconciling this problem by presenting a deceptively simple graphic and temporal layout, which limits the opportunity for the user/viewer to become disoriented and miss the key themes and issues that binds, the otherwise divergent, research material together. Research significance: http://www.ianweirarchitect.com, is a creative work that supplements Dr Ian Weir’s exhibition “Enacted Cartography” held in August 2012 in Brisbane and in August/September 2012 in Venice, Italy for the 13th International Architecture Exhibition (Venice Architecture Biennale). Dr Weir was selected by the Australian Institute of Architects to represent innovation in architectural practice for the Institute’s Formations: New Practices in Australian Architecture, exhibition and catalogue (of the same name) held in the Australian Pavilion, The Giardini, Venice. This website is creative output that compliments Dr Weir’s other multimodal outputs including photographic artworks, cartographic maps and architectural designs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work has led to the development of empirical mathematical models to quantitatively predicate the changes of morphology in osteocyte-like cell lines (MLO-Y4) in culture. MLO-Y4 cells were cultured at low density and the changes in morphology recorded over 11 hours. Cell area and three dimensional shape features including aspect ratio, circularity and solidity were then determined using widely accepted image analysis software (ImageJTM). Based on the data obtained from the imaging analysis, mathematical models were developed using the non-linear regression method. The developed mathematical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analyzing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Carbon fibre reinforced polymer (CFRP) sheets have many outstanding properties such as high strength, high elastic modulus, light weight and good durability which are made them a suitable alternative for steel in strengthening work. This paper describe the ultimate load carrying capacity of steel hollow sections at effective bond length in terms of its cross sectional area and the stress distribution within bond region for different layers CFRP. It was found that depending on their size and orientation of uni- directional CFRP layers, the ultimate tensile load was different. Along with these tests, non linear finite element analysis was also performed to validate the ultimate load carrying capacity depending on their cross sections. The predicted ultimate loads from FE analysis are found very close to the laboratory test results. The validated model has been used to determine the stress distribution at bond joint for different orientation of CFRP. This research shows the effect of stress distribution and suitable wrapping layer to be used for the strengthening of steel hollow sections in tension.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a new active noise control (ANC) technique. The technique has a feedback structure to have a simple configuration in practical implementation. In this approach, the secondary path is modelled online to ensure convergence of the system as the secondary paths are practically time varying or non-linear. The proposed method consists of two steps: a noise controller which is based on a modified FxLMS algorithm, and a new variable step size (VSS) LMS algorithm which is used to adapt the modelling filter with the secondary path. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Eliminating continuous injection of the white noise increases the performance of the proposed method significantly and makes it more desirable for practical ANC systems. The computer simulations are presented to show the effectiveness of the proposed method.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To examine the effects of extremely cold and hot temperatures on ischaemic heart disease (IHD) mortality in five cities (Beijing, Tianjin, Shanghai, Wuhan and Guangzhou) in China; and to examine the time relationships between cold and hot temperatures and IHD mortality for each city. Design: A negative binomial regression model combined with a distributed lag non-linear model was used to examine city-specific temperature effects on IHD mortality up to 20 lag days. A meta-analysis was used to pool the cold effects and hot effects across the five cities. Patients: 16 559 IHD deaths were monitored by a sentinel surveillance system in five cities during 2004–2008. Results: The relationships between temperature and IHD mortality were non-linear in all five cities. The minimum-mortality temperatures in northern cities were lower than in southern cities. In Beijing, Tianjin and Guangzhou, the effects of extremely cold temperatures were delayed, while Shanghai and Wuhan had immediate cold effects. The effects of extremely hot temperatures appeared immediately in all the cities except Wuhan. Meta-analysis showed that IHD mortality increased 48% at the 1st percentile of temperature (extremely cold temperature) compared with the 10th percentile, while IHD mortality increased 18% at the 99th percentile of temperature (extremely hot temperature) compared with the 90th percentile. Conclusions: Results indicate that both extremely cold and hot temperatures increase IHD mortality in China. Each city has its characteristics of heat effects on IHD mortality. The policy for response to climate change should consider local climate–IHD mortality relationships.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigates whether and how a firm’s ownership and corporate governance affect its timeliness of price discovery, which is referred to as the speed of incorporation of value-relevant information into the stock price. Using a panel data of 1,138 Australian firm-year observations from 2001 to 2008, we predict and find a non-linear relationship between ownership concentration and the timeliness of price discovery. We test the identity of the largest shareholder and find that only firms with family as the largest shareholder exhibit faster price discovery. There is no evidence that suggests that the presence of a second largest shareholder affects the timeliness of price discovery materially. Although we find a positive association between corporate governance quality and the timeliness of price discovery, as expected, there is no interaction effect between the largest shareholding and corporate governance in relation to the timeliness of price discovery. Further tests show no evidence of severe endogeneity problems in our study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reliability analysis is crucial to reducing unexpected down time, severe failures and ever tightened maintenance budget of engineering assets. Hazard based reliability methods are of particular interest as hazard reflects the current health status of engineering assets and their imminent failure risks. Most existing hazard models were constructed using the statistical methods. However, these methods were established largely based on two assumptions: one is the assumption of baseline failure distributions being accurate to the population concerned and the other is the assumption of effects of covariates on hazards. These two assumptions may be difficult to achieve and therefore compromise the effectiveness of hazard models in the application. To address this issue, a non-linear hazard modelling approach is developed in this research using neural networks (NNs), resulting in neural network hazard models (NNHMs), to deal with limitations due to the two assumptions for statistical models. With the success of failure prevention effort, less failure history becomes available for reliability analysis. Involving condition data or covariates is a natural solution to this challenge. A critical issue for involving covariates in reliability analysis is that complete and consistent covariate data are often unavailable in reality due to inconsistent measuring frequencies of multiple covariates, sensor failure, and sparse intrusive measurements. This problem has not been studied adequately in current reliability applications. This research thus investigates such incomplete covariates problem in reliability analysis. Typical approaches to handling incomplete covariates have been studied to investigate their performance and effects on the reliability analysis results. Since these existing approaches could underestimate the variance in regressions and introduce extra uncertainties to reliability analysis, the developed NNHMs are extended to include handling incomplete covariates as an integral part. The extended versions of NNHMs have been validated using simulated bearing data and real data from a liquefied natural gas pump. The results demonstrate the new approach outperforms the typical incomplete covariates handling approaches. Another problem in reliability analysis is that future covariates of engineering assets are generally unavailable. In existing practices for multi-step reliability analysis, historical covariates were used to estimate the future covariates. Covariates of engineering assets, however, are often subject to substantial fluctuation due to the influence of both engineering degradation and changes in environmental settings. The commonly used covariate extrapolation methods thus would not be suitable because of the error accumulation and uncertainty propagation. To overcome this difficulty, instead of directly extrapolating covariate values, projection of covariate states is conducted in this research. The estimated covariate states and unknown covariate values in future running steps of assets constitute an incomplete covariate set which is then analysed by the extended NNHMs. A new assessment function is also proposed to evaluate risks of underestimated and overestimated reliability analysis results. A case study using field data from a paper and pulp mill has been conducted and it demonstrates that this new multi-step reliability analysis procedure is able to generate more accurate analysis results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. This paper contains a comprehensive set of analytical benchmark solutions for steel frames comprising non-compact sections, which can be used to verify the accuracy of simplified concentrated plasticity methods of advanced analysis. The analytical benchmark solutions were obtained using a distributed plasticity shell finite element model that explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. A brief description and verification of the shell finite element model is provided in this paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives To examine the effect of extreme temperatures on emergency department admissions (EDAs) for childhood asthma. Methods An ecological design was used in this study. A Poisson linear regression model combined with a distributed lag non-linear model was used to quantify the effect of temperature on EDAs for asthma among children aged 0–14 years in Brisbane, Australia, during January 2003–December 2009, while controlling for air pollution, relative humidity, day of the week, season and long-term trends. The model residuals were checked to identify whether there was an added effect due to heat waves or cold spells. Results There were 13 324 EDAs for childhood asthma during the study period. Both hot and cold temperatures were associated with increases in EDAs for childhood asthma, and their effects both appeared to be acute. An added effect of heat waves on EDAs for childhood asthma was observed, but no added effect of cold spells was found. Male children and children aged 0–4 years were most vulnerable to heat effects, while children aged 10–14 years were most vulnerable to cold effects. Conclusions Both hot and cold temperatures seemed to affect EDAs for childhood asthma. As climate change continues, children aged 0–4 years are at particular risk for asthma.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Osteocyte cells are the most abundant cells in human bone tissue. Due to their unique morphology and location, osteocyte cells are thought to act as regulators in the bone remodelling process, and are believed to play an important role in astronauts’ bone mass loss after long-term space missions. There is increasing evidence showing that an osteocyte’s functions are highly affected by its morphology. However, changes in an osteocyte’s morphology under an altered gravity environment are still not well documented. Several in vitro studies have been recently conducted to investigate the morphological response of osteocyte cells to the microgravity environment, where osteocyte cells were cultured on a two-dimensional flat surface for at least 24 hours before microgravity experiments. Morphology changes of osteocyte cells in microgravity were then studied by comparing the cell area to 1g control cells. However, osteocyte cells found in vivo are with a more 3D morphology, and both cell body and dendritic processes are found sensitive to mechanical loadings. A round shape osteocyte’s cells support a less stiff cytoskeleton and are more sensitive to mechanical stimulations compared with flat cellular morphology. Thus, the relative flat and spread shape of isolated osteocytes in 2D culture may greatly hamper their sensitivity to a mechanical stimulus, and the lack of knowledge on the osteocyte’s morphological characteristics in culture may lead to subjective and noncomprehensive conclusions of how altered gravity impacts on an osteocyte’s morphology. Through this work empirical models were developed to quantitatively predicate the changes of morphology in osteocyte cell lines (MLO-Y4) in culture, and the response of osteocyte cells, which are relatively round in shape, to hyper-gravity stimulation has also been investigated. The morphology changes of MLO-Y4 cells in culture were quantified by measuring cell area and three dimensionless shape features including aspect ratio, circularity and solidity by using widely accepted image analysis software (ImageJTM). MLO-Y4 cells were cultured at low density (5×103 per well) and the changes in morphology were recorded over 10 hours. Based on the data obtained from the imaging analysis, empirical models were developed using the non-linear regression method. The developed empirical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analysing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary. The morphological response of MLO-Y4 cells with a relatively round morphology to hyper-gravity environment has been investigated using a centrifuge. After 2 hours culture, MLO-Y4 cells were exposed to 20g for 30mins. Changes in the morphology of MLO-Y4 cells are quantitatively analysed by measuring the average value of cell area and dimensionless shape factors such as aspect ratio, solidity and circularity. In this study, no significant morphology changes were detected in MLO-Y4 cells under a hyper-gravity environment (20g for 30 mins) compared with 1g control cells.