348 resultados para Breakdown Probability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rodenticide use in agriculture can lead to the secondary poisoning of avian predators. Currently the Australian sugarcane industry has two rodenticides, Racumin® and Rattoff®, available for in-crop use but, like many agricultural industries, it lacks an ecologically-based method of determining the potential secondary poisoning risk the use of these rodenticides poses to avian predators. The material presented in this thesis addresses this by: a. determining where predator/prey interactions take place in sugar producing districts; b. quantifying the amount of rodenticide available to avian predators and the probability of encounter; and c. developing a stochastic model that allows secondary poisoning risk under various rodenticide application scenarios to be investigated. Results demonstrate that predator/prey interactions are highly constrained by environmental structure. Rodents used crops that provided high levels of canopy cover and therefore predator protection and poorly utilised open canopy areas. In contrast, raptors over-utilised areas with low canopy cover and low rodent densities, but which provided high accessibility to prey. Given this pattern of habitat use, and that industry baiting protocols preclude rodenticide application in open canopy crops, these results indicate that secondary poisoning can only occur if poisoned rodents leave closed canopy crops and become available for predation in open canopy areas. Results further demonstrate that after in-crop rodenticide application, only a small proportion of rodents available in open areas are poisoned and that these rodents carry low levels of toxicant. Coupled with the low level of rodenticide use in the sugar industry, the high toxic threshold raptors have to these toxicants and the low probability of encountering poisoned rodents, results indicate that the risk of secondary poisoning events occurring is minimal. A stochastic model was developed to investigate the effect of manipulating factors that might influence secondary poisoning hazard in a sugarcane agro-ecosystem. These simulations further suggest that in all but extreme scenarios, the risk of secondary poisoning is also minimal. Collectively, these studies demonstrate that secondary poisoning of avian predators associated with the use of the currently available rodenticides in Australian sugar producing districts is minimal. Further, the ecologically-based method of assessing secondary poisoning risk developed in this thesis has broader applications in other agricultural systems where rodenticide use may pose risks to avian predators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secondary tasks such as cell phone calls or interaction with automated speech dialog systems (SDSs) increase the driver’s cognitive load as well as the probability of driving errors. This study analyzes speech production variations due to cognitive load and emotional state of drivers in real driving conditions. Speech samples were acquired from 24 female and 17 male subjects (approximately 8.5 h of data) while talking to a co-driver and communicating with two automated call centers, with emotional states (neutral, negative) and the number of necessary SDS query repetitions also labeled. A consistent shift in a number of speech production parameters (pitch, first format center frequency, spectral center of gravity, spectral energy spread, and duration of voiced segments) was observed when comparing SDS interaction against co-driver interaction; further increases were observed when considering negative emotion segments and the number of requested SDS query repetitions. A mel frequency cepstral coefficient based Gaussian mixture classifier trained on 10 male and 10 female sessions provided 91% accuracy in the open test set task of distinguishing co-driver interactions from SDS interactions, suggesting—together with the acoustic analysis—that it is possible to monitor the level of driver distraction directly from their speech.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The expansion of economics to ‘non-market topics’ has received increased attention in recent years. The economics of sports (football) is such a sub-field. This paper reports empirical evidence of team and referee performances in the FIFA World Cup 2002. The results reveal that being a hosting nation has a significant impact on the probability of winning a game. Furthermore, the strength of a team measured with the FIFA World Ranking does not play the important role presumed, which indicates that the element of uncertainty is working. The findings also indicate that the influence of a referee on the game result should not be neglected. Finally, the previous World Cup experiences seem to have the strongest impact on referees' performances during the game.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent research on particle size distributions and particle concentrations near a busy road cannot be explained by the conventional mechanisms for particle evolution of combustion aerosols. Specifically they appear to be inadequate to explain the experimental observations of particle transformation and the evolution of the total number concentration. This resulted in the development of a new mechanism based on their thermal fragmentation, for the evolution of combustion aerosol nano-particles. A complex and comprehensive pattern of evolution of combustion aerosols, involving particle fragmentation, was then proposed and justified. In that model it was suggested that thermal fragmentation occurs in aggregates of primary particles each of which contains a solid graphite/carbon core surrounded by volatile molecules bonded to the core by strong covalent bonds. Due to the presence of strong covalent bonds between the core and the volatile (frill) molecules, such primary composite particles can be regarded as solid, despite the presence of significant (possibly, dominant) volatile component. Fragmentation occurs when weak van der Waals forces between such primary particles are overcome by their thermal (Brownian) motion. In this work, the accepted concept of thermal fragmentation is advanced to determine whether fragmentation is likely in liquid composite nano-particles. It has been demonstrated that at least at some stages of evolution, combustion aerosols contain a large number of composite liquid particles containing presumably several components such as water, oil, volatile compounds, and minerals. It is possible that such composite liquid particles may also experience thermal fragmentation and thus contribute to, for example, the evolution of the total number concentration as a function of distance from the source. Therefore, the aim of this project is to examine theoretically the possibility of thermal fragmentation of composite liquid nano-particles consisting of immiscible liquid v components. The specific focus is on ternary systems which include two immiscible liquid droplets surrounded by another medium (e.g., air). The analysis shows that three different structures are possible, the complete encapsulation of one liquid by the other, partial encapsulation of the two liquids in a composite particle, and the two droplets separated from each other. The probability of thermal fragmentation of two coagulated liquid droplets is discussed and examined for different volumes of the immiscible fluids in a composite liquid particle and their surface and interfacial tensions through the determination of the Gibbs free energy difference between the coagulated and fragmented states, and comparison of this energy difference with the typical thermal energy kT. The analysis reveals that fragmentation was found to be much more likely for a partially encapsulated particle than a completely encapsulated particle. In particular, it was found that thermal fragmentation was much more likely when the volume ratio of the two liquid droplets that constitute the composite particle are very different. Conversely, when the two liquid droplets are of similar volumes, the probability of thermal fragmentation is small. It is also demonstrated that the Gibbs free energy difference between the coagulated and fragmented states is not the only important factor determining the probability of thermal fragmentation of composite liquid particles. The second essential factor is the actual structure of the composite particle. It is shown that the probability of thermal fragmentation is also strongly dependent on the distance that each of the liquid droplets should travel to reach the fragmented state. In particular, if this distance is larger than the mean free path for the considered droplets in the air, the probability of thermal fragmentation should be negligible. In particular, it follows form here that fragmentation of the composite particle in the state with complete encapsulation is highly unlikely because of the larger distance that the two droplets must travel in order to separate. The analysis of composite liquid particles with the interfacial parameters that are expected in combustion aerosols demonstrates that thermal fragmentation of these vi particles may occur, and this mechanism may play a role in the evolution of combustion aerosols. Conditions for thermal fragmentation to play a significant role (for aerosol particles other than those from motor vehicle exhaust) are determined and examined theoretically. Conditions for spontaneous transformation between the states of composite particles with complete and partial encapsulation are also examined, demonstrating the possibility of such transformation in combustion aerosols. Indeed it was shown that for some typical components found in aerosols that transformation could take place on time scales less than 20 s. The analysis showed that factors that influenced surface and interfacial tension played an important role in this transformation process. It is suggested that such transformation may, for example, result in a delayed evaporation of composite particles with significant water component, leading to observable effects in evolution of combustion aerosols (including possible local humidity maximums near a source, such as a busy road). The obtained results will be important for further development and understanding of aerosol physics and technologies, including combustion aerosols and their evolution near a source.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Currently used Trauma and Injury Severity Score (TRISS) coefficients, which measure probability of survival (Ps), were derived from the Major Trauma Outcome Study (MTOS) in 1995 and are now unlikely to be optimal. This study aims to estimate new TRISS coefficients using a contemporary database of injured patients presenting to emergency departments in the United States; and to compare these against the MTOS coefficients.---------- Methods: Data were obtained from the National Trauma Data Bank (NTDB) and the NTDB National Sample Project (NSP). TRISS coefficients were estimated using logistic regression. Separate coefficients were derived from complete case and multistage multiple imputation analyses for each NTDB and NSP dataset. Associated Ps over Injury Severity Score values were graphed and compared by age (adult ≥ 15 years; pediatric < 15 years) and injury mechanism (blunt; penetrating) groups. Area under the Receiver Operating Characteristic curves was used to assess coefficients’ predictive performance.---------- Results: Overall 1,072,033 NTDB and 1,278,563 weighted NSP injury events were included, compared with 23,177 used in the original MTOS analyses. Large differences were seen between results from complete case and imputed analyses. For blunt mechanism and adult penetrating mechanism injuries, there were similarities between coefficients estimated on imputed samples, and marked divergences between associated Ps estimated and those from the MTOS. However, negligible differences existed between area under the receiver operating characteristic curves estimates because the overwhelming majority of patients had minor trauma and survived. For pediatric penetrating mechanism injuries, variability in coefficients was large and Ps estimates unreliable.---------- Conclusions: Imputed NTDB coefficients are recommended as the TRISS coefficients 2009 revision for blunt mechanism and adult penetrating mechanism injuries. Coefficients for pediatric penetrating mechanism injuries could not be reliably estimated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carlin and Finch, this issue, compare goodwill impairment discount rates used by a sample of large Australian firms with ‘independently’ generated discount rates. Their objective is to empirically determine whether managers opportunistically select goodwill discount rates subsequent to the 2005 introduction of International Financial Reporting Standards (IFRS) in Australia. This is a worthwhile objective given that IFRS introduced an impairment regime, and within this regime, discount rate selection plays a key role in goodwill valuation decisions. It is also timely to consider the goodwill valuation issue. Following the recent downturn in the economy, there is a high probability that many firms will be forced to write down impaired goodwill arising from boom period acquisitions. Hence, evidence of bias in rate selection is likely to be of major concern to investors, policymakers and corporate regulators. Carlin and Finch claim their findings provide evidence of such bias. In this commentary I review the validity of their claims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need in industry for a commodity polyethylene film with controllable degradation properties that will degrade in an environmentally neutral way, for applications such as shopping bags and packaging film. Additives such as starch have been shown to accelerate the degradation of plastic films, however control of degradation is required so that the film will retain its mechanical properties during storage and use, and then degrade when no longer required. By the addition of a photocatalyst it is hoped that polymer film will breakdown with exposure to sunlight. Furthermore, it is desired that the polymer film will degrade in the dark, after a short initial exposure to sunlight. Research has been undertaken into the photo- and thermo-oxidative degradation processes of 25 ìm thick LLDPE (linear low density polyethylene) film containing titania from different manufacturers. Films were aged in a suntest or in an oven at 50 °C, and the oxidation product formation was followed using IR spectroscopy. Degussa P25, Kronos 1002, and various organic-modified and doped titanias of the types Satchleben Hombitan and Hunstsman Tioxide incorporated into LLDPE films were assessed for photoactivity. Degussa P25 was found to be the most photoactive with UVA and UVC exposure. Surface modification of titania was found to reduce photoactivity. Crystal phase is thought to be among the most important factors when assessing the photoactivity of titania as a photocatalyst for degradation. Pre-irradiation with UVA or UVC for 24 hours of the film containing 3% Degussa P25 titania prior to aging in an oven resulted in embrittlement in ca. 200 days. The multivariate data analysis technique PCA (principal component analysis) was used as an exploratory tool to investigate the IR spectral data. Oxidation products formed in similar relative concentrations across all samples, confirming that titania was catalysing the oxidation of the LLDPE film without changing the oxidation pathway. PCA was also employed to compare rates of degradation in different films. PCA enabled the discovery of water vapour trapped inside cavities formed by oxidation by titania particles. Imaging ATR/FTIR spectroscopy with high lateral resolution was used in a novel experiment to examine the heterogeneous nature of oxidation of a model polymer compound caused by the presence of titania particles. A model polymer containing Degussa P25 titania was solvent cast onto the internal reflection element of the imaging ATR/FTIR and the oxidation under UVC was examined over time. Sensitisation of 5 ìm domains by titania resulted in areas of relatively high oxidation product concentration. The suitability of transmission IR with a synchrotron light source to the study of polymer film oxidation was assessed as the Australian Synchrotron in Melbourne, Australia. Challenges such as interference fringes and poor signal-to-noise ratio need to be addressed before this can become a routine technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Principal Topic: Entrepreneurship is key to employment, innovation and growth (Acs & Mueller, 2008), and as such, has been the subject of tremendous research in both the economic and management literatures since Solow (1957), Schumpeter (1934, 1943), and Penrose (1959). The presence of entrepreneurs in the economy is a key factor in the success or failure of countries to grow (Audretsch and Thurik, 2001; Dejardin, 2001). Further studies focus on the conditions of existence of entrepreneurship, influential factors invoked are historical, cultural, social, institutional, or purely economic (North, 1997; Thurik 1996 & 1999). Of particular interest, beyond the reasons behind the existence of entrepreneurship, are entrepreneurial survival and good ''performance'' factors. Using cross-country firm data analysis, La Porta & Schleifer (2008) confirm that informal micro-businesses provide on average half of all economic activity in developing countries. They find that these are utterly unproductive compared to formal firms, and conclude that the informal sector serves as a social security net ''keep[ing] millions of people alive, but disappearing over time'' (abstract). Robison (1986), Hill (1996, 1997) posit that the Indonesian government under Suharto always pointed to the lack of indigenous entrepreneurship , thereby motivating the nationalisation of all industries. Furthermore, the same literature also points to the fact that small businesses were mostly left out of development programmes because they were supposed less productive and having less productivity potential than larger ones. Vial (2008) challenges this view and shows that small firms represent about 70% of firms, 12% of total output, but contribute to 25% of total factor productivity growth on average over the period 1975-94 in the industrial sector (Table 10, p.316). ---------- Methodology/Key Propositions: A review of the empirical literature points at several under-researched questions. Firstly, we assess whether there is, evidence of small family-business entrepreneurship in Indonesia. Secondly, we examine and present the characteristics of these enterprises, along with the size of the sector, and its dynamics. Thirdly, we study whether these enterprises underperform compared to the larger scale industrial sector, as it is suggested in the literature. We reconsider performance measurements for micro-family owned businesses. We suggest that, beside productivity measures, performance could be appraised by both the survival probability of the firm, and by the amount of household assets formation. We compare micro-family-owned and larger industrial firms' survival probabilities after the 1997 crisis, their capital productivity, then compare household assets of families involved in business with those who do not. Finally, we examine human and social capital as moderators of enterprises' performance. In particular, we assess whether a higher level of education and community participation have an effect on the likelihood of running a family business, and whether it has an impact on households' assets level. We use the IFLS database compiled and published by RAND Corporation. The data is a rich community, households, and individuals panel dataset in four waves: 1993, 1997, 2000, 2007. We now focus on the waves 1997 and 2000 in order to investigate entrepreneurship behaviours in turbulent times, i.e. the 1997 Asian crisis. We use aggregate individual data, and focus on households data in order to study micro-family-owned businesses. IFLS data covers roughly 7,600 households in 1997 and over 10,000 households in 2000, with about 95% of 1997 households re-interviewed in 2000. Households were interviewed in 13 of the 27 provinces as defined before 2001. Those 13 provinces were targeted because accounting for 83% of the population. A full description of the data is provided in Frankenberg and Thomas (2000), and Strauss et alii (2004). We deflate all monetary values in Rupiah with the World Development Indicators Consumer Price Index base 100 in 2000. ---------- Results and Implications: We find that in Indonesia, entrepreneurship is widespread and two thirds of households hold one or several family businesses. In rural areas, in 2000, 75% of households run one or several businesses. The proportion of households holding both a farm and a non farm business is higher in rural areas, underlining the reliance of rural households on self-employment, especially after the crisis. Those businesses come in various sizes from very small to larger ones. The median business production value represents less than the annual national minimum wage. Figures show that at least 75% of farm businesses produce less than the annual minimum wage, with non farm businesses being more numerous to produce the minimum wage. However, this is only one part of the story, as production is not the only ''output'' or effect of the business. We show that the survival rate of those businesses ranks between 70 and 82% after the 1997 crisis, which contrasts with the 67% survival rate for the formal industrial sector (Ter Wengel & Rodriguez, 2006). Micro Family Owned Businesses might be relatively small in terms of production, they also provide stability in times of crisis. For those businesses that provide business assets figures, we show that capital productivity is fairly high, with rates that are ten times higher for non farm businesses. Results show that households running a business have larger family assets, and households are better off in urban areas. We run a panel logit model in order to test the effect of human and social capital on the existence of businesses among households. We find that non farm businesses are more likely to appear in households with higher human and social capital situated in urban areas. Farm businesses are more likely to appear in lower human capital and rural contexts, while still being supported by community participation. The estimation of our panel data model confirm that households are more likely to have higher family assets if situated in urban area, the higher the education level, the larger the assets, and running a business increase the likelihood of having larger assets. This is especially true for non farm businesses that have a clearly larger and more significant effect on assets than farm businesses. Finally, social capital in the form of community participation also has a positive effect on assets. Those results confirm the existence of a strong entrepreneurship culture among Indonesian households. Investigating survival rates also shows that those businesses are quite stable, even in the face of a violent crisis such as the 1997 one, and as a result, can provide a safety net. Finally, considering household assets - the returns of business to the household, rather than profit or productivity - the returns of business to itself, shows that households running a business are better off. While we demonstrate that uman and social capital are key to business existence, survival and performance, those results open avenues for further research regarding the factors that could hamper growth of those businesses in terms of output and employment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The sinking of the Titanic in April 1912 took the lives of 68 percent of the people aboard. Who survived? It was women and children who had a higher probability of being saved, not men. Likewise, people traveling in first class had a better chance of survival than those in second and third class. British passengers were more likely to perish than members of other nations. This extreme event represents a rare case of a well-documented life and death situation where social norms were enforced. This paper shows that economic analysis can account for human behavior in such situations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

uring periods of market stress, electricity prices can rise dramatically. Electricity retailers cannot pass these extreme prices on to customers because of retail price regulation. Improved prediction of these price spikes therefore is important for risk management. This paper builds a time-varying-probability Markov-switching model of Queensland electricity prices, aimed particularly at forecasting price spikes. Variables capturing demand and weather patterns are used to drive the transition probabilities. Unlike traditional Markov-switching models that assume normality of the prices in each state, the model presented here uses a generalised beta distribution to allow for the skewness in the distribution of electricity prices during high-price episodes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper illustrates the prediction of opponent behaviour in a competitive, highly dynamic, multi-agent and partially observable environment, namely RoboCup small size league robot soccer. The performance is illustrated in the context of the highly successful robot soccer team, the RoboRoos. The project is broken into three tasks; classification of behaviours, modelling and prediction of behaviours and integration of the predictions into the existing planning system. A probabilistic approach is taken to dealing with the uncertainty in the observations and with representing the uncertainty in the prediction of the behaviours. Results are shown for a classification system using a Naïve Bayesian Network that determines the opponent’s current behaviour. These results are compared to an expert designed fuzzy behaviour classification system. The paper illustrates how the modelling system will use the information from behaviour classification to produce probability distributions that model the manner with which the opponents perform their behaviours. These probability distributions are show to match well with the existing multi-agent planning system (MAPS) that forms the core of the RoboRoos system.