220 resultados para 2D elasticity problems
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
Over the past 20 years the labour market, workforce and work organisation of most if not all industrialised countries have been significantly refashioned by the increased use of more flexible work arrangements, variously labelled as precarious employment or contingent work. There is now a substantial and growing body of international evidence that many of these arrangements are associated with a significant deterioration in occupational health and safety (OHS), using a range of measures such as injury rates, disease, hazard exposures and work-related stress. Moreover, there is an emerging body of evidence that these arrangements pose particular problems for conventional regulatory regimes. Recognition of these problems has aroused the concern of policy makers - especially in Europe, North America and Australia - and a number of responses have been adopted in terms of modifying legislation, producing new guidance material and codes of practice and revised enforcement practices. This article describes one such in itiative in Australia with regard to home-based clothing workers. The regulatory strategy developed in one Australian jurisdiction (and now being ‘exported’ into others) seeks to counter this process via contractual tracking mechanisms to follow the work, tie in liability and shift overarching legal responsibility to the top of the supply chain. The process also entails the integration of minimum standards relating to wages, hours and working conditions; OHS and access to workers’ compensation. While home-based clothing manufacture represents a very old type of ‘flexible’ work arrangement, it is one that regulators have found especially difficult to address. Further, the elaborate multi-tiered subcont racting and diffuse work locations found in this industry are also characteristic of newer forms of contingent work in other industries (such as some telework) and the regulatory challenges they pose (such as the tendency of elaborate supply chains to attenuate and fracture statutory responsibilities, at least in terms of the attitudes and behaviour of those involved).
Resumo:
Distributed computation and storage have been widely used for processing of big data sets. For many big data problems, with the size of data growing rapidly, the distribution of computing tasks and related data can affect the performance of the computing system greatly. In this paper, a distributed computing framework is presented for high performance computing of All-to-All Comparison Problems. A data distribution strategy is embedded in the framework for reduced storage space and balanced computing load. Experiments are conducted to demonstrate the effectiveness of the developed approach. They have shown that about 88% of the ideal performance capacity have be achieved in multiple machines through using the approach presented in this paper.
Resumo:
Evaluating agency theory and optimal contracting theory views of corporate philanthropy, we find that as corporate giving increases, shareholders reduce their valuation of firm cash holdings. Dividend increases following the 2003 Tax Reform Act are associated with reduced corporate giving. Using a natural experiment, we find that corporate giving is positively (negatively) associated with CEO charity preferences (CEO shareholdings and corporate governance quality). Evidence from CEO-affiliated charity donations, market reactions to insider-affiliated donations, its relation to CEO compensation, and firm contributions to director-affiliated charities indicates that corporate donations advance CEO interests and suggests misuses of corporate resources that reduce firm value.
Resumo:
The MOCVD assisted formation of nested WS2 inorganic fullerenes (IF-WS2) was performed by enhancing surface diffusion with iodine, and fullerene growth was monitored by taking TEM snapshots of intermediate products. The internal structure of the core-shell nanoparticles was studied using scanning electron microscopy (SEM) after cross-cutting with a focused ion beam (FIB). Lamellar reaction intermediates were found occluded in the fullerene particles. In contrast to carbon fullerenes, layered metal chalcogenides prefer the formation of planar, plate-like structures where the dangling bonds at the edges are stabilized by excess S atoms. The effects of the reaction and annealing temperatures on the composition and morphology of the final product were investigated, and the strength of the WS2 shell was measured by intermittent contact-mode AFM. The encapsulated lamellar structures inside the hollow spheres may lead to enhanced tribological activities.
Resumo:
This paper explores possible ways in which design thinking can be used to understand the issues involved in wicked problems. Following brief reviews of both the design thinking and wicked problems literature, we offer a synthesis of these two areas. This paper distinguishes the connections of the various thinking styles inherent in design thinking methodologies and approaches to the phases of wicked problems. We create links between design thinking and cognitive mechanisms such as thinking styles, simple heuristics and schemas that can deepen understanding and effective decision making in the wicked problems in our complex environments.
Resumo:
This thesis proposes three novel models which extend the statistical methodology for motor unit number estimation, a clinical neurology technique. Motor unit number estimation is important in the treatment of degenerative muscular diseases and, potentially, spinal injury. Additionally, a recent and untested statistic to enable statistical model choice is found to be a practical alternative for larger datasets. The existing methods for dose finding in dual-agent clinical trials are found to be suitable only for designs of modest dimensions. The model choice case-study is the first of its kind containing interesting results using so-called unit information prior distributions.
Resumo:
A new mesh adaptivity algorithm that combines a posteriori error estimation with bubble-type local mesh generation (BLMG) strategy for elliptic differential equations is proposed. The size function used in the BLMG is defined on each vertex during the adaptive process based on the obtained error estimator. In order to avoid the excessive coarsening and refining in each iterative step, two factor thresholds are introduced in the size function. The advantages of the BLMG-based adaptive finite element method, compared with other known methods, are given as follows: the refining and coarsening are obtained fluently in the same framework; the local a posteriori error estimation is easy to implement through the adjacency list of the BLMG method; at all levels of refinement, the updated triangles remain very well shaped, even if the mesh size at any particular refinement level varies by several orders of magnitude. Several numerical examples with singularities for the elliptic problems, where the explicit error estimators are used, verify the efficiency of the algorithm. The analysis for the parameters introduced in the size function shows that the algorithm has good flexibility.
Resumo:
Translocation is an increasingly popular conservation tool from which a wide range of taxa have benefited. However, to our knowledge, bats have not been translocated successfully. Bats differ behaviourally, morphologically and physiologically from the taxa for which translocation the- ory has been developed, so existing guidelines may not be directly transferable. We review previous translocations of bats and discuss characteristics of bats that may require special consideration dur- ing translocation. Their vagility and homing ability, coloniality, roost requirements, potential ability to transmit diseases, susceptibility to anthropomorphic impacts, and cryptic nature have implications for establishing populations, effects of these populations on the release site, and ability to monitor translocation success following release. We hope that our discussion of potential problems will be able to supplement the existing, more generic guidelines to provide a starting point for the planning of bat translocations.
Resumo:
Under certain conditions, the mathematical models governing the melting of nano-sized particles predict unphysical results, which suggests these models are incomplete. This thesis studies the addition of different physical effects to these models, using analytic and numerical techniques to obtain realistic and meaningful results. In particular, the mathematical "blow-up" of solutions to ill-posed Stefan problems is examined, and the regularisation of this blow-up via kinetic undercooling. Other effects such as surface tension, density change and size-dependent latent heat of fusion are also analysed.
Resumo:
Strain-based failure criteria have several advantages over stress-based failure criteria: they can account for elastic and inelastic strains, they utilise direct, observables effects instead of inferred effects (strain gauges vs. stress estimates), and model complete stress-strain curves including pre-peak, non-linear elasticity and post-peak strain weakening. In this study, a strain-based failure criterion derived from thermodynamic first principles utilising the concepts of continuum damage mechanics is presented. Furthermore, implementation of this failure criterion into a finite-element simulation is demonstrated and applied to the stability of underground mining coal pillars. In numerical studies, pillar strength is usually expressed in terms of critical stresses or stress-based failure criteria where scaling with pillar width and height is common. Previous publications have employed the finite-element method for pillar stability analysis using stress-based failure criterion such as Mohr-Coulomb and Hoek-Brown or stress-based scalar damage models. A novel constitutive material model, which takes into consideration anisotropy as well as elastic strain and damage as state variables has been developed and is presented in this paper. The damage threshold and its evolution are strain-controlled, and coupling of the state variables is achieved through the damage-induced degradation of the elasticity tensor. This material model is implemented into the finite-element software ABAQUS and can be applied to 3D problems. Initial results show that this new material model is capable of describing the non-linear behaviour of geomaterials commonly observed before peak strength is reached as well as post-peak strain softening. Furthermore, it is demonstrated that the model can account for directional dependency of failure behaviour (i.e. anisotropy) and has the potential to be expanded to environmental controls like temperature or moisture.
Resumo:
Neuroimaging research has shown localised brain activation to different facial expressions. This, along with the finding that schizophrenia patients perform poorly in their recognition of negative emotions, has raised the suggestion that patients display an emotion specific impairment. We propose that this asymmetry in performance reflects task difficulty gradations, rather than aberrant processing in neural pathways subserving recognition of specific emotions. A neural network model is presented, which classifies facial expressions on the basis of measurements derived from human faces. After training, the network showed an accuracy pattern closely resembling that of healthy subjects. Lesioning of the network led to an overall decrease in the network’s discriminant capacity, with the greatest accuracy decrease to fear, disgust and anger stimuli. This implies that the differential pattern of impairment in schizophrenia patients can be explained without having to postulate impairment of specific processing modules for negative emotion recognition.