163 resultados para latent fingermarks
Resumo:
Education is often viewed as a key approach to address sexual-health issues; the current concern is the burgeoning HIV/AIDS epidemic. This ethnographic study investigates the gender practices associated with high-risk sexual behaviour in Papua New Guinea as viewed by educators there. A number of practices, including gender inequality and associated sexual behaviours have been highlighted by male and female participants as escalating PNG’s HIV/AIDS epidemic. The study finds that although participants were well-informed concerning HIV/AIDS, they had varying beliefs concerning the prevailing gender/sexual issues involved in escalating highrisk behaviour and how to address the problem. The study further examines the behavioural beliefs and intentions of the educators themselves. Subsequently, within the data a number of underpinning factors, pertaining to gender, education and life experience, were found to be related to the behaviour beliefs and intentions of participants towards embracing change with regard to behaviours associated with gender equality in PNG. These factors appeared to encourage participants to adopt healthier gender and sexual behavioural intentions and, arguably, could provide the basis for ways to help address the gender inequality and high-risk behaviours associated with HIV/AIDS in PNG.
Resumo:
Motor unit number estimation (MUNE) is a method which aims to provide a quantitative indicator of progression of diseases that lead to loss of motor units, such as motor neurone disease. However the development of a reliable, repeatable and fast real-time MUNE method has proved elusive hitherto. Ridall et al. (2007) implement a reversible jump Markov chain Monte Carlo (RJMCMC) algorithm to produce a posterior distribution for the number of motor units using a Bayesian hierarchical model that takes into account biological information about motor unit activation. However we find that the approach can be unreliable for some datasets since it can suffer from poor cross-dimensional mixing. Here we focus on improved inference by marginalising over latent variables to create the likelihood. In particular we explore how this can improve the RJMCMC mixing and investigate alternative approaches that utilise the likelihood (e.g. DIC (Spiegelhalter et al., 2002)). For this model the marginalisation is over latent variables which, for a larger number of motor units, is an intractable summation over all combinations of a set of latent binary variables whose joint sample space increases exponentially with the number of motor units. We provide a tractable and accurate approximation for this quantity and also investigate simulation approaches incorporated into RJMCMC using results of Andrieu and Roberts (2009).
Resumo:
The Posttraumatic Growth Inventory (PTGI; Tedeschi & Calhoun, 1996) is the most commonly used measure of positive psychological change that can result from negotiating a traumatic experience. Whilst the PTGI has strong internal reliability, validity studies are still sparse. The presented research details trauma survivors’ understanding of items comprising the PTGI in order to qualitatively assess content validity. Participants were 14 trauma survivors who completed the PTGI and participated in a semi-structured interview. Thematic Analysis was conducted on participant’s transcribed interviews. One latent theme was identified reflecting that questions were consistently understood. A relationship was found between the constituent themes identified and the five factors of the PTGI. Participants answered the PTGI statements in a way that is consistent with the purpose of the instrument with only a small discrepancy found when some participants used the PTGI scale to indicate when a decrease in an element of the inventory had been experienced. Overall results supported the content validity of the PTGI.
Resumo:
The New South Wales Court of Appeal decision of Wood v Balfour [2011] NSWCA 382 presents an interesting factual matrix relating to the obligation of a seller to disclose significant latent defects in quality of title to a buyer, in this instance, severe termite damage. It offers insights into the difficulty of a buyer proving the existence of the element of deceit in the making of a representation with respect to quality and reinforces the importance of the rule caveat emptor as being an article of faith for every buyer of real estate.
Resumo:
Readily accepted knowledge regarding crash causation is consistently omitted from efforts to model and subsequently understand motor vehicle crash occurrence and their contributing factors. For instance, distracted and impaired driving accounts for a significant proportion of crash occurrence, yet is rarely modeled explicitly. In addition, spatially allocated influences such as local law enforcement efforts, proximity to bars and schools, and roadside chronic distractions (advertising, pedestrians, etc.) play a role in contributing to crash occurrence and yet are routinely absent from crash models. By and large, these well-established omitted effects are simply assumed to contribute to model error, with predominant focus on modeling the engineering and operational effects of transportation facilities (e.g. AADT, number of lanes, speed limits, width of lanes, etc.) The typical analytical approach—with a variety of statistical enhancements—has been to model crashes that occur at system locations as negative binomial (NB) distributed events that arise from a singular, underlying crash generating process. These models and their statistical kin dominate the literature; however, it is argued in this paper that these models fail to capture the underlying complexity of motor vehicle crash causes, and thus thwart deeper insights regarding crash causation and prevention. This paper first describes hypothetical scenarios that collectively illustrate why current models mislead highway safety researchers and engineers. It is argued that current model shortcomings are significant, and will lead to poor decision-making. Exploiting our current state of knowledge of crash causation, crash counts are postulated to arise from three processes: observed network features, unobserved spatial effects, and ‘apparent’ random influences that reflect largely behavioral influences of drivers. It is argued; furthermore, that these three processes in theory can be modeled separately to gain deeper insight into crash causes, and that the model represents a more realistic depiction of reality than the state of practice NB regression. An admittedly imperfect empirical model that mixes three independent crash occurrence processes is shown to outperform the classical NB model. The questioning of current modeling assumptions and implications of the latent mixture model to current practice are the most important contributions of this paper, with an initial but rather vulnerable attempt to model the latent mixtures as a secondary contribution.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
A composite paraffin-based phase change material (PCM) was prepared by blending composite paraffin and calcined diatomite through the fusion adsorption method. In this study, raw diatomite was purified by thermal treatment in order to improve the adsorption capacity of diatomite, which acted as a carrier material to prepare shape-stabilized PCMs. Two forms of paraffin (paraffin waxes and liquid paraffin) with different melting points were blended together by the fusion method, and the optimum mixed proportion with a suitable phase-transition temperature was obtained through differential scanning calorimetry (DSC) analysis. Then the prepared composite paraffin was adsorbed in calcined diatomite. The prepared paraffin/calcined diatomite composites were characterized by the scanning electron microscope (SEM) and Fourier transformation infrared (FT-IR) analysis techniques. Thermal energy storage properties of the composite PCMs were determined by DSC method. DSC results showed that there was an optimum adsorption ratio between composite paraffin and calcined diatomite and the phase-transition temperature and the latent heat of the composite PCMs were 33.04 ◦C and 89.54 J/g, respectively. Thermal cycling test of composite PCMs showed that the prepared material is thermally reliable and chemically stable. The obtained paraffin/calcined diatomite composites have proper latent heat and melting temperatures, and show practical significance and good potential application value.
Resumo:
Dynamic capabilities are widely considered to incorporate those processes that enable organizations to sustain superior performance over time. In this paper, we argue theoretically and demonstrate empirically that these effects are contingent on organizational structure and the competitive intensity in the market. Results from partial least square structural equation modeling (PLS-SEM) analyses indicate that organic organizational structures facilitate the impact of dynamic capabilities on organizational performance. Furthermore, we find that the performance effects of dynamic capabilities are contingent on the competitive intensity faced by firms. Our findings demonstrate the performance effects of internal alignment between organizational structure and dynamic capabilities, as well as the external fit of dynamic capabilities with competitive intensity. We outline the advantages of PLS-SEM for modeling latent constructs, such as dynamic capabilities, and conclude with managerial implications.
Resumo:
A quasi-maximum likelihood procedure for estimating the parameters of multi-dimensional diffusions is developed in which the transitional density is a multivariate Gaussian density with first and second moments approximating the true moments of the unknown density. For affine drift and diffusion functions, the moments are exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good and is as effective as alternative methods based on likelihood approximations. The estimation procedure generalises to models with latent factors. A conditioning procedure is developed that allows parameter estimation in the absence of proxies.
Resumo:
The purpose of this article is to examine the role of the alignment between technological innovation effectiveness and operational effectiveness after the implementation of enterprise information systems, and the impact of this alignment on the improvement in operational performance. Confirmatory factor analysis was used to examine structural relationships between the set of observed variables and the set of continuous latent variables. The findings from this research suggest that the dimensions stemming from technological innovation effectiveness such as system quality, information quality, service quality, user satisfaction and the performance objectives stemming from operational effectiveness such as cost, quality, reliability, flexibility and speed are important and significantly well-correlated factors. These factors promote the alignment between technological innovation effectiveness and operational effectiveness and should be the focus for managers in achieving effective implementation of technological innovations. In addition, there is a significant and direct influence of this alignment on the improvement of operational performance. The principal limitation of this study is that the findings are based on investigation of small sample size.
Resumo:
The validity of the Multidimensional School Anger Inventory (MSAI) was examined with adolescents from 5 Pacific Rim countries (N ¼ 3,181 adolescents; age, M ¼ 14.8 years; 52% females). Confirmatory factor analyses examined configural invariance for the MSAI’s anger experience, hostility, destructive expression, and anger coping subscales. The model did not converge for Peruvian students. Using the top 4 loaded items for anger experience, hostility, and destructive expression configural invariance and partial metric and scalar invariances were found. Latent means analysis compared mean responses on each subscale to the U.S. sample. Students from other countries showed higher mean responses on the anger experience subscale (ds ¼ .37–.73). Australian (d ¼ .40) and Japanese students (d ¼ .21) had significantly higher mean hostility subscale scores. Australian students had higher mean scores on the destructive expression subscale (d ¼ .30), whereas Japanese students had lower mean scores (d ¼ 2.17). The largest latent mean gender differences (females lower than males) were for destructive expression among Australian (d ¼ 2.67), Guatemalan (d ¼ 2.42), and U.S. (d ¼ 2.66) students. This study supported an abbreviated, 12-item MSAI with partial invariance. Implications for the use of the MSAI in comparative research are discussed.
Resumo:
This thesis advances the knowledge of behavioural economics on the importance of individual characteristics – such as gender, personality or culture – for choices relevant to labour and insurance markets. It does so using economic experiments, survey tools and physiological data, collected in economic laboratories and in the field. More specifically, the thesis includes 5 experimental economic studies investigating individual-specific characteristics (gender, age, personality, cultural background) in decisions influenced by risk attitudes and social preferences. One of these characteristics is the physiological state of decision-makers, measured by heart rate variability. The results show that individual-specific characteristics play an important role for choices affected by social preferences, a finding to a lesser degree observable for risk preferences. This finding is confirmed under revealed incentivised choices and when studying (latent) physiological responses of decision-makers.
Resumo:
Critical analysis and problem-solving skills are two graduate attributes that are important in ensuring that graduates are well equipped in working across research and practice settings within the discipline of psychology. Despite the importance of these skills, few psychology undergraduate programmes have undertaken any systematic development, implementation, and evaluation of curriculum activities to foster these graduate skills. The current study reports on the development and implementation of a tutorial programme designed to enhance the critical analysis and problem-solving skills of undergraduate psychology students. Underpinned by collaborative learning and problem-based learning, the tutorial programme was administered to 273 third year undergraduate students in psychology. Latent Growth Curve Modelling revealed that students demonstrated a significant linear increase in self-reported critical analysis and problem-solving skills across the tutorial programme. The findings suggest that the development of inquiry-based curriculum offers important opportunities for psychology undergraduates to develop critical analysis and problem-solving skills.
Resumo:
The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.
Resumo:
The promise of metabonomics, a new "omics" technique, to validate Chinese medicines and the compatibility of Chinese formulas has been appreciated. The present study was undertaken to explore the excretion pattern of low molecular mass metabolites in the male Wistar-derived rat model of kidney yin deficiency induced with thyroxine and reserpine as well as the therapeutic effect of Liu Wei Di Huang Wan (LW) and its separated prescriptions, a classic traditional Chinese medicine formula for treating kidney yin deficiency in China. The study utilized ultra-performance liquid chromatography/electrospray ionization synapt high definition mass spectrometry (UPLC/ESI-SYNAPT-HDMS) in both negative and positive electrospray ionization (ESI). At the same time, blood biochemistry was examined to identify specific changes in the kidney yin deficiency. Distinct changes in the pattern of metabolites, as a result of daily administration of thyroxine and reserpine, were observed by UPLC-HDMS combined with a principal component analysis (PCA). The changes in metabolic profiling were restored to their baseline values after treatment with LW according to the PCA score plots. Altogether, the current metabonomic approach based on UPLC-HDMS and orthogonal projection to latent structures discriminate analysis (OPLS-DA) indicated 20 ions (14 in the negative mode, 8 in the positive mode, and 2 in both) as "differentiating metabolites".