893 resultados para Boussinesq approximation
Resumo:
A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.
Resumo:
Ab-initio DFT calculations for the phonon dispersion (PD) and the Phonon Density Of States (PDOS) of the two isotopic forms (10B and 11B) of MgB2 demonstrate that use of a reduced symmetry super-lattice provides an improved approximation to the dynamical, phonon-distorted P6/mmm crystal structure. Construction of phonon frequency plots using calculated values for these isotopic forms gives linear trends with integer multiples of a base frequency that change in slope in a manner consistent with the isotope effect (IE). Spectral parameters inferred from this method are similar to that determined experimentally for the pure isotopic forms of MgB2. Comparison with AlB2 demonstrates that a coherent phonon decay down to acoustic modes is not possible for this metal. Coherent acoustic phonon decay may be an important contributor to superconductivity for MgB2.
A tag-based personalized item recommendation system using tensor modeling and topic model approaches
Resumo:
This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment
Resumo:
Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.
Resumo:
This thesis is a study on controlling methods for six-legged robots. The study is based on mathematical modeling and simulation. A new joint controller is proposed and tested in simulation that uses joint angles and leg reaction force as inputs to generate a torque, and a method to optimise this controller is formulated and validated. Simulation shows that hexapod can walk on flat ground based on PID controllers with just four target configurations and a set of leg coordination rules, which provided the basis for the design of the new controller.
Resumo:
The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.
Resumo:
Background and Aims Research into craving is hampered by lack of theoretical specification and a plethora of substance-specific measures. This study aimed to develop a generic measure of craving based on elaborated intrusion (EI) theory. Confirmatory factor analysis (CFA) examined whether a generic measure replicated the three-factor structure of the Alcohol Craving Experience (ACE) scale over different consummatory targets and time-frames. Design Twelve studies were pooled for CFA. Targets included alcohol, cigarettes, chocolate and food. Focal periods varied from the present moment to the previous week. Separate analyses were conducted for strength and frequency forms. Setting Nine studies included university students, with single studies drawn from an internet survey, a community sample of smokers and alcohol-dependent out-patients. Participants A heterogeneous sample of 1230 participants. Measurements Adaptations of the ACE questionnaire. Findings Both craving strength [comparative fit indices (CFI = 0.974; root mean square error of approximation (RMSEA) = 0.039, 95% confidence interval (CI) = 0.035–0.044] and frequency (CFI = 0.971, RMSEA = 0.049, 95% CI = 0.044–0.055) gave an acceptable three-factor solution across desired targets that mapped onto the structure of the original ACE (intensity, imagery, intrusiveness), after removing an item, re-allocating another and taking intercorrelated error terms into account. Similar structures were obtained across time-frames and targets. Preliminary validity data on the resulting 10-item Craving Experience Questionnaire (CEQ) for cigarettes and alcohol were strong. Conclusions The Craving Experience Questionnaire (CEQ) is a brief, conceptually grounded and psychometrically sound measure of desires. It demonstrates a consistent factor structure across a range of consummatory targets in both laboratory and clinical contexts.
Resumo:
This thesis progresses Bayesian experimental design by developing novel methodologies and extensions to existing algorithms. Through these advancements, this thesis provides solutions to several important and complex experimental design problems, many of which have applications in biology and medicine. This thesis consists of a series of published and submitted papers. In the first paper, we provide a comprehensive literature review on Bayesian design. In the second paper, we discuss methods which may be used to solve design problems in which one is interested in finding a large number of (near) optimal design points. The third paper presents methods for finding fully Bayesian experimental designs for nonlinear mixed effects models, and the fourth paper investigates methods to rapidly approximate the posterior distribution for use in Bayesian utility functions.
Resumo:
A fractional FitzHugh–Nagumo monodomain model with zero Dirichlet boundary conditions is presented, generalising the standard monodomain model that describes the propagation of the electrical potential in heterogeneous cardiac tissue. The model consists of a coupled fractional Riesz space nonlinear reaction-diffusion model and a system of ordinary differential equations, describing the ionic fluxes as a function of the membrane potential. We solve this model by decoupling the space-fractional partial differential equation and the system of ordinary differential equations at each time step. Thus, this means treating the fractional Riesz space nonlinear reaction-diffusion model as if the nonlinear source term is only locally Lipschitz. The fractional Riesz space nonlinear reaction-diffusion model is solved using an implicit numerical method with the shifted Grunwald–Letnikov approximation, and the stability and convergence are discussed in detail in the context of the local Lipschitz property. Some numerical examples are given to show the consistency of our computational approach.
Resumo:
BACKGROUND Measuring disease and injury burden in populations requires a composite metric that captures both premature mortality and the prevalence and severity of ill-health. The 1990 Global Burden of Disease study proposed disability-adjusted life years (DALYs) to measure disease burden. No comprehensive update of disease burden worldwide incorporating a systematic reassessment of disease and injury-specific epidemiology has been done since the 1990 study. We aimed to calculate disease burden worldwide and for 21 regions for 1990, 2005, and 2010 with methods to enable meaningful comparisons over time. METHODS We calculated DALYs as the sum of years of life lost (YLLs) and years lived with disability (YLDs). DALYs were calculated for 291 causes, 20 age groups, both sexes, and for 187 countries, and aggregated to regional and global estimates of disease burden for three points in time with strictly comparable definitions and methods. YLLs were calculated from age-sex-country-time-specific estimates of mortality by cause, with death by standardised lost life expectancy at each age. YLDs were calculated as prevalence of 1160 disabling sequelae, by age, sex, and cause, and weighted by new disability weights for each health state. Neither YLLs nor YLDs were age-weighted or discounted. Uncertainty around cause-specific DALYs was calculated incorporating uncertainty in levels of all-cause mortality, cause-specific mortality, prevalence, and disability weights. FINDINGS Global DALYs remained stable from 1990 (2·503 billion) to 2010 (2·490 billion). Crude DALYs per 1000 decreased by 23% (472 per 1000 to 361 per 1000). An important shift has occurred in DALY composition with the contribution of deaths and disability among children (younger than 5 years of age) declining from 41% of global DALYs in 1990 to 25% in 2010. YLLs typically account for about half of disease burden in more developed regions (high-income Asia Pacific, western Europe, high-income North America, and Australasia), rising to over 80% of DALYs in sub-Saharan Africa. In 1990, 47% of DALYs worldwide were from communicable, maternal, neonatal, and nutritional disorders, 43% from non-communicable diseases, and 10% from injuries. By 2010, this had shifted to 35%, 54%, and 11%, respectively. Ischaemic heart disease was the leading cause of DALYs worldwide in 2010 (up from fourth rank in 1990, increasing by 29%), followed by lower respiratory infections (top rank in 1990; 44% decline in DALYs), stroke (fifth in 1990; 19% increase), diarrhoeal diseases (second in 1990; 51% decrease), and HIV/AIDS (33rd in 1990; 351% increase). Major depressive disorder increased from 15th to 11th rank (37% increase) and road injury from 12th to 10th rank (34% increase). Substantial heterogeneity exists in rankings of leading causes of disease burden among regions. INTERPRETATION Global disease burden has continued to shift away from communicable to non-communicable diseases and from premature death to years lived with disability. In sub-Saharan Africa, however, many communicable, maternal, neonatal, and nutritional disorders remain the dominant causes of disease burden. The rising burden from mental and behavioural disorders, musculoskeletal disorders, and diabetes will impose new challenges on health systems. Regional heterogeneity highlights the importance of understanding local burden of disease and setting goals and targets for the post-2015 agenda taking such patterns into account. Because of improved definitions, methods, and data, these results for 1990 and 2010 supersede all previously published Global Burden of Disease results.
Resumo:
Under the legacy of neoliberalism, it is important to consider how the indigenous people, in this case of Australia, are to advance, develop and achieve some approximation of parity with broader societies in terms of health, educational outcomes and economic participation. In this paper, we explore the relationships between welfare dependency, individualism, responsibility, rights, liberty and the role of the state in the provision of Government-funded programmes of sport to Indigenous communities. We consider whether such programmes are a product of ‘white guilt’ and therefore encourage dependency and weaken the capacity for independence within communities and individuals, or whether programmes to increase rates of participation in sport are better viewed as good investments to bring about changes in physical activity as (albeit a small) part of a broader social policy aimed at reducing the gaps between Indigenous and non-Indigenous Australians in health, education and employment.
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
In this paper, a class of unconditionally stable difference schemes based on the Pad´e approximation is presented for the Riesz space-fractional telegraph equation. Firstly, we introduce a new variable to transform the original dfferential equation to an equivalent differential equation system. Then, we apply a second order fractional central difference scheme to discretise the Riesz space-fractional operator. Finally, we use (1, 1), (2, 2) and (3, 3) Pad´e approximations to give a fully discrete difference scheme for the resulting linear system of ordinary differential equations. Matrix analysis is used to show the unconditional stability of the proposed algorithms. Two examples with known exact solutions are chosen to assess the proposed difference schemes. Numerical results demonstrate that these schemes provide accurate and efficient methods for solving a space-fractional hyperbolic equation.