944 resultados para Model-based bootstrap


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Discussion is currently taking place among international HIV/AIDS groups around increasing HIV testing and initiating earlier use of antiretroviral therapy (ART) among people diagnosed with HIV as a method to reduce the spread of HIV. In this study, we explore the expected epidemiological impact of this strategy in a small population in which HIV transmission is predominantly confined to men who have sex with men (MSM). Methods: A deterministic mathematical transmission model was constructed to investigate the impacts of strategies that increase testing and treatment rates, and their likely potential to mitigate HIV epidemics among MSM. Our novel model distinguishes men in the population who are more easily accessible to prevention campaigns through engagement with the gay community from men who are not. This model is applied to the population of MSM in South Australia. Results: Our model-based findings suggest that increasing testing rates alone will have minimal impact on reducing the expected number of infections compared to current conditions. However, in combination with increases in treatment coverage, this strategy could lead to a 59–68% reduction in the number of HIV infections over the next 5 years. Targeting men who are socially engaged with the gay community would result in the majority of potential reductions in incidence, with only minor improvements possible by reaching all other MSM. Conclusions: Investing in strategies that will achieve higher coverage and earlier initiation of treatment to reduce infectiousness of HIV-infected individuals could be an effective strategy for reducing incidence in a population of MSM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a general perception that public confidence in the insolvency profession is low as the result of the recent unethical practices of a few high profile liquidators. As a result, the effectiveness of the current regulatory mechanisms has been questioned, leading to a review of the performance of insolvency practitioners and subsequent regulation proposals. The challenge for the insolvency profession is balancing the expectations of the general public whilst ensuring that the obligations and duties imposed upon them are performed to acceptable and realistic standards. It is difficult (if not impossible) for the profession to meet this challenge in the absence of a cohesive framework which identifies those issues that require further regulation as opposed to those that relate to general education on the insolvency process. This paper will examine the audit expectations gap theory in the context of insolvency practitioners and suggests that a model based on this theory provides an effective framework for evaluating the regulation of the insolvency industry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thermal transport in graphene-polymer nanocomposite is complicated and has not been well understood. The interfacial thermal transport between graphene nanofiller and polymer matrix is expected to play a key role in controlling the overall thermal performance of graphene-polymer nanocomposite. In this work, we investigated the thermal transport across graphene-polymer interfaces functionalized with end-grafted polymer chains using molecular dynamics simulations. The effects of grafting density, chain length and initial morphology on the interfacial thermal transport were systematically investigated. It was found that end-grafted polymer chains could significantly enhance interfacial thermal transport and the underlying mechanism was considered to be the enhanced vibration coupling between graphene and polymer. In addition, a theoretical model based on effective medium theory was established to predict the thermal conductivity in graphene-polymer nanocomposites.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study presents the results of the first large scale survey of Australian builders’ beliefs about prefabrication, drawing on 454 surveys completed by representatives of building companies in Queensland and Western Australia. Previous literature has identified a number of broad themes affecting the uptake of prefabrication. The current study builds on this work by using a structured theoretical model based on the Theory of Planned Behaviour (TPB) and the Technology Acceptance Model (TAM), to further explore the specific factors influencing builders’ intentions to increase their use of prefabrication. Information was gathered concerning the characteristics of respondents in addition to three aims. The aims were: (1) To identify the relative importance of a number of key factors which may affect builders’ use of prefabrication, (2) To compare the characteristics of builders using various levels of prefabrication (including none), and; (3) To determine if a model based on the TPB, TAM, and other control variables can explain builders’ intentions to adopt prefabrication on their housing projects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis introduces a method of applying Bayesian Networks to combine information from a range of data sources for effective decision support systems. It develops a set of techniques in development, validation, visualisation, and application of Complex Systems models, with a working demonstration in an Australian airport environment. The methods presented here have provided a modelling approach that produces highly flexible, informative and applicable interpretations of a system's behaviour under uncertain conditions. These end-to-end techniques are applied to the development of model based dashboards to support operators and decision makers in the multi-stakeholder airport environment. They provide highly flexible and informative interpretations and confidence in these interpretations of a system's behaviour under uncertain conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode indicator functions (MIFs) are used in modal testing and analysis as a means of identifying modes of vibration, often as a precursor to modal parameter estimation. Various methods have been developed since the MIF was introduced four decades ago. These methods are quite useful in assisting the analyst to identify genuine modes and, in the case of the complex mode indicator function, have even been developed into modal parameter estimation techniques. Although the various MIFs are able to indicate the existence of a mode, they do not provide the analyst with any descriptive information about the mode. This paper uses the simple summation type of MIF to develop five averaged and normalised MIFs that will provide the analyst with enough information to identify whether a mode is longitudinal, vertical, lateral or torsional. The first three functions, termed directional MIFs, have been noted in the literature in one form or another; however, this paper introduces a new twist on the MIF by introducing two MIFs, termed torsional MIFs, that can be used by the analyst to identify torsional modes and, moreover, can assist in determining whether the mode is of a pure torsion or sway type (i.e., having a rigid cross-section) or a distorted twisting type. The directional and torsional MIFs are tested on a finite element model based simulation of an experimental modal test using an impact hammer. Results indicate that the directional and torsional MIFs are indeed useful in assisting the analyst to identify whether a mode is longitudinal, vertical, lateral, sway, or torsion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To investigate potentially dissociable recognition memory responses in the hippocampus and perirhinal cortex, fMRI studies have often used confidence ratings as an index of memory strength. Confidence ratings, although correlated with memory strength, also reflect sources of variability, including task-irrelevant item effects and differences both within and across individuals in terms of applying decision criteria to separate weak from strong memories. We presented words one, two, or four times at study in each of two different conditions, focused and divided attention, and then conducted separate fMRI analyses of correct old responses on the basis of subjective confidence ratings or estimates from single- versus dual-process recognition memory models. Overall, the effect of focussing attention on spaced repetitions at study manifested as enhanced recognition memory performance. Confidence- versus model-based analyses revealed disparate patterns of hippocampal and perirhinal cortex activity at both study and test and both within and across hemispheres. The failure to observe equivalent patterns of activity indicates that fMRI signals associated with subjective confidence ratings reflect additional sources of variability. The results are consistent with predictions of single-process models of recognition memory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Several common genetic variants have recently been discovered that appear to influence white matter microstructure, as measured by diffusion tensor imaging (DTI). Each genetic variant explains only a small proportion of the variance in brain microstructure, so we set out to explore their combined effect on the white matter integrity of the corpus callosum. We measured six common candidate single-nucleotide polymorphisms (SNPs) in the COMT, NTRK1, BDNF, ErbB4, CLU, and HFE genes, and investigated their individual and aggregate effects on white matter structure in 395 healthy adult twins and siblings (age: 20-30 years). All subjects were scanned with 4-tesla 94-direction high angular resolution diffusion imaging. When combined using mixed-effects linear regression, a joint model based on five of the candidate SNPs (COMT, NTRK1, ErbB4, CLU, and HFE) explained ∼ 6% of the variance in the average fractional anisotropy (FA) of the corpus callosum. This predictive model had detectable effects on FA at 82% of the corpus callosum voxels, including the genu, body, and splenium. Predicting the brain's fiber microstructure from genotypes may ultimately help in early risk assessment, and eventually, in personalized treatment for neuropsychiatric disorders in which brain integrity and connectivity are affected.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A model based on the cluster process representation of the self-exciting process model in White and Porter 2013 and Ruggeri and Soyer 2008is derived to allow for variation in the excitation effects for terrorist events in a self-exciting or cluster process model. The details of the model derivation and implementation are given and applied to data from the Global Terrorism Database from 2000–2012. Results are discussed in terms of practical interpretation along with implications for a theoretical model paralleling existing criminological theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The world has experienced a large increase in the amount of available data. Therefore, it requires better and more specialized tools for data storage and retrieval and information privacy. Recently Electronic Health Record (EHR) Systems have emerged to fulfill this need in health systems. They play an important role in medicine by granting access to information that can be used in medical diagnosis. Traditional systems have a focus on the storage and retrieval of this information, usually leaving issues related to privacy in the background. Doctors and patients may have different objectives when using an EHR system: patients try to restrict sensible information in their medical records to avoid misuse information while doctors want to see as much information as possible to ensure a correct diagnosis. One solution to this dilemma is the Accountable e-Health model, an access protocol model based in the Information Accountability Protocol. In this model patients are warned when doctors access their restricted data. They also enable a non-restrictive access for authenticated doctors. In this work we use FluxMED, an EHR system, and augment it with aspects of the Information Accountability Protocol to address these issues. The Implementation of the Information Accountability Framework (IAF) in FluxMED provides ways for both patients and physicians to have their privacy and access needs achieved. Issues related to storage and data security are secured by FluxMED, which contains mechanisms to ensure security and data integrity. The effort required to develop a platform for the management of medical information is mitigated by the FluxMED's workflow-based architecture: the system is flexible enough to allow the type and amount of information being altered without the need to change in your source code.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Spontaneous emission (SE) of a Quantum emitter depends mainly on the transmission strength between the upper and lower energy levels as well as the Local Density of States (LDOS)[1]. When a QD is placed in near a plasmon waveguide, LDOS of the QD is increased due to addition of the non-radiative decay and a plasmonic decay channel to free space emission[2-4]. The slow velocity and dramatic concentration of the electric field of the plasmon can capture majority of the SE into guided plasmon mode (Гpl ). This paper focused on studying the effect of waveguide height on the efficiency of coupling QD decay into plasmon mode using a numerical model based on finite elemental method (FEM). Symmetric gap waveguide considered in this paper support single mode and QD as a dipole emitter. 2D simulation models are done to find normalized Гpl and 3D models are used to find probability of SE decaying into plasmon mode ( β) including all three decay channels. It is found out that changing gap height can increase QD-plasmon coupling, by up to a factor of 5 and optimally placed QD up to a factor of 8. To make the paper more realistic we briefly studied the effect of sharpness of the waveguide edge on SE emission into guided plasmon mode. Preliminary nano gap waveguide fabrication and testing are already underway. Authors expect to compare the theoretical results with experimental outcomes in the future