116 resultados para ENCEFALITIS MARGINAL


Relevância:

10.00% 10.00%

Publicador:

Resumo:

- Introduction ‘Store and forward’ teledermoscopy is a technology with potential advantages for melanoma screening. Any large-scale implementation of this technology is dependent on consumer acceptance. - Aim To investigate preferences for melanoma screening options compared to skin selfexamination in adults considered to be at increased risk of developing skin cancer. - Methods A discrete choice experiment (DCE) was completed by 35 consumers, all of whom had prior experience with the use of teledermoscopy, in Queensland, Australia. Participants made 12 choices between screening alternatives described by seven attributes including monetary cost. A mixed logit model was used to estimate the relative weights that consumers place on different aspects of screening, along with the marginal willingness to pay for teledermoscopy as opposed to screening at a clinic. - Results Overall, participants preferred screening/diagnosis by a health professional rather than skin self-examination. Key drivers of screening choice were for results to be reviewed by a dermatologist; a higher detection rate; fewer non-cancerous moles being removed in relation for every skin cancer detected; and less time spent away from usual activities. On average, participants were willing to pay AU$110 to have teledermoscopy with dermatologist review available to them as a screening option. - Discussion & Conclusions Consumers preferentially value aspects of care that are more feasible with a teledermoscopy screening model, as compared to other skin cancer screening and diagnosis options. This study adds to previous literature in the area which has relied on the use of consumer satisfaction scales to assess the acceptability of teledermoscopy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although tactical voting attracts a great deal of attention, it is very hard to measure as it requires knowledge of both individuals’ voting choices as well as their unobserved preferences. In this article, we present a simple empirical strategy to nonparametrically identify tactical voting patterns directly from balloting results. This approach allows us to study the magnitude and direction of strategic voting as well as to verify which information voters and parties take into account to determine marginal constituencies. We show that tactical voting played a significant role in the 2010 election, mainly for Liberal–Democratic voters supporting Labour. Moreover, our results suggest that voters seem to form their expectations based on a national swing in vote shares rather than newspaper guides published in the main media outlets or previous election outcomes. We also present some evidence that suggests that campaign spending is not driving tactical voting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study examines the property value impacts of an announcement of a project which has potential environmental impacts as distinct from other studies that address costs associated with under-construction and the operating impacts of developments. The hypothesis is that the announcement of a proposed project with potential environmental impact creates uncertainty in the property market of the affected area, and this impact is greater on properties closer to the project than those farther from it. The results of the study confirm the hypothesis and indicate that the marginal willingness to pay for properties within a 5 km distance declined by AU$17,020 per km proximity to the proposed heavy vehicle route, after the proposed route was announced. The results support the need for more holistic measurement of cost–benefit analysis of projects and provide a basis for improved consideration by policy makers of the rights of affected parties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The inverse temperature hyperparameter of the hidden Potts model governs the strength of spatial cohesion and therefore has a substantial influence over the resulting model fit. The difficulty arises from the dependence of an intractable normalising constant on the value of the inverse temperature, thus there is no closed form solution for sampling from the distribution directly. We review three computational approaches for addressing this issue, namely pseudolikelihood, path sampling, and the approximate exchange algorithm. We compare the accuracy and scalability of these methods using a simulation study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traffic law enforcement sanctions can impact on road user behaviour through general and specific deterrence mechanisms. The manner in which specific deterrence can influence recidivist behaviour can be conceptualised in different ways. While any reduction in speeding will have road safety benefits, the ways in which a ‘reduction’ is determined deserves greater methodological attention and has implications for countermeasure evaluation more generally. The primary aim of this research was to assess the specific deterrent impact of penalty increases for speeding offences in Queensland, Australia, in 2003 on two cohorts of drivers detected for speeding prior to and after the penalty changes were investigated. Since the literature is relatively silent on how to assess recidivism in the speeding context, the secondary research aim was to contribute to the literature regarding ways to conceptualise and measure specific deterrence in the speeding context. We propose a novel way of operationalising four measures which reflect different ways in which a specific deterrence effect could be conceptualised: (1) the proportion of offenders who re-offended in the follow up period; (2) the overall frequency of re-offending in the follow up period; (3) the length of delay to re-offence among those who re-offended; and (4) the average number of re-offences during the follow up period among those who re-offended. Consistent with expectations, results suggested an absolute deterrent effect of penalty changes, as evidenced by significant reductions in the proportion of drivers who re-offended and the overall frequency of re-offending, although effect sizes were small. Contrary to expectations, however, there was no evidence of a marginal specific deterrent effect among those who re-offended, with a significant reduction in the length of time to re-offence and no significant change in the average number of offences committed. Additional exploratory analyses investigating potential influences of the severity of the index offence, offence history, and method of detection revealed mixed results. Access to additional data from various sources suggested that the main findings were not influenced by changes in speed enforcement activity, public awareness of penalty changes, or driving exposure during the study period. Study limitations and recommendations for future research are discussed with a view to promoting more extensive evaluations of penalty changes and better understanding of how such changes may impact on motorists’ perceptions of enforcement and sanctions, as well as on recidivist behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reconstructing 3D motion data is highly under-constrained due to several common sources of data loss during measurement, such as projection, occlusion, or miscorrespondence. We present a statistical model of 3D motion data, based on the Kronecker structure of the spatiotemporal covariance of natural motion, as a prior on 3D motion. This prior is expressed as a matrix normal distribution, composed of separable and compact row and column covariances. We relate the marginals of the distribution to the shape, trajectory, and shape-trajectory models of prior art. When the marginal shape distribution is not available from training data, we show how placing a hierarchical prior over shapes results in a convex MAP solution in terms of the trace-norm. The matrix normal distribution, fit to a single sequence, outperforms state-of-the-art methods at reconstructing 3D motion data in the presence of significant data loss, while providing covariance estimates of the imputed points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose We sought to analyse clinical and oncological outcomes of patients after guided resection of periacetabular tumours and endoprosthetic reconstruction of the remaining defect. Methods From 1988 to 2008, we treated 56 consecutive patients (mean age 52.5 years, 41.1 % women). Patients were followed up either until death or February 2011 (mean follow up 5.5 years, range 0.1–22.5, standard deviation ± 5.3). Kaplan–Meier analysis was used to estimate survival rates. Results Disease-specific survival was 59.9 % at five years and 49.7 % at ten and 20 years, respectively. Wide resection margins were achieved in 38 patients, whereas 11 patients underwent marginal and seven intralesional resection. Survival was significantly better in patients with wide or marginal resection than in patients with intralesional resection (p = 0.022). Survival for patients with secondary tumours was significantly worse than for patients with primary tumours (p = 0.003). In 29 patients (51.8 %), at least one reoperation was necessary, resulting in a revision-free survival of 50.5 % at five years, 41.1 % at ten years and 30.6 % at 20 years. Implant survival was 77.0 % at five years, 68.6 % at ten years and 51.8 % at 20 years. A total of 35 patients (62.5 %) experienced one or more complications after surgery. Ten of 56 patients (17.9 %) experienced local recurrence after a mean of 8.9 months. The mean postoperative Musculoskeletal Tumor Society (MSTS) score was 18.1 (60.1 %). Conclusion The surgical approach assessed in this study simplifies the process of tumour resection and prosthesis implantation and leads to acceptable clinical and oncological outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we analyse two variants of SIMON family of light-weight block ciphers against variants of linear cryptanalysis and present the best linear cryptanalytic results on these variants of reduced-round SIMON to date. We propose a time-memory trade-off method that finds differential/linear trails for any permutation allowing low Hamming weight differential/linear trails. Our method combines low Hamming weight trails found by the correlation matrix representing the target permutation with heavy Hamming weight trails found using a Mixed Integer Programming model representing the target differential/linear trail. Our method enables us to find a 17-round linear approximation for SIMON-48 which is the best current linear approximation for SIMON-48. Using only the correlation matrix method, we are able to find a 14-round linear approximation for SIMON-32 which is also the current best linear approximation for SIMON-32. The presented linear approximations allow us to mount a 23-round key recovery attack on SIMON-32 and a 24-round Key recovery attack on SIMON-48/96 which are the current best results on SIMON-32 and SIMON-48. In addition we have an attack on 24 rounds of SIMON-32 with marginal complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

South Africa is an emerging and industrializing economy which is experiencing remarkable progress. We contend that amidst the developments in the economy, the role of energy, trade openness and financial development are critical. In this article, we revisit the pivotal role of these factors. We use the ARDL bounds [72], the Bayer and Hanck [11] cointegration techniques, and an extended Cobb–Douglas framework, to examine the long-run association with output per worker over the sample period 1971–2011. The results support long-run association between output per worker, capital per worker and the shift parameters. The short-run elasticity coefficients are as follows: energy (0.24), trade (0.07), financial development (−0.03). In the long-run, the elasticity coefficients are: trade openness (0.05), energy (0.29), and financial development (−0.04). In both the short-run and the long-run, we note the post-2000 period has a marginal positive effect on the economy. The Toda and Yamamoto [91] Granger causality results show that a unidirectional causality from capital stock and energy consumption to output; and from capital stock to trade openness; a bidirectional causality between trade openness and output; and absence (neutrality) of any causality between financial development and output thus indicating that these two variables evolve independent of each other.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big Data and predictive analytics have received significant attention from the media and academic literature throughout the past few years, and it is likely that these emerging technologies will materially impact the mining sector. This short communication argues, however, that these technological forces will probably unfold differently in the mining industry than they have in many other sectors because of significant differences in the marginal cost of data capture and storage. To this end, we offer a brief overview of what Big Data and predictive analytics are, and explain how they are bringing about changes in a broad range of sectors. We discuss the “N=all” approach to data collection being promoted by many consultants and technology vendors in the marketplace but, by considering the economic and technical realities of data acquisition and storage, we then explain why a “n « all” data collection strategy probably makes more sense for the mining sector. Finally, towards shaping the industry’s policies with regards to technology-related investments in this area, we conclude by putting forward a conceptual model for leveraging Big Data tools and analytical techniques that is a more appropriate fit for the mining sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An open-label, inpatient study was undertaken to compare the efficacy of two oral rehydration solutions (ORS) given randomly to children aged 1-10 years who had acute gastroenteritis with mild or moderate dehydration (n = 45). One solution contained 60 mmol/L sodium and 1.8% glucose, total osmolality 240 mosm/l (gastrolyte, Rhone-poulenc, Rorer) and the other contained 26 mmol/l sodium, 2.7% glucose and 3.6% sucrose, total osmolality 340 mOsm/l (Glucolyte, Gilseal). Analysis of data indicated that Gastrolyte therapy resulted in significantly fewer episodes and volume of vomiting over all time periods in comparison to Glucolyte and significantly less stool volume during the first 8 h and in the 0-24 h period. The differences between treatments in degree of dehydration at each follow-up period, duration of diarrhea, and duration of hospital stay were not significant. No adverse drug reactions occurred. Six patients received intravenous rehydration treatment and were considered treatment failures. We conclude that oral rehydration therapy is safe and efficacious in the management of dehydration in acute diarrhoea and that the lower osmolar rehydration solution has clinically marginal advantages.