941 resultados para Non-commutative Landau problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demographic characteristics associated with gambling participation and problem gambling severity were investigated in a stratified random survey in Tasmania, Australia. Computer-assisted telephone interviews were conducted in March 2011 resulting in a representative sample of 4,303 Tasmanian residents aged 18 years or older. Overall, 64.8 % of Tasmanian adults reported participating in some form of gambling in the previous 12 months. The most common forms of gambling were lotteries (46.5 %), keno (24.3 %), instant scratch tickets (24.3 %), and electronic gaming machines (20.5 %). Gambling severity rates were estimated at non-gambling (34.8 %), non-problem gambling (57.4 %), low risk gambling (5.3 %), moderate risk (1.8 %), and problem gambling (.7 %). Compared to Tasmanian gamblers as a whole significantly higher annual participation rates were reported by couples with no children, those in full time paid employment, and people who did not complete secondary school. Compared to Tasmanian gamblers as a whole significantly higher gambling frequencies were reported by males, people aged 65 or older, and people who were on pensions or were unable to work. Compared to Tasmanian gamblers as a whole significantly higher gambling expenditure was reported by males. The highest average expenditure was for horse and greyhound racing ($AUD 1,556), double the next highest gambling activity electronic gaming machines ($AUD 767). Compared to Tasmanian gamblers as a whole problem gamblers were significantly younger, in paid employment, reported lower incomes, and were born in Australia. Although gambling participation rates appear to be falling, problem gambling severity rates remain stable. These changes appear to reflect a maturing gambling market and the need for population specific harm minimisation strategies. © 2014 Springer Science+Business Media New York.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of sampling, randomized algorithms, or training based on the unpredictable inputs of users in Information Retrieval often leads to non-deterministic outputs. Evaluating the effectiveness of systems incorporating these methods can be challenging since each run may produce different effectiveness scores. Current IR evaluation techniques do not address this problem. Using the context of distributed information retrieval as a case study for our investigation, we propose a solution based on multivariate linear modeling. We show that the approach provides a consistent and reliable method to compare the effectiveness of non-deterministic IR algorithms, and explain how statistics can safely be used to show that two IR algorithms have equivalent effectiveness. Copyright 2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Aims: To develop and evaluate a screening tool to identify people with diabetes at increased risk of medication problems relating to hypoglycaemia and medication non-adherence. Methods: A retrospective audit of attendances at a diabetes outpatient clinic at a public, teaching hospital over a 16-month period was conducted. Logistic regression was undertaken to examine risk factors associated with medication problems relating to hypoglycaemia and medication non-adherence and the most predictive set of factors comprise the Diabetes Medication Risk Screening Tool. Evaluating the tool involved assessing sensitivity and specificity, positive and negative predictive values, cut-off scores, inter-rater reliability, and content validity. Results: The Diabetes Medication Risk Screening Tool comprises seven predictive factors: age, living alone, English language, mental and behavioural problems, comorbidity index score, number of medications prescribed, and number of high-risk medications prescribed. The tool has 76.5% sensitivity, 59.5% specificity, and has a 65.1% positive predictive value, and a 71.8% negative predictive value. A score of 27 or more out of 62 was associated with high-risk of a medication problem. The inter-rater reliability of the tool was high (κ = 0.79, 95% CI 0.75 - 0.84) and the content validity index was 99.4%. Conclusion: The Diabetes Medication Risk Screening Tool has good psychometric properties and can proactively identify people with diabetes at greatest risk of medication problems relating to hypoglycaemia and medication non-adherence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Retrieval systems with non-deterministic output are widely used in information retrieval. Common examples include sampling, approximation algorithms, or interactive user input. The effectiveness of such systems differs not just for different topics, but also for different instances of the system. The inherent variance presents a dilemma - What is the best way to measure the effectiveness of a non-deterministic IR system? Existing approaches to IR evaluation do not consider this problem, or the potential impact on statistical significance. In this paper, we explore how such variance can affect system comparisons, and propose an evaluation framework and methodologies capable of doing this comparison. Using the context of distributed information retrieval as a case study for our investigation, we show that the approaches provide a consistent and reliable methodology to compare the effectiveness of a non-deterministic system with a deterministic or another non-deterministic system. In addition, we present a statistical best-practice that can be used to safely show how a non-deterministic IR system has equivalent effectiveness to another IR system, and how to avoid the common pitfall of misusing a lack of significance as a proof that two systems have equivalent effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this paper was to systematically review and meta-analyse the prevalence of co-morbid psychiatric disorders (DSM-IV Axis I disorders) among treatment-seeking problem gamblers. METHODS: A systematic search was conducted for peer-reviewed studies that provided prevalence estimates of Axis I psychiatric disorders in individuals seeking psychological or pharmacological treatment for problem gambling (including pathological gambling). Meta-analytic techniques were performed to estimate the weighted mean effect size and heterogeneity across studies. RESULTS: Results from 36 studies identified high rates of co-morbid current (74.8%, 95% CI 36.5-93.9) and lifetime (75.5%, 95% CI 46.5-91.8) Axis I disorders. There were high rates of current mood disorders (23.1%, 95% CI 14.9-34.0), alcohol use disorders (21.2%, 95% CI 15.6-28.1), anxiety disorders (17.6%, 95% CI 10.8-27.3) and substance (non-alcohol) use disorders (7.0%, 95% CI 1.7-24.9). Specifically, the highest mean prevalence of current psychiatric disorders was for nicotine dependence (56.4%, 95% CI 35.7-75.2) and major depressive disorder (29.9%, 95% CI 20.5-41.3), with smaller estimates for alcohol abuse (18.2%, 95% CI 13.4-24.2), alcohol dependence (15.2%, 95% CI 10.2-22.0), social phobia (14.9%, 95% CI 2.0-59.8), generalised anxiety disorder (14.4%, 95% CI 3.9-40.8), panic disorder (13.7%, 95% CI 6.7-26.0), post-traumatic stress disorder (12.3%, 95% CI 3.4-35.7), cannabis use disorder (11.5%, 95% CI 4.8-25.0), attention-deficit hyperactivity disorder (9.3%, 95% CI 4.1-19.6), adjustment disorder (9.2%, 95% CI 4.8-17.2), bipolar disorder (8.8%, 95% CI 4.4-17.1) and obsessive-compulsive disorder (8.2%, 95% CI 3.4-18.6). There were no consistent patterns according to gambling problem severity, type of treatment facility and study jurisdiction. Although these estimates were robust to the inclusion of studies with non-representative sampling biases, they should be interpreted with caution as they were highly variable across studies. CONCLUSIONS: The findings highlight the need for gambling treatment services to undertake routine screening and assessment of psychiatric co-morbidity and provide treatment approaches that adequately manage these co-morbid disorders. Further research is required to explore the reasons for the variability observed in the prevalence estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to estimate a particular quantile of a distribution is an important problem which frequently arises in many computer vision and signal processing applications. For example, our work was motivated by the requirements of many semi-automatic surveillance analytics systems which detect abnormalities in close-circuit television (CCTV) footage using statistical models of low-level motion features. In this paper we specifically address the problem of estimating the running quantile of a data stream with non-stationary stochasticity when the memory for storing observations is limited. We make several major contributions: (i) we derive an important theoretical result which shows that the change in the quantile of a stream is constrained regardless of the stochastic properties of data, (ii) we describe a set of high-level design goals for an effective estimation algorithm that emerge as a consequence of our theoretical findings, (iii) we introduce a novel algorithm which implements the aforementioned design goals by retaining a sample of data values in a manner adaptive to changes in the distribution of data and progressively narrowing down its focus in the periods of quasi-stationary stochasticity, and (iv) we present a comprehensive evaluation of the proposed algorithm and compare it with the existing methods in the literature on both synthetic data sets and three large 'real-world' streams acquired in the course of operation of an existing commercial surveillance system. Our findings convincingly demonstrate that the proposed method is highly successful and vastly outperforms the existing alternatives, especially when the target quantile is high valued and the available buffer capacity severely limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We establish a general Lagrangian for the moral hazard problem which generalizes the well known first order approach (FOA). It requires that besides the multiplier of the first order condition, there exist multipliers for the second order condition and for the binding actions of the incentive compatibility constraint. Some examples show that our approach can be useful to treat the finite and infinite state space cases. One of the examples is solved by the second order approach. We also compare our Lagrangian with 1\1irrlees'.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We establish a general Lagrangian for the moral hazard problem which generalizes the well known first order approach (FOA). It requires that besides the multiplier of the first order condition, there exist multipliers for the second order condition and for the binding actions of the incentive compatibility constraint. Some examples show that our approach can be useful to treat the finite and infinite state space cases. One of the examples is solved by the second order approach. We also compare our Lagrangian with 1\1irrlees'.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the ever increasing demands for high complexity consumer electronic products, market pressures demand faster product development and lower cost. SoCbased design can provide the required design flexibility and speed by allowing the use of IP cores. However, testing costs in the SoC environment can reach a substantial percent of the total production cost. Analog testing costs may dominate the total test cost, as testing of analog circuits usually require functional verification of the circuit and special testing procedures. For RF analog circuits commonly used in wireless applications, testing is further complicated because of the high frequencies involved. In summary, reducing analog test cost is of major importance in the electronic industry today. BIST techniques for analog circuits, though potentially able to solve the analog test cost problem, have some limitations. Some techniques are circuit dependent, requiring reconfiguration of the circuit being tested, and are generally not usable in RF circuits. In the SoC environment, as processing and memory resources are available, they could be used in the test. However, the overhead for adding additional AD and DA converters may be too costly for most systems, and analog routing of signals may not be feasible and may introduce signal distortion. In this work a simple and low cost digitizer is used instead of an ADC in order to enable analog testing strategies to be implemented in a SoC environment. Thanks to the low analog area overhead of the converter, multiple analog test points can be observed and specific analog test strategies can be enabled. As the digitizer is always connected to the analog test point, it is not necessary to include muxes and switches that would degrade the signal path. For RF analog circuits, this is specially useful, as the circuit impedance is fixed and the influence of the digitizer can be accounted for in the design phase. Thanks to the simplicity of the converter, it is able to reach higher frequencies, and enables the implementation of low cost RF test strategies. The digitizer has been applied successfully in the testing of both low frequency and RF analog circuits. Also, as testing is based on frequency-domain characteristics, nonlinear characteristics like intermodulation products can also be evaluated. Specifically, practical results were obtained for prototyped base band filters and a 100MHz mixer. The application of the converter for noise figure evaluation was also addressed, and experimental results for low frequency amplifiers using conventional opamps were obtained. The proposed method is able to enhance the testability of current mixed-signal designs, being suitable for the SoC environment used in many industrial products nowadays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study optimal labor income taxation in non-competitive labor markets. Firms offer screening contracts to workers who have private information about their productivity. A planner endowed with a Paretian social welfare function tries to induce allocations that maximize its objective. We provide necessary and sufficient conditions for implementation of constrained efficient allocations using tax schedules. All allocations that are implementable by a tax schedule display negative marginal tax rates for almost all workers. Not all allocations that are implementable in a competitive setting are implementable in this noncompetitive environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss a general approach to building non-asymptotic confidence bounds for stochastic optimization problems. Our principal contribution is the observation that a Sample Average Approximation of a problem supplies upper and lower bounds for the optimal value of the problem which are essentially better than the quality of the corresponding optimal solutions. At the same time, such bounds are more reliable than “standard” confidence bounds obtained through the asymptotic approach. We also discuss bounding the optimal value of MinMax Stochastic Optimization and stochastically constrained problems. We conclude with a small simulation study illustrating the numerical behavior of the proposed bounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new procedure to construct the one-dimensional non-Hermitian imaginary potential with a real energy spectrum in the context of the position-dependent effective mass Dirac equation with the vector-coupling scheme in 1 + 1 dimensions. In the first example, we consider a case for which the mass distribution combines linear and inversely linear forms, the Dirac problem with a PT-symmetric potential is mapped into the exactly solvable Schrodinger-like equation problem with the isotonic oscillator by using the local scaling of the wavefunction. In the second example, we take a mass distribution with smooth step shape, the Dirac problem with a non-PT-symmetric imaginary potential is mapped into the exactly solvable Schrodinger-like equation problem with the Rosen-Morse potential. The real relativistic energy levels and corresponding wavefunctions for the bound states are obtained in terms of the supersymmetric quantum mechanics approach and the function analysis method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the application of a multiobjective evolutionary algorithm (MOEA) for optimal power flow (OPF) solution. The OPF is modeled as a constrained nonlinear optimization problem, non-convex of large-scale, with continuous and discrete variables. The violated inequality constraints are treated as objective function of the problem. This strategy allows attending the physical and operational restrictions without compromise the quality of the found solutions. The developed MOEA is based on the theory of Pareto and employs a diversity-preserving mechanism to overcome the premature convergence of algorithm and local optimal solutions. Fuzzy set theory is employed to extract the best compromises of the Pareto set. Results for the IEEE-30, RTS-96 and IEEE-354 test systems are presents to validate the efficiency of proposed model and solution technique.