42 resultados para Second-order decision analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with blind equalization of single-input-multiple-output (SIMO) finite-impulse-response (FIR) channels driven by i.i.d. signal, by exploiting the second-order statistics (SOS) of the channel outputs. Usually, SOS-based blind equalization is carried out via two stages. In Stage 1, the SIMO FIR channel is estimated using a blind identification method, such as the recently developed truncated transfer matrix (TTM) method. In Stage 2, an equalizer is derived from the estimate of the channel to recover the source signal. However, this type of two-stage approach does not give satisfactory blind equalization result if the channel is ill-conditioned, which is often encountered in practical applications. In this paper, we first show that the TTM method does not work in some situations. Then, we propose a novel SOS-based blind equalization method which can directly estimate the equalizer without knowing the channel impulse responses. The proposed method can obtain the desired equalizer even in the case that the channel is ill-conditioned. The performance of our method is illustrated by numerical simulations and compared with four benchmark methods. © 2014 Elsevier Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

 There is a growing interest in the use of renewable energy sources to power wireless networks in order to mitigate the detrimental effects of conventional energy production or to enable deployment in off-grid locations. However, renewable energy sources, such as solar and wind, are by nature unstable in their availability and capacity. The dynamics of energy supply hence impose new challenges for network planning and resource management. In this paper, the sustainable performance of a wireless mesh network powered by renewable energy sources is studied. To address the intermittently available capacity of the energy supply, adaptive resource management and admission control schemes are proposed. Specifically, the goal is to maximize the energy sustainability of the network, or equivalently, to minimize the failure probability that the mesh access points (APs) deplete their energy and go out of service due to the unreliable energy supply. To this end, the energy buffer of a mesh AP is modeled as a G/G/1(/N) queue with arbitrary patterns of energy charging and discharging. Diffusion approximation is applied to analyze the transient evolution of the queue length and the energy depletion duration. Based on the analysis, an adaptive resource management scheme is proposed to balance traffic loads across the mesh network according to the energy adequacy at different mesh APs. A distributed admission control strategy to guarantee high resource utilization and to improve energy sustainability is presented. By considering the first and second order statistics of the energy charging and discharging processes at each mesh AP, it is demonstrated that the proposed schemes outperform some existing state-of-the-art solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contents:
1. Role of multi-criteria decision making in natural resource management /​ Gamini Herath and Tony Prato
2. Analysis of forest policy using multi-attribute value theory /​ Jayanath Ananda and Gamini Herath
3. Comparing Riparian revegetation policy options using the analytic hierarchy process /​ M. E. Qureshi and S. R. Harrison
4. Managing environmental and health risks from a lead and zinc smelter : an application of deliberative multi-criteria evaluation /​ Wendy Proctor, Chris McQuade and Anne Dekker
5. Multiple attribute evaluation of management alternatives for the Missouri River System /​ Tony Prato
6. Multi-criteria decision analysis for integrated watershed management /​ Zeyuan Qiu
7. Fuzzy multiple attribute evaluation of agricultural systems /​ Leonie A. Marks and Elizabeth G. Dunn
8. Multi-criteria decision support for energy supply assessment /​ Bram Noble
9. Seaport development in Vietnam : evaluation using the analytic hierarchy process /​ Tran Phuong Dong and David M. Chapman
10. Valuing wetland aquatic resources using the analytic hierarchy process /​ Premachandra Wattage and Simon Mardle
11. Multiple attribute evaluation for national park management /​ Tony Prato
12. The future of MCDA in natural resource management : some generalizations /​ Gamini Herath and Tony Prato.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction:
Low dose spiral computed tomography (CT) is a sensitive screening tool for lung cancer that is currently being evaluated in both non-randomised studies and randomised controlled trials.
Methods:
We conducted a quantitative decision analysis using a Markov model to determine whether, in the Australian setting, offering spiral CT screening for lung cancer to high risk individuals would be cost-effective compared with current practice. This exploratory analysis was undertaken predominantly from the perspective of the government as third-party funder. In the base-case analysis, the costs and health outcomes (life-years saved and quality-adjusted life years) were calculated in a hypothetical cohort of 10,000 male current smokers for two alternatives: (1) screen for lung cancer with annual CT for 5 years starting at age 60 year and treat those diagnosed with cancer or (2) no screening and treat only those who present with symptomatic cancer.
Results:
For male smokers aged 60–64 years, with an annual incidence of lung cancer of 552 per 100,000, the incremental cost-effectiveness ratio was $57,325 per life-year saved and $105,090 per QALY saved. For females aged 60–64 years with the same annual incidence of lung cancer, the cost-effectiveness ratio was $51,001 per life-year saved and $88,583 per QALY saved. The model was used to examine the relationship between efficacy in terms of the expected reduction in lung cancer mortality at 7 years and cost-effectiveness. In the base-case analysis lung cancer mortality was reduced by 27% and all cause mortality by 2.1%. Changes in the estimated proportion of stage I cancers detected by screening had the greatest impact on the efficacy of the intervention and the cost-effectiveness. The results were also sensitive to assumptions about the test performance characteristics of CT scanning, the proportion of lung cancer cases overdiagnosed by screening, intervention rates for benign disease, the discount rate, the cost of CT, the quality of life in individuals with early stage screen-detected cancer and disutility associated with false positive diagnoses. Given current knowledge and practice, even under favourable assumptions, reductions in lung cancer mortality of less than 20% are unlikely to be cost-effective, using a value of $50,000 per life-year saved as the threshold to define a “cost-effective” intervention.
Conclusion:
The most feasible scenario under which CT screening for lung cancer could be cost-effective would be if very high-risk individuals are targeted and screening is either highly effective or CT screening costs fall substantially.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Positive functioning in the developmental period of emerging adulthood has received little investigation. The current study investigated components of positive development using confirmatory factor analysis of Australian Temperament Project data collected from 1,158 young adults aged 19-20 years. Positive development constructs that have been theoretically conceptualised were examined to test core concepts. Five first-order constructs were identified in this sample: Civic Action and Engagement, Social Competence, Life Satisfaction, Trust and Tolerance of Others, and Trust in Authorities and Organisations. A second-order positive development factor defined by these constructs provided good fit for the data. This model of positive development in emerging adulthood can provide an outcome measure that can then be used to investigate the developmental processes and pathways involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Response surface methodology was used to optimize the fermentation medium for enhancing naringinase production by Staphylococcus xylosus. The first step of this process involved the individual adjustment and optimization of various medium components at shake flask level. Sources of carbon (sucrose) and nitrogen (sodium nitrate), as well as an inducer (naringin) and pH levels were all found to be the important factors significantly affecting naringinase production. In the second step, a 22 full factorial central composite design was applied to determine the optimal levels of each of the significant variables. A second-order polynomial was derived by multiple regression analysis on the experimental data. Using this methodology, the optimum values for the critical components were obtained as follows: sucrose, 10.0%; sodium nitrate, 10.0%; pH 5.6; biomass concentration, 1.58%; and naringin, 0.50% (w/v), respectively. Under optimal conditions, the experimental naringinase production was 8.45 U/mL. The determination coefficients (R 2) were 0.9908 and 0.9950 for naringinase activity and biomass production, respectively, indicating an adequate degree of reliability in the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Investigation of the role of hypothesis formation in complex (business) problem solving has resulted in a new approach to hypothesis generation. A prototypical hypothesis generation paradigm for management intelligence has been developed, reflecting a widespread need to support management in such areas as fraud detection and intelligent decision analysis. This dissertation presents this new paradigm and its application to goal directed problem solving methodologies, including case based reasoning. The hypothesis generation model, which is supported by a dynamic hypothesis space, consists of three components, namely, Anomaly Detection, Abductive Reasoning, and Conflict Resolution models. Anomaly detection activates the hypothesis generation model by scanning anomalous data and relations in its working environment. The respective heuristics are activated by initial indications of anomalous behaviour based on evidence from historical patterns, linkages with other cases, inconsistencies, etc. Abductive reasoning, as implemented in this paradigm, is based on joining conceptual graphs, and provides an inference process that can incorporate a new observation into a world model by determining what assumptions should be added to the world, so that it can explain new observations. Abductive inference is a weak mechanism for generating explanation and hypothesis. Although a practical conclusion cannot be guaranteed, the cues provided by the inference are very beneficial. Conflict resolution is crucial for the evaluation of explanations, especially those generated by a weak (abduction) mechanism.The measurements developed in this research for explanation and hypothesis provide an indirect way of estimating the ‘quality’ of an explanation for given evidence. Such methods are realistic for complex domains such as fraud detection, where the prevailing hypothesis may not always be relevant to the new evidence. In order to survive in rapidly changing environments, it is necessary to bridge the gap that exists between the system’s view of the world and reality.Our research has demonstrated the value of Case-Based Interaction, which utilises an hypothesis structure for the representation of relevant planning and strategic knowledge. Under, the guidance of case based interaction, users are active agents empowered by system knowledge, and the system acquires its auxiliary information/knowledge from this external source. Case studies using the new paradigm and drawn from the insurance industry have attracted wide interest. A prototypical system of fraud detection for motor vehicle insurance based on an hypothesis guided problem solving mechanism is now under commercial development. The initial feedback from claims managers is promising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background/Purpose

Hepatocellular carcinoma (HCC) has been the leading cause of cancer death in Taiwan since the 1980s. A two-stage screening intervention was introduced in 1996 and has been implemented in a limited number of hospitals. The present study assessed the costs and health outcomes associated with the introduction of screening intervention, from the perspective of the Taiwanese government. The cost-effectiveness analysis aimed to assist informed decision making by the health authority in Taiwan.
Methods

A two-phase economic model, 1-year decision analysis and a 60-year Markov simulation, was developed to conceptualize the screening intervention within current practice, and was compared with opportunistic screening alone. Incremental analyses were conducted to compare the incremental costs and outcomes associated with the introduction of the intervention. Sensitivity analyses were performed to investigate the uncertainties that surrounded the model.
Results

The Markov model simulation demonstrated an incremental cost-effectiveness ratio (ICER) of NT$498,000 (US$15,600) per life-year saved, with a 5% discount rate. An ICER of NT$402,000 (US$12,600) per quality-adjusted life-year was achieved by applying utility weights. Sensitivity analysis showed that excess mortality reduction of HCC by screening and HCC incidence rates were the most influential factors on the ICERs. Scenario analysis also indicated that expansion of the HCC screening intervention by focusing on regular monitoring of the high-risk individuals could achieve a more favorable result.
Conclusion

Screening the population of high-risk individuals for HCC with the two-stage screening intervention in Taiwan is considered potentially cost-effective compared with opportunistic screening in the target population of an HCC endemic area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an attempt to enhance debate focused on an established academic controversy, this study re-investigated selected data from the 1994 AMC survey of Australian and New Zealand manufacturing practices to test the hypothesis that best practice and product innovation may be incompatible generic business strategies. A modification of Robert G. Cooper’s Stage-Gate product development model was used as a theoretical framework to create a measurable construct of ‘product innovation’ as a strategy and compare two groups: firms committed to a best practice strategy (BPs) and firms not utilising best practice (Non-BPs). Eight variables were scrutinised. After logical critique was added to statistical data analysis, four major insights emerged.

(1) Tests yielded several statistically significant but substantively inconclusive results because both studied groups had nearly identical profiles in rating innovation as the factor of lowest importance to commercial success and because the definitional framework which guided construction of the survey instrument treated innovation as a second-order issue. (2) Currently, best practice and product innovation are logically incompatible by definition. (3) Even if the definition of best practice were changed, it is likely that the additional key process of innovation would remain incompatible with the existing key process of benchmarking. (4) However, until the definition of best practice does make an attempt to include innovation as a key process rather than an outcome, testing any hypothesis of strategic compatibility between a best practice focus and an innovation focus will be both empirically difficult and logically unnecessary.