209 resultados para Individually rational utility set
Resumo:
Numeric sets can be used to store and distribute important information such as currency exchange rates and stock forecasts. It is useful to watermark such data for proving ownership in case of illegal distribution by someone. This paper analyzes the numerical set watermarking model presented by Sion et. al in “On watermarking numeric sets”, identifies it’s weaknesses, and proposes a novel scheme that overcomes these problems. One of the weaknesses of Sion’s watermarking scheme is the requirement to have a normally-distributed set, which is not true for many numeric sets such as forecast figures. Experiments indicate that the scheme is also susceptible to subset addition and secondary watermarking attacks. The watermarking model we propose can be used for numeric sets with arbitrary distribution. Theoretical analysis and experimental results show that the scheme is strongly resilient against sorting, subset selection, subset addition, distortion, and secondary watermarking attacks.
Resumo:
Ever since Cox et. al published their paper, “A Secure, Robust Watermark for Multimedia” in 1996 [6], there has been tremendous progress in multimedia watermarking. The same pattern re-emerged with Agrawal and Kiernan publishing their work “Watermarking Relational Databases” in 2001 [1]. However, little attention has been given to primitive data collections with only a handful works of research known to the authors [11, 10]. This is primarily due to the absence of an attribute that differentiates marked items from unmarked item during insertion and detection process. This paper presents a distribution-independent, watermarking model that is secure against secondary-watermarking in addition to conventional attacks such as data addition, deletion and distortion. The low false positives and high capacity provide additional strength to the scheme. These claims are backed by experimental results provided in the paper.
Resumo:
Lower airway inflammation is generally classified as eosinophilic or neutrophilic. In conditions where eosinophilic inflammation predominates such as asthma in children, corticosteroids are usually beneficial. Traditionally, lower airway eosinophilia is measured using cellular count (through bronchoalveolar lavage or induced sputum). Both methods have limited applicability in children. When instruments to measure fractional exhaled nitric oxide (FeNO) became available, it presented an attractive option as it provided a non-invasive method of measuring eosinophilic inflammation suitable for children and adult. Not surprisingly, proposals have been made that FeNO measurement can be clinically used in many scenarios including monitoring the response to anti-inflammatory medications, to verify the adherence to treatment, and to predict upcoming asthma exacerbations. This thesis addresses the utility of FeNO levels in various scenarios, specifically in relation to asthma control and cough, a contentious aspect of the diagnosis of asthma. The thesis consists of a series of systematic reviews (related to the main question) and original studies in children. The over-arching aim of the thesis is to determine if FeNO is a clinically useful tool in the management of asthma and common asthma symptoms. The specific aims of the thesis were, to: 1. Determine if children with asthma have more severe acute respiratory symptoms at presentation with an asthma exacerbation and at days 7, 10 and 14 using validated scales. We also examined if children with asthma were more likely to have a persistent cough on day 14 than children with protracted bronchitis and/or controls. 2. Evaluate the efficacy of tailoring asthma interventions based on sputum analysis in comparison to clinical symptoms (with or without spirometry/peak flow) for asthma related outcomes in children and adults. 3. Evaluate the efficacy of tailoring asthma interventions based on exhaled nitric oxide in comparison to clinical symptoms (with or without spirometry/peak flow) for asthma related outcomes in children and adults. 4. Determine if adjustment of asthma medications based on FeNO levels (compared to management based on clinical symptoms) reduces severe exacerbations in children with asthma. 5. Examine the relationship between FeNO and exercise induced broncho-constriction and cough in children The aims above are addressed in respective chapters and all but one has been published/submitted. A synopsis of the findings are: In study-1 (Aim 1), we found that children with protracted bronchitis had the most severe acute respiratory infection symptoms and higher percentage of respiratory morbidity at day 14 in comparison to children with asthma and healthy controls. The systematic review of study-2 (Aim 2) included 246 randomised adult participants (no children) with 221 completing the trials. In the meta-analysis, a significant reduction in number of participants who had one or more asthma exacerbations occurred when treatment was based on sputum eosinophils in comparison to clinical symptoms. In the systematic review of study-3 (Aim 3), we found no significant difference between the intervention group (treatment adjusted based on FeNO) and control group (treatment adjusted based on clinical symptoms) for the primary outcome of asthma exacerbations or for the other outcomes (clinical symptoms, FeNO level and spirometry). In post-hoc analysis, a significant reduction in mean final daily dose ICS per adult was found in the group where treatment was based on FeNO in comparison to clinical symptoms. In contrast, in the paediatric studies, there was a significant increase in ICS dose in the FeNO strategy arm. Thus, controversy remains of the benefit or otherwise of utilising exhaled nitric oxide (FeNO) in routine clinical practice. FeNO levels are dependent on atopy and none of the 7 published trials have considered atopic status in FeNO levels when medications were adjusted. In study-4 (Aim 4), 64 children with asthma were recruited. Their asthma medications were adjusted according to either FeNO levels or usual clinical care utilising a management hierarchy taking into account atopy. It was concluded that tailoring of asthma medications in accordance to FeNO levels (compared to usual management), taking into account atopy status, reduced the number of children with severe exacerbations. However, a FeNO-based strategy resulted in higher daily ICS doses and had no benefit on asthma control. In study-5 (Aim 5), 33 children with cough and 17 controls were recruited. They were randomised to undertake an exercise challenge on day 1, or dry powder mannitol challenge on day 1 (with alternative challenge being done on day 2). In addition, a 24 hour cough meter, skin prick test, capsaicin cough sensitivity test and cough diary were undertaken. The change in cough frequency post exercise was significantly increased in the children with cough. FeNO decreases post exercise regardless of whether EIB is present or not. Limitations in the studies were addressed in the respective chapters. In summary, the studies from this thesis have provided new information on: • The severity of respiratory symptoms was increased in the early phase of the asthma exacerbation but not in the later recovery phase when compared with controls. • The utility of FeNO in the management of children with asthma. • The relationship of FeNO, cough and EIB in children. • Systematic reviews on the efficacy of tailoring asthma interventions based on eosinophilic inflammatory markers (sputum analysis and FeNO) in comparison to clinical symptoms.
Resumo:
This thesis introduces advanced Demand Response algorithms for residential appliances to provide benefits for both utility and customers. The algorithms are engaged in scheduling appliances appropriately in a critical peak day to alleviate network peak, adverse voltage conditions and wholesale price spikes also reducing the cost of residential energy consumption. Initially, a demand response technique via customer reward is proposed, where the utility controls appliances to achieve network improvement. Then, an improved real-time pricing scheme is introduced and customers are supported by energy management schedulers to actively participate in it. Finally, the demand response algorithm is improved to provide frequency regulation services.
Resumo:
Firstly, we would like to thank Ms. Alison Brough and her colleagues for their positive commentary on our published work [1] and their appraisal of our utility of the “off-set plane” protocol for anthropometric analysis. The standardized protocols described in our manuscript have wide applications, ranging from forensic anthropology and paleodemographic research to clinical settings such as paediatric practice and orthopaedic surgical design. We affirm that the use of geometrically based reference tools commonly found in computer aided design (CAD) programs such as Geomagic Design X® are imperative for more automated and precise measurement protocols for quantitative skeletal analysis. Therefore we stand by our recommendation of the use of software such as Amira and Geomagic Design X® in the contexts described in our manuscript...
Resumo:
Analytically or computationally intractable likelihood functions can arise in complex statistical inferential problems making them inaccessible to standard Bayesian inferential methods. Approximate Bayesian computation (ABC) methods address such inferential problems by replacing direct likelihood evaluations with repeated sampling from the model. ABC methods have been predominantly applied to parameter estimation problems and less to model choice problems due to the added difficulty of handling multiple model spaces. The ABC algorithm proposed here addresses model choice problems by extending Fearnhead and Prangle (2012, Journal of the Royal Statistical Society, Series B 74, 1–28) where the posterior mean of the model parameters estimated through regression formed the summary statistics used in the discrepancy measure. An additional stepwise multinomial logistic regression is performed on the model indicator variable in the regression step and the estimated model probabilities are incorporated into the set of summary statistics for model choice purposes. A reversible jump Markov chain Monte Carlo step is also included in the algorithm to increase model diversity for thorough exploration of the model space. This algorithm was applied to a validating example to demonstrate the robustness of the algorithm across a wide range of true model probabilities. Its subsequent use in three pathogen transmission examples of varying complexity illustrates the utility of the algorithm in inferring preference of particular transmission models for the pathogens.
Resumo:
The along-track stereo images of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) sensor with 15 m resolution were used to generate Digital Elevation Model (DEM) on an area with low and near Mean Sea Level (MSL) elevation in Johor, Malaysia. The absolute DEM was generated by using the Rational Polynomial Coefficient (RPC) model which was run on ENVI 4.8 software. In order to generate the absolute DEM, 60 Ground Control Pointes (GCPs) with almost vertical accuracy less than 10 meter extracted from topographic map of the study area. The assessment was carried out on uncorrected and corrected DEM by utilizing dozens of Independent Check Points (ICPs). Consequently, the uncorrected DEM showed the RMSEz of ± 26.43 meter which was decreased to the RMSEz of ± 16.49 meter for the corrected DEM after post-processing. Overall, the corrected DEM of ASTER stereo images met the expectations.
Resumo:
Housing affordability and sustainable development are not polarised ideologies as both are necessary with increasing urbanisation. We must bridge the gap between current median house pricing and target affordable house pricing whilst pursuing sustainability. This paper examines the potential of initial construction cost and ongoing utilities and transport cost reduction through the integration of sustainable housing design and transit oriented development principles in a Commuter Energy and Building Utilities System (CEBUS). It also introduces current research on the development of a Dynamic Simulation Model for CEBUS applications in the Australian property development and construction industry.
Resumo:
A recent review by Panagoulias and Doupis, published in Patient Preference and Adherence, concerned the saxagliptin/metformin fixed combination (SAXA/MET FDC), and was titled "Clinical utility in the treatment of type 2 diabetes with the saxagliptin/metformin fixed combination."1 This review concluded that "The SAXA/MET FDC is a patient-friendly, dosage-flexible, and hypoglycemia-safe regimen with very few adverse events and a neutral or even favorable effect on body weight. It achieves significant glycosylated hemoglobin A1c reduction helping the patient to achieve his/her individual glycemic goals."1
Resumo:
Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.
Resumo:
Introduction Measuring occupational performance is an essential part of clinical practice; however, there is little research on service user perceptions of measures. The aim of this investigation was to explore the acceptability and utility of one occupational performance outcome measure, Goal Attainment Scaling, with young people (12–25 years old) seeking psychological help. Method Semi-structured interviews were conducted with ten young people seeking help from a youth mental health clinic. Interviews were audio taped and a field diary kept. Interviews were transcribed verbatim and analysed using content analysis. Results were verified by member checking. Results All participants were able to engage in using Goal Attainment Scaling to set goals for therapy, and reported the process to be useful. The participants identified the physical location and ownership of the scale was important to help motivate them to work on their goals. Conclusion Young help-seekers see Goal Attainment Scaling as an acceptable tool to facilitate the establishment of functional goals. Young service users were particularly keen to maintain control over the physical location of completed forms.
Resumo:
This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.
Resumo:
Objective To examine the clinical utility of the Cornell Scale for Depression in Dementia (CSDD) in nursing homes. Setting 14 nursing homes in Sydney and Brisbane, Australia. Participants 92 residents with a mean age of 85 years. Measurements Consenting residents were assessed by care staff for depression using the CSDD as part of their routine assessment. Specialist clinicians conducted assessment of depression using the Semi-structured Clinical Diagnostic Interview for DSM-IV-TR Axis I Disorders for residents without dementia or the Provisional Diagnostic Criteria for Depression in Alzheimer Disease for residents with dementia to establish expert clinical diagnoses of depression. The diagnostic performance of the staff completed CSDD was analyzed against expert diagnosis using receiver operating characteristic (ROC) curves. Results The CSDD showed low diagnostic accuracy, with areas under the ROC curve being 0.69, 0.68 and 0.70 for the total sample, residents with dementia and residents without dementia, respectively. At the standard CSDD cutoff score, the sensitivity and specificity were 71% and 59% for the total sample, 69% and 57% for residents with dementia, and 75% and 61% for residents without dementia. The Youden index (for optimizing cut-points) suggested different depression cutoff scores for residents with and without dementia. Conclusion When administered by nursing home staff the clinical utility of the CSDD is highly questionable in identifying depression. The complexity of the scale, the time required for collecting relevant information, and staff skills and knowledge of assessing depression in older people must be considered when using the CSDD in nursing homes.
Resumo:
Measurements of half-field beam penumbra were taken using EBT2 film for a variety of blocking techniques. It was shown that minimizing the SSD reduces the penumbra as the effects of beam divergence are diminished. The addition of a lead block directly on the surface provides optimal results with a 10-90% penumbra of 0.53 ± 0.02 cm. To resolve the uncertainties encountered in film measurements, future Monte Carlo measurements of halffield penumbras are to be conducted.
Resumo:
With the overwhelming increase in the amount of data on the web and data bases, many text mining techniques have been proposed for mining useful patterns in text documents. Extracting closed sequential patterns using the Pattern Taxonomy Model (PTM) is one of the pruning methods to remove noisy, inconsistent, and redundant patterns. However, PTM model treats each extracted pattern as whole without considering included terms, which could affect the quality of extracted patterns. This paper propose an innovative and effective method that extends the random set to accurately weigh patterns based on their distribution in the documents and their terms distribution in patterns. Then, the proposed approach will find the specific closed sequential patterns (SCSP) based on the new calculated weight. The experimental results on Reuters Corpus Volume 1 (RCV1) data collection and TREC topics show that the proposed method significantly outperforms other state-of-the-art methods in different popular measures.