991 resultados para Individually rational utility set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designers and artists have integrated recent advances in interactive, tangible and ubiquitous computing technologies to create new forms of interactive environments in the domains of work, recreation, culture and leisure. Many designs of technology systems begin with the workplace in mind, and with function, ease of use, and efficiency high on the list of priorities. [1] These priorities do not fit well with works designed for an interactive art environment, where the aims are many, and where the focus on utility and functionality is to support a playful, ambiguous or even experimental experience for the participants. To evaluate such works requires an integration of art-criticism techniques with more recent Human Computer Interaction (HCI) methods, and an understanding of the different nature of engagement in these environments. This paper begins a process of mapping a set of priorities for amplifying engagement in interactive art installations. I first define the concept of ludic engagement and its usefulness as a lens for both design and evaluation in these settings. I then detail two fieldwork evaluations I conducted within two exhibitions of interactive artworks, and discuss their outcomes and the future directions of this research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This document outlines a framework that could be used by government agencies in assessing policy interventions aimed at achieving social outcomes from government construction contracts. The framework represents a rational interpretation of the information gathered during the multi-outcomes construction policies project. The multi-outcomes project focused on the costs and benefits of using public construction contracts to promote the achievement of training and employment and public art objectives. The origin of the policy framework in a cost-benefit appraisal of current policy interventions is evidenced by its emphasis on sensitivity to policy commitment and project circumstances (especially project size and scope).The quantitative and qualitative analysis conducted in the multi-outcomes project highlighted, first, that in the absence of strong industry commitment to policy objectives, policy interventions typically result in high levels of avoidance activity, substantial administrative costs and very few benefits. Thus, for policy action on, for example, training or local employment to be successful compliance issues must be adequately addressed. Currently it appears that pre-qualification schemes (similar to the Priority Access Scheme) and schemes that rely on measuring, for example, the training investments of contractors within particular projects do not achieve high levels of compliance and involve significant administrative costs. Thus, an alternative is suggested in the policy framework developed here: a levy on each public construction project – set as a proportion of the total project costs. Although a full evaluation of this policy alternative was beyond the scope of the multi-outcomes construction policies project, it appears to offer the potential to minimize the transaction costs on contractors whilst enabling the creation of a training agency dedicated to improving the supply of skilled construction labour. A recommendation is thus made that this policy alternative be fully researched and evaluated. As noted above, the outcomes of the multi-outcomes research project also highlighted the need for sensitivity to project circumstances in the development and implementation of polices for public construction projects. Ideally a policy framework would have the flexibility to respond to circumstances where contractors share a commitment to the policy objectives and are able to identify measurable social outcomes from the particular government projects they are involved in. This would involve a project-by-project negotiation of goals and performance measures. It is likely to only be practical for large, longer term projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Construction sector policy makers have the opportunity to create improvements and develop economic, social and environmental sustainability through supply chain economics. The idea of the supply chain concept to improve firm behaviour and industry performance is not new. However there has been limited application and little or no measurement to monitor successful implementation. Often purchasing policies have been developed with sound strategic procurement principles but even these have had limited penetration in to the processes and practices of infrastructure agencies. The research reported in this paper documents an action research study currently being undertaken in the Australian construction sector which aims to explore supply chain economic policy implementation for sectoral change by two government agencies. The theory which informs this study is the emerging area of construction supply chain economics. There are five stages to the project including; demand analysis, chain analysis, government agency organizational audit, supplier strategy and strategic alignment. The overall objective is towards the development of a Supplier Group Strategy Map for two public sector agencies. Two construction subsectors are examined in detail; construction and demolition waste and precast concrete. Both of these subsectors are critical to the economic and environmental sustainability performance of the construction sector and the community as a whole in the particular jurisdictions. The local and state government agencies who are at the core of the case studies rely individually on the performance of these sectors. The study is set within the context of a sound state purchasing policy that has however, had limited application by the two agencies. Partial results of the study are presented and early findings indicate that the standard risk versus expenditure procurement model does not capture the complexities of project, owner and government risk considerations. A new model is proposed in this paper, which incorporates the added dimension of time. The research results have numerous stakeholders; they will hold particular value for those interested in regional construction sector economics, government agencies who develop and implement policy and who have a large construction purchasing imprint and the players involved in the two subsectors. Even though this is a study in Australia it has widespread applicability as previous research indicates that procurement reform is of international significance and policy implementation is problematic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the impact of Cognitive-Behavioural Therapy (CBT) and Rational~Emotive Education (REE) self-enhancement programs on children's self-talk, self-esteem and irrational beliefs. A total of 116 children (50.9% girls) with a mean age of 9.8 years attending Years 4 and 6 at two primary schools participated in the study. eBT resulted in a reduction in negative self-talk while REE seemed to enhance independence beliefs. Both programs were associated with increased positive self-talk and with having increased rationality in Conformity and Discomfort Intolerance beliefs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To examine the reliability and validity of the Alcohol Use Disorders Identification Test (AUDIT) compared to a structured diagnostic interview, the Composite international Diagnostic Interview (CIDI; 12-month version) in psychiatric patients with a diagnosis of schizophrenia. Method: Patients (N = 71, 53 men) were interviewed using the CIDI (Alcohol Misuse Section; 12-month version) and then completed the AUDIT. Results: The CIDI identified 32.4% of the sample as having an alcohol use disorder. Of these, 5 (7.0%) met diagnostic criteria for harmful use of alcohol, 1 (1.4%) met diagnostic criteria for alcohol abuse and 17 (23.9%) met diagnostic criteria for alcohol dependence. The AUDIT was found to have good internal reliability (coefficient = 0.85). An AUDIT cutoff of greater than or equal to 8 had a sensitivity of 87% and specificity of 90% in detecting CIDI-diagnosed alcohol disorders. All items except Item 9 contributed significantly to discriminant validity. Conclusions: The findings replicate and extend previous findings of high rates of alcohol use disorders in people with severe mental illness. The AUDIT was found to be reliable and valid in this sample and can be used with confidence as a screening instrument for alcohol use disorders in people with schizophrenia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Node-based Local Mesh Generation (NLMG) algorithm, which is free of mesh inconsistency, is one of core algorithms in the Node-based Local Finite Element Method (NLFEM) to achieve the seamless link between mesh generation and stiffness matrix calculation, and the seamless link helps to improve the parallel efficiency of FEM. Furthermore, the key to ensure the efficiency and reliability of NLMG is to determine the candidate satellite-node set of a central node quickly and accurately. This paper develops a Fast Local Search Method based on Uniform Bucket (FLSMUB) and a Fast Local Search Method based on Multilayer Bucket (FLSMMB), and applies them successfully to the decisive problems, i.e. presenting the candidate satellite-node set of any central node in NLMG algorithm. Using FLSMUB or FLSMMB, the NLMG algorithm becomes a practical tool to reduce the parallel computation cost of FEM. Parallel numerical experiments validate that either FLSMUB or FLSMMB is fast, reliable and efficient for their suitable problems and that they are especially effective for computing the large-scale parallel problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The objectives of this article are to explore the extent to which the International Statistical Classification of Diseases and Related Health Problems (ICD) has been used in child abuse research, to describe how the ICD system has been applied and to assess factors affecting the reliability of ICD coded data in child abuse research.----- Methods: PubMed, CINAHL, PsychInfo and Google Scholar were searched for peer reviewed articles written since 1989 that used ICD as the classification system to identify cases and research child abuse using health databases. Snowballing strategies were also employed by searching the bibliographies of retrieved references to identify relevant associated articles. The papers identified through the search were independently screened by two authors for inclusion, resulting in 47 studies selected for the review. Due to heterogeneity of studies metaanalysis was not performed.----- Results: This paper highlights both utility and limitations of ICD coded data. ICD codes have been widely used to conduct research into child maltreatment in health data systems. The codes appear to be used primarily to determine child maltreatment patterns within identified diagnoses or to identify child maltreatment cases for research.----- Conclusions: A significant impediment to the use of ICD codes in child maltreatment research is the under-ascertainment of child maltreatment by using coded data alone. This is most clearly identified and, to some degree, quantified, in research where data linkage is used. Practice Implications: The importance of improved child maltreatment identification will assist in identifying risk factors and creating programs that can prevent and treat child maltreatment and assist in meeting reporting obligations under the CRC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A data-driven background dataset refinement technique was recently proposed for SVM based speaker verification. This method selects a refined SVM background dataset from a set of candidate impostor examples after individually ranking examples by their relevance. This paper extends this technique to the refinement of the T-norm dataset for SVM-based speaker verification. The independent refinement of the background and T-norm datasets provides a means of investigating the sensitivity of SVM-based speaker verification performance to the selection of each of these datasets. Using refined datasets provided improvements of 13% in min. DCF and 9% in EER over the full set of impostor examples on the 2006 SRE corpus with the majority of these gains due to refinement of the T-norm dataset. Similar trends were observed for the unseen data of the NIST 2008 SRE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Autistic Behavioural Indicators Instrument (ABII) is an 18-item instrument developed to identify children with Autistic Disorder (AD) based on the presence of unique autistic behavioural indicators. The ABII was administered to 20 children with AD, 20 children with speech and language impairment (SLI) and 20 typically developing (TD) children aged 2-6 years. Results indicated that the ABII discriminated children diagnosed with AD from those diagnosed with SLI and those who were TD, based on the presence of specific social attention, sensory, and behavioural symptoms. A combination of symptomology across these domains correctly classified 100% of children with and without AD. The paper concludes that the ABII shows considerable promise as an instrument for the early identification of AD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The book within which this chapter appears is published as a research reference book (not a coursework textbook) on Management Information Systems (MIS) for seniors or graduate students in Chinese universities. It is hoped that this chapter, along with the others, will be helpful to MIS scholars and PhD/Masters research students in China who seek understanding of several central Information Systems (IS) research topics and related issues. The subject of this chapter - ‘Evaluating Information Systems’ - is broad, and cannot be addressed in its entirety in any depth within a single book chapter. The chapter proceeds from the truism that organizations have limited resources and those resources need to be invested in a way that provides greatest benefit to the organization. IT expenditure represents a substantial portion of any organization’s investment budget and IT related innovations have broad organizational impacts. Evaluation of the impact of this major investment is essential to justify this expenditure both pre- and post-investment. Evaluation is also important to prioritize possible improvements. The chapter (and most of the literature reviewed herein) admittedly assumes a blackbox view of IS/IT1, emphasizing measures of its consequences (e.g. for organizational performance or the economy) or perceptions of its quality from a user perspective. This reflects the MIS emphasis – a ‘management’ emphasis rather than a software engineering emphasis2, where a software engineering emphasis might be on the technical characteristics and technical performance. Though a black-box approach limits diagnostic specificity of findings from a technical perspective, it offers many benefits. In addition to superior management information, these benefits may include economy of measurement and comparability of findings (e.g. see Part 4 on Benchmarking IS). The chapter does not purport to be a comprehensive treatment of the relevant literature. It does, however, reflect many of the more influential works, and a representative range of important writings in the area. The author has been somewhat opportunistic in Part 2, employing a single journal – The Journal of Strategic Information Systems – to derive a classification of literature in the broader domain. Nonetheless, the arguments for this approach are believed to be sound, and the value from this exercise real. The chapter drills down from the general to the specific. It commences with a highlevel overview of the general topic area. This is achieved in 2 parts: - Part 1 addressing existing research in the more comprehensive IS research outlets (e.g. MISQ, JAIS, ISR, JMIS, ICIS), and Part 2 addressing existing research in a key specialist outlet (i.e. Journal of Strategic Information Systems). Subsequently, in Part 3, the chapter narrows to focus on the sub-topic ‘Information Systems Success Measurement’; then drilling deeper to become even more focused in Part 4 on ‘Benchmarking Information Systems’. In other words, the chapter drills down from Parts 1&2 Value of IS, to Part 3 Measuring Information Systems Success, to Part 4 Benchmarking IS. While the commencing Parts (1&2) are by definition broadly relevant to the chapter topic, the subsequent, more focused Parts (3 and 4) admittedly reflect the author’s more specific interests. Thus, the three chapter foci – value of IS, measuring IS success, and benchmarking IS - are not mutually exclusive, but, rather, each subsequent focus is in most respects a sub-set of the former. Parts 1&2, ‘the Value of IS’, take a broad view, with much emphasis on ‘the business Value of IS’, or the relationship between information technology and organizational performance. Part 3, ‘Information System Success Measurement’, focuses more specifically on measures and constructs employed in empirical research into the drivers of IS success (ISS). (DeLone and McLean 1992) inventoried and rationalized disparate prior measures of ISS into 6 constructs – System Quality, Information Quality, Individual Impact, Organizational Impact, Satisfaction and Use (later suggesting a 7th construct – Service Quality (DeLone and McLean 2003)). These 6 constructs have been used extensively, individually or in some combination, as the dependent variable in research seeking to better understand the important antecedents or drivers of IS Success. Part 3 reviews this body of work. Part 4, ‘Benchmarking Information Systems’, drills deeper again, focusing more specifically on a measure of the IS that can be used as a ‘benchmark’3. This section consolidates and extends the work of the author and his colleagues4 to derive a robust, validated IS-Impact measurement model for benchmarking contemporary Information Systems (IS). Though IS-Impact, like ISS, has potential value in empirical, causal research, its design and validation has emphasized its role and value as a comparator; a measure that is simple, robust and generalizable and which yields results that are as far as possible comparable across time, across stakeholders, and across differing systems and systems contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-speed videokeratoscopy is an emerging technique that enables study of the corneal surface and tear-film dynamics. Unlike its static predecessor, this new technique results in a very large amount of digital data for which storage needs become significant. We aimed to design a compression technique that would use mathematical functions to parsimoniously fit corneal surface data with a minimum number of coefficients. Since the Zernike polynomial functions that have been traditionally used for modeling corneal surfaces may not necessarily correctly represent given corneal surface data in terms of its optical performance, we introduced the concept of Zernike polynomial-based rational functions. Modeling optimality criteria were employed in terms of both the rms surface error as well as the point spread function cross-correlation. The parameters of approximations were estimated using a nonlinear least-squares procedure based on the Levenberg-Marquardt algorithm. A large number of retrospective videokeratoscopic measurements were used to evaluate the performance of the proposed rational-function-based modeling approach. The results indicate that the rational functions almost always outperform the traditional Zernike polynomial approximations with the same number of coefficients.