999 resultados para Nonsmooth Analysis
Resumo:
Confirmatory factor analyses evaluated the factorial validity of the Observer Alexithymia Scale (OAS) in an alcohol-dependent sample. Observation was conducted by clinical psychologists. All models examined were rejected, given their poor fit. Given the psychometric limitations of the OAS shown in this study, the OAS may not be the most appropriate measure to use early in treatment among alcohol-dependent individuals.
Resumo:
Natural convection thermal boundary layer adjacent to the heated inclined wall of a right angled triangle with an adiabatic fin attached to that surface is investigated by numerical simulations. The finite volume based unsteady numerical model is adopted for the simulation. It is revealed from the numerical results that the development of the boundary layer along the inclined surface is characterized by three distinct stages, i.e. a start-up stage, a transitional stage and a steady stage. These three stages can be clearly identified from the numerical simulations. Moreover, in presence of adiabatic fin, the thermal boundary layer adjacent to the inclined wall breaks initially. However, it is reattached with the downstream boundary layer next to the fin. More attention has been given to the boundary layer development near the fin area.
Resumo:
Knowledge of cable parameters has been well established but a better knowledge of the environment in which the cables are buried lags behind. Research in Queensland University of Technology has been aimed at obtaining and analysing actual daily field values of thermal resistivity and diffusivity of the soil around power cables. On-line monitoring systems have been developed and installed with a data logger system and buried spheres that use an improved technique to measure thermal resistivity and diffusivity over a short period. Results based on long term continuous field data are given. A probabilistic approach is developed to establish the correlation between the measured field thermal resistivity values and rainfall data from weather bureau records. This data from field studies can reduce the risk in cable rating decisions and provide a basis for reliable prediction of “hot spot” of an existing cable circuit
Resumo:
The advanced programmatic risk analysis and management model (APRAM) is one of the recently developed methods that can be used for risk analysis and management purposes considering schedule, cost, and quality risks simultaneously. However, this model considers those failure risks that occur only over the design and construction phases of a project’s life cycle. While it can be sufficient for some projects for which the required cost during the operating life is much less than the budget required over the construction period, it should be modified in relation to infrastructure projects because the associated costs during the operating life cycle are significant. In this paper, a modified APRAM is proposed, which can consider potential risks that might occur over the entire life cycle of the project, including technical and managerial failure risks. Therefore, the modified model can be used as an efficient decision-support tool for construction managers in the housing industry in which various alternatives might be technically available. The modified method is demonstrated by using a real building project, and this demonstration shows that it can be employed efficiently by construction managers. The Delphi method was applied in order to figure out the failure events and their associated probabilities. The results show that although the initial cost of a cold-formed steel structural system is higher than a conventional construction system, the former’s failure cost is much lower than the latter’s
Resumo:
Background: A range of health outcomes at a population level are related to differences in levels of social disadvantage. Understanding the impact of any such differences in palliative care is important. The aim of this study was to assess, by level of socio-economic disadvantage, referral patterns to specialist palliative care and proximity to inpatient services. Methods: All inpatient and community palliative care services nationally were geocoded (using postcode) to one nationally standardised measure of socio-economic deprivation – Socio-Economic Index for Areas (SEIFA; 2006 census data). Referral to palliative care services and characteristics of referrals were described through data collected routinely at clinical encounters. Inpatient location was measured from each person’s home postcode, and stratified by socio-economic disadvantage. Results: This study covered July – December 2009 with data from 10,064 patients. People from the highest SEIFA group (least disadvantaged) were significantly less likely to be referred to a specialist palliative care service, likely to be referred closer to death and to have more episodes of inpatient care for longer time. Physical proximity of a person’s home to inpatient care showed a gradient with increasing distance by decreasing levels of socio-economic advantage. Conclusion: These data suggest that a simple relationship of low socioeconomic status and poor access to a referral-based specialty such as palliative care does not exist. Different patterns of referral and hence different patterns of care emerge.
Resumo:
This study makes out the case for the use of the Conversational Analytic method as a research approach that might both extricate and chronicle the features of the journalism interview. It seeks to encourage such research to help inform understanding of this form and to provide further lessons as to the nature of journalism practice. Such studies might follow many paths but this paper focuses more particularly on the outcomes for the debate as to the continued relevance of "objectivity" in informing journalism professional practice. To make out the case for the veracity of CA as a means through which the conduct of journalism practice might be explored the paper examines: the theories of the interaction order that gave rise to the CA method; outlines the key features of the journalism interview as explicated through the CA approach; outlines the implications of such research for the establishment of the standing of "objectivity". It concludes as to the wider relevance of such studies of journalism practice for a fracturing journalism field, which suffers from a lack of benchmarks to measure the public benefit of the range of forms that now proliferate on the internet.
Resumo:
Modelling video sequences by subspaces has recently shown promise for recognising human actions. Subspaces are able to accommodate the effects of various image variations and can capture the dynamic properties of actions. Subspaces form a non-Euclidean and curved Riemannian manifold known as a Grassmann manifold. Inference on manifold spaces usually is achieved by embedding the manifolds in higher dimensional Euclidean spaces. In this paper, we instead propose to embed the Grassmann manifolds into reproducing kernel Hilbert spaces and then tackle the problem of discriminant analysis on such manifolds. To achieve efficient machinery, we propose graph-based local discriminant analysis that utilises within-class and between-class similarity graphs to characterise intra-class compactness and inter-class separability, respectively. Experiments on KTH, UCF Sports, and Ballet datasets show that the proposed approach obtains marked improvements in discrimination accuracy in comparison to several state-of-the-art methods, such as the kernel version of affine hull image-set distance, tensor canonical correlation analysis, spatial-temporal words and hierarchy of discriminative space-time neighbourhood features.
Resumo:
Much has been written on Michel Foucault’s reluctance to clearly delineate a research method, particularly with respect to genealogy (Harwood 2000; Meadmore, Hatcher, & McWilliam 2000; Tamboukou 1999). Foucault (1994, p. 288) himself disliked prescription stating, “I take care not to dictate how things should be” and wrote provocatively to disrupt equilibrium and certainty, so that “all those who speak for others or to others” no longer know what to do. It is doubtful, however, that Foucault ever intended for researchers to be stricken by that malaise to the point of being unwilling to make an intellectual commitment to methodological possibilities. Taking criticism of “Foucauldian” discourse analysis as a convenient point of departure to discuss the objectives of poststructural analyses of language, this paper develops what might be called a discursive analytic; a methodological plan to approach the analysis of discourses through the location of statements that function with constitutive effects.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Data reliability issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. Participants may submit low quality, misleading, inaccurate, or even malicious data. Therefore, finding a way to improve the data reliability has become an urgent demand. This study aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we propose to design a reputation framework to enhance data reliability and also investigate some critical elements that should be aware of during developing and designing new reputation systems.
Resumo:
In order to obtain a more compact Superconducting Fault Current limiter (SFCL), a special geometry of core and AC coil is required. This results in a unique magnetic flux pattern which differs from those associated with conventional round core arrangements. In this paper the magnetic flux density within a Fault Current Limiter (FCL) is described. Both experimental and analytical approaches are considered. A small scale prototype of an FCL was constructed in order to conduct the experiments. This prototype comprises a single phase. The analysis covers both the steady state and the short-circuit condition. Simulation results were obtained using commercial software based on the Finite Element Method (FEM). The magnetic flux saturating the cores, leakage magnetic flux giving rise to electromagnetic forces and leakage magnetic flux flowing in the enclosing tank are computed.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.
Resumo:
Power system operation and planning are facing increasing uncertainties especially with the deregulation process and increasing demand for power. Probabilistic power system stability assessment and probabilistic power system planning have been identified by EPRI as one of the important trends in power system operations and planning. Probabilistic small signal stability assessment studies the impact of system parameter uncertainties on system small disturbance stability characteristics. Researches in this area have covered many uncertainties factors such as controller parameter uncertainties and generation uncertainties. One of the most important factors in power system stability assessment is load dynamics. In this paper, composite load model is used to consider the uncertainties from load parameter uncertainties impact on system small signal stability characteristics. The results provide useful insight into the significant stability impact brought to the system by load dynamics. They can be used to help system operators in system operation and planning analysis.