899 resultados para information bottleneck method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photocopy. Springfield, Va., Distributed by Clearinghouse for Federal Scientific and Technical Information [1969]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capturing the voices of women when the issue is of a sensitive nature has been a major concern of feminist researchers. It has often been argued that interpretive methods are the most appropriate way to collect such information, but there are other appropriate ways to approach the design of research. This article explores the use of a mixed-method approach to collect data on incontinence in older women and argues for the use of a variety of creative approaches to collect and analyze data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective healthcare integration is underpinned by clinical information transfer that is timely, legible and relevant. The aim of this study was to describe and evaluate a method for best practice information exchange. This was achieved based on the generic Mater integration methodology. Using this model the Mater Health Services have increased effective community fax discharge from 34% in 1999 to 86% in 2002. These results were predicated on applied information technology excellence involving the development of the Mater Electronic Health Referral Summary and effective change management methodology, which included addressing issues around patient consent, engaging clinicians, provision of timely and appropriate education and training, executive leadership and commitment and adequate resourcing. The challenge in achieving best practice information transfer is not solely in the technology but also in implementing the change process and engaging clinicians. General practitioners valued the intervention highly. Hospital and community providers now have an inexpensive, effective product for critical information exchange in a timely and relevant manner, enhancing the quality and safety of patient care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The aim of this project was to design and evaluate a system that would produce tailored information for stroke patients and their carers, customised according to their informational needs, and facilitate communication between the patient and, health professional. Method: A human factors development approach was used to develop a computer system, which dynamically compiles stroke education booklets for patients and carers. Patients and carers are able to select the topics about which they wish to receive information, the amount of information they want, and the font size of the printed booklet. The system is designed so that the health professional interacts with it, thereby providing opportunities for communication between the health professional and patient/carer at a number of points in time. Results: Preliminary evaluation of the system by health professionals, patients and carers was positive. A randomised controlled trial that examines the effect of the system on patient and carer outcomes is underway. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the role of information, efficacy, and 3 stressors in predicting adjustment to organizational change. Participants were 589 government employees undergoing an 18-month process of regionalization. To examine if the predictor variables had long-term effects on adjustment, the authors assessed psychological well-being, client engagement, and job satisfaction again at a 2-year follow-up. At Time 1, there was evidence to suggest that information was indirectly related to psychological well-being, client engagement, and job satisfaction, via its positive relationship to efficacy. There also was evidence to suggest that efficacy was related to reduced stress appraisals, thereby heightening client engagement. Last, there was consistent support for the stress-buffering role of Time I self-efficacy in the prediction of Time 2 job satisfaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer-aided tomography has been used for many years to provide significant information about the internal properties of an object, particularly in the medical fraternity. By reconstructing one-dimensional (ID) X-ray images, 2D cross-sections and 3D renders can provide a wealth of information about an object's internal structure. An extension of the methodology is reported here to enable the characterization of a model agglomerate structure. It is demonstrated that methods based on X-ray microtomography offer considerable potential in the validation and utilization of distinct element method simulations also examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the construction and use of a systematic structured method of mental health country situation appraisal, in order to help meet the need for conceptual tools to assist planners and policy makers develop and audit policy and implementation strategies. The tool encompasses the key domains of context, needs, resources, provisions and outcomes, and provides a framework for synthesizing key qualitative and quantitative information, flagging up gaps in knowledge, and for reviewing existing policies. It serves as an enabling tool to alert and inform policy makers, professionals and other key stakeholders about important issues which need to be considered in mental health policy development. It provides detailed country specific information in a systematic format, to facilitate global sharing of experiences of mental health reform and strategies between policy makers and other stakeholders. Lastly, it is designed to be a capacity building tool for local stakeholders to enhance situation appraisal, and multisectorial policy development and implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we apply a new method for the determination of surface area of carbonaceous materials, using the local surface excess isotherms obtained from the Grand Canonical Monte Carlo simulation and a concept of area distribution in terms of energy well-depth of solid–fluid interaction. The range of this well-depth considered in our GCMC simulation is from 10 to 100 K, which is wide enough to cover all carbon surfaces that we dealt with (for comparison, the well-depth for perfect graphite surface is about 58 K). Having the set of local surface excess isotherms and the differential area distribution, the overall adsorption isotherm can be obtained in an integral form. Thus, given the experimental data of nitrogen or argon adsorption on a carbon material, the differential area distribution can be obtained from the inversion process, using the regularization method. The total surface area is then obtained as the area of this distribution. We test this approach with a number of data in the literature, and compare our GCMC-surface area with that obtained from the classical BET method. In general, we find that the difference between these two surface areas is about 10%, indicating the need to reliably determine the surface area with a very consistent method. We, therefore, suggest the approach of this paper as an alternative to the BET method because of the long-recognized unrealistic assumptions used in the BET theory. Beside the surface area obtained by this method, it also provides information about the differential area distribution versus the well-depth. This information could be used as a microscopic finger-print of the carbon surface. It is expected that samples prepared from different precursors and different activation conditions will have distinct finger-prints. We illustrate this with Cabot BP120, 280 and 460 samples, and the differential area distributions obtained from the adsorption of argon at 77 K and nitrogen also at 77 K have exactly the same patterns, suggesting the characteristics of this carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present experimental results on the measurement of fidelity decay under contrasting system dynamics using a nuclear magnetic resonance quantum information processor. The measurements were performed by implementing a scalable circuit in the model of deterministic quantum computation with only one quantum bit. The results show measurable differences between regular and complex behavior and for complex dynamics are faithful to the expected theoretical decay rate. Moreover, we illustrate how the experimental method can be seen as an efficient way for either extracting coarse-grained information about the dynamics of a large system or measuring the decoherence rate from engineered environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notorious "dimensionality curse" is a well-known phenomenon for any multi-dimensional indexes attempting to scale up to high dimensions. One well-known approach to overcome degradation in performance with respect to increasing dimensions is to reduce the dimensionality of the original dataset before constructing the index. However, identifying the correlation among the dimensions and effectively reducing them are challenging tasks. In this paper, we present an adaptive Multi-level Mahalanobis-based Dimensionality Reduction (MMDR) technique for high-dimensional indexing. Our MMDR technique has four notable features compared to existing methods. First, it discovers elliptical clusters for more effective dimensionality reduction by using only the low-dimensional subspaces. Second, data points in the different axis systems are indexed using a single B+-tree. Third, our technique is highly scalable in terms of data size and dimension. Finally, it is also dynamic and adaptive to insertions. An extensive performance study was conducted using both real and synthetic datasets, and the results show that our technique not only achieves higher precision, but also enables queries to be processed efficiently. Copyright Springer-Verlag 2005

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adsorption of supercritical fluids is increasingly carried out to determine the micropore size distribution. This is largely motivated by the advances in the use of supercritical adsorption in high energy applications, such as hydrogen and methane storage in porous media. Experimental data are reported as mass excess versus pressure, and when these data are matched against the theoretical mass excess, significant errors could occur if the void volume used in the calculation of the experimental mass excess is incorrectly determined [Malbrunot, P.; Vidal, D.; Vermesse, J.; Chahine, R.; Bose, T. K. Langmuir 1997, 13, 539]. 1 The incorrect value for the void volume leads to a wrong description of the maximum in the plot of mass excess versus pressure as well as the part of the isotherm over the pressure region where the isotherm is decreasing. Because of this uncertainty in the maximum and the decreasing part of the isotherm, we propose a new method in which the problems associated with this are completely avoided. Our method involves only the relationship between the amount that is introduced into the adsorption cell and the equilibrium pressure. This information of direct experimental data has two distinct advantages. The first is that the data is the raw data without any manipulation (i.e., involving further calculations), and the second one is that this relationship always monotonically increases with pressure. We will illustrate this new method with the adsorption data of methane in a commercial sample of activated carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shear strengthening is required when an RC beam is found deficient in shear, or when its shear capacity falls below its flexural capacity after flexural strengthening. A recent technique for the shear strengthening of RC beams is to provide additional FRP web reinforcement, commonly in the form of bonded external FRP strips/sheets. Over the last few years, several experimental studies have been conducted on this new strengthening technique, which has established its effectiveness. While experimental methods of investigation are extremely useful in obtaining information about the composite behaviour of FRP and reinforced concrete, the use of numerical models such as the one presented in this paper helps in developing a good understanding of the behaviour at lower costs. In the study presented in this paper, ANSYS finite element program is used to examine the response of beams strengthened in shear by FRPs. The FE model is calibrated against test results performed at the University of Kentucky. Once validated, the model is used to examine the influence of fibre orientation, compressive strength of concrete, area of tensile and compressive reinforcements, and amount and distance between stirrups on the strength and ductility of FRP strengthened beam.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Land-surface processes include a broad class of models that operate at a landscape scale. Current modelling approaches tend to be specialised towards one type of process, yet it is the interaction of processes that is increasing seen as important to obtain a more integrated approach to land management. This paper presents a technique and a tool that may be applied generically to landscape processes. The technique tracks moving interfaces across landscapes for processes such as water flow, biochemical diffusion, and plant dispersal. Its theoretical development applies a Lagrangian approach to motion over a Eulerian grid space by tracking quantities across a landscape as an evolving front. An algorithm for this technique, called level set method, is implemented in a geographical information system (GIS). It fits with a field data model in GIS and is implemented as operators in map algebra. The paper describes an implementation of the level set methods in a map algebra programming language, called MapScript, and gives example program scripts for applications in ecology and hydrology.