512 resultados para Reliable Computations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sing & Grow is an early intervention music therapy project that provides community group music therapy programs to families with young children who encounter risk factors that may impact on parenting and optimal child develop variety of evaluation tools were devised and used over the first 3 years of the project. Upon the subsequent funding and expansion of the project at the end of this period, it was necessary to find, test and devise more rigorous, valid and reliable measures to withstand the scrutiny of researchers, and to combat the concerns and criticisms associated with the previous methods of data collection. An action inquiry project was therefore undertaken with two groups of project participants to trial the use of the Parenting Stress Index and Depression, Anxiety and Stress Scales, both recommended by leading psychologists. Key findings that will be discussed include the friction between the deficit-focussed nature of many psychometric tools and the strengths-based approach taken in service delivery, the level of difficulty in terms of literacy and comprehension for vulnerable respondents, and the lack of one tool with the ability to comprehensively measure all aspects of a broad scoping program. Keywords: music therapy, evaluation, PSI, DASS, action inquiry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract]: Traditional technology adoption models identified ‘ease of use’ and ‘usefulness’ as the dominating factors for technology adoption. However, recent studies in healthcare have established that these two factors are not always reliable on their own and other factors may influence technology adoption. To establish the identity of these additional factors, a mixed method approach was used and data were collected through interviews and a survey. The survey instrument was specifically developed for this study so that it is relevant to the Indian healthcare setting. We identified clinical management and technological barriers as the dominant factors influencing the wireless handheld technology adoption in the Indian healthcare environment. The results of this study showed that new technology models will benefit by considering the clinical influences of wireless handheld technology, in addition to known factors. The scope of this study is restricted to wireless handheld devices such as PDAs, smart phones, and handheld PCs Gururajan, Raj and Hafeez-Baig, Abdul and Gururajan, Vijaya

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The responsibility to protect ('R2P') principle articulates the obligations of the international community to prevent conflict occurring, to intervene in conflicts, and to assist in rebuilding after conflicts. The doctrine is about protecting civilians in armed conflicts from four mass atrocity crimes: genocide, war crimes, crimes against humanity and ethnic cleansing. This book examines interventions in East Timor, Sri Lanka, Sudan and Kosovo. The chapters explore and question UN debates with respect to the doctrine both before and after its adoption in 2005; contrasting state attitudes to international military intervention; and what takes place after intervention. It also discusses the ability of the Security Council to access reliable information and credible and transparent processes to enable it to make a determination on the occurrence of atrocities in a Member State. Questioning whether there is a need to find a closer operational link between the responsibilities to prevent and react and a normative link between R2P and principles of international law, the contributions examine the effectiveness of the framework of R2P for international decision-making in response to mass atrocity crimes and ask how an international system to deal with threats and mass atrocities can be developed in the absence of a central authority. This book will be valuable to those interested in international law, human rights, and security, peace and conflict studies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The antiretroviral therapy (ART) program for People Living with HIV/AIDS (PLHIV) in Vietnam has been scaled up rapidly in recent years (from 50 clients in 2003 to almost 38,000 in 2009). ART success is highly dependent on the ability of the patients to fully adhere to the prescribed treatment regimen. Despite the remarkable extension of ART programs in Vietnam, HIV/AIDS program managers still have little reliable data on levels of ART adherence and factors that might promote or reduce adherence. Several previous studies in Vietnam estimated extremely high levels of ART adherence among their samples, although there are reasons to question the veracity of the conclusion that adherence is nearly perfect. Further, no study has quantitatively assessed the factors influencing ART adherence. In order to reduce these gaps, this study was designed to include several phases and used a multi-method approach to examine levels of ART non-adherence and its relationship to a range of demographic, clinical, social and psychological factors. The study began with an exploratory qualitative phase employing four focus group discussions and 30 in-depth interviews with PLHIV, peer educators, carers and health care providers (HCPs). Survey interviews were completed with 615 PLHIV in five rural and urban out-patient clinics in northern Vietnam using an Audio Computer Assisted Self-Interview (ACASI) and clinical records extraction. The survey instrument was carefully developed through a systematic procedure to ensure its reliability and validity. Cultural appropriateness was considered in the design and implementation of both the qualitative study and the cross sectional survey. The qualitative study uncovered several contrary perceptions between health care providers and HIV/AIDS patients regarding the true levels of ART adherence. Health care providers often stated that most of their patients closely adhered to their regimens, while PLHIV and their peers reported that “it is not easy” to do so. The quantitative survey findings supported the PLHIV and their peers’ point of view in the qualitative study, because non-adherence to ART was relatively common among the study sample. Using the ACASI technique, the estimated prevalence of onemonth non-adherence measured by the Visual Analogue Scale (VAS) was 24.9% and the prevalence of four-day not-on-time-adherence using the modified Adult AIDS Clinical Trials Group (AACTG) instrument was 29%. Observed agreement between the two measures was 84% and kappa coefficient was 0.60 (SE=0.04 and p<0.0001). The good agreement between the two measures in the current study is consistent with those found in previous research and provides evidence of cross-validation of the estimated adherence levels. The qualitative study was also valuable in suggesting important variables for the survey conceptual framework and instrument development. The survey confirmed significant correlations between two measures of ART adherence (i.e. dose adherence and time adherence) and many factors identified in the qualitative study, but failed to find evidence of significant correlations of some other factors and ART adherence. Non-adherence to ART was significantly associated with untreated depression, heavy alcohol use, illicit drug use, experiences with medication side-effects, chance health locus of control, low quality of information from HCPs, low satisfaction with received support and poor social connectedness. No multivariate association was observed between ART adherence and age, gender, education, duration of ART, the use of adherence aids, disclosure of ART, patients’ ability to initiate communication with HCPs or distance between clinic and patients’ residence. This is the largest study yet reported in Asia to examine non-adherence to ART and its possible determinants. The evidence strongly supports recent calls from other developing nations for HIV/AIDS services to provide screening, counseling and treatment for patients with depressive symptoms, heavy use of alcohol and substance use. Counseling should also address fatalistic beliefs about chance or luck determining health outcomes. The data suggest that adherence could be enhanced by regularly providing information on ART and assisting patients to maintain social connectedness with their family and the community. This study highlights the benefits of using a multi-method approach in examining complex barriers and facilitators of medication adherence. It also demonstrated the utility of the ACASI interview method to enhance open disclosure by people living with HIV/AIDS and thus, increase the veracity of self-reported data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stochastic models for competing clonotypes of T cells by multivariate, continuous-time, discrete state, Markov processes have been proposed in the literature by Stirk, Molina-París and van den Berg (2008). A stochastic modelling framework is important because of rare events associated with small populations of some critical cell types. Usually, computational methods for these problems employ a trajectory-based approach, based on Monte Carlo simulation. This is partly because the complementary, probability density function (PDF) approaches can be expensive but here we describe some efficient PDF approaches by directly solving the governing equations, known as the Master Equation. These computations are made very efficient through an approximation of the state space by the Finite State Projection and through the use of Krylov subspace methods when evolving the matrix exponential. These computational methods allow us to explore the evolution of the PDFs associated with these stochastic models, and bimodal distributions arise in some parameter regimes. Time-dependent propensities naturally arise in immunological processes due to, for example, age-dependent effects. Incorporating time-dependent propensities into the framework of the Master Equation significantly complicates the corresponding computational methods but here we describe an efficient approach via Magnus formulas. Although this contribution focuses on the example of competing clonotypes, the general principles are relevant to multivariate Markov processes and provide fundamental techniques for computational immunology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction and Aims. Alcohol expectancies are associated with drinking behaviour and post-drinking use thoughts, feelings and behaviours. The expectancies held by specific cultural or sub-cultural groups have rarely been investigated. This research maps expectancies specific to gay and other men who have sex with men (MSM) and their relationship with substance use. This study describes the specific development of a measure of such beliefs for alcohol, the Drinking Expectancy Questionnaire for Men who have Sex with Men (DEQ-MSM). Design and Methods. Items selected through a focus group and interviews were piloted on 220 self-identified gay or other MSM via an online questionnaire. Results. Factor analysis revealed three distinct substance reinforcement domains ('Cognitive impairment', 'Sexual activity' and 'Social and emotional facilitation'). These factors were associated with consumption patterns of alcohol, and in a crucial test of discriminant validity were not associated with the consumption of cannabis or stimulants. Similarities and differences with existing measures will also be discussed. Discussion and Conclusions. The DEQ-MSM represents a reliable and valid measure of outcome expectancies, related to alcohol use among MSM, and represents an important advance as no known existing alcohol expectancy measure, to date, has been developed and/or normed for use among this group. Future applications of the DEQ-MSM in health promotion, clinical settings and research may contribute to reducing harm associated with alcohol use among MSM, including the development of alcohol use among young gay men.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports the feasibility and methodological considerations of using the Short Message System Experience Sampling (SMS-ES) Method, which is an experience sampling research method developed to assist researchers to collect repeat measures of consumers’ affective experiences. The method combines SMS with web-based technology in a simple yet effective way. It is described using a practical implementation study that collected consumers’ emotions in response to using mobile phones in everyday situations. The method is further evaluated in terms of the quality of data collected in the study, as well as against the methodological considerations for experience sampling studies. These two evaluations suggest that the SMS-ES Method is both a valid and reliable approach for collecting consumers’ affective experiences. Moreover, the method can be applied across a range of for-profit and not-for-profit contexts where researchers want to capture repeated measures of consumers’ affective experiences occurring over a period of time. The benefits of the method are discussed to assist researchers who wish to apply the SMS-ES Method in their own research designs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a model of computation of the parallel type, which we call 'computing with bio-agents', based on the concept that motions of biological objects such as bacteria or protein molecular motors in confined spaces can be regarded as computations. We begin with the observation that the geometric nature of the physical structures in which model biological objects move modulates the motions of the latter. Consequently, by changing the geometry, one can control the characteristic trajectories of the objects; on the basis of this, we argue that such systems are computing devices. We investigate the computing power of mobile bio-agent systems and show that they are computationally universal in the sense that they are capable of computing any Boolean function in parallel. We argue also that using appropriate conditions, bio-agent systems can solve NP-complete problems in probabilistic polynomial time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Bioimpedance techniques provide a reliable method of assessing unilateral lymphedema in a clinical setting. Bioimpedance devices are traditionally used to assess body composition at a current frequency of 50 kHz. However, these devices are not transferable to the assessment of lymphedema, as the sensitivity of measuring the impedance of extracellular fluid is frequency dependent. It has previously been shown that the best frequency to detect extracellular fluid is 0 kHz (or DC). However, measurement at this frequency is not possible in practice due to the high skin impedance at DC, and an estimate is usually determined from low frequency measurements. This study investigated the efficacy of various low frequency ranges for the detection of lymphedema. Methods and Results: Limb impedance was measured at 256 frequencies between 3 kHz and 1000 kHz for a sample control population, arm lymphedema population, and leg lymphedema population. Limb impedance was measured using the ImpediMed SFB7 and ImpediMed L-Dex® U400 with equipotential electrode placement on the wrists and ankles. The contralateral limb impedance ratio for arms and legs was used to calculate a lymphedema index (L-Dex) at each measurement frequency. The standard deviation of the limb impedance ratio in a healthy control population has been shown to increase with frequency for both the arm and leg. Box and whisker plots of the spread of the control and lymphedema populations show that there exists good differentiation between the arm and leg L-Dex measured for lymphedema subjects and the arm and leg L-Dex measured for control subjects up to a frequency of about 30 kHz. Conclusions: It can be concluded that impedance measurements above a frequency of 30 kHz decrease sensitivity to extracellular fluid and are not reliable for early detection of lymphedema.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Objective Involuntary commitment and treatment (IC&T) of people affected by mental illness may have reference to considerations of dangerousness and/or need for care. While attempts have been made to classify mental health legislation according to whether IC&T has obligatory dangerousness criteria, there is no standardised procedure for making classification decisions. The aim of this study was to develop and trial a classification procedure and apply it to Australia's mental health legislation. Method We developed benchmarks for ‘need for care’ and ‘dangerousness’ and applied these benchmarks to classify the mental health legislation of Australia's 8 states and territories. Our focus was on civil commitment legislation rather than criminal commitment legislation. Results One state changed its legislation during the course of the study resulting in two classificatory exercises. In our initial classification, we were able to classify IC&T provisions in legislation from 6 of the 8 jurisdictions as being based on either ‘need for care’ or ‘dangerousness’. Two jurisdictions used a terminology that was outside the established benchmarks. In our second classification, we were also able to successfully classify IC&T provisions in 6 of the 8 jurisdictions. Of the 6 Acts that could be classified, all based IC&T on ‘need for care’ and none contained mandatory ‘dangerousness’ criteria. Conclusions The classification system developed for this study provided a transparent and probably reliable means of classifying 75% of Australia's mental health legislation. The inherent ambiguity of the terminology used in two jurisdictions means that further development of classification may not be possible until the meaning of the terms used has been addressed in case law. With respect to the 6 jurisdictions for which classification was possible, the findings suggest that Australia's mental health legislation relies on ‘need for care’ and not on ‘dangerousness’ as the guiding principle for IC&T. Keywords: Involuntary commitment; Mental health legislation; Dangerousness; Australia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are many applications in aeronautical/aerospace engineering where some values of the design parameters states cannot be provided or determined accurately. These values can be related to the geometry(wingspan, length, angles) and or to operational flight conditions that vary due to the presence of uncertainty parameters (Mach, angle of attack, air density and temperature, etc.). These uncertainty design parameters cannot be ignored in engineering design and must be taken into the optimisation task to produce more realistic and reliable solutions. In this paper, a robust/uncertainty design method with statistical constraints is introduced to produce a set of reliable solutions which have high performance and low sensitivity. Robust design concept coupled with Multi Objective Evolutionary Algorithms (MOEAs) is defined by applying two statistical sampling formulas; mean and variance/standard deviation associated with the optimisation fitness/objective functions. The methodology is based on a canonical evolution strategy and incorporates the concepts of hierarchical topology, parallel computing and asynchronous evaluation. It is implemented for two practical Unmanned Aerial System (UAS) design problems; the flrst case considers robust multi-objective (single disciplinary: aerodynamics) design optimisation and the second considers a robust multidisciplinary (aero structures) design optimisation. Numerical results show that the solutions obtained by the robust design method with statistical constraints have a more reliable performance and sensitivity in both aerodynamics and structures when compared to the baseline design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional planning and decision making, with its sectoral and territorial emphasis and flat-map based processes are no longer adequate or appropriate for the increased complexity confronting airport/city interfaces. These crowed and often contested governance spaces demand a more iterative and relational planning and decision-making approach. Emergent GIS based planning and decision-making tools provide a mechanism which integrate and visually display an array of complex data, frameworks and scenarios/expectations, often in ‘real time’ computations. In so doing, these mechanisms provide a common ground for decision making and facilitate a more ‘joined-up’ approach to airport/city planning. This paper analyses the contribution of the Airport Metropolis Planning Support System (PSS) to sub-regional planning in the Brisbane Airport case environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is recognised that individuals do not always respond honestly when completing psychological tests. One of the foremost issues for research in this area is the inability to detect individuals attempting to fake. While a number of strategies have been identified in faking, a commonality of these strategies is the latent role of long term memory. Seven studies were conducted in order to examine whether it is possible to detect the activation of faking related cognitions using a lexical decision task. Study 1 found that engagement with experiential processing styles predicted the ability to fake successfully, confirming the role of associative processing styles in faking. After identifying appropriate stimuli for the lexical decision task (Studies 2A and 2B), Studies 3 to 5 examined whether a cognitive state of faking could be primed and subsequently identified, using a lexical decision task. Throughout the course of these studies, the experimental methodology was increasingly refined in an attempt to successfully identify the relevant priming mechanisms. The results were consistent and robust throughout the three priming studies: faking good on a personality test primed positive faking related words in the lexical decision tasks. Faking bad, however, did not result in reliable priming of negative faking related cognitions. To more completely address potential issues with the stimuli and the possible role of affective priming, two additional studies were conducted. Studies 6A and 6B revealed that negative faking related words were more arousing than positive faking related words, and that positive faking related words were more abstract than negative faking related words and neutral words. Study 7 examined whether the priming effects evident in the lexical decision tasks occurred as a result of an unintentional mood induction while faking the psychological tests. Results were equivocal in this regard. This program of research aligned the fields of psychological assessment and cognition to inform the preliminary development and validation of a new tool to detect faking. Consequently, an implicit technique to identify attempts to fake good on a psychological test has been identified, using long established and robust cognitive theories in a novel and innovative way. This approach represents a new paradigm for the detection of individuals responding strategically to psychological testing. With continuing development and validation, this technique may have immense utility in the field of psychological assessment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cold-formed steel stud walls are a major component of Light Steel Framing (LSF) building systems used in commercial, industrial and residential buildings. In the conventional LSF stud wall systems, thin steel studs are protected from fire by placing one or two layers of plasterboard on both sides with or without cavity insulation. However, there is very limited data about the structural and thermal performance of stud wall systems while past research showed contradicting results, for example, about the benefits of cavity insulation. This research was therefore conducted to improve the knowledge and understanding of the structural and thermal performance of cold-formed steel stud wall systems (both load bearing and non-load bearing) under fire conditions and to develop new improved stud wall systems including reliable and simple methods to predict their fire resistance rating. Full scale fire tests of cold-formed steel stud wall systems formed the basis of this research. This research proposed an innovative LSF stud wall system in which a composite panel made of two plasterboards with insulation between them was used to improve the fire rating. Hence fire tests included both conventional steel stud walls with and without the use of cavity insulation and the new composite panel system. A propane fired gas furnace was specially designed and constructed first. The furnace was designed to deliver heat in accordance with the standard time temperature curve as proposed by AS 1530.4 (SA, 2005). A compression loading frame capable of loading the individual studs of a full scale steel stud wall system was also designed and built for the load-bearing tests. Fire tests included comprehensive time-temperature measurements across the thickness and along the length of all the specimens using K type thermocouples. They also included the measurements of load-deformation characteristics of stud walls until failure. The first phase of fire tests included 15 small scale fire tests of gypsum plasterboards, and composite panels using different types of insulating material of varying thickness and density. Fire performance of single and multiple layers of gypsum plasterboards was assessed including the effect of interfaces between adjacent plasterboards on the thermal performance. Effects of insulations such as glass fibre, rock fibre and cellulose fibre were also determined while the tests provided important data relating to the temperature at which the fall off of external plasterboards occurred. In the second phase, nine small scale non-load bearing wall specimens were tested to investigate the thermal performance of conventional and innovative steel stud wall systems. Effects of single and multiple layers of plasterboards with and without vertical joints were investigated. The new composite panels were seen to offer greater thermal protection to the studs in comparison to the conventional panels. In the third phase of fire tests, nine full scale load bearing wall specimens were tested to study the thermal and structural performance of the load bearing wall assemblies. A full scale test was also conducted at ambient temperature. These tests showed that the use of cavity insulation led to inferior fire performance of walls, and provided good explanations and supporting research data to overcome the incorrect industry assumptions about cavity insulation. They demonstrated that the use of insulation externally in a composite panel enhanced the thermal and structural performance of stud walls and increased their fire resistance rating significantly. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. This research also included steady state tensile tests at ambient and elevated temperatures to address the lack of reliable mechanical properties for high grade cold-formed steels at elevated temperatures. Suitable predictive equations were developed for calculating the yield strength and elastic modulus at elevated temperatures. In summary, this research has developed comprehensive experimental thermal and structural performance data for both the conventional and the proposed non-load bearing and load bearing stud wall systems under fire conditions. Idealized hot flange temperature profiles have been developed for non-insulated, cavity insulated and externally insulated load bearing wall models along with suitable equations for predicting their failure times. A graphical method has also been proposed to predict the failure times (fire rating) of non-load bearing and load bearing walls under different load ratios. The results from this research are useful to both fire researchers and engineers working in this field. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF walls under fire conditions, and developed an innovative LSF wall system with increased fire rating. It has clearly demonstrated the detrimental effects of using cavity insulation, and has paved the way for Australian building industries to develop new wall panels with increased fire rating for commercial applications worldwide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web service technology is increasingly being used to build various e-Applications, in domains such as e-Business and e-Science. Characteristic benefits of web service technology are its inter-operability, decoupling and just-in-time integration. Using web service technology, an e-Application can be implemented by web service composition — by composing existing individual web services in accordance with the business process of the application. This means the application is provided to customers in the form of a value-added composite web service. An important and challenging issue of web service composition, is how to meet Quality-of-Service (QoS) requirements. This includes customer focused elements such as response time, price, throughput and reliability as well as how to best provide QoS results for the composites. This in turn best fulfils customers’ expectations and achieves their satisfaction. Fulfilling these QoS requirements or addressing the QoS-aware web service composition problem is the focus of this project. From a computational point of view, QoS-aware web service composition can be transformed into diverse optimisation problems. These problems are characterised as complex, large-scale, highly constrained and multi-objective problems. We therefore use genetic algorithms (GAs) to address QoS-based service composition problems. More precisely, this study addresses three important subproblems of QoS-aware web service composition; QoS-based web service selection for a composite web service accommodating constraints on inter-service dependence and conflict, QoS-based resource allocation and scheduling for multiple composite services on hybrid clouds, and performance-driven composite service partitioning for decentralised execution. Based on operations research theory, we model the three problems as a constrained optimisation problem, a resource allocation and scheduling problem, and a graph partitioning problem, respectively. Then, we present novel GAs to address these problems. We also conduct experiments to evaluate the performance of the new GAs. Finally, verification experiments are performed to show the correctness of the GAs. The major outcomes from the first problem are three novel GAs: a penaltybased GA, a min-conflict hill-climbing repairing GA, and a hybrid GA. These GAs adopt different constraint handling strategies to handle constraints on interservice dependence and conflict. This is an important factor that has been largely ignored by existing algorithms that might lead to the generation of infeasible composite services. Experimental results demonstrate the effectiveness of our GAs for handling the QoS-based web service selection problem with constraints on inter-service dependence and conflict, as well as their better scalability than the existing integer programming-based method for large scale web service selection problems. The major outcomes from the second problem has resulted in two GAs; a random-key GA and a cooperative coevolutionary GA (CCGA). Experiments demonstrate the good scalability of the two algorithms. In particular, the CCGA scales well as the number of composite services involved in a problem increases, while no other algorithms demonstrate this ability. The findings from the third problem result in a novel GA for composite service partitioning for decentralised execution. Compared with existing heuristic algorithms, the new GA is more suitable for a large-scale composite web service program partitioning problems. In addition, the GA outperforms existing heuristic algorithms, generating a better deployment topology for a composite web service for decentralised execution. These effective and scalable GAs can be integrated into QoS-based management tools to facilitate the delivery of feasible, reliable and high quality composite web services.