42 resultados para Analysis task

em Deakin Research Online - Australia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mediums claim to be able to report accurate and specific information about the deceased loved ones (termed discarnates) of living people (termed sitters) even without any prior knowledge about the sitters or the discarnates and in the complete absence of any sensory feedback. Despite recent proof-focused experimental research investigating this phenomenon (e.g., Beischel & Schwartz, 2007), no published studies have attempted to quantify the phenomenological effects of discarnate readings. The aim of the present study was, thus, to investigate experimentally the phenomenological differences that arose psychologically in accordance with the demands of a discarnate reading task versus a control task. Seven mediums were administered counter-balanced sequences of a discarnate reading and control condition. The discarnate reading condition consisted of a phone reading including questions about a discarnate where only a blinded medium and a blinded experimenter were on the phone. The control condition consisted of a phone conversation between the medium and the same experimenter in which the medium was asked similar questions regarding a living person s/he (i.e., the medium) knew. Mediums’ phenomenology during each condition was retrospectively assessed using the Phenomenology of Consciousness Inventory (PCI). Phenomenology associated with the discarnate reading condition appeared to be significantly different from phenomenology associated with the control condition. Future research might use the PCI to address whether the phenomenology reported by mediums during discarnate readings is quantitatively different from their experiences during psychic telepathy readings for the living.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Meta-analysis and meta-regression were used to evaluate whether evidence to date demonstrates deficits in procedural memory in individuals with specific language impairment (SLI), and to examine reasons for inconsistencies of findings across studies. The Procedural Deficit Hypothesis (PDH) proposes that SLI is largely explained by abnormal functioning of the frontal-basal ganglia circuits that support procedural memory. It has also been suggested that declarative memory can compensate for at least some of the problems observed in individuals with SLI. A number of studies have used Serial Reaction Time (SRT) tasks to investigate procedural learning in SLI. In this report, results from eight studies that collectively examined 186 participants with SLI and 203 typically-developing peers were submitted to a meta-analysis. The average mean effect size was .328 (CI95: .071, .584) and was significant. This suggests SLI is associated with impairments of procedural learning as measured by the SRT task. Differences among individual study effect sizes, examined with meta-regression, indicated that smaller effect sizes were found in studies with older participants, and in studies that had a larger number of trials on the SRT task. The contributions of age and SRT task characteristics to learning are discussed with respect to impaired and compensatory neural mechanisms in SLI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predicting which consumers will be amongst the first to adopt an innovative product is a difficult task but is valuable in allowing effective and efficient use of marketing resources. This paper examines the accuracy of predictions made about likely first adopters based on the most widely accepted theory and compares them to predictions made by examining the relevant past behavior of consumers. A survey of over 1000 consumers examined adoption of an innovative technology: compact fluorescent lightglobes. The results show that variables which were derived from a utility and awareness perspective were a more accurate and managerially useful predictor than the demographic variables derived from the widely accepted theory based on the work of Rogers. It is suggested that these alternative variables could be utilized more readily by marketing managers in many circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining the causal relation among attributes in a domain
is a key task in the data mining and knowledge discovery. In this
paper, we applied a causal discovery algorithm to the business traveler
expenditure survey data [1]. A general class of causal models is adopted in
this paper to discover the causal relationship among continuous and discrete variables. All those factors which have direct effect on the expense
pattern of travelers could be detected. Our discovery results reinforced
some conclusions of the rough set analysis and found some new conclusions which might significantly improve the understanding of expenditure behaviors of the business traveler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: E. Bialystok and E. B. Ryan (1985) have outlined two operations, analysis and control, which are required for grammaticality judgments. In this model, analysis is involved in determining the grammaticality of a sentence, and control is required so that irrelevant information is ignored. This study examined these processes in specific language impairment (SLI).

Method: Sixteen children with SLI and 20 typically developing (TD) children between 8;6 (years;months) and 10;6 were presented with a grammatical judgment task. Analysis was measured by recording children's decision times in determining grammaticality. Control was assessed by examining accuracy for judgments made for semantically odd sentences.

Results: Relative to the TD group, it was found that the children with SLI took longer in judging sentences associated with the process of analysis. Children with SLI were also found to have more difficulty, in terms of accuracy, with items requiring control (e.g., semantically odd sentences) than did the TD group.

Conclusion: It is argued that the longer time required for children with SLI to respond to semantically normal sentences reflects a degree of difficulty with completing analysis. The SLI group's lower level of accuracy on semantically odd sentences reflects a problem with a control and is consistent with previously reported problems with cognitive inhibition in SLI

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protocol analysis is an empirical method applied by researchers in cognitive psychology and behavioural analysis. Protocol analysis can be used to collect, document and analyse thought processes by an individual problem solver. In general, research subjects are asked to think aloud when performing a given task. Their verbal reports are transcribed and represent a sequence of their thoughts and cognitive activities. These verbal reports are analysed to identify relevant segments of cognitive behaviours by the research subjects. The analysis results may be cross-examined (or validated through retrospective interviews with the research subjects). This paper offers a critical analysis of this research method, its approaches to data collection and analysis, strengths and limitations, and discusses its use in information systems research. The aim is to explore the use of protocol analysis in studying the creative requirements engineering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the problem of performance modeling for large-scale heterogeneous distributed systems with emphases on multi-cluster computing systems. Since the overall performance of distributed systems is often depends on the effectiveness of its communication network, the study of the interconnection networks for these systems is very important. Performance modeling is required to avoid poorly chosen components and architectures as well as discovering a serious shortfall during system testing just prior to deployment time. However, the multiplicity of components and associated complexity make performance analysis of distributed computing systems a challenging task. To this end, we present an analytical performance model for the interconnection networks of heterogeneous multi-cluster systems. The analysis is based on a parametric family of fat-trees, the m-port n-tree, and a deterministic routing algorithm, which is proposed in this paper. The model is validated through comprehensive simulation, which demonstrated that the proposed model exhibits a good degree of accuracy for various system organizations and under different working conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article provides an overview of the development and validation of a Counsellor Task Analysis (Problem Gambling) [CTA (PG)] instrument undertaken in order to document the activities of counsellors in problem gambling services. The CTA (PG) aims to provide a broad overview of the complexity of the counsellor's role; specify the range of tasks they perform; and document the relationship between the frequency of task performance and the counsellor's beliefs about the importance of the tasks performed.

The CTA (PG) instrument addresses nine dimensions of practice activity through nine subscales, all of which demonstrate internal consistency. It appears to be a strong instrument in terms of its measurement error characteristics for recommending its use with counsellors engaged in the field of gambling. The CTA (PG) gives counsellors the opportunity to document their practice and theories in use when dealing with a problem gambler, a member of the problem gambler's family and the community at large. The psychometric findings reported in this article should be viewed as the preliminary results of an ongoing research effort and further psychometric testing is anticipated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When building a cost-effective high-performance parallel processing system, a performance model is a useful tool for exploring the design space and examining various parameters. However, performance analysis in such systems has proven to be a challenging task that requires the innovative performance analysis tools and methods to keep up with the rapid evolution and ever increasing complexity of such systems. To this end, we propose an analytical model for heterogeneous multi-cluster systems. The model takes into account stochastic quantities as well as network heterogeneity in bandwidth and latency in each cluster. Also, blocking and non-blocking network architecture model is proposed and are used in performance analysis of the system. The message latency is used as the primary performance metric. The model is validated by constructing a set of simulators to simulate different types of clusters, and by comparing the modeled results with the simulated ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event.

Background. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses appear to be both misinterpreting and mismanaging the nursing-knowledge 'basics' such as heart rate, respiratory rate and oxygenation. Whilst many medical interventions originate from nurses, up to 26% of nurses' responses to abnormal signs result in delays of between one and three hours.

Methods. A double system judgement analysis using Brunswik's lens model of cognition was undertaken with 245 Dutch, UK, Canadian and Australian acute care nurses. Nurses were asked to judge the likelihood of a critical event, 'at-risk' status, and whether they would intervene in response to 50 computer-presented clinical scenarios in which data on heart rate, systolic blood pressure, urine output, oxygen saturation, conscious level and oxygenation support were varied. Nurses were also presented with a protocol recommendation and also placed under time pressure for some of the scenarios. The ecological criterion was the predicted level of risk from the Modified Early Warning Score assessments of 232 UK acute care inpatients.

Results. Despite receiving identical information, nurses varied considerably in their risk assessments. The differences can be partly explained by variability in weightings given to information. Time and protocol recommendations were given more weighting than clinical information for key dichotomous choices such as classifying a patient as 'at risk' and deciding to intervene. Nurses' weighting of cues did not mirror the same information's contribution to risk in real patients. Nurses synthesized information in non-linear ways that contributed little to decisional accuracy. The low-moderate achievement (Ra) statistics suggests that nurses' assessments of risk were largely inaccurate; these assessments were applied consistently among 'patients' (scenarios). Critical care experience was statistically associated with estimates of risk, but not with the decision to intervene.

Conclusion. Nurses overestimated the risk and the need to intervene in simulated paper patients at risk of a critical event. This average response masked considerable variation in risk predictions, the need for action and the weighting afforded to the information they had available to them. Nurses did not make use of the linear reasoning required for accurate risk predictions in this task. They also failed to employ any unique knowledge that could be shown to make them more accurate. The influence of time pressure and protocol recommendations depended on the kind of judgement faced suggesting then that knowing more about the types of decisions nurses face may influence information use.

Relevance to clinical practice. Practice developers and educators need to pay attention to the quality of nurses' clinical experience as well as the quantity when developing judgement expertise in nurses. Intuitive unaided decision making in the assessment of risk may not be as accurate as supported decision making. Practice developers and educators should consider teaching nurses normative rules for revising probabilities (even subjective ones) such as Bayes' rule for diagnostic or assessment judgements and also that linear ways of thinking, in which decision support may help, may be useful for many choices that nurses face. Nursing needs to separate the rhetoric of 'holism' and 'expertise' from the science of predictive validity, accuracy and competence in judgement and decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reviews progress toward an understanding of the processes involved in the solution of spatial problems. Previous work employing factor analysis and information processing analysis is reviewed and the emphasis on variations in speed and accuracy as the major contributers to individual differences is noted. It is argued that the strategy used by individuals is a preferable explanatory concept for identifying the cognitive substratum necessary for problem solving. Using the protocols obtained from subjects solving The Minnesota Paper Form Board (Revised), a test commonly regarded as measuring skill in spatial visualization, a number of different strategies are isolated. Assumptions as to the task variants which undergird these strategies are made and tested experimentally. The results suggest that task variants such as the size of the stimulus and the shape of the pieces interact with subject variables to produce the operating strategy. Skill in problem solving is revealed in the ability to structure the array, to hold a structured image and to reduce the number of answers requiring intensive processing. The interaction between task and subject variables results in appropriate or inappropriate strategies which in turn affect speed and accuracy. Results suggest that strategy formation and usage are the keys to explaining individual differences and an heuristic model is presented to explain the performance of individual subjects on the problems involved in the Minnesota Paper Form Board. The model can be used to predict performance on other tests; and as an aid to teaching subjects experiencing difficulties. The model presented incorporates strategy variation and is consequently mores complex than previously suggested models. It is argued that such complexity is necessary to explain the nature of a subject's performance and is also necessary to perform diagnostic evaluation. Certain structural -features of the Minnesota Paper Form Board are questioned and suggestions for improvement included. The essential explanatory function of the strategy in use makes the prevalent group administration approach suspect in the prediction of future performance in spatial or vocational activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is about using appropriate tools in functional analysis arid classical analysis to tackle the problem of existence and uniqueness of nonlinear partial differential equations. There being no unified strategy to deal with these equations, one approaches each equation with an appropriate method, depending on the characteristics of the equation. The correct setting of the problem in appropriate function spaces is the first important part on the road to the solution. Here, we choose the setting of Sobolev spaces. The second essential part is to choose the correct tool for each equation. In the first part of this thesis (Chapters 3 and 4) we consider a variety of nonlinear hyperbolic partial differential equations with mixed boundary and initial conditions. The methods of compactness and monotonicity are used to prove existence and uniqueness of the solution (Chapter 3). Finding a priori estimates is the main task in this analysis. For some types of nonlinearity, these estimates cannot be easily obtained, arid so these two methods cannot be applied directly. In this case, we first linearise the equation, using linear recurrence (Chapter 4). In the second part of the thesis (Chapter 5), by using an appropriate tool in functional analysis (the Sobolev Imbedding Theorem), we are able to improve previous results on a posteriori error estimates for the finite element method of lines applied to nonlinear parabolic equations. These estimates are crucial in the design of adaptive algorithms for the method, and previous analysis relies on, what we show to be, unnecessary assumptions which limit the application of the algorithms. Our analysis does not require these assumptions. In the last part of the thesis (Chapter 6), staying with the theme of choosing the most suitable tools, we show that using classical analysis in a proper way is in some cases sufficient to obtain considerable results. We study in this chapter nonexistence of positive solutions to Laplace's equation with nonlinear Neumann boundary condition. This problem arises when one wants to study the blow-up at finite time of the solution of the corresponding parabolic problem, which models the heating of a substance by radiation. We generalise known results which were obtained by using more abstract methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling network traffic has been a critical task in the development of Internet. Attacks and defense are prevalent in the current Internet. Traditional network models such as Poisson-related models do not consider the competition behaviors between the attack and defense parties. In this paper, we present a microscopic competition model to analyze the dynamics among the nodes, benign or malicious, connected to a router, which compete for the bandwidth. The dynamics analysis demonstrates that the model can well describe the competition behavior among normal users and attackers. Based on this model, an anomaly attack detection method is presented. The method is based on the adaptive resonance theory, which is used to learn the model by normal traffic data. The evaluation shows that it can effectively detect the network attacks.