562 resultados para brain, computer, interface


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computer forensics is the process of gathering and analysing evidence from computer systems to aid in the investigation of a crime. Typically, such investigations are undertaken by human forensic examiners using purpose-built software to discover evidence from a computer disk. This process is a manual one, and the time it takes for a forensic examiner to conduct such an investigation is proportional to the storage capacity of the computer's disk drives. The heterogeneity and complexity of various data formats stored on modern computer systems compounds the problems posed by the sheer volume of data. The decision to undertake a computer forensic examination of a computer system is a decision to commit significant quantities of a human examiner's time. Where there is no prior knowledge of the information contained on a computer system, this commitment of time and energy occurs with little idea of the potential benefit to the investigation. The key contribution of this research is the design and development of an automated process to describe a computer system and its activity for the purposes of a computer forensic investigation. The term proposed for this process is computer profiling. A model of a computer system and its activity has been developed over the course of this research. Using this model a computer system, which is the subj ect of investigation, can be automatically described in terms useful to a forensic investigator. The computer profiling process IS resilient to attempts to disguise malicious computer activity. This resilience is achieved by detecting inconsistencies in the information used to infer the apparent activity of the computer. The practicality of the computer profiling process has been demonstrated by a proof-of concept software implementation. The model and the prototype implementation utilising the model were tested with data from real computer systems. The resilience of the process to attempts to disguise malicious activity has also been demonstrated with practical experiments conducted with the same prototype software implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital forensics investigations aim to find evidence that helps confirm or disprove a hypothesis about an alleged computer-based crime. However, the ease with which computer-literate criminals can falsify computer event logs makes the prosecutor's job highly challenging. Given a log which is suspected to have been falsified or tampered with, a prosecutor is obliged to provide a convincing explanation for how the log may have been created. Here we focus on showing how a suspect computer event log can be transformed into a hypothesised actual sequence of events, consistent with independent, trusted sources of event orderings. We present two algorithms which allow the effort involved in falsifying logs to be quantified, as a function of the number of `moves' required to transform the suspect log into the hypothesised one, thus allowing a prosecutor to assess the likelihood of a particular falsification scenario. The first algorithm always produces an optimal solution but, for reasons of efficiency, is suitable for short event logs only. To deal with the massive amount of data typically found in computer event logs, we also present a second heuristic algorithm which is considerably more efficient but may not always generate an optimal outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aggregate structure which occurs in aqueous smectitic suspensions is responsible for poor water clarification, difficulties in sludge dewatering and the unusual rheological behaviour of smectite rich soils. These macroscopic properties are dictated by the 3-D structural arrangement of smectite finest fraction within flocculated aggregates. Here, we report results from a relatively new technique, Transmission X-ray Microscopy (TXM), which makes it possible to investigate the internal structure and 3-D tomographic reconstruction of the smectite clay aggregates modified by Al13 keggin macro-molecule [Al13(O)4(OH)24(H2O)12 ]7+. Three different treatment methods were shown resulted in three different micro-structural environments of the resulting flocculation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the use of models in automatic computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgements as to the probable usage and evidentiary value of a computer system. The computer profiling object model can be implemented so as to support automated analysis to provide an investigator with the information needed to decide whether manual analysis is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Mobile Learning Kit is a new digital learning application that allows students and teachers to compose, publish, discuss and evaluate their own mobile learning games and events. The research field was interaction design in the context of mobile learning. The research methodology was primarily design-based supported by collaboration between participating disciplines of game design, education and information technology. As such, the resulting MiLK application is a synthesis of current pedagogical models and experimental interaction design techniques and technologies. MiLK is a dynamic learning resource for incorporating both formal and informal teaching and learning practices while exploiting mobile phones and contemporary digital social tools in innovative ways. MiLK explicitly addresses other predominant themes in educational scholarship that relate to current education innovation and reform such as personalised learning, life-long learning and new learning spaces. The success of this project is indicated through rigorous trials and actual uptake of MiLK by international participants in Australia, UK, US and South Africa. MiLK was awarded for excellence in the use of emerging technologies for improved learning and teaching as a finalist (top 3) in the Handheld Learning and Innovation Awards in the UK in 2008. MiLK was awarded funding from the Australasian CRC for Interaction Design in 2008 to prepare the MiLK application for development. MiLK has been awarded over $230,000 from ACID since 2006. The resulting application and research materials are now being commercialised by a new company, ‘ACID Services’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a retrospective view of a game design practice that recently switched from the development of complex learning games to the development of simple authoring tools for students to design their own learning games for each other. We introduce how our ‘10% Rule’, a premise that only 10% of what is learnt during a game design process is ultimately appreciated by the player, became a major contributor to the evolving practice. We use this rule primarily as an analytical and illustrative tool to discuss the learning involved in designing and playing learning games rather than as a scientifically and empirically proven rule. The 10% rule was promoted by our experience as designers and allows us to explore the often overlooked and valuable learning processes involved in designing learning games and mobile games in particular. This discussion highlights that in designing mobile learning games, students are not only reflecting on their own learning processes through setting up structures for others to enquire and investigate, they are also engaging in high-levels of independent inquiry and critical analysis in authentic learning settings. We conclude the paper with a discussion of the importance of these types of learning processes and skills of enquiry in 21st Century learning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Farm It Right is an innovative creative work that simulates sustainable farming techniques using ecological models prepared by academics at Bradford University (School of Life Sciences). This interactive work simulates the farming conditions and options of our ancestors and demonstrates the direct impact their actions had on their environment and on the ’future of their cultures’ (Schmidt 2008). Specifically, the simulation allows users to explore and experiment with the complex relationships between environmental factors and human decision making within the harsh conditions of an early (9th century) Nordic farm. The simulation interface displays both statistical and graphical feedback in response to the users selections regarding animal reproduction rates, shelter provisions, food supplies etc. as well as demonstrating resulting impacts to soil erosion, water supply, animal population sizes etc.---------- 'Farm It Right' is now used at Bradford University (School of Life Sciences) as a dynamic e-Learning resource for incorporating environmental archaeology with sustainable development education, improving the engagement with complex data and the appreciation of human impacts on the environment and the future of their cultures. 'Farm It Right' is also demonstrated as an exemplar case study for interaction design students at Queensland University of Technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What does it mean when we design for accessibility, inclusivity and "dissolving boundaries" -- particularly those boundaries between the design philosophy, the software/interface actuality and the stated goals? This paper is about the principles underlying a research project called 'The Little Grey Cat engine' or greyCat. GreyCat has grown out of our experience in using commercial game engines as production environments for the transmission of culture and experience through the telling of individual stories. The key to this endeavour is the potential of the greyCat software to visualize worlds and the manner in which non-formal stories are intertwined with place. The apparently simple dictum of "show, don't tell" and the use of 3D game engines as a medium disguise an interesting nexus of problematic issues and questions, particularly in the ramifications for cultural dimensions and participatory interaction design. The engine is currently in alpha and the following paper is its background story. In this paper we discuss the problematic, thrown into sharp relief by a particular project, and we continue to unpack concepts and early designs behind the greyCat itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The brain-derived neurotrophic factor (BDNF) has been suggested to play a pivotal role in the aetiology of affective disorders. In order to further clarify the impact of BDNF gene variation on major depression as well as antidepressant treatment response, association of three BDNF polymorphisms [rs7103411, Val66Met (rs6265) and rs7124442] with major depression and antidepressant treatment response was investigated in an overall sample of 268 German patients with major depression and 424 healthy controls. False discovery rate (FDR) was applied to control for multiple testing. Additionally, ten markers in BDNF were tested for association with citalopram outcome in the STAR*D sample. While BDNF was not associated with major depression as a categorical diagnosis, the BDNF rs7124442 TT genotype was significantly related to worse treatment outcome over 6 wk in major depression (p=0.01) particularly in anxious depression (p=0.003) in the German sample. However, BDNF rs7103411 and rs6265 similarly predicted worse treatment response over 6 wk in clinical subtypes of depression such as melancholic depression only (rs7103411: TT

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis maps the author's journey from a music composition practice to a composition and performance practice. The work involves the development of a software library for the purpose of encapsulating compositional ideas in software, and realising these ideas in performance through a live coding computer music practice. The thesis examines what artistic practice emerges through live coding and software development, and does this permit a blurring between the activities of music composition and performance. The role that software design plays in affecting musical outcomes is considered to gain an insight into how software development contributes to artistic development. The relationship between music composition and performance is also examined to identify the means by which engaging in live coding and software development can bring these activities together. The thesis, situated within the discourse of practice led research, documents a journey which uses the experience of software development and performance as a means to guide the direction of the research. The journey serves as an experiment for the author in engaging an hitherto unfamiliar musical practice, and as a roadmap for others seeking to modify or broaden their artistic practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual 3D models of long bones are increasingly being used for implant design and research applications. The current gold standard for the acquisition of such data is Computed Tomography (CT) scanning. Due to radiation exposure, CT is generally limited to the imaging of clinical cases and cadaver specimens. Magnetic Resonance Imaging (MRI) does not involve ionising radiation and therefore can be used to image selected healthy human volunteers for research purposes. The feasibility of MRI as alternative to CT for the acquisition of morphological bone data of the lower extremity has been demonstrated in recent studies [1, 2]. Some of the current limitations of MRI are long scanning times and difficulties with image segmentation in certain anatomical regions due to poor contrast between bone and surrounding muscle tissues. Higher field strength scanners promise to offer faster imaging times or better image quality. In this study image quality at 1.5T is quantitatively compared to images acquired at 3T. --------- The femora of five human volunteers were scanned using 1.5T and 3T MRI scanners from the same manufacturer (Siemens) with similar imaging protocols. A 3D flash sequence was used with TE = 4.66 ms, flip angle = 15° and voxel size = 0.5 × 0.5 × 1 mm. PA-Matrix and body matrix coils were used to cover the lower limb and pelvis respectively. Signal to noise ratio (SNR) [3] and contrast to noise ratio (CNR) [3] of the axial images from the proximal, shaft and distal regions were used to assess the quality of images from the 1.5T and 3T scanners. The SNR was calculated for the muscle and bone-marrow in the axial images. The CNR was calculated for the muscle to cortex and cortex to bone marrow interfaces, respectively. --------- Preliminary results (one volunteer) show that the SNR of muscle for the shaft and distal regions was higher in 3T images (11.65 and 17.60) than 1.5T images (8.12 and 8.11). For the proximal region the SNR of muscles was higher in 1.5T images (7.52) than 3T images (6.78). The SNR of bone marrow was slightly higher in 1.5T images for both proximal and shaft regions, while it was lower in the distal region compared to 3T images. The CNR between muscle and bone of all three regions was higher in 3T images (4.14, 6.55 and 12.99) than in 1.5T images (2.49, 3.25 and 9.89). The CNR between bone-marrow and bone was slightly higher in 1.5T images (4.87, 12.89 and 10.07) compared to 3T images (3.74, 10.83 and 10.15). These results show that the 3T images generated higher contrast between bone and the muscle tissue than the 1.5T images. It is expected that this improvement of image contrast will significantly reduce the time required for the mainly manual segmentation of the MR images. Future work will focus on optimizing the 3T imaging protocol for reducing chemical shift and susceptibility artifacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Despite being the leading cause of death and disability in the paediatric population, traumatic brain injury (TBI) in this group is largely understudied. Clinical practice within the paediatric intensive care unit (PICU) has been based upon adult guidelines however children are significantly different in terms of mechanism, pathophysiology and consequence of injury. Aim To review TBI management in the PICU and gain insight into potential management strategies. Method To conduct this review, a literature search was conducted using MEDLINE, PUBMED and The Cochrane Library using the following key words; traumatic brain injury; paediatric; hypothermia. There were no date restrictions applied to ensure that past studies, whose principles remain current were not excluded. Results Three areas were identified from the literature search and will be discussed against current acknowledged treatment strategies: Prophylactic hypothermia, brain tissue oxygen tension monitoring and decompressive craniectomy. Conclusion Previous literature has failed to fully address paediatric specific management protocols and we therefore have little evidence-based guidance. This review has shown that there is an emerging and ongoing trend towards paediatric specific TBI research in particular the area of moderate prophylactic hypothermia (MPH).