953 resultados para computer forensics tools
Resumo:
It is important to promote a sustainable development approach to ensure that economic, environmental and social developments are maintained in balance. Sustainable development and its implications are not just a global concern, it also affects Australia. In particular, rural Australian communities are facing various economic, environmental and social challenges. Thus, the need for sustainable development in rural regions is becoming increasingly important. To promote sustainable development, proper frameworks along with the associated tools optimised for the specific regions, need to be developed. This will ensure that the decisions made for sustainable development are evidence based, instead of subjective opinions. To address these issues, Queensland University of Technology (QUT), through an Australian Research Council (ARC) linkage grant, has initiated research into the development of a Rural Statistical Sustainability Framework (RSSF) to aid sustainable decision making in rural Queensland. This particular branch of the research developed a decision support tool that will become the integrating component of the RSSF. This tool is developed on the web-based platform to allow easy dissemination, quick maintenance and to minimise compatibility issues. The tool is developed based on MapGuide Open Source and it follows the three-tier architecture: Client tier, Web tier and the Server tier. The developed tool is interactive and behaves similar to a familiar desktop-based application. It has the capability to handle and display vector-based spatial data and can give further visual outputs using charts and tables. The data used in this tool is obtained from the QUT research team. Overall the tool implements four tasks to help in the decision-making process. These are the Locality Classification, Trend Display, Impact Assessment and Data Entry and Update. The developed tool utilises open source and freely available software and accounts for easy extensibility and long-term sustainability.
Resumo:
To address issues of divisive ideologies in the Mathematics Education community and to subsequently advance educational practice, an alternative theoretical framework and operational model is proposed which represents a consilience of constructivist learning theories whilst acknowledging the objective but improvable nature of domain knowledge. Based upon Popper’s three-world model of knowledge, the proposed theory supports the differentiation and explicit modelling of both shared domain knowledge and idiosyncratic personal understanding using a visual nomenclature. The visual nomenclature embodies Piaget’s notion of reflective abstraction and so may support an individual’s experience-based transformation of personal understanding with regards to shared domain knowledge. Using the operational model and visual nomenclature, seminal literature regarding early-number counting and addition was analysed and described. Exemplars of the resultant visual artefacts demonstrate the proposed theory’s viability as a tool with which to characterise the reflective abstraction-based organisation of a domain’s shared knowledge. Utilising such a description of knowledge, future research needs to consider the refinement of the operational model and visual nomenclature to include the analysis, description and scaffolded transformation of personal understanding. A detailed model of knowledge and understanding may then underpin the future development of educational software tools such as computer-mediated teaching and learning environments.
Resumo:
Visual modes of representation have always been very important in science and science education. Interactive computer-based animations and simulations offer new visual resources for chemistry education. Many studies have shown that students enjoy learning with visualisations but few have explored how learning outcomes compare when teaching with or without visualisations. This study employs a quasi-experimental crossover research design and quantitative methods to measure the educational effectiveness - defined as level of conceptual development on the part of students - of using computer-based scientific visualisations versus teaching without visualisations in teaching chemistry. In addition to finding that teaching with visualisations offered outcomes that were not significantly different from teaching without visualisations, the study also explored differences in outcomes for male and female students, students with different learning styles (visual, aural, kinesthetic) and students of differing levels of academic ability.
Resumo:
Video games have shown great potential as tools that both engage and motivate players to achieve tasks and build communities in fantasy worlds. We propose that the application of game elements to real world activities can aid in delivering contextual information in interesting ways and help young people to engage in everyday events. Our research will explore how we can unite utility and fun to enhance information delivery, encourage participation, build communities and engage users with utilitarian events situated in the real world. This research aims to identify key game elements that work effectively to engage young digital natives, and provide guidelines to influence the design of interactions and interfaces for event applications in the future. This research will primarily contribute to areas of user experience and pervasive gaming.
Resumo:
This position paper provides an overview of a proposed study that seeks to design and develop tools, methods and applications of urban informatics to promote an innovation culture and knowledge economy in regional Queensland. The National Broadband Network has the potential to leapfrog regional Queensland to join the knowledge economy, but effective applications and content strategies are required. The Edge is the Queensland Government’s Digital Culture Centre to engage young people in the technology/culture nexus. This position paper provides an overview of a proposed study that will set up Living Labs at The Edge and in a new precinct in rural Queensland (Goondiwindi) as sites to trial strategies and applications that engage people in entrepreneurial thinking, sustainability initiatives, and new creative practices across the urban and rural boundaries.
Resumo:
The management of models over time in many domains requires different constraints to apply to some parts of the model as it evolves. Using EMF and its meta-language Ecore, the development of model management code and tools usually relies on the meta- model having some constraints, such as attribute and reference cardinalities and changeability, set in the least constrained way that any model user will require. Stronger versions of these constraints can then be enforced in code, or by attaching additional constraint expressions, and their evaluations engines, to the generated model code. We propose a mechanism that allows for variations to the constraining meta-attributes of metamodels, to allow enforcement of different constraints at different lifecycle stages of a model. We then discuss the implementation choices within EMF to support the validation of a state-specific metamodel on model graphs when changing states, as well as the enforcement of state-specific constraints when executing model change operations.
Resumo:
Emerging from the challenge to reduce energy consumption in buildings is the need for energy simulation to be used more effectively to support integrated decision making in early design. As a critical response to a Green Star case study, we present DEEPA, a parametric modeling framework that enables architects and engineers to work at the same semantic level to generate shared models for energy simulation. A cloud-based toolkit provides web and data services for parametric design software that automate the process of simulating and tracking design alternatives, by linking building geometry more directly to analysis inputs. Data, semantics, models and simulation results can be shared on the fly. This allows the complex relationships between architecture, building services and energy consumption to be explored in an integrated manner, and decisions to be made collaboratively.
Resumo:
Everything is Political Ben Eltham, Kieran Lord, Jeff Brand, Truna. Chair: Daniel Golding Videogames don’t exist in isolation. They are part of artistic, cultural, and political spheres – even if some would much rather they weren’t. This panel takes a look at the way videogames are used as political tools and how we as developers and critics can better engage with that, and perhaps wrestle some of the conversation back into our hands.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
Significant research has demonstrated direct and indirect associations between substance use and sexual behaviour. Substance use is related to sexual risk-taking and HIV seroconversion among some substance-using MSM. It remains unclear what factors mediate or underlie this relationship, and which substances are associated with greater harm. Substance-related expectancies are hypothesised as potential mechanisms. A conceptual model based on social-cognitive theory was tested, which explores the role of demographic factors, substance use, substance-related expectancies and novelty-seeking personality characteristics in predicting unprotected anal intercourse (UAI) while under the influence, across four commonly used substance types. Phase 1, a qualitative study (N = 20), explored how MSM perceive the effects of substance use on their thoughts, feelings and behaviours, including sexual behaviours. Information was attained through discussion and interviews, resulting in the establishment of key themes. Results indicated MSM experience a wide range of reinforcing aspects associated with substance use. General and specific effects were evident across substance types, and were associated with sexual behaviour and sexual risk-taking. Phase 2 consisted of developing a comprehensive profile of substance-related expectancies for MSM (SEP-MSM) regarding alcohol, cannabis, amyl nitrite and stimulants that possessed sound psychometric properties and was appropriate for use among this group. A cross-sectional questionnaire with 249 participants recruited through gay community networks was used to validate these measures, and involved online data collection, participants rating expectancy items and subsequent factor analysis. Results indicated expectancies can be reliably assessed, and predicted substance use patterns. Phase 3 examined demographic factors, substance use, substance-related expectancies, and novelty-seeking traits among another community sample of MSM (N = 277) throughout Australia, in predicting UAI while under the influence. Using a cross-sectional design, participants were recruited through gay community networks and completed online questionnaires. The SEP-MSM, and associated substance use, predicted UAI. This research extends social-cognitive theory regarding sexual behaviour, and advances understanding of the role of expectancies associated with substance use and sexual risk-taking. Future applications of the SEP-MSM in health promotion, prevention, clinical interventions and research are likely to contribute to reducing harm associated with substance-using MSM (e.g., HIV transmission).
Resumo:
Living City 2010 was a three-day place-based urban design immersion workshop program held at Logan Road Conference Centre, Stones Corner, for 30 self-selected Year 11 Visual Art Students and 4 Teachers drawn from 11 state and private Brisbane Secondary Schools, that focused on the active Brisbane City Council redevelopment site of Stones Corner, specifically Logan Road, public spaces at Stones Corner Library and rehabilitation of the nearby creek corridor. The workshop, framed within notions of ecological, economic, social and cultural sustainability, aimed to raise awareness of the layered complexity and perspectives involved in the design of shared city spaces and to encourage young people to voice their own concerns as future citizens about the shape and direction of their city. On Day 1, Brisbane City Council Public Art Officers Brendan Doherty and Genevieve Searle, local landscape architect Peter Boyle (Verge) and artists Malcolm Enright and Barbara Heath provided students with an overview of the historic and future context of the area including proposed design and public art interventions, followed by a site walk. The afternoon session, led by Natalie Wright and QUT design staff and students, focused on design tools to assist in the tackling of the redesign of the Stones Corner library precinct, where students worked on ideas. On Day 2, students were mentored by artist Liam Key to participate in a computer animation activity using the built environment as a canvas, and by artist Sebastian Moody to participate in an activity using red helium balloons as a playful catalyst for interaction to activate and create new public space. Later, students worked in teams on their ideas for redevelopment of the site in preparation for their Day 3 presentations. The workshop culminated in an exchange of planning ideas with Georgina Aitchison from Brisbane City Council's Urban Renewal Division. Students were introduced to design methodology, team thinking strategies, the scope of design practices and professions, presentation skills and post-secondary pathways, while participating teachers acquired content and design learning strategies transferable in many other contexts.
Resumo:
The use of Cellular Automata (CA) for musical purposes has a rich history. In general the mapping of CA states to note-level music representations has focused on pitch mapping and downplayed rhythm. This paper reports experiments in the application of one-dimensional cellular automata to the generation and evolution of rhythmic patterns. A selection of CA tendencies are identified that can be used as compositional tools to control the rhythmic coherence of monophonic passages and the polyphonic texture of musical works in broad-brush, rather than precisely deterministic, ways. This will provide the composer and researcher with a clearer understanding of the useful application of CAs for generative music.
Resumo:
Increasingly, studies are reported that examine how conceptual modeling is conducted in practice. Yet, typically the studies to date have examined in isolation how modeling grammars can be, or are, used to develop models of information systems or organizational processes, without considering that such modeling is typically done by means of a modeling tool that extends the modeling functionality offered by a grammar through complementary features. This paper extends the literature by examining how the use of seven different features of modeling tools affects usage beliefs users develop when using modeling grammars for process modeling. We show that five distinct tool features positively affect usefulness, ease of use and satisfaction beliefs of users. We offer a number of interpretations about the findings. We also describe how the results inform decisions of relevance to developers of modeling tools as well as managers in charge for making modeling-related investment decisions.
Resumo:
Proteases regulate a spectrum of diverse physiological processes, and dysregulation of proteolytic activity drives a plethora of pathological conditions. Understanding protease function is essential to appreciating many aspects of normal physiology and progression of disease. Consequently, development of potent and specific inhibitors of proteolytic enzymes is vital to provide tools for the dissection of protease function in biological systems and for the treatment of diseases linked to aberrant proteolytic activity. The studies in this thesis describe the rational design of potent inhibitors of three proteases that are implicated in disease development. Additionally, key features of the interaction of proteases and their cognate inhibitors or substrates are analysed and a series of rational inhibitor design principles are expounded and tested. Rational design of protease inhibitors relies on a comprehensive understanding of protease structure and biochemistry. Analysis of known protease cleavage sites in proteins and peptides is a commonly used source of such information. However, model peptide substrate and protein sequences have widely differing levels of backbone constraint and hence can adopt highly divergent structures when binding to a protease’s active site. This may result in identical sequences in peptides and proteins having different conformations and diverse spatial distribution of amino acid functionalities. Regardless of this, protein and peptide cleavage sites are often regarded as being equivalent. One of the key findings in the following studies is a definitive demonstration of the lack of equivalence between these two classes of substrate and invalidation of the common practice of using the sequences of model peptide substrates to predict cleavage of proteins in vivo. Another important feature for protease substrate recognition is subsite cooperativity. This type of cooperativity is commonly referred to as protease or substrate binding subsite cooperativity and is distinct from allosteric cooperativity, where binding of a molecule distant from the protease active site affects the binding affinity of a substrate. Subsite cooperativity may be intramolecular where neighbouring residues in substrates are interacting, affecting the scissile bond’s susceptibility to protease cleavage. Subsite cooperativity can also be intermolecular where a particular residue’s contribution to binding affinity changes depending on the identity of neighbouring amino acids. Although numerous studies have identified subsite cooperativity effects, these findings are frequently ignored in investigations probing subsite selectivity by screening against diverse combinatorial libraries of peptides (positional scanning synthetic combinatorial library; PS-SCL). This strategy for determining cleavage specificity relies on the averaged rates of hydrolysis for an uncharacterised ensemble of peptide sequences, as opposed to the defined rate of hydrolysis of a known specific substrate. Further, since PS-SCL screens probe the preference of the various protease subsites independently, this method is inherently unable to detect subsite cooperativity. However, mean hydrolysis rates from PS-SCL screens are often interpreted as being comparable to those produced by single peptide cleavages. Before this study no large systematic evaluation had been made to determine the level of correlation between protease selectivity as predicted by screening against a library of combinatorial peptides and cleavage of individual peptides. This subject is specifically explored in the studies described here. In order to establish whether PS-SCL screens could accurately determine the substrate preferences of proteases, a systematic comparison of data from PS-SCLs with libraries containing individually synthesised peptides (sparse matrix library; SML) was carried out. These SML libraries were designed to include all possible sequence combinations of the residues that were suggested to be preferred by a protease using the PS-SCL method. SML screening against the three serine proteases kallikrein 4 (KLK4), kallikrein 14 (KLK14) and plasmin revealed highly preferred peptide substrates that could not have been deduced by PS-SCL screening alone. Comparing protease subsite preference profiles from screens of the two types of peptide libraries showed that the most preferred substrates were not detected by PS SCL screening as a consequence of intermolecular cooperativity being negated by the very nature of PS SCL screening. Sequences that are highly favoured as result of intermolecular cooperativity achieve optimal protease subsite occupancy, and thereby interact with very specific determinants of the protease. Identifying these substrate sequences is important since they may be used to produce potent and selective inhibitors of protolytic enzymes. This study found that highly favoured substrate sequences that relied on intermolecular cooperativity allowed for the production of potent inhibitors of KLK4, KLK14 and plasmin. Peptide aldehydes based on preferred plasmin sequences produced high affinity transition state analogue inhibitors for this protease. The most potent of these maintained specificity over plasma kallikrein (known to have a very similar substrate preference to plasmin). Furthermore, the efficiency of this inhibitor in blocking fibrinolysis in vitro was comparable to aprotinin, which previously saw clinical use to reduce perioperative bleeding. One substrate sequence particularly favoured by KLK4 was substituted into the 14 amino acid, circular sunflower trypsin inhibitor (SFTI). This resulted in a highly potent and selective inhibitor (SFTI-FCQR) which attenuated protease activated receptor signalling by KLK4 in vitro. Moreover, SFTI-FCQR and paclitaxel synergistically reduced growth of ovarian cancer cells in vitro, making this inhibitor a lead compound for further therapeutic development. Similar incorporation of a preferred KLK14 amino acid sequence into the SFTI scaffold produced a potent inhibitor for this protease. However, the conformationally constrained SFTI backbone enforced a different intramolecular cooperativity, which masked a KLK14 specific determinant. As a consequence, the level of selectivity achievable was lower than that found for the KLK4 inhibitor. Standard mechanism inhibitors such as SFTI rely on a stable acyl-enzyme intermediate for high affinity binding. This is achieved by a conformationally constrained canonical binding loop that allows for reformation of the scissile peptide bond after cleavage. Amino acid substitutions within the inhibitor to target a particular protease may compromise structural determinants that support the rigidity of the binding loop and thereby prevent the engineered inhibitor reaching its full potential. An in silico analysis was carried out to examine the potential for further improvements to the potency and selectivity of the SFTI-based KLK4 and KLK14 inhibitors. Molecular dynamics simulations suggested that the substitutions within SFTI required to target KLK4 and KLK14 had compromised the intramolecular hydrogen bond network of the inhibitor and caused a concomitant loss of binding loop stability. Furthermore in silico amino acid substitution revealed a consistent correlation between a higher frequency of formation and the number of internal hydrogen bonds of SFTI-variants and lower inhibition constants. These predictions allowed for the production of second generation inhibitors with enhanced binding affinity toward both targets and highlight the importance of considering intramolecular cooperativity effects when engineering proteins or circular peptides to target proteases. The findings from this study show that although PS-SCLs are a useful tool for high throughput screening of approximate protease preference, later refinement by SML screening is needed to reveal optimal subsite occupancy due to cooperativity in substrate recognition. This investigation has also demonstrated the importance of maintaining structural determinants of backbone constraint and conformation when engineering standard mechanism inhibitors for new targets. Combined these results show that backbone conformation and amino acid cooperativity have more prominent roles than previously appreciated in determining substrate/inhibitor specificity and binding affinity. The three key inhibitors designed during this investigation are now being developed as lead compounds for cancer chemotherapy, control of fibrinolysis and cosmeceutical applications. These compounds form the basis of a portfolio of intellectual property which will be further developed in the coming years.
Resumo:
Much has been said and documented about the key role that reflection can play in the ongoing development of e-portfolios, particularly e-portfolios utilised for teaching and learning. A review of e-portfolio platforms reveals that a designated space for documenting and collating personal reflections is a typical design feature of both open source and commercial off-the-shelf software. Further investigation of tools within e-portfolio systems for facilitating reflection reveals that, apart from enabling personal journalism through blogs or other writing, scaffolding tools that encourage the actual process of reflection are under-developed. Investigation of a number of prominent e-portfolio projects also reveals that reflection, while presented as critically important, is often viewed as an activity that takes place after a learning activity or experience and not intrinsic to it. This paper assumes an alternative, richer conception of reflection: a process integral to a wide range of activities associated with learning, such as inquiry, communication, editing, analysis and evaluation. Such a conception is consistent with the literature associated with ‘communities of practice’, which is replete with insight into ‘learning through doing’, and with a ‘whole minded’ approach to inquiry. Thus, graduates who are ‘reflective practitioners’ who integrate reflection into their learning will have more to offer a prospective employer than graduates who have adopted an episodic approach to reflection. So, what kinds of tools might facilitate integrated reflection? This paper outlines a number of possibilities for consideration and development. Such tools do not have to be embedded within e-portfolio systems, although there are benefits in doing so. In order to inform future design of e-portfolio systems this paper presents a faceted model of knowledge creation that depicts an ‘ecology of knowing’ in which interaction with, and the production of, learning content is deepened through the construction of well-formed questions of that content. In particular, questions that are initiated by ‘why’ are explored because they are distinguished from the other ‘journalist’ questions (who, what, when, where, and where) in that answers to them demand explanative, as opposed to descriptive, content. They require a rationale. Although why questions do not belong to any one genre and are not simple to classify — responses can contain motivational, conditional, causal, and/or existential content — they do make a difference in the acquisition of understanding. The development of scaffolding that builds on why-questioning to enrich learning is the motivation behind the research that has informed this paper.