914 resultados para Instrumental reason
Resumo:
Pacific people have their own unique ways of knowing that shape how they learn and this should be taken into account in planning curriculum and in teaching. Pacific people are more likely to want to learn by doing, seeing, collaborating and in a concrete environment whereas for Western students learning becomes formal quickly and depends more on words and theories. This assumed difference in learning preferences could present a problem for formal learning with the need to bridge the gap psychologically and epistemologically between concrete and formal modes of learning. It could be the reason why some students in the Pacific, even at the tertiary level, rely heavily on rote learning. This chapter is a discussion of learning and assessment practices that help to foster understanding as they might apply to teaching at university in the South Pacific.
Resumo:
The hippocampus is an anatomically distinct region of the medial temporal lobe that plays a critical role in the formation of declarative memories. Here we show that a computer simulation of simple compartmental cells organized with basic hippocampal connectivity is capable of producing stimulus intensity sensitive wide-band fluctuations of spectral power similar to that seen in real EEG. While previous computational models have been designed to assess the viability of the putative mechanisms of memory storage and retrieval, they have generally been too abstract to allow comparison with empirical data. Furthermore, while the anatomical connectivity and organization of the hippocampus is well defined, many questions regarding the mechanisms that mediate large-scale synaptic integration remain unanswered. For this reason we focus less on the specifics of changing synaptic weights and more on the population dynamics. Spectral power in four distinct frequency bands were derived from simulated field potentials of the computational model and found to depend on the intensity of a random input. The majority of power occurred in the lowest frequency band (3-6 Hz) and was greatest to the lowest intensity stimulus condition (1% maximal stimulus). In contrast, higher frequency bands ranging from 7-45 Hz show an increase in power directly related with an increase in stimulus intensity. This trend continues up to a stimulus level of 15% to 20% of the maximal input, above which power falls dramatically. These results suggest that the relative power of intrinsic network oscillations are dependent upon the level of activation and that above threshold levels all frequencies are damped, perhaps due to over activation of inhibitory interneurons.
Resumo:
For the past few years, research works on the topic of secure outsourcing of cryptographic computations has drawn significant attention from academics in security and cryptology disciplines as well as information security practitioners. One main reason for this interest is their application for resource constrained devices such as RFID tags. While there has been significant progress in this domain since Hohenberger and Lysyanskaya have provided formal security notions for secure computation delegation, there are some interesting challenges that need to be solved that can be useful towards a wider deployment of cryptographic protocols that enable secure outsourcing of cryptographic computations. This position paper brings out these challenging problems with RFID technology as the use case together with our ideas, where applicable, that can provide a direction towards solving the problems.
Resumo:
High-throughput plasmid DNA (pDNA) manufacture is obstructed predominantly by the performance of conventional stationary phases. For this reason, the search for new materials for fast chromatographic separation of pDNA is ongoing. A poly(glycidyl methacrylate-co-ethylene glycol dimethacrylate) (GMA-EGDMA) monolithic material was synthesised via a thermal-free radical reaction, functionalised with different amino groups from urea, 2-chloro-N,N-diethylethylamine hydrochloride (DEAE-Cl) and ammonia in order to investigate their plasmid adsorption capacities. Physical characterisation of the monolithic polymer showed a macroporous polymer having a unimodal pore size distribution pivoted at 600 nm. Chromatographic characterisation of the functionalised polymers using pUC19 plasmid isolated from E. coli DH5α-pUC19 showed a maximum plasmid adsorption capacity of 18.73 mg pDNA/mL with a dissociation constant (KD) of 0.11 mg/mL for GMA-EGDMA/DEAE-Cl polymer. Studies on ligand leaching and degradation demonstrated the stability of GMA-EGDMA/DEAE-Cl after the functionalised polymers were contacted with 1.0 M NaOH, which is a model reagent for most 'cleaning in place' (CIP) systems. However, it is the economic advantage of an adsorbent material that makes it so attractive for commercial purification purposes. Economic evaluation of the performance of the functionalised polymers on the grounds of polymer cost (PC)/mg pDNA retained endorsed the suitability of GMA-EGDMA/DEAE-Cl polymer.
Resumo:
Current developments in gene medicine and vaccination studies are utilizing plasmid DNA (pDNA) as the vector. For this reason, there has been an increasing trend towards larger and larger doses of pDNA utilized in human trials: from 100-1000 μg in 2002 to 500-5000 μg in 2005. The increasing demand of pDNA has created the need to revolutionalize current production levels under optimum economy. In this work, different standard media (LB, TB and SOC) for culturing recombinant Escherichia coli DH5α harbouring pUC19 were compared to a medium optimised for pDNA production. Lab scale fermentations using the standard media showed that the highest pDNA volumetric and specific yields were for TB (11.4 μg/ml and 6.3 μg/mg dry cell mass respectively) and the lowest was for LB (2.8 μg/ml and 3.3 μg/mg dry cell mass respectively). A fourth medium, PDMR, designed by modifying a stoichiometrically-formulated medium with an optimised carbon source concentration and carbon to nitrogen ratio displayed pDNA volumetric and specific yields of 23.8 μg/ml and 11.2 μg/mg dry cell mass respectively. However, it is the economic advantages of the optimised medium that makes it so attractive. Keeping all variables constant except medium and using LB as a base scenario (100 medium cost [MC] units/mg pDNA), the optimised PDMR medium yielded pDNA at a cost of only 27 MC units/mg pDNA. These results show that greater amounts of pDNA can be obtained more economically with minimal extra effort simply by using a medium optimised for pDNA production.
Resumo:
During the critical neurobiological and social developmental period of adolescence, binge drinking of alcohol increases the risk of mental health problems, school exclusion, convictions, fatal and non-fatal accidents. The present research utilizes a simple cluster randomized control trial design to evaluate a social marketing program, Game On: Know Alcohol (GOKA), employing innovative online edutainment games to target binge drinking. Pre and post data were collected for seven program (942 students, mean age: 14.6 years) and five control schools (578 students, mean age: 14.4 years). Significant improvements in alcohol knowledge and affective attitude toward binge drinking was observed for adolescents who participated in GOKA compared to the control group, with maintenance of desirable subjective norms, instrumental attitudes and intentions. Given considerable external competition from messages promoting the benefits of alcohol use, a one-off program that modifies incorrect knowledge and alters perceptions of binge drinking as a fun, recreational activity represents an important step. This research contributes to current understanding of social marketing’s capacity to change drivers and maintain inhibitors of binge drinking intentions of adolescents and provides an important basis for future research in the domain.
Resumo:
The philosophical promise of community development to “resource and empower people so that they can collectively control their own destinies” (Kenny 1996:104) is no doubt alluring to Indigenous Australia. Given the historical and contemporary experiences of colonial control and surveillance of Aboriginal bodies, alongside the continuing experiences of socio-economic disadvantage, community development reaffirms the aspirational goal of Indigenous Australians for self-determination. Self-determination as a national policy agenda for Indigenous Australians emerged in the 1970s and saw the establishment of a wide range of Aboriginal community-controlled services (Tsey et al 2012). Sullivan (2010:4) argues that the Aboriginal community controlled service sector during this time has, and continues to be, instrumental to advancing the plight of Indigenous Australians both materially and politically. Yet community development and self-determination remain highly problematic and contested in how they manifest in Indigenous social policy agendas and in practice (Hollinsworth 1996; Martin 2003; McCausland 2005; Moreton-Robinson 2009). Moreton-Robinson (2009:68) argues that a central theme underpinning these tensions is a reading of Indigeneity in which Aboriginal and Torres Strait Islander people, behaviours, cultures, and communities are pathologised as “dysfunctional” thus enabling assertions that Indigenous people are incapable of managing their own affairs. This discourse distracts us from the “strategies and tactics of patriarchal white sovereignty” that inhibit the “state’s earlier policy of self-determination” (Moreton-Robinson 2009:68). We acknowledge the irony of community development espoused by Ramirez above (1990), that the least resourced are expected to be most resourceful.; however, we wish to interrogate the processes that inhibit Indigenous participation and control of our own affairs rather than further interrogate Aboriginal minds as uneducated, incapable and/or impaired...
Resumo:
This article argues that the secular liberal and positivist foundations of the modern Western legal system render it violent. In particular, the liberal exclusion of faith and subjectivity in favour of abstract and universal reason in conjunction with its privileging of individual autonomy at the expense of the community leads to alienation of the individual from the community. Similarly, the positivist exclusion of faith and theology from law, with its enforced conformity to the posited law, also results in this violence of alienation. In response, this article proposes a new foundation for law, a natural law based in the truth of Trinitarian theology articulated by John Milbank. In the Trinity, the members exist as a perfect unity in diversity, providing a model for the reconciliation of the legal individual and community: the law of love. Through the law of love as the basic norm, individuals love their neighbours as themselves, reconciling the particular and the universal, and providing a community of peace rather than violence.
Resumo:
There is consensus among practitioners and academics that culture is a critical factor that is able to determine success or failure of BPM initiatives. Yet, culture is a topic that seems difficult to grasp and manage. This may be the reason for the overall lack of guidance on how to address this topic in practice. We have conducted in-depth research for more than three years to examine why and how culture is relevant to BPM. In this chapter, we introduce a framework that explains the role of culture in BPM. We also present the relevant cultural values that compose a BPM culture, and we introduce a tool to examine the supportiveness of organizational cultures for BPM. Our research results provide the basis for further empirical analyses on the topic and support practitioners in the management of culture as an important factor in BPM initiatives.
Resumo:
Structural Health Monitoring (SHM) schemes are useful for proper management of the performance of structures and for preventing their catastrophic failures. Vibration based SHM schemes has gained popularity during the past two decades resulting in significant research. It is hence evitable that future SHM schemes will include robust and automated vibration based damage assessment techniques (VBDAT) to detect, localize and quantify damage. In this context, the Damage Index (DI) method which is classified as non-model or output based VBDAT, has the ability to automate the damage assessment process without using a computer or numerical model along with actual measurements. Although damage assessment using DI methods have been able to achieve reasonable success for structures made of homogeneous materials such as steel, the same success level has not been reported with respect to Reinforced Concrete (RC) structures. The complexity of flexural cracks is claimed to be the main reason to hinder the applicability of existing DI methods in RC structures. Past research also indicates that use of a constant baseline throughout the damage assessment process undermines the potential of the Modal Strain Energy based Damage Index (MSEDI). To address this situation, this paper presents a novel method that has been developed as part of a comprehensive research project carried out at Queensland University of Technology, Brisbane, Australia. This novel process, referred to as the baseline updating method, continuously updates the baseline and systematically tracks both crack formation and propagation with the ability to automate the damage assessment process using output only data. The proposed method is illustrated through examples and the results demonstrate the capability of the method to achieve the desired outcomes.
Resumo:
More than ever, research is playing an important part in supporting proposed tax reforms and finding solutions to Australia’s tax system. Also, for tax academics the importance of quality research is critical in an increasingly competitive tertiary environment. However, life for an academic can be an isolating experience at time, especially if one’s expertise is in an area that many of their immediate colleagues do not share an interest in. Collegiately and the ability to be able to discuss research is seen as critical in fostering the next generation of academics. It is with this in mind that on the 5th of July 2010 the Inaugural Queensland Tax Teachers’ Symposium was hosted by Griffith University at its Southbank campus. The aim was to bring together for one day tax academics in Queensland, and further afield, to present their current research projects and encourage independent tax research. If was for this reason that the symposium was later re-named the Queensland Tax Researchers’ Symposium (QTRS) to reflect its emphasis. The Symposium has been held annually mid-year on four occasions with in excess of 120 attendees over this period. The fifth QTRS is planned for June 2014 to be hosted by James Cook University.
Resumo:
The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.
Resumo:
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.
Resumo:
Introduction Patients with dysphagia (PWDs) have been shown to be four times more likely to suffer medication administration errors (MAEs).1 2 Individualised medication administration guides (I-MAGs) which outline how each formulation should be administered, have been developed to standardise medication administration by nurses on the ward and reduce the likelihood of errors. This pilot study aimed to determine the recruitment rates, estimate effect on errors and develop the intervention to design a future full scale randomised controlled trial to determine the costs and effects of I-MAG implementation. Ethical approval was granted by local ethics committee. Method Software was developed to enable I-MAG production (based on current best practice)3 4 for all PWDs on two care of the older person wards admitted during a six month period from January to July 2011. I-MAGs were attached to the medication administration record charts to be utilised by nurses when administering medicines. Staff training was provided for all staff on the intervention wards. Two care of the older person wards in the same hospital were used for control purposes. All patients with dysphagia were recruited for follow up purposes at discharge. Four ward rounds at each intervention and control ward were observed pre and post I-MAG implementation to determine the level of medication administration errors. NHS ethical approval for the study was obtained. Results 164 I-MAGs were provided for 75 patients with dysphagia (PWDs) in the two intervention wards. At discharge, 23 patients in the intervention wards and 7 patients in the control wards were approached for recruitment of which 17 (74%) & 5 (71.5%) respectively consented. Discussion Recruitment rates were low on discharge due to the dysphagia remitting during hospitalisation. The introduction of the I-MAG demonstrated no effect on the quality of administration on the intervention ward and interestingly practice improved on the control ward. The observation of medication rounds at least one month post I-MAG removal may have identified a reversal to normal practice and ideally observations should have been undertaken with I-MAGs in place. Identification of the reason for the improvement in the control ward is warranted.
Resumo:
The Australian water sector needs to adapt to effectively deal with the impacts of climate change on its systems. Challenges as a result of climate change include increasingly extreme occurrences of weather events including flooding and droughts (Pittock, 2011). In response to such challenges, the National Water Commission in Australia has identified the need for the water sector to transition towards being readily adaptable and able to respond to complex needs for a variety of supply and demand scenarios (National Water Commission, 2013). To successfully make this transition, the sector will need to move away from business as usual, and proactively pursue and adopt innovative approaches and technologies as a means to successfully address the impacts of climate change on the Australian water sector. In order to effectively respond to specific innovation challenges related to the sector, including climate change, it is first necessary to possess a foundational understanding about the key elements related to innovation in the sector. This paper presents this base level understanding, identifying the key barriers, drivers and enablers, and elements for innovative practise in the water sector. After initially inspecting the literature around the challenges stemming from climate change faced by the sector, the paper then examines the findings from the initial two rounds of a modified Delphi study, conducted with experts from the Australian water sector, including participants from research, government and industry backgrounds. The key barriers, drivers and enablers for innovation in the sector identified during the initial phase of the study formed the basis for the remainder of the investigation. Key elements investigated were: barriers – scepticism, regulation systems, inconsistent policy; drivers – influence of policy, resource scarcity, thought leadership; enablers – framing the problem, effective regulations, community acceptance. There is a convincing argument for the water sector transitioning to a more flexible, adaptive and responsive system in the face of challenges resulting from climate change. However, without first understanding the challenges and opportunities around making this transition, the likelihood of success is limited. For that reason, this paper takes the first step in understanding the elements surrounding innovation in the Australian water sector.