936 resultados para static random access memory
Resumo:
Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. The spreading activation, spooky-action-at-a-distance and entanglement models have all been used to model the activation of a word. Recently a hypothesis was put forward that the mean activation levels of the respective models are as follows: Spreading � Entanglment � Spooking-action-at-a-distance This article investigates this hypothesis by means of a substantial empirical analysis of each model using the University of South Florida word association, rhyme and word norms.
Resumo:
Current knowledge about the relationship between transport disadvantage and activity space size is limited to urban areas, and as a result, very little is known about this link in a rural context. In addition, although research has identified transport disadvantaged groups based on their size of activity space, these studies have, however, not empirically explained such differences and the result is often a poor identification of the problems facing disadvantaged groups. Research has shown that transport disadvantage varies over time. The static nature of analysis using the activity space concept in previous research studies has lacked the ability to identify transport disadvantage in time. Activity space is a dynamic concept; and therefore possesses a great potential in capturing temporal variations in behaviour and access opportunities. This research derives measures of the size and fullness of activity spaces for 157 individuals for weekdays, weekends, and for a week using weekly activity-travel diary data from three case study areas located in rural Northern Ireland. Four focus groups were also conducted in order to triangulate quantitative findings and to explain the differences between different socio-spatial groups. The findings of this research show that despite having a smaller sized activity space, individuals were not disadvantaged because they were able to access their required activities locally. Car-ownership was found to be an important life line in rural areas. Temporal disaggregation of the data reveals that this is true only on weekends due to a lack of public transport services. In addition, despite activity spaces being at a similar size, the fullness of activity spaces of low-income individuals was found to be significantly lower compared to their high-income counterparts. Focus group data shows that financial constraint, poor connections both between public transport services and between transport routes and opportunities forced individuals to participate in activities located along the main transport corridors.
Resumo:
Researchers are increasingly involved in data-intensive research projects that cut across geographic and disciplinary borders. Quality research now often involves virtual communities of researchers participating in large-scale web-based collaborations, opening their earlystage research to the research community in order to encourage broader participation and accelerate discoveries. The result of such large-scale collaborations has been the production of ever-increasing amounts of data. In short, we are in the midst of a data deluge. Accompanying these developments has been a growing recognition that if the benefits of enhanced access to research are to be realised, it will be necessary to develop the systems and services that enable data to be managed and secured. It has also become apparent that to achieve seamless access to data it is necessary not only to adopt appropriate technical standards, practices and architecture, but also to develop legal frameworks that facilitate access to and use of research data. This chapter provides an overview of the current research landscape in Australia as it relates to the collection, management and sharing of research data. The chapter then explains the Australian legal regimes relevant to data, including copyright, patent, privacy, confidentiality and contract law. Finally, this chapter proposes the infrastructure elements that are required for the proper management of legal interests, ownership rights and rights to access and use data collected or generated by research projects.
Resumo:
In 2001, amendments to the Migration Act 1958 (Cth) made possible the offshore processing of protection claims. The same amendments also foreshadowed the processing of claims by ‘offshore entry persons’ in Australia according to non-statutory procedures. After disbanding offshore processing the then Rudd Labor Government commenced processing of protection claims by ‘offshore entry persons’ in Australia under the Refugee Status Assessment process (RSA). The RSA process sought to substitute well established legislative criteria for the grant of a protection visa, as interpreted by the courts, with administrative guidelines and decision-making immune from judicial review. This approach was rejected by the High Court in the cases M61 and M69. This article analyses these developments in light of Australia’s international protection obligations, as well as considering the practical obstacles that continue to confront offshore entry persons as they pursue judicial review of adverse refugee status determinations after the High Court’s decision.
Resumo:
In Viet Nam, standards of nursing care fail to meet international competency standards. This increases risks to patient safety (eg. hospital acquired infection), consequently the Ministry of Health identified the need to strengthen nurse education in Viet Nam. This paper presents experiences of a piloted clinical teaching model developed in Ha Noi, to strengthen nurse led institutional capacity for in-service education and clinical teaching. Historically 90% of nursing education was conducted by physicians and professional development in hospitals for nurses was limited. There was minimal communication between hospitals and nursing schools about expectations of students and assessment and quality of the learning experience. As a result when students came to the clinical sites, no-one understood how to plan their learning objectives and utilise teaching and learning approaches appropriate to their level. Therefore student learning outcomes were variable. They focussed on procedures and techniques and “learning how to do” rather than learning how to plan, implement and evaluate patient care. This project is part of a multi-component capacity building program designed to improve nurse education in Viet Nam. The project was funded jointly by Queensland University of Technology (QUT) and the Australian Agency for International Development. Its aim was to develop a collaborative clinically-based model of teaching to create an environment that encourages evidence-based, student-centred clinical learning. Accordingly, strategies introduced promoted clinical teaching of competency based nursing practice utilising the regionally endorsed nurse core competency standards. Thirty nurse teachers from Viet Duc University Hospital and Hanoi Medical College participated in the program. These nurses and nurse teachers undertook face to face education in three workshops, and completed three assessment items. Assessment was applied, where participants integrated the concepts learned in each workshop and completed assessment tasks related to planning, implementing and evaluating teaching in the clinical area. Twenty of these participants were then selected to undertake a two week study tour in Brisbane, Australia where the clinical teaching model was refined and an action plan developed to integrate into both organisations with possible implementation across Viet Nam. Participants on this study tour also experienced clinical teaching and learning at QUT by attending classes held at the university, and were able to visit selected hospitals to experience clinical teaching in these settings as well. Effectiveness of the project was measured throughout the implementation phase and in follow up visits to the clinical site. To date changes have been noted on an individual and organisational level. There is also significant planning underway to incorporate the clinical teaching model developed across the organisation and how this may be implemented in other regions. Two participants have also been involved in disseminating aspects of this approach to clinical teaching in Ho Chi Minh, with further plans for more in-depth dissemination to occur throughout the country.
Resumo:
The Malaysian National Innovation Model blueprint states that there is an urgent need to pursue an innovation-oriented economy to improve the nation’s capacity for knowledge, creativity and innovation. In nurturing a pervasive innovation culture, the Malaysian government has declared the year 2010 as an Innovative Year whereby creativity among its population is highly celebrated. However, while Malaysian citizens are encouraged to be creative and innovative, scientific data and information generated from publicly funded research in Malaysia is locked up because of rigid intellectual property licensing regimes and traditional publishing models. Reflecting on these circumstances, this paper looks at, and argue why, scientific data and information should be made available, accessible and re-useable freely to promote the grassroots level of innovation in Malaysia. Using innovation theory as its platform of argument, this paper calls for an open access policy for publicly funded research output to be adopted and implemented in Malaysia. Simultaneously, a normative analytic approach is used to determine the types of open access policy that ought to be adopted to spur greater innovation among Malaysians.
Resumo:
Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.
Resumo:
This project involved the complete refurbishment and extension of a 1980’s two-storey domestic brick building, previously used as a Boarding House (Class 3), into Middle School facilities (Class 9b) on a heritage listed site at Nudgee College secondary school, Brisbane. The building now accommodates 12 technologically advanced classrooms, computer lab and learning support rooms, tuckshop, art room, mini library/reading/stage area, dedicated work areas for science and large projects with access to water on both floors, staff facilities and an undercover play area suitable for assemblies and presentations. The project was based on a Reggio Emilia approach, in which the organisation of the physical environment is referred to as the child’s third teacher, creating opportunities for complex, varied, sustained and changing relationships between people and ideas. Classrooms open to a communal centre piazza and are integrated with the rest of the school and the school with the surrounding community. In order to achieve this linkage of the building with the overall masterplan of the site, a key strategy of the internal planning was to orientate teaching areas around a well defined active circulation space that breaks out of the building form to legibly define the new access points to the building and connect up to the pathway network of the campus. The width of the building allowed for classrooms and a generous corridor that has become ‘breakout’ teaching areas for art, IT, and small group activities. Large sliding glass walls allow teachers to maintain supervision of students across all areas and allow maximum light penetration through small domestic window openings into the deep and low-height spaces. The building was also designed with an effort to uphold cultural characteristics from the Edmund Rice Education Charter (2004). Coherent planning is accompanied by a quality fit-out, creating a vibrant and memorable environment in which to deliver the upper primary curriculum. Consistent with the Reggio Emilia approach, materials, expressive of the school’s colours, are used in a contemporary, adventurous manner to create panels of colour useful for massing and defining the ‘breakout’ teaching areas and paths of travel, and storage elements are detailed and arranged to draw attention to their aesthetic features. Modifications were difficult due to the random placement of load bearing walls, minimum ceiling heights, the general standard of finishes and new fire and energy requirements, however the reuse of this building was assessed to be up to 30% cheaper than an equivalent new building, The fit out integrates information technology and services at a level not usually found in primary school facilities. This has been achieved within the existing building fabric through thoughtful detailing and co-ordination with allied disciplines.
Resumo:
Various time-memory tradeoffs attacks for stream ciphers have been proposed over the years. However, the claimed success of these attacks assumes the initialisation process of the stream cipher is one-to-one. Some stream cipher proposals do not have a one-to-one initialisation process. In this paper, we examine the impact of this on the success of time-memory-data tradeoff attacks. Under the circumstances, some attacks are more successful than previously claimed while others are less. The conditions for both cases are established.
Resumo:
Security and privacy in electronic health record systems have been hindering the growth of e-health systems since their emergence. The development of policies that satisfy the security and privacy requirements of different stakeholders in healthcare has proven to be difficult. But, these requirements have to be met if the systems developed are to succeed in achieving their intended goals. Access control is a fundamental security barrier for securing data in healthcare information systems. In this paper we present an access control model for electronic health records. We address patient privacy requirements, confidentiality of private information and the need for flexible access for health professionals for electronic health records. We carefully combine three existing access control models and present a novel access control model for EHRs which satisfies requirements of electronic health records.
Resumo:
Finite Element Modeling (FEM) has become a vital tool in the automotive design and development processes. FEM of the human body is a technique capable of estimating parameters that are difficult to measure in experimental studies with the human body segments being modeled as complex and dynamic entities. Several studies have been dedicated to attain close-to-real FEMs of the human body (Pankoke and Siefert 2007; Amann, Huschenbeth et al. 2009; ESI 2010). The aim of this paper is to identify and appraise the state of-the art models of the human body which incorporate detailed pelvis and/or lower extremity models. Six databases and search engines were used to obtain literature, and the search was limited to studies published in English since 2000. The initial search results identified 636 pelvis-related papers, 834 buttocks-related papers, 505 thigh-related papers, 927 femur-related papers, 2039 knee-related papers, 655 shank-related papers, 292 tibia-related papers, 110 fibula-related papers, 644 ankle related papers, and 5660 foot-related papers. A refined search returned 100 pelvis-related papers, 45 buttocks related papers, 65 thigh-related papers, 162 femur-related papers, 195 kneerelated papers, 37 shank-related papers, 80 tibia-related papers, 30 fibula-related papers and 102 ankle-related papers and 246 foot-related papers. The refined literature list was further restricted by appraisal against a modified LOW appraisal criteria. Studies with unclear methodologies, with a focus on populations with pathology or with sport related dynamic motion modeling were excluded. The final literature list included fifteen models and each was assessed against the percentile the model represents, the gender the model was based on, the human body segment/segments included in the model, the sample size used to develop the model, the source of geometric/anthropometric values used to develop the model, the posture the model represents and the finite element solver used for the model. The results of this literature review provide indication of bias in the available models towards 50th percentile male modeling with a notable concentration on the pelvis, femur and buttocks segments.
Resumo:
Background: Access to cardiac services is essential for appropriate implementation of evidence-based therapies to improve outcomes. The Cardiac Accessibility and Remoteness Index for Australia (Cardiac ARIA) aimed to derive an objective, geographic measure reflecting access to cardiac services. Methods: An expert panel defined an evidence-based clinical pathway. Using Geographic Information Systems (GIS), a numeric/alpha index was developed at two points along the continuum of care. The acute category (numeric) measured the time from the emergency call to arrival at an appropriate medical facility via road ambulance. The aftercare category (alpha) measured access to four basic services (family doctor, pharmacy, cardiac rehabilitation, and pathology services) when a patient returned to their community. Results: The numeric index ranged from 1 (access to principle referral center with cardiac catheterization service ≤ 1 hour) to 8 (no ambulance service, > 3 hours to medical facility, air transport required). The alphabetic index ranged from A (all 4 services available within 1 hour drive-time) to E (no services available within 1 hour). 13.9 million (71%) Australians resided within Cardiac ARIA 1A locations (hospital with cardiac catheterization laboratory and all aftercare within 1 hour). Those outside Cardiac 1A were over-represented by people aged over 65 years (32%) and Indigenous people (60%). Conclusion: The Cardiac ARIA index demonstrated substantial inequity in access to cardiac services in Australia. This methodology can be used to inform cardiology health service planning and the methodology could be applied to other common disease states within other regions of the world.
Resumo:
Introduction The suitability of video conferencing (VC) technology for clinical purposes relevant to geriatric medicine is still being established. This project aimed to determine the validity of the diagnosis of dementia via VC. Methods This was a multisite, noninferiority, prospective cohort study. Patients, aged 50 years and older, referred by their primary care physician for cognitive assessment, were assessed at 4 memory disorder clinics. All patients were assessed independently by 2 specialist physicians. They were allocated one face-to-face (FTF) assessment (Reference standard – usual clinical practice) and an additional assessment (either usual FTF assessment or a VC assessment) on the same day. Each specialist physician had access to the patient chart and the results of a battery of standardized cognitive assessments administered FTF by the clinic nurse. Percentage agreement (P0) and the weighted kappa statistic with linear weight (Kw) were used to assess inter-rater reliability across the 2 study groups on the diagnosis of dementia (cognition normal, impaired, or demented). Results The 205 patients were allocated to group: Videoconference (n = 100) or Standard practice (n = 105); 106 were men. The average age was 76 (SD 9, 51–95) and the average Standardized Mini-Mental State Examination Score was 23.9 (SD 4.7, 9–30). Agreement for the Videoconference group (P0= 0.71; Kw = 0.52; P < .0001) and agreement for the Standard Practice group (P0= 0.70; Kw = 0.50; P < .0001) were both statistically significant (P < .05). The summary kappa statistic of 0.51 (P = .84) indicated that VC was not inferior to FTF assessment. Conclusions Previous studies have shown that preliminary standardized assessment tools can be reliably administered and scored via VC. This study focused on the geriatric assessment component of the interview (interpretation of standardized assessments, taking a history and formulating a diagnosis by medical specialist) and identified high levels of agreement for diagnosing dementia. A model of service incorporating either local or remote administered standardized assessments, and remote specialist assessment, is a reliable process for enabling the diagnosis of dementia for isolated older adults.
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.