562 resultados para Arctique--Aspect stratégique
Resumo:
Given identified synergies between information use and health status greater understanding is needed about how people use information to learn about their health. This article presents the findings of preliminary research into health information literacy which sought to explore how this is phenomenon is experienced among ageing Australians. Analysis of data from semi-structured interviews has revealed six different ways ageing Australians experience using information to learn about their health within one aspect of community life. Health information literacy is a new terrain for information literacy research endeavours and one which warrants further attention by the profession to foster and promote within the community.
Resumo:
The effects of sintering on several properties of FTO and ITO substates used in DSC have been investigated. FTO & ITO substrates were prepared with a range of sizes and aspect ratios - emulated laboratory style test cells through to prototype modules. Time and temperature of the sintering profiles were varied and sheet resistance and flatness measured. Electrical properties of the substrates were then further characterized by electrochemical impedance spectroscopy, and module sized devices were assembled and thickness variations over the device area were determined and related to performance.
Resumo:
The biomechanical or biophysical principles can be applied to study biological structures in their modern or fossil form. Bone is an important tissue in paleontological studies as it is a commonly preserved element in most fossil vertebrates, and can often allow its microstructures such as lacuna and canaliculi to be studied in detail. In this context, the principles of Fluid Mechanics and Scaling Laws have been previously applied to enhance the understanding of bone microarchitecture and their implications for the evolution of hydraulic structures to transport fluid. It has been shown that the microstructure of bone has evolved to maintain efficient transport between the nutrient supply and cells, the living components of the tissue. Application of the principle of minimal expenditure of energy to this analysis shows that the path distance comprising five or six lamellar regions represents an effective limit for fluid and solute transport between the nutrient supply and cells; beyond this threshold, hydraulic resistance in the network increases and additional energy expenditure is necessary for further transportation. This suggests an optimization of the size of bone’s building blocks (such as osteon or trabecular thickness) to meet the metabolic demand concomitant to minimal expenditure of energy. This biomechanical aspect of bone microstructure is corroborated from the ratio of osteon to Haversian canal diameters and scaling constants of several mammals considered in this study. This aspect of vertebrate bone microstructure and physiology may provide a basis of understanding of the form and function relationship in both extinct and extant taxa.
Resumo:
Globalisation and societal change suggest the language and literacy skills needed to make meaning in our lives are increasing and changing radically. Multiliteracies are influencing the future of literacy teaching. One aspect of the pedagogy of multiliteracies is recruiting learners’ previous and current experiences as an integral part of the learning experience. This paper examines the implications of results from a project that examined student responses to a postmodern picture book, in particular, ways teachers might develop students’ self-knowledge about reading. It draws on Freebody and Luke’s Four Resources Model of Reading and recently developed models for teaching multiliteracies.
Resumo:
Aim. This paper is a report of a study conducted to explore the impact of preidentified contextual themes (related to work environment and socialization) on nursing medication practice. Background. Medication administration is a complex aspect of paediatric nursing and an important component of day-to-day nursing practice. Many attempts are being made to improve patient safety, but many errors remain. Identifying and understanding factors that influence medication administration errors are of utmost importance. Method. A cross-sectional survey was conducted with a sample of 278 paediatric nurses from the emergency department, intensive care unit and medical and surgical wards of an Australian tertiary paediatric hospital in 2004. The response rate was 67%. Result. Contextual influences were important in determining how closely medication policy was followed. Completed questionnaires were returned by 185 nurses (67%). Younger nurses aged <34 years thought that their medication administration practice could be influenced by the person with whom they checked the drugs (P = 0·001), and that there were daily circumstances when it was acceptable not to adhere strictly to medication policy (P < 0·001), including choosing between following policy and acting in the best interests of the child (P = 0·002). Senior nurses agreed that senior staff dictate acceptable levels of medication policy adherence through role modelling (P = 0·01). Less experienced nurses reported greater confidence with computer literacy (P < 0·001). Conclusions. Organizations need to employ multidisciplinary education programmes to promote universal understanding of, and adherence to, medication policies. Skill mix should be closely monitored to ensure adequate support for new and junior staff.
Resumo:
Researching administrative history is problematical. A trail of authoritative documents is often hard to find; and useful summaries can be difficult to organise, especially if source material is in paper formats in geographically dispersed locations. In the absence of documents, the reasons for particular decisions and the rationale underpinning particular policies can be confounded as key personnel advance in their professions and retire. The rationale for past decisions may be lost for practical purposes; and if an organisation’s memory of events is diminished, its learning through experience is also diminished. Publishing this document tries to avoid unnecessary duplication of effort by other researchers that need to venture into how policies of charging for public sector information have been justified. The author compiled this work within a somewhat limited time period and the work does not pretend to be a complete or comprehensive analysis of the issues.----- A significant part of the role of government is to provide a framework of legally-enforceable rights and obligations that can support individuals and non-government organisations in their lawful activities. Accordingly, claims that governments should be more ‘business-like’ need careful scrutiny. A significant supply of goods and services occurs as non-market activity where neither benefits nor costs are quantified within conventional accounting systems or in terms of money. Where a government decides to provide information as a service; and information from land registries is archetypical, the transactions occur as a political decision made under a direct or a clearly delegated authority of a parliament with the requisite constitutional powers. This is not a market transaction and the language of the market confuses attempts to describe a number of aspects of how governments allocate resources.----- Cost recovery can be construed as an aspect of taxation that is a sole prerogative of a parliament. The issues are fundamental to political constitutions; but they become more complicated where states cede some taxing powers to a central government as part of a federal system. Nor should the absence of markets be construed necessarily as ‘market failure’ or even ‘government failure’. The absence is often attributable to particular technical, economic and political constraints that preclude the operation of markets. Arguably, greater care is needed in distinguishing between the polity and markets in raising revenues and allocating resources; and that needs to start by removing unhelpful references to ‘business’ in the context of government decision-making.
Resumo:
Ecological problems are typically multi faceted and need to be addressed from a scientific and a management perspective. There is a wealth of modelling and simulation software available, each designed to address a particular aspect of the issue of concern. Choosing the appropriate tool, making sense of the disparate outputs, and taking decisions when little or no empirical data is available, are everyday challenges facing the ecologist and environmental manager. Bayesian Networks provide a statistical modelling framework that enables analysis and integration of information in its own right as well as integration of a variety of models addressing different aspects of a common overall problem. There has been increased interest in the use of BNs to model environmental systems and issues of concern. However, the development of more sophisticated BNs, utilising dynamic and object oriented (OO) features, is still at the frontier of ecological research. Such features are particularly appealing in an ecological context, since the underlying facts are often spatial and temporal in nature. This thesis focuses on an integrated BN approach which facilitates OO modelling. Our research devises a new heuristic method, the Iterative Bayesian Network Development Cycle (IBNDC), for the development of BN models within a multi-field and multi-expert context. Expert elicitation is a popular method used to quantify BNs when data is sparse, but expert knowledge is abundant. The resulting BNs need to be substantiated and validated taking this uncertainty into account. Our research demonstrates the application of the IBNDC approach to support these aspects of BN modelling. The complex nature of environmental issues makes them ideal case studies for the proposed integrated approach to modelling. Moreover, they lend themselves to a series of integrated sub-networks describing different scientific components, combining scientific and management perspectives, or pooling similar contributions developed in different locations by different research groups. In southern Africa the two largest free-ranging cheetah (Acinonyx jubatus) populations are in Namibia and Botswana, where the majority of cheetahs are located outside protected areas. Consequently, cheetah conservation in these two countries is focussed primarily on the free-ranging populations as well as the mitigation of conflict between humans and cheetahs. In contrast, in neighbouring South Africa, the majority of cheetahs are found in fenced reserves. Nonetheless, conflict between humans and cheetahs remains an issue here. Conservation effort in South Africa is also focussed on managing the geographically isolated cheetah populations as one large meta-population. Relocation is one option among a suite of tools used to resolve human-cheetah conflict in southern Africa. Successfully relocating captured problem cheetahs, and maintaining a viable free-ranging cheetah population, are two environmental issues in cheetah conservation forming the first case study in this thesis. The second case study involves the initiation of blooms of Lyngbya majuscula, a blue-green algae, in Deception Bay, Australia. L. majuscula is a toxic algal bloom which has severe health, ecological and economic impacts on the community located in the vicinity of this algal bloom. Deception Bay is an important tourist destination with its proximity to Brisbane, Australia’s third largest city. Lyngbya is one of several algae considered to be a Harmful Algal Bloom (HAB). This group of algae includes other widespread blooms such as red tides. The occurrence of Lyngbya blooms is not a local phenomenon, but blooms of this toxic weed occur in coastal waters worldwide. With the increase in frequency and extent of these HAB blooms, it is important to gain a better understanding of the underlying factors contributing to the initiation and sustenance of these blooms. This knowledge will contribute to better management practices and the identification of those management actions which could prevent or diminish the severity of these blooms.
Resumo:
Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.
Resumo:
Objective: To investigate the acute effects of isolated eccentric and concentric calf muscle exercise on Achilles tendon sagittal thickness. ---------- Design: Within-subject, counterbalanced, mixed design. ---------- Setting: Institutional. ---------- Participants: 11 healthy, recreationally active male adults. ---------- Interventions: Participants performed an exercise protocol, which involved isolated eccentric loading of the Achilles tendon of a single limb and isolated concentric loading of the contralateral, both with the addition of 20% bodyweight. ---------- Main outcome measurements: Sagittal sonograms were acquired prior to, immediately following and 3, 6, 12 and 24 h after exercise. Tendon thickness was measured 2 cm proximal to the superior aspect of the calcaneus. ---------- Results: Both loading conditions resulted in an immediate decrease in normalised Achilles tendon thickness. Eccentric loading induced a significantly greater decrease than concentric loading despite a similar impulse (−0.21 vs −0.05, p<0.05). Post-exercise, eccentrically loaded tendons recovered exponentially, with a recovery time constant of 2.5 h. The same exponential function did not adequately model changes in tendon thickness resulting from concentric loading. Even so, recovery pathways subsequent to the 3 h time point were comparable. Regardless of the exercise protocol, full tendon thickness recovery was not observed until 24 h. ---------- Conclusions: Eccentric loading invokes a greater reduction in Achilles tendon thickness immediately after exercise but appears to recover fully in a similar time frame to concentric loading.
Resumo:
While my PhD is practice-led research, it is my contention that such an inquiry cannot develop as long as it tries to emulate other models of research. I assert that practice-led research needs to account for an epistemological unknown or uncertainty central to the practice of art. By focusing on what I call the artist's 'voice,' I will show how this 'voice' is comprised of a dual motivation—'articulate' representation and 'inarticulate' affect—which do not even necessarily derive from the artist. Through an analysis of art-historical precedents, critical literature (the work of Jean-François Lyotard and Andrew Benjamin, the critical methods of philosophy, phenomenology and psychoanalysis) as well as of my own painting and digital arts practice, I aim to demonstrate how this unknown or uncertain aspect of artistic inquiry can be mapped. It is my contention that practice-led research needs to address and account for this dualistic 'voice' in order to more comprehensively articulate its unique contribution to research culture.
Resumo:
Principal Topic : Nascent entrepreneurship has drawn the attention of scholars in the last few years (Davidsson, 2006, Wagner, 2004). However, most studies have asked why firms are created focussing on questions such as what are the characteristics (Delmar and Davidsson, 2000) and motivations (Carter, Gartner, Shaver & Reynolds, 2004) of nascent entrepreneurs, or what are the success factors in venture creation (Davidsson & Honig; 2003; Delmar and Shane, 2004). In contrast, the question of how companies emerge is still in its infancy. On a theoretical side, effectuation, developed by Sarasvathy (2001) offers one view of the strategies that may be at work during the venture creation process. Causation, the theorized inverse to effectuation, may be described as a rational reasoning method to create a company. After a comprehensive market analysis to discover opportunities, the entrepreneur will select the alternative with the higher expected return and implement it through the use of a business plan. In contrast, effectuation suggests that the future entrepreneur will develop her new venture in a more iterative way by selecting possibilities through flexibility and interaction with the market, affordability of loss of resources and time invested, development of pre-commitments and alliances from stakeholders. Another contrasting point is that causation is ''goal driven'' while an effectual approach is ''mean driven'' (Sarasvathy, 2001) One of the predictions of effectuation theory is effectuation is more likely to be used by entrepreneurs early in the venture creation process (Sarasvathy, 2001). However, this temporal aspect and the impact of the effectuation strategy on the venture outcomes has so far not been systematically and empirically tested on large samples. The reason behind this research gap is twofold. Firstly, few studies collect longitudinal data on emerging ventures at an early enough stage of development to avoid severe survivor bias. Second, the studies that collect such data have not included validated measures of effectuation. The research we are conducting attempts to partially fill this gap by combining an empirical investigation on a large sample of nascent and young firms with the effectuation/causation continuum as a basis (Sarasvathy, 2001). The objectives are to understand the strategies used by the firms during the creation process and measure their impacts on the firm outcomes. Methodology/Key Propositions : This study draws its data from the first wave of the CAUSEE project where 28,383 Australian households were randomly contacted by phone using a specific methodology to capture emerging firms (Davidsson, Steffens, Gordon, Reynolds, 2008). This screening led to the identification of 594 nascent ventures (i.e., firms that are not operating yet) and 514 young firms (i.e., firms that have started operating from 2004) that were willing to participate in the study. Comprehensive phone interviews were conducted with these 1108 ventures. In a likewise comprehensive follow-up 12 months later, 80% of the eligible cases completed the interview. The questionnaire contains specific sections designed to distinguish effectual and causal processes, innovation, gestation activities, business idea changes and ventures outcomes. The effectuation questions are based on the components of effectuation strategy as described by Sarasvathy (2001) namely: flexibility, affordable loss and pre-commitment from stakeholders. Results from two rounds of pre-testing informed the design of the instrument included in the main survey. The first two waves of data have will be used to test and compare the use of effectuation in the venture creation process. To increase the robustness of the results, temporal use of effectuation will be tested both directly and indirectly. 1. By comparing the use of effectuation in nascent and young firms from wave 1 to 2, we will be able to find out how effectuation is affected by time over a 12-month duration and if the stage of venture development has an impact on its use. 2. By comparing nascent ventures early in the creation process versus nascent ventures late in the creation process. Early versus late can be determined with the help of time-stamped gestation activity questions included in the survey. This will help us to determine the change on a small time scale during the creation phase of the venture. 3. By comparing nascent firms to young (already operational) firms. 4. By comparing young firms becoming operational in 2006 with those first becoming operational in 2004. Results and Implications : Wave 1 and 2 data have been completed and wave 2 is currently being checked and 'cleaned'. Analysis work will commence in September, 2009. This paper is expected to contribute to the body of knowledge on effectuation by measuring quantitatively its use and impact on nascent and young firms activities at different stages of their development. In addition, this study will also increase the understanding of the venture creation process by comparing over time nascent and young firms from a large sample of randomly selected ventures. We acknowledge the results from this study will be preliminary and will have to be interpreted with caution as the changes identified may be due to several factors and may not only be attributed to the use/not use of effectuation. Meanwhile, we believe that this study is important to the field of entrepreneurship as it provides some much needed insights on the processes used by nascent and young firms during their creation and early operating stages.
Resumo:
The evaluation of satisfaction levels related to performance is an important aspect in increasing market share, improving profitability and enlarging opportunities for repeat business and can lead to the determination of areas to be improved, improving harmonious working relationships and conflict avoidance. In the construction industry, this can also result in improved project quality, enhanced reputation and increased competitiveness. Many conceptual models have been developed to measure satisfaction levels - typically to gauge client satisfaction, customer satisfaction and home buyer satisfaction - but limited empirical research has been carried out, especially in investigating the satisfaction of construction contractors. In addressing this, this paper provides a unique conceptual model or framework for contractor satisfaction based on attributes identified by interviews with practitioners in Malaysia. In addition to progressing research in this topic and being of potential benefit to Malaysian contractors, it is anticipated that the framework will also be useful for other parties - clients, designers, subcontractors and suppliers - in enhancing the quality of products and/or services generally.
Resumo:
Asset management in local government is an emerging discipline and over a decade has become a crucial aspect towards a more efficient and effective organisation. One crucial feature in the public asset management is performance measurement toward the public real estates. This measurement critically at the important component of public wealth and seeks to apply a standard of economic efficiency and effective organisational management especially in such global financial crisis condition. This paper aims to identify global economic crisis effect and proposes alternative solution for local governments to softening the impact of the crisis to the local governments organisation. This study found that the most suitable solution for local government to solve the global economic crisis in Indonesia is application of performance measurement in its asset management. Thus, it is important to develop performance measurement system in local government asset management process. This study provides suggestions from published documents and literatures. The paper also discusses the elements of public real estate performance measurement. The measurement of performance has become an essential component of the strategic thinking of assets owners and managers. Without having a formal measurement system for performance, it is difficult to plan, control and improve local government real estate management system. A close look at best practices in public sectors reveals that in most cases these practices were transferred from private sector reals estate management under the direction of real estate experts retained by government. One of the most significant advances in government property performance measurement resulted from recognition that the methodology used by private sector, non real estate corporations for managing their real property offered a valuable prototype for local governments. In general, there are two approaches most frequently used to measure performance of public organisations. Those are subjective and objective measures. Finally, findings from this study provides useful input for the local government policy makers, scholars and asset management practitioners to establish a public real estate performance measurement system toward more efficient and effective local governments in managing their assets as well as increasing public services quality in order to soften the impact of global financial crisis.
Resumo:
Structural changes in intercalated kaolinite after wet ball-milling were examined by scanning electron microscopy (SEM), X-ray diffraction (XRD), specific surface area (SSA) and Fourier Transform Infrared spectroscopy (FTIR). The X-ray diffraction pattern at room temperature indicated that the intercalation of potassium acetate into kaolinite causes an increase of the basal spacing from 0.718 to 1.42 nm, and with the particle size reduction, the surface area increased sharply with the intercalation and delamination by ball-milling. The wet ball-milling kaolinite after intercalation did not change the structural order, and the particulates have high aspect ratio according SEM images.
Resumo:
Environmental impacts caused during Australia's comparatively recent settlement by Europeans are evident. Governments (both Commonwealth and States) have been largely responsible for requiring landholders – through leasehold development conditions and taxation concessions – to conduct clearing that is now perceived as damage. Most governments are now demanding resource protection. There is a measure of bewilderment (if not resentment) among landholders because of this change. The more populous States, where most overall damage has been done (i.e. Victoria and New South Wales), provide most support for attempts to stop development in other regions where there has been less damage. Queensland, i.e. the north-eastern quarter of the continent, has been relatively slow to develop. It also holds the largest and most diverse natural environments. Tree clearing is an unavoidable element of land development, whether to access and enhance native grasses for livestock or to allow for urban developments (with exotic tree plantings). The consequences in terms of regulations are particularly complex because of the dynamic nature of vegetation. The regulatory terms used in current legislation – such as 'Endangered' and 'Of concern' – depend on legally-defined, static baselines. Regrowth and fire damage are two obvious causes of change. A less obvious aspect is succession, where ecosystems change naturally over long timeframes. In the recent past, the Queensland Government encouraged extensive tree-clearing e.g. through the State Brigalow Development Scheme (mostly 1962 to 1975) which resulted in the removal of some 97% of the wide-ranging mature forests of Acacia harpophylla. At the same time, this government controls National Parks and other reservations (occupying some 4% of the State's 1.7 million km2 area) and also holds major World Heritage Areas (such as the Great Barrier Reef and the Wet Tropics Rainforest) promulgated under Commonwealth legislation. This is a highly prescriptive approach, where the community is directed on the one hand to develop (largely through lease conditions) and on the other to avoid development (largely by unusable reserves). Another approach to development and conservation is still possible in Queensland. For this to occur, however, a more workable and equitable solution than has been employed to date is needed, especially for the remote lands of this State. This must involve resident landholders, who have the capacity (through local knowledge, infrastructure and daily presence) to undertake most costeffectively sustainable land-use management (with suitable attention to ecosystems requiring special conservation effort), that is, provided they have the necessary direction, encouragement and incentive to do so.