882 resultados para Update


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many cities worldwide face the prospect of major transformation as the world moves towards a global information order. In this new era, urban economies are being radically altered by dynamic processes of economic and spatial restructuring. The result is the creation of ‘informational cities’ or its new and more popular name, ‘knowledge cities’. For the last two centuries, social production had been primarily understood and shaped by neo-classical economic thought that recognized only three factors of production: land, labor and capital. Knowledge, education, and intellectual capacity were secondary, if not incidental, factors. Human capital was assumed to be either embedded in labor or just one of numerous categories of capital. In the last decades, it has become apparent that knowledge is sufficiently important to deserve recognition as a fourth factor of production. Knowledge and information and the social and technological settings for their production and communication are now seen as keys to development and economic prosperity. The rise of knowledge-based opportunity has, in many cases, been accompanied by a concomitant decline in traditional industrial activity. The replacement of physical commodity production by more abstract forms of production (e.g. information, ideas, and knowledge) has, however paradoxically, reinforced the importance of central places and led to the formation of knowledge cities. Knowledge is produced, marketed and exchanged mainly in cities. Therefore, knowledge cities aim to assist decision-makers in making their cities compatible with the knowledge economy and thus able to compete with other cities. Knowledge cities enable their citizens to foster knowledge creation, knowledge exchange and innovation. They also encourage the continuous creation, sharing, evaluation, renewal and update of knowledge. To compete nationally and internationally, cities need knowledge infrastructures (e.g. universities, research and development institutes); a concentration of well-educated people; technological, mainly electronic, infrastructure; and connections to the global economy (e.g. international companies and finance institutions for trade and investment). Moreover, they must possess the people and things necessary for the production of knowledge and, as importantly, function as breeding grounds for talent and innovation. The economy of a knowledge city creates high value-added products using research, technology, and brainpower. Private and the public sectors value knowledge, spend money on its discovery and dissemination and, ultimately, harness it to create goods and services. Although many cities call themselves knowledge cities, currently, only a few cities around the world (e.g., Barcelona, Delft, Dublin, Montreal, Munich, and Stockholm) have earned that label. Many other cities aspire to the status of knowledge city through urban development programs that target knowledge-based urban development. Examples include Copenhagen, Dubai, Manchester, Melbourne, Monterrey, Singapore, and Shanghai. Knowledge-Based Urban Development To date, the development of most knowledge cities has proceeded organically as a dependent and derivative effect of global market forces. Urban and regional planning has responded slowly, and sometimes not at all, to the challenges and the opportunities of the knowledge city. That is changing, however. Knowledge-based urban development potentially brings both economic prosperity and a sustainable socio-spatial order. Its goal is to produce and circulate abstract work. The globalization of the world in the last decades of the twentieth century was a dialectical process. On one hand, as the tyranny of distance was eroded, economic networks of production and consumption were constituted at a global scale. At the same time, spatial proximity remained as important as ever, if not more so, for knowledge-based urban development. Mediated by information and communication technology, personal contact, and the medium of tacit knowledge, organizational and institutional interactions are still closely associated with spatial proximity. The clustering of knowledge production is essential for fostering innovation and wealth creation. The social benefits of knowledge-based urban development extend beyond aggregate economic growth. On the one hand is the possibility of a particularly resilient form of urban development secured in a network of connections anchored at local, national, and global coordinates. On the other hand, quality of place and life, defined by the level of public service (e.g. health and education) and by the conservation and development of the cultural, aesthetic and ecological values give cities their character and attract or repel the creative class of knowledge workers, is a prerequisite for successful knowledge-based urban development. The goal is a secure economy in a human setting: in short, smart growth or sustainable urban development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a framework for performing real-time recursive estimation of landmarks’ visual appearance. Imaging data in its original high dimensional space is probabilistically mapped to a compressed low dimensional space through the definition of likelihood functions. The likelihoods are subsequently fused with prior information using a Bayesian update. This process produces a probabilistic estimate of the low dimensional representation of the landmark visual appearance. The overall filtering provides information complementary to the conventional position estimates which is used to enhance data association. In addition to robotics observations, the filter integrates human observations in the appearance estimates. The appearance tracks as computed by the filter allow landmark classification. The set of labels involved in the classification task is thought of as an observation space where human observations are made by selecting a label. The low dimensional appearance estimates returned by the filter allow for low cost communication in low bandwidth sensor networks. Deployment of the filter in such a network is demonstrated in an outdoor mapping application involving a human operator, a ground and an air vehicle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to demonstrate the validity of using Gaussian mixture models (GMM) for representing probabilistic distributions in a decentralised data fusion (DDF) framework. GMMs are a powerful and compact stochastic representation allowing efficient communication of feature properties in large scale decentralised sensor networks. It will be shown that GMMs provide a basis for analytical solutions to the update and prediction operations for general Bayesian filtering. Furthermore, a variant on the Covariance Intersect algorithm for Gaussian mixtures will be presented ensuring a conservative update for the fusion of correlated information between two nodes in the network. In addition, purely visual sensory data will be used to show that decentralised data fusion and tracking of non-Gaussian states observed by multiple autonomous vehicles is feasible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a real-time foreground–background segmentation algorithm that exploits the following observation (very often satisfied by a static camera positioned high in its environment). If a blob moves on a pixel p that had not changed its colour significantly for a few frames, then p was probably part of the background when its colour was static. With this information we are able to update differentially pixels believed to be background. This work is relevant to autonomous minirobots, as they often navigate in buildings where smart surveillance cameras could communicate wirelessly with them. A by-product of the proposed system is a mask of the image regions which are demonstrably background. Statistically significant tests show that the proposed method has a better precision and recall rates than the state of the art foreground/background segmentation algorithm of the OpenCV computer vision library.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell based therapies as they apply to tissue engineering and regenerative medicine, require cells capable of self renewal and differentiation, and a prerequisite is to be able to prepare an effective dose of ex vivo expanded cells for autologous transplants. The in vivo identification of a source of physiologically relevant cell types suitable for cell therapies therefore figures as an integral part of tissue engineering. Stem cells serve as a reserve for biological repair, having the potential to differentiate into a number of specialised cell types within the body; they therefore represent the most useful candidates for cell based therapies. The primary goal of stem cell research is to produce cells that are both patient specific, as well as having properties suitable for the specific conditions for which they are intended to remedy. From a purely scientific perspective, stem cells allow scientists to gain a deeper understanding of developmental biology and regenerative therapies. Stem cells have acquired a number of uses for applications in regenerative medicine, immunotherapy, gene therapy, but it is in the area of tissue engineering that they generate most excitement, primarily as a result of their capacity for self-renewal and pluripotency. A unique feature of stem cells is their ability to maintain an uncommitted quiescent state in vivo and then, once triggered by conditions such as disease, injury or natural wear or tear, serve as a reservoir and natural support system to replenish lost cells. Although these cells retain the plasticity to differentiate into various tissues, being able to control this differentiation process is still one of the biggest challenges facing stem cell research. In an effort to harness the potential of these cells a number of studies have been conducted using both embryonic/foetal and adult stem cells. The use of embryonic stem cells (ESC) have been hampered by strong ethical and political concerns, this despite their perceived versatility due to their pluripotency. Ethical issues aside, other concerns raised with ESCs relates to the possibility of tumorigenesis, immune rejection and complications with immunosuppressive therapies, all of which adds layers of complications to the application ESC in research and which has led to the search for alternative sources for stem cells. The adult tissues in higher organisms harbours cells, termed adult stem cells, and these cells are reminiscent of unprogrammed stem cells. A number of sources of adult stem cells have been described. Bone marrow is by far the most accessible source of two potent populations of adult stem cells, namely haematopoietic stem cells (HSCs) and bone marrow mesenchymal stem cells (BMSCs). Autologously harvested adult stem cells can, in contrast to embryonic stem cells, readily be used in autografts, since immune rejection is not an issue; and their use in scientific research has not attracted the ethical concerns which have been the case with embryonic stem cells. The major limitation to their use, however, is the fact that adult stem cells are exceedingly rare in most tissues. This fact makes identifying and isolating these cells problematic; bone marrow being perhaps the only notable exception. Unlike the case of HSCs, there are as yet no rigorous criteria for characterizing MSCs. Changing acuity about the pluripotency of MSCs in recent studies has expanded their potential application; however, the underlying molecular pathways which impart the features distinctive to MSCs remain elusive. Furthermore, the sparse in vivo distribution of these cells imposes a clear limitation to their study in vitro. Also, when MSCs are cultured in vitro, there is a loss of the in vivo microenvironment, resulting in a progressive decline in proliferation potential and multipotentiality. This is further exacerbated with increased passage numbers in culture, characterized by the onset of senescence related changes. As a consequence, it is necessary to establish protocols for generating large numbers of MSCs but without affecting their differentiation potential. MSCs are capable of differentiating into mesenchymal tissue lineages, including bone, cartilage, fat, tendon, muscle, and marrow stroma. Recent findings indicate that adult bone marrow may also contain cells that can differentiate into the mature, nonhematopoietic cells of a number of tissues, including cells of the liver, kidney, lung, skin, gastrointestinal tract, and myocytes of heart and skeletal muscle. MSCs can readily be expanded in vitro and can be genetically modified by viral vectors and be induced to differentiate into specific cell lineages by changing the microenvironment–properties which makes these cells ideal vehicles for cellular gene therapy. MSCs can also exert profound immunosuppressive effects via modulation of both cellular and innate immune pathways, and this property allows them to overcome the issue of immune rejection. Despite the many attractive features associated with MSCs, there are still many hurdles to overcome before these cells are readily available for use in clinical applications. The main concern relates to in vivo characterization and identification of MSCs. The lack of a universal biomarker, sparse in vivo distribution, and a steady age related decline in their numbers, makes it an obvious need to decipher the reprogramming pathways and critical molecular players which govern the characteristics unique to MSCs. This book presents a comprehensive insight into the biology of adult stem cells and their utility in current regeneration therapies. The adult stem cell populations reviewed in this book include bone marrow derived MSCs, adipose derived stem cells (ASCs), umbilical cord blood stem cells, and placental stem cells. The features such as MSC circulation and trafficking, neuroprotective properties, and the nurturing roles and differentiation potential of multiple lineages have been discussed in details. In terms of therapeutic applications, the strengths of MSCs have been presented and their roles in disease treatments such as osteoarthritis, Huntington’s disease, periodontal regeneration, and pancreatic islet transplantation have been discussed. An analysis comparing osteoblast differentiation of umbilical cord blood stem cells and MSCs has been reviewed, as has a comparison of human placental stem cells and ASCs, in terms of isolation, identification and therapeutic applications of ASC in bone, cartilage regeneration, as well as myocardial regeneration. It is my sincere hope that this book will update the reader as to the research progress of MSC biology and potential use of these cells in clinical applications. It will be the best reward to all contributors of this book, if their efforts herein may in some way help the readers in any part of their study, research, and career development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this review is to update expected values for pedometer-determined physical activity in free-living healthy older populations. A search of the literature published since 2001 began with a keyword (pedometer, "step counter," "step activity monitor" or "accelerometer AND steps/day") search of PubMed, Cumulative Index to Nursing & Allied Health Literature (CINAHL), SportDiscus, and PsychInfo. An iterative process was then undertaken to abstract and verify studies of pedometer-determined physical activity (captured in terms of steps taken; distance only was not accepted) in free-living adult populations described as ≥ 50 years of age (studies that included samples which spanned this threshold were not included unless they provided at least some appropriately age-stratified data) and not specifically recruited based on any chronic disease or disability. We identified 28 studies representing at least 1,343 males and 3,098 females ranging in age from 50–94 years. Eighteen (or 64%) of the studies clearly identified using a Yamax pedometer model. Monitoring frames ranged from 3 days to 1 year; the modal length of time was 7 days (17 studies, or 61%). Mean pedometer-determined physical activity ranged from 2,015 steps/day to 8,938 steps/day. In those studies reporting such data, consistent patterns emerged: males generally took more steps/day than similarly aged females, steps/day decreased across study-specific age groupings, and BMI-defined normal weight individuals took more steps/day than overweight/obese older adults. The range of 2,000–9,000 steps/day likely reflects the true variability of physical activity behaviors in older populations. More explicit patterns, for example sex- and age-specific relationships, remain to be informed by future research endeavors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In light of the changing nature of contemporary workplaces, this chapter attempts to identify employer expectations and the associated skills required to workers to function effectively in such workplaces. Workers are required to participate in informed discussion about their specific jobs and to contribute to the overall development of organisations. This requires deep understanding of domain-specific knowledge, which at times can be very complex. Workers are also required to take responsibility for their actions and are expected to be flexible so that they can be deployed to other related jobs depending on demand. Finally, workers are expected to be pro-active, be able to anticipate situations and continuously update their knowledge to address new situations. This chapter discusses the nature of knowledge and skills that will facilitate the above qualities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Video surveillance technology, based on Closed Circuit Television (CCTV) cameras, is one of the fastest growing markets in the field of security technologies. However, the existing video surveillance systems are still not at a stage where they can be used for crime prevention. The systems rely heavily on human observers and are therefore limited by factors such as fatigue and monitoring capabilities over long periods of time. To overcome this limitation, it is necessary to have “intelligent” processes which are able to highlight the salient data and filter out normal conditions that do not pose a threat to security. In order to create such intelligent systems, an understanding of human behaviour, specifically, suspicious behaviour is required. One of the challenges in achieving this is that human behaviour can only be understood correctly in the context in which it appears. Although context has been exploited in the general computer vision domain, it has not been widely used in the automatic suspicious behaviour detection domain. So, it is essential that context has to be formulated, stored and used by the system in order to understand human behaviour. Finally, since surveillance systems could be modeled as largescale data stream systems, it is difficult to have a complete knowledge base. In this case, the systems need to not only continuously update their knowledge but also be able to retrieve the extracted information which is related to the given context. To address these issues, a context-based approach for detecting suspicious behaviour is proposed. In this approach, contextual information is exploited in order to make a better detection. The proposed approach utilises a data stream clustering algorithm in order to discover the behaviour classes and their frequency of occurrences from the incoming behaviour instances. Contextual information is then used in addition to the above information to detect suspicious behaviour. The proposed approach is able to detect observed, unobserved and contextual suspicious behaviour. Two case studies using video feeds taken from CAVIAR dataset and Z-block building, Queensland University of Technology are presented in order to test the proposed approach. From these experiments, it is shown that by using information about context, the proposed system is able to make a more accurate detection, especially those behaviours which are only suspicious in some contexts while being normal in the others. Moreover, this information give critical feedback to the system designers to refine the system. Finally, the proposed modified Clustream algorithm enables the system to both continuously update the system’s knowledge and to effectively retrieve the information learned in a given context. The outcomes from this research are: (a) A context-based framework for automatic detecting suspicious behaviour which can be used by an intelligent video surveillance in making decisions; (b) A modified Clustream data stream clustering algorithm which continuously updates the system knowledge and is able to retrieve contextually related information effectively; and (c) An update-describe approach which extends the capability of the existing human local motion features called interest points based features to the data stream environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is a big challenge to clearly identify the boundary between positive and negative streams for information filtering systems. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on the RCV1 data collection, and substantial experiments show that the proposed approach achieves encouraging performance and the performance is also consistent for adaptive filtering as well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many data mining techniques have been proposed for mining useful patterns in text documents. However, how to effectively use and update discovered patterns is still an open research issue, especially in the domain of text mining. Since most existing text mining methods adopted term-based approaches, they all suffer from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern (or phrase) based approaches should perform better than the term-based ones, but many experiments did not support this hypothesis. This paper presents an innovative technique, effective pattern discovery which includes the processes of pattern deploying and pattern evolving, to improve the effectiveness of using and updating discovered patterns for finding relevant and interesting information. Substantial experiments on RCV1 data collection and TREC topics demonstrate that the proposed solution achieves encouraging performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant amendments to the Property Agents and Motor Dealers Act 2000 (Qld) (‘PAMDA’) and the Body Corporate and Community Management Act 1997 (Qld) (‘BCCMA’) were made by the Liquor and Other Acts Amendment Act 2005 (Qld). These amendments commenced on 1 December 2005. The purpose of this alert is to very briefly describe the amendments and to indicate certain issues that may arise. The alert is intended to signal the need for careful perusal of these amendments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Significant ongoing learning needs for nurses have occurred as a direct result of the continuous introduction of technological innovations and research developments in the healthcare environment. Despite an increased worldwide emphasis on the importance of continuing education, there continues to be an absence of empirical evidence of program and session effectiveness. Few studies determine whether continuing education enhances or develops practice and the relative cost benefits of health professionals’ participation in professional development. The implications for future clinical practice and associated educational approaches to meet the needs of an increasingly diverse multigenerational and multicultural workforce are also not well documented. There is minimal research confirming that continuing education programs contribute to improved patient outcomes, nurses’ earlier detection of patient deterioration or that standards of continuing competence are maintained. Crucially, evidence-based practice is demonstrated and international quality and safety benchmarks are adhered to. An integrated clinical learning model was developed to inform ongoing education for acute care nurses. Educational strategies included the use of integrated learning approaches, interactive teaching concepts and learner-centred pedagogies. A Respiratory Skills Update education (ReSKU) program was used as the content for the educational intervention to inform surgical nurses’ clinical practice in the area of respiratory assessment. The aim of the research was to evaluate the effectiveness of implementing the ReSKU program using teaching and learning strategies, in the context of organisational utility, on improving surgical nurses’ practice in the area of respiratory assessment. The education program aimed to facilitate better awareness, knowledge and understanding of respiratory dysfunction in the postoperative clinical environment. This research was guided by the work of Forneris (2004), who developed a theoretical framework to operationalise a critical thinking process incorporating the complexities of the clinical context. The framework used educational strategies that are learner-centred and participatory. These strategies aimed to engage the clinician in dynamic thinking processes in clinical practice situations guided by coaches and educators. Methods A quasi experimental pre test, post test non–equivalent control group design was used to evaluate the impact of the ReSKU program on the clinical practice of surgical nurses. The research tested the hypothesis that participation in the ReSKU program improves the reported beliefs and attitudes of surgical nurses, increases their knowledge and reported use of respiratory assessment skills. The study was conducted in a 400 bed regional referral public hospital, the central hub of three smaller hospitals, in a health district servicing the coastal and hinterland areas north of Brisbane. The sample included 90 nurses working in the three surgical wards eligible for inclusion in the study. The experimental group consisted of 36 surgical nurses who had chosen to attend the ReSKU program and consented to be part of the study intervention group. The comparison group included the 39 surgical nurses who elected not to attend the ReSKU program, but agreed to participate in the study. Findings One of the most notable findings was that nurses choosing not to participate were older, more experienced and less well educated. The data demonstrated that there was a barrier for training which impacted on educational strategies as this mature aged cohort was less likely to take up educational opportunities. The study demonstrated statistically significant differences between groups regarding reported use of respiratory skills, three months after ReSKU program attendance. Between group data analysis indicated that the intervention group’s reported beliefs and attitudes pertaining to subscale descriptors showed statistically significant differences in three of the six subscales following attendance at the ReSKU program. These subscales included influence on nursing care, educational preparation and clinical development. Findings suggest that the use of an integrated educational model underpinned by a robust theoretical framework is a strong factor in some perceptions of the ReSKU program relating to attitudes and behaviour. There were minimal differences in knowledge between groups across time. Conclusions This study was consistent with contemporary educational approaches using multi-modal, interactive teaching strategies and a robust overarching theoretical framework to support study concepts. The construct of critical thinking in the clinical context, combined with clinical reasoning and purposeful and collective reflection, was a powerful educational strategy to enhance competency and capability in clinicians.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given global demand for new infrastructure, governments face substantial challenges in funding new infrastructure and simultaneously delivering Value for Money (VfM). The paper begins with an update on a key development in a new early/first-order procurement decision making model that deploys production cost/benefit theory and theories concerning transaction costs from the New Institutional Economics, in order to identify a procurement mode that is likely to deliver the best ratio of production costs and transaction costs to production benefits, and therefore deliver superior VfM relative to alternative procurement modes. In doing so, the new procurement model is also able to address the uncertainty concerning the relative merits of Public-Private Partnerships (PPP) and non-PPP procurement approaches. The main aim of the paper is to develop competition as a dependent variable/proxy for VfM and a hypothesis (overarching proposition), as well as developing a research method to test the new procurement model. Competition reflects both production costs and benefits (absolute level of competition) and transaction costs (level of realised competition) and is a key proxy for VfM. Using competition as a proxy for VfM, the overarching proposition is given as: When the actual procurement mode matches the predicted (theoretical) procurement mode (informed by the new procurement model), then actual competition is expected to match potential competition (based on actual capacity). To collect data to test this proposition, the research method that is developed in this paper combines a survey and case study approach. More specifically, data collection instruments for the surveys to collect data on actual procurement, actual competition and potential competition are outlined. Finally, plans for analysing this survey data are briefly mentioned, along with noting the planned use of analytical pattern matching in deploying the new procurement model and in order to develop the predicted (theoretical) procurement mode.