871 resultados para Ad hoc network
Resumo:
Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable however research in this field is limited with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach and therefore the accumulation of research findings in development of a body of knowledge. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. If left unaddressed, health services research will continue in an ad hoc manner preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The purpose of this paper is to provide an integrative review of the literature and introduce a theoretical framework for health services innovation sustainability research based on integration and synthesis of the literature. Finally recommendations for operationalising and testing this theory will be presented.
Resumo:
Background & Research Focus Managing knowledge for innovation and organisational benefit has been extensively investigated in studies of large firms (Smith, Collins & Clark, 2005; Zucker, et al., 2007) and to a large extent there is limited research into studies of small- and medium- sized enterprises (SMEs). There are some investigations in knowledge management research on SMEs, but what remains to be seen in particular is the question of where are the potential challenges for managing knowledge more effectively within these firms? Effective knowledge management (KM) processes and systems lead to improved performance in pursuing distinct capabilities that contribute to firm-level innovation (Nassim 2009; Zucker et al. 2007; Verona and Ravasi 2003). Managing internal and external knowledge in a way that links it closely to the innovation process can assist the creation and implementation of new products and services. KM is particularly important in knowledge intensive firms where the knowledge requirements are highly specialized, diverse and often emergent. However, to a large extent the KM processes of small firms that are often the source of new knowledge and an important element of the value networks of larger companies have not been closely studied. To address this gap which is of increasing importance with the growing number of small firms, we need to further investigate knowledge management processes and the ways that firms find, capture, apply and integrate knowledge from multiple sources for their innovation process. This study builds on the previous literature and applies existing frameworks and takes the process and activity view of knowledge management as a starting point of departure (see among others Kraaijenbrink, Wijnhoven & Groen, 2007; Enberg, Lindkvist, & Tell, 2006; Lu, Wang & Mao, 2007). In this paper, it is attempted to develop a better understanding of the challenges of knowledge management within the innovation process in small knowledge-oriented firms. The paper aims to explore knowledge management processes and practices in firms that are engaged in the new product/service development programs. Consistent with the exploratory character of the study, the research question is: How is knowledge integrated, sourced and recombined from internal and external sources for innovation and new product development? Research Method The research took an exploratory case study approach and developed a theoretical framework to investigate the knowledge situation of knowledge-intensive firms. Equipped with the conceptual foundation, the research adopted a multiple case study method investigating four diverse Australian knowledge-intensive firms from IT, biotechnology, nanotechnology and biochemistry industries. The multiple case study method allowed us to document in some depth the knowledge management experience of the theses firms. Case study data were collected through a review of company published data and semi-structured interviews with managers using an interview guide to ensure uniform coverage of the research themes. This interview guide was developed following development of the framework and a review of the methodologies and issues covered by similar studies in other countries and used some questions common to these studies. It was framed to gather data around knowledge management activity within the business, focusing on the identification, acquisition and utilisation of knowledge, but collecting a range of information about subject as well. The focus of the case studies was on the use of external and internal knowledge to support their knowledge intensive products and services. Key Findings Firstly a conceptual and strategic knowledge management framework has been developed. The knowledge determinants are related to the nature of knowledge, organisational context, and mechanism of the linkages between internal and external knowledge. Overall, a number of key observations derived from this study, which demonstrated the challenges of managing knowledge and how important KM is as a management tool for innovation process in knowledge-oriented firms. To summarise, findings suggest that knowledge management process in these firms is very much project focused and not embedded within the overall organisational routines and mainly based on ad hoc and informal processes. Our findings highlighted lack of formal knowledge management process within our sampled firms. This point to the need for more specialised capabilities in knowledge management for these firms. We observed a need for an effective knowledge transfer support system which is required to facilitate knowledge sharing and particularly capturing and transferring tacit knowledge from one team members to another. In sum, our findings indicate that building effective and adaptive IT systems to manage and share knowledge in the firm is one of the biggest challenges for these small firms. Also, there is little explicit strategy in small knowledge-intensive firms that is targeted at systematic KM either at the strategic or operational level. Therefore, a strategic approach to managing knowledge for innovation as well as leadership and management are essential to achieving effective KM. In particular, research findings demonstrate that gathering tacit knowledge, internal and external to the organization, and applying processes to ensure the availability of knowledge for innovation teams, drives down the risks and cost of innovation. KM activities and tools, such as KM systems, environmental scanning, benchmarking, intranets, firm-wide databases and communities of practice to acquire knowledge and to make it accessible, were elements of KM. Practical Implications The case study method that used in this study provides practical insight into the knowledge management process within Australian knowledge-intensive firms. It also provides useful lessons which can be used by other firms in managing the knowledge more effectively in the innovation process. The findings would be helpful for small firms that may be searching for a practical method for managing and integrating their specialised knowledge. Using the results of this exploratory study and to address the challenges of knowledge management, this study proposes five practices that are discussed in the paper for managing knowledge more efficiently to improve innovation: (1) Knowledge-based firms must be strategic in knowledge management processes for innovation, (2) Leadership and management should encourage various practices for knowledge management, (3) Capturing and sharing tacit knowledge is critical and should be managed, (4)Team knowledge integration practices should be developed, (5) Knowledge management and integration through communication networks, and technology systems should be encouraged and strengthen. In sum, the main managerial contribution of the paper is the recognition of knowledge determinants and processes, and their effects on the effective knowledge management within firm. This may serve as a useful benchmark in the strategic planning of the firm as it utilises new and specialised knowledge.
Resumo:
• Balancing the interests of individual autonomy and protection is an escalating challenge confronting an ageing Australian society. • One way this is manifested is in the current ad hoc and unsatisfactory way that capacity is assessed in the context of wills, enduring powers of attorney and advance health directives. • The absence of nationally accepted assessment guidelines results in terminological and methodological miscommunication and misunderstanding between legal and medical professionals. • Expectations between legal and medical professionals can be clarified to provide satisfactory capacity assessments based upon the development of a sound assessment paradigm
Resumo:
Millions of people with print disabilities are denied the right to read. While some important efforts have been made to convert standard books to accessible formats and create accessible repositories, these have so far only addressed this crisis in an ad hoc way. This article argues that universally designed ebook libraries have the potential of substantially enabling persons with print disabilities. As a case study of what is possible, we analyse 12 academic ebook libraries to map their levels of accessibility. The positive results from this study indicate that universally designed ebooks are more than possible; they exist. While results are positive, however, we also found that most ebook libraries have some features that frustrate full accessibility, and some ebook libraries present critical barriers for people with disabilities. Based on these findings, we consider that some combination of private pressure and public law is both possible and necessary to advance the right-to-read cause. With access improving and recent advances in international law, now is the time to push for universal design and equality.
Resumo:
This pilot project investigated the existing practices and processes of Proficient, Highly Accomplished and Lead teachers in the interpretation, analysis and implementation of National Assessment Program – Literacy and Numeracy (NAPLAN) data. A qualitative case study approach was the chosen methodology, with nine teachers across a variety of school sectors interviewed. Themes and sub-themes were identified from the participants’ interview responses revealing the ways in which Queensland teachers work with NAPLAN data. The data illuminated that generally individual schools and teachers adopted their own ways of working with data, with approaches ranging from individual/ad hoc, to hierarchical or a whole school approach. Findings also revealed that data are the responsibility of various persons from within the school hierarchy; some working with the data electronically whilst others rely on manual manipulation. Manipulation of data is used for various purposes including tracking performance, value adding and targeting programmes for specific groups of students, for example the gifted and talented. Whilst all participants had knowledge of intervention programmes and how practice could be modified, there were large inconsistencies in knowledge and skills across schools. Some see the use of data as a mechanism for accountability, whilst others mention data with regards to changing the school culture and identifying best practice. Overall, the findings showed inconsistencies in approach to focus area 5.4. Recommendations therefore include a more national approach to the use of educational data.
Resumo:
Determining the key variables of transportation disadvantage remains a great challenge as the variables are commonly selected using ad-hoc techniques. In order to identify the variables, this research develops a transportation disadvantage framework by manipulating the capability approach. Developed framework is statistically analysed using partial least square-based software to determine the framework fitness. The statistical analysis identifies mobility and socioeconomic variables that significantly influence transportation disadvantage. The research reveals the key socioeconomic variables for transportation disadvantage in the case of Brisbane, Australia as household structure, presence of dependent family member, vehicle ownership, and driving licence possession.
Resumo:
The Kyoto Protocol is remarkable among global multilateral environmental agreements for its efforts to depoliticize compliance. However, attempts to create autonomous, arm’s length and rule-based compliance processes with extensive reliance on putatively neutral experts were only partially realized in practice in the first commitment period from 2008 to 2012. In particular, the procedurally constrained facilitative powers vested in the Facilitative Branch were circumvented, and expert review teams (ERTs) assumed pivotal roles in compliance facilitation. The ad hoc diplomatic and facilitative practices engaged in by these small teams of technical experts raise questions about the reliability and consistency of the compliance process. For the future operation of the Kyoto compliance system, it is suggested that ERTs should be confined to more technical and procedural roles, in line with their expertise. There would then be greater scope for the Facilitative Branch to assume a more comprehensive facilitative role, safeguarded by due process guarantees, in accordance with its mandate. However, if – as appears likely – the future compliance trajectories under the United Nations Framework Convention on Climate Change will include a significant role for ERTs without oversight by the Compliance Committee, it is important to develop appropriate procedural safeguards that reflect and shape the various technical and political roles these teams currently play.
Resumo:
The requirement of isolated relays is one of the prime obstacles in utilizing sequential slotted cooperative protocols for Vehicular Ad-hoc Networks (VANET). Significant research advancement has taken place to improve the diversity multiplexing trade-off (DMT) of cooperative protocols in conventional mobile networks without much attention on vehicular ad-hoc networks. We have extended the concept of sequential slotted amplify and forward (SAF) protocols in the context of urban vehicular ad-hoc networks. Multiple Input Multiple Output (MIMO) reception is used at relaying vehicular nodes to isolate the relays effectively. The proposed approach adds a pragmatic value to the sequential slotted cooperative protocols while achieving attractive performance gains in urban VANETs. We have analysed the DMT bounds and the outage probabilities of the proposed scheme. The results suggest that the proposed scheme can achieve an optimal DMT similar to the DMT upper bound of the sequential SAF. Furthermore, the outage performance of the proposed scheme outperforms the SAF protocol by 2.5 dB at a target outage probability of 10-4.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.
Resumo:
Most standard algorithms for prediction with expert advice depend on a parameter called the learning rate. This learning rate needs to be large enough to fit the data well, but small enough to prevent overfitting. For the exponential weights algorithm, a sequence of prior work has established theoretical guarantees for higher and higher data-dependent tunings of the learning rate, which allow for increasingly aggressive learning. But in practice such theoretical tunings often still perform worse (as measured by their regret) than ad hoc tuning with an even higher learning rate. To close the gap between theory and practice we introduce an approach to learn the learning rate. Up to a factor that is at most (poly)logarithmic in the number of experts and the inverse of the learning rate, our method performs as well as if we would know the empirically best learning rate from a large range that includes both conservative small values and values that are much higher than those for which formal guarantees were previously available. Our method employs a grid of learning rates, yet runs in linear time regardless of the size of the grid.
Resumo:
It’s commonly assumed that psychiatric violence is motivated by delusions, but here the concept of a reversed impetus is explored, to understand whether delusions are formed as ad-hoc or post-hoc rationalizations of behaviour or in advance of the actus reus. The reflexive violence model proposes that perceptual stimuli has motivational power and this may trigger unwanted actions and hallucinations. The model is based on the theory of ecological perception, where opportunities enabled by an object are cues to act. As an apple triggers a desire to eat, a gun triggers a desire to shoot. These affordances (as they are called) are part of the perceptual apparatus, they allow the direct recognition of objects – and in emergencies they enable the fastest possible reactions. Even under normal circumstances, the presence of a weapon will trigger inhibited violent impulses. The presence of a victim will also, but under normal circumstances, these affordances don’t become violent because negative action impulses are totally inhibited, whereas in psychotic illness, negative action impulses are treated as emergencies and bypass frontal inhibitory circuits. What would have been object recognition becomes a blind automatic action. A range of mental illnesses can cause inhibition to be bypassed. At its most innocuous, this causes both simple hallucinations (where the motivational power of an object is misattributed). But ecological perception may have the power to trigger serious violence also –a kind that’s devoid of motives or planning and is often shrouded in amnesia or post-rational delusions.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
‘Complexity’ is a term that is increasingly prevalent in conversations about building capacity for 21st Century professional engineers. Society is grappling with the urgent and challenging reality of accommodating seven billion people, meeting needs and innovating lifestyle improvements in ways that do not destroy atmospheric, biological and oceanic systems critical to life. Over the last two decades in particular, engineering educators have been active in attempting to build capacity amongst professionals to deliver ‘sustainable development’ in this rapidly changing global context. However curriculum literature clearly points to a lack of significant progress, with efforts best described as ad hoc and highly varied. Given the limited timeframes for action to curb environmental degradation proposed by scientists and intergovernmental agencies, the authors of this paper propose it is imperative that curriculum renewal towards education for sustainable development proceeds rapidly, systemically, and in a transformational manner. Within this context, the paper discusses the need to consider a multiple track approach to building capacity for 21st Century engineering, including priorities and timeframes for undergraduate and postgraduate curriculum renewal. The paper begins with a contextual discussion of the term complexity and how it relates to life in the 21st Century. The authors then present a whole of system approach for planning and implementing rapid curriculum renewal that addresses the critical roles of several generations of engineering professionals over the next three decades. The paper concludes with observations regarding engaging with this approach in the context of emerging accreditation requirements and existing curriculum renewal frameworks.
Resumo:
Background Foot dorsiflexion plays an essential role in both controlling balance and human gait. Electromyography (EMG) and sonomyography (SMG) can provide information on several aspects of muscle function. The aim was to establish the relationship between the EMG and SMG variables during isotonic contractions of foot dorsiflexors. Methods Twenty-seven healthy young adults performed the foot dorsiflexion test on a device designed ad hoc. EMG variables were maximum peak and area under the curve. Muscular architecture variables were muscle thickness and pennation angle. Descriptive statistical analysis, inferential analysis and a multivariate linear regression model were carried out. The confidence level was established with a statistically significant p-value of less than 0.05. Results The correlation between EMG variables and SMG variables was r = 0.462 (p < 0.05). The linear regression model to the dependent variable “peak normalized tibialis anterior (TA)” from the independent variables “pennation angle and thickness”, was significant (p = 0.002) with an explained variance of R2 = 0.693 and SEE = 0.16. Conclusions There is a significant relationship and degree of contribution between EMG and SMG variables during isotonic contractions of the TA muscle. Our results suggest that EMG and SMG can be feasible tools for monitoring and assessment of foot dorsiflexors. TA muscle parameterization and assessment is relevant in order to know that increased strength accelerates the recovery of lower limb injuries.
Resumo:
For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.