860 resultados para Mobile Ad-hoc Networks (MANETs)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formality and informality of HRM practices in small firms Rowena Barrett and Susan Mayson Introduction The nature of human resource management in small firms is understood to be characterized by ad hoc and idiosyncratic practices. The liability of smallness (Heneman and Berkley, 1999) and resource poverty (Welsh and White, 1981) presents unique challenges to managing human resources in small firms. The inability to achieve economies of scale can mean that implementing formalized HRM practices is costly in terms of time and money for small firms (Sels et al., 2006a; 2006b). These, combined with small firm owner–managers’ lack of strategic capabilities and awareness (Hannon and Atherton, 1998) and a lack of managerial resources and expertise in HRM (Cardon and Stevens, 2004) can lead to informal and ad hoc HRM practices. For some this state of affairs is interpreted as problematic as the normative and formalized HRM practices in the areas of recruitment, selection, appraisal, training and rewards are not present (see Marlow, 2006 and Taylor, 2006 for a critique). However, a more nuanced analysis of the small firm and its practices in their context can tell a different story (Barrett and Rainnie, 2002; Harney and Dundon, 2006). In this chapter we contribute to our understanding of small firm management practices by investigating a series of questions in relation to HRM in small firms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable however research in this field is limited with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach and therefore the accumulation of research findings in development of a body of knowledge. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. If left unaddressed, health services research will continue in an ad hoc manner preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The purpose of this paper is to provide an integrative review of the literature and introduce a theoretical framework for health services innovation sustainability research based on integration and synthesis of the literature. Finally recommendations for operationalising and testing this theory will be presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

• Balancing the interests of individual autonomy and protection is an escalating challenge confronting an ageing Australian society. • One way this is manifested is in the current ad hoc and unsatisfactory way that capacity is assessed in the context of wills, enduring powers of attorney and advance health directives. • The absence of nationally accepted assessment guidelines results in terminological and methodological miscommunication and misunderstanding between legal and medical professionals. • Expectations between legal and medical professionals can be clarified to provide satisfactory capacity assessments based upon the development of a sound assessment paradigm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Millions of people with print disabilities are denied the right to read. While some important efforts have been made to convert standard books to accessible formats and create accessible repositories, these have so far only addressed this crisis in an ad hoc way. This article argues that universally designed ebook libraries have the potential of substantially enabling persons with print disabilities. As a case study of what is possible, we analyse 12 academic ebook libraries to map their levels of accessibility. The positive results from this study indicate that universally designed ebooks are more than possible; they exist. While results are positive, however, we also found that most ebook libraries have some features that frustrate full accessibility, and some ebook libraries present critical barriers for people with disabilities. Based on these findings, we consider that some combination of private pressure and public law is both possible and necessary to advance the right-to-read cause. With access improving and recent advances in international law, now is the time to push for universal design and equality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This pilot project investigated the existing practices and processes of Proficient, Highly Accomplished and Lead teachers in the interpretation, analysis and implementation of National Assessment Program – Literacy and Numeracy (NAPLAN) data. A qualitative case study approach was the chosen methodology, with nine teachers across a variety of school sectors interviewed. Themes and sub-themes were identified from the participants’ interview responses revealing the ways in which Queensland teachers work with NAPLAN data. The data illuminated that generally individual schools and teachers adopted their own ways of working with data, with approaches ranging from individual/ad hoc, to hierarchical or a whole school approach. Findings also revealed that data are the responsibility of various persons from within the school hierarchy; some working with the data electronically whilst others rely on manual manipulation. Manipulation of data is used for various purposes including tracking performance, value adding and targeting programmes for specific groups of students, for example the gifted and talented. Whilst all participants had knowledge of intervention programmes and how practice could be modified, there were large inconsistencies in knowledge and skills across schools. Some see the use of data as a mechanism for accountability, whilst others mention data with regards to changing the school culture and identifying best practice. Overall, the findings showed inconsistencies in approach to focus area 5.4. Recommendations therefore include a more national approach to the use of educational data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining the key variables of transportation disadvantage remains a great challenge as the variables are commonly selected using ad-hoc techniques. In order to identify the variables, this research develops a transportation disadvantage framework by manipulating the capability approach. Developed framework is statistically analysed using partial least square-based software to determine the framework fitness. The statistical analysis identifies mobility and socioeconomic variables that significantly influence transportation disadvantage. The research reveals the key socioeconomic variables for transportation disadvantage in the case of Brisbane, Australia as household structure, presence of dependent family member, vehicle ownership, and driving licence possession.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the development and experimental evaluation of a novel vision-based Autonomous Surface Vehicle with the purpose of performing coordinated docking manoeuvres with a target, such as an Autonomous Underwater Vehicle, on the water’s surface. The system architecture integrates two small processor units; the first performs vehicle control and implements a virtual force obstacle avoidance and docking strategy, with the second performing vision-based target segmentation and tracking. Furthermore, the architecture utilises wireless sensor network technology allowing the vehicle to be observed by, and even integrated within an ad-hoc sensor network. The system performance is demonstrated through real-world experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Kyoto Protocol is remarkable among global multilateral environmental agreements for its efforts to depoliticize compliance. However, attempts to create autonomous, arm’s length and rule-based compliance processes with extensive reliance on putatively neutral experts were only partially realized in practice in the first commitment period from 2008 to 2012. In particular, the procedurally constrained facilitative powers vested in the Facilitative Branch were circumvented, and expert review teams (ERTs) assumed pivotal roles in compliance facilitation. The ad hoc diplomatic and facilitative practices engaged in by these small teams of technical experts raise questions about the reliability and consistency of the compliance process. For the future operation of the Kyoto compliance system, it is suggested that ERTs should be confined to more technical and procedural roles, in line with their expertise. There would then be greater scope for the Facilitative Branch to assume a more comprehensive facilitative role, safeguarded by due process guarantees, in accordance with its mandate. However, if – as appears likely – the future compliance trajectories under the United Nations Framework Convention on Climate Change will include a significant role for ERTs without oversight by the Compliance Committee, it is important to develop appropriate procedural safeguards that reflect and shape the various technical and political roles these teams currently play.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most standard algorithms for prediction with expert advice depend on a parameter called the learning rate. This learning rate needs to be large enough to fit the data well, but small enough to prevent overfitting. For the exponential weights algorithm, a sequence of prior work has established theoretical guarantees for higher and higher data-dependent tunings of the learning rate, which allow for increasingly aggressive learning. But in practice such theoretical tunings often still perform worse (as measured by their regret) than ad hoc tuning with an even higher learning rate. To close the gap between theory and practice we introduce an approach to learn the learning rate. Up to a factor that is at most (poly)logarithmic in the number of experts and the inverse of the learning rate, our method performs as well as if we would know the empirically best learning rate from a large range that includes both conservative small values and values that are much higher than those for which formal guarantees were previously available. Our method employs a grid of learning rates, yet runs in linear time regardless of the size of the grid.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It’s commonly assumed that psychiatric violence is motivated by delusions, but here the concept of a reversed impetus is explored, to understand whether delusions are formed as ad-hoc or post-hoc rationalizations of behaviour or in advance of the actus reus. The reflexive violence model proposes that perceptual stimuli has motivational power and this may trigger unwanted actions and hallucinations. The model is based on the theory of ecological perception, where opportunities enabled by an object are cues to act. As an apple triggers a desire to eat, a gun triggers a desire to shoot. These affordances (as they are called) are part of the perceptual apparatus, they allow the direct recognition of objects – and in emergencies they enable the fastest possible reactions. Even under normal circumstances, the presence of a weapon will trigger inhibited violent impulses. The presence of a victim will also, but under normal circumstances, these affordances don’t become violent because negative action impulses are totally inhibited, whereas in psychotic illness, negative action impulses are treated as emergencies and bypass frontal inhibitory circuits. What would have been object recognition becomes a blind automatic action. A range of mental illnesses can cause inhibition to be bypassed. At its most innocuous, this causes both simple hallucinations (where the motivational power of an object is misattributed). But ecological perception may have the power to trigger serious violence also –a kind that’s devoid of motives or planning and is often shrouded in amnesia or post-rational delusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

‘Complexity’ is a term that is increasingly prevalent in conversations about building capacity for 21st Century professional engineers. Society is grappling with the urgent and challenging reality of accommodating seven billion people, meeting needs and innovating lifestyle improvements in ways that do not destroy atmospheric, biological and oceanic systems critical to life. Over the last two decades in particular, engineering educators have been active in attempting to build capacity amongst professionals to deliver ‘sustainable development’ in this rapidly changing global context. However curriculum literature clearly points to a lack of significant progress, with efforts best described as ad hoc and highly varied. Given the limited timeframes for action to curb environmental degradation proposed by scientists and intergovernmental agencies, the authors of this paper propose it is imperative that curriculum renewal towards education for sustainable development proceeds rapidly, systemically, and in a transformational manner. Within this context, the paper discusses the need to consider a multiple track approach to building capacity for 21st Century engineering, including priorities and timeframes for undergraduate and postgraduate curriculum renewal. The paper begins with a contextual discussion of the term complexity and how it relates to life in the 21st Century. The authors then present a whole of system approach for planning and implementing rapid curriculum renewal that addresses the critical roles of several generations of engineering professionals over the next three decades. The paper concludes with observations regarding engaging with this approach in the context of emerging accreditation requirements and existing curriculum renewal frameworks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Foot dorsiflexion plays an essential role in both controlling balance and human gait. Electromyography (EMG) and sonomyography (SMG) can provide information on several aspects of muscle function. The aim was to establish the relationship between the EMG and SMG variables during isotonic contractions of foot dorsiflexors. Methods Twenty-seven healthy young adults performed the foot dorsiflexion test on a device designed ad hoc. EMG variables were maximum peak and area under the curve. Muscular architecture variables were muscle thickness and pennation angle. Descriptive statistical analysis, inferential analysis and a multivariate linear regression model were carried out. The confidence level was established with a statistically significant p-value of less than 0.05. Results The correlation between EMG variables and SMG variables was r = 0.462 (p < 0.05). The linear regression model to the dependent variable “peak normalized tibialis anterior (TA)” from the independent variables “pennation angle and thickness”, was significant (p = 0.002) with an explained variance of R2 = 0.693 and SEE = 0.16. Conclusions There is a significant relationship and degree of contribution between EMG and SMG variables during isotonic contractions of the TA muscle. Our results suggest that EMG and SMG can be feasible tools for monitoring and assessment of foot dorsiflexors. TA muscle parameterization and assessment is relevant in order to know that increased strength accelerates the recovery of lower limb injuries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.