879 resultados para false acceptance
Resumo:
Despite a wide acceptance that primary producers in Australia subscribe to a stewardship ethic, land and water degradation remains an ongoing problem. Recent calculations suggest that the economic cost of Australia's environmental degradation is amounting to more than $A3.5 billion a year with an estimated cost of managing (not overcoming) problems of salinity, acidification, soil erosion totalling $A60 billion over the next decade. This paper argues that stewardship itself is an unsatisfactory concept when looking to landholders to respond to environmental problems, for rarely does the attitude of stewardship translate into behaviours of improving natural resource management practices on private land. Whilst there is some acceptance of the environmental problem among primary producers, a number of external constraints may also impede the uptake of conservation-orientated practices. In light of the prevailing accounts of poor adoption of sustainable practices a number of policy options are reviewed in this paper, including formal regional partnerships, regulatory frameworks and market-based measures. It is concluded that the contentious nature of some of these new opportunities for change will mean that any moves aimed at reversing environmental degradation in Australia will be slow.
Resumo:
The international tax system, designed a century ago, has not kept pace with the modern multinational entity rendering it ineffective in taxing many modern businesses according to economic activity. One of those modern multinational entities is the multinational financial institution (MNFI). The recent global financial crisis provides a particularly relevant and significant example of the failure of the current system on a global scale. The modern MNFI is increasingly undertaking more globalised and complex trading operations. A primary reason for the globalisation of financial institutions is that they typically ‘follow-the-customer’ into jurisdictions where international capital and international investors are required. The International Monetary Fund (IMF) recently reported that from 1995-2009, foreign bank presence in developing countries grew by 122 per cent. The same study indicates that foreign banks have a 20 per cent market share in OECD countries and 50 per cent in emerging markets and developing countries. Hence, most significant is that fact that MNFIs are increasingly undertaking an intermediary role in developing economies where they are financing core business activities such as mining and tourism. IMF analysis also suggests that in the future, foreign bank expansion will be greatest in emerging economies. The difficulties for developing countries in applying current international tax rules, especially the current traditional transfer pricing regime, are particularly acute in relation to MNFIs, which are the biggest users of tax havens and offshore finance. This paper investigates whether a unitary taxation approach which reflects economic reality would more easily and effectively ensure that the profits of MNFIs are taxed in the jurisdictions which give rise to those profits. It has previously been argued that the uniqueness of MNFIs results in a failure of the current system to accurately allocate profits and that unitary tax as an alternative could provide a sounder allocation model for international tax purposes. This paper goes a step further, and examines the practicalities of the implementation of unitary taxation for MNFIs in terms of the key components of such a regime, along with their their implications. This paper adopts a two-step approach in considering the implications of unitary taxation as a means of improved corporate tax coordination which requires international acceptance and agreement. First, the definitional issues of the unitary MNFI are examined and second, an appropriate allocation formula for this sector is investigated. To achieve this, the paper asks first, how the financial sector should be defined for the purposes of unitary taxation and what should constitute a unitary business for that sector and second, what is the ‘best practice’ model of an allocation formula for the purposes of the apportionment of the profits of the unitary business of a financial institution.
Resumo:
Using Gray and McNaughton’s revised RST, this study investigated the extent to which the Behavioural Approach System (BAS) and the Fight-Flight-Freeze System (FFFS) influence the processing of gain-framed and loss-framed road safety messages and subsequent message acceptance. It was predicted that stronger BAS sensitivity and FFFS sensitivity would be associated with greater processing and acceptance of the gain-framed messages and loss-framed messages, respectively. Young drivers (N = 80, aged 17–25 years) viewed one of four road safety messages and completed a lexical decision task to assess message processing. Both self-report (e.g., Corr-Cooper RST-PQ) and behavioural measures (i.e., CARROT and Q-Task) were used to assess BAS and FFFS traits. Message acceptance was measured via self-report ratings of message effectiveness, behavioural intentions, attitudes and subsequent driving behaviour. The results are discussed in the context of the effect that differences in reward and punishment sensitivities may have on message processing and message acceptance.
Resumo:
Aim The aim of this paper was to explore the concept of expertise in nursing from the perspective of how it relates to current driving forces in health care in which it discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. Background Expert nursing practice can be argued to be central to high quality, holistic, individualized patient care. However, changes in government policy which have led to the inception of comprehensive guidelines or protocols of care are in danger of relegating the ‘expert nurse’ to being an icon of the past. Indeed, it could be argued that expert nurses are an expensive commodity within the nursing workforce. Consequently, with this change to the use of clinical guidelines, it calls into question how expert nursing practice will develop within this framework of care. Method The article critically reviews the evidence related to the role of the Expert Nurse in an attempt to identify the key concepts and ideas, and how the inception of care protocols has implications for their role. Conclusion Nursing expertise which focuses on the provision of individualized, holistic care and is based largely on intuitive decision making cannot, should not be reduced to being articulated in positivist terms. However, the dominant power and decision-making focus in health care means that nurses must be confident in articulating the value of a concept which may be outside the scope of knowledge of those with whom they are debating. Relevance to clinical practice The principles of abduction or fuzzy logic may be useful in assisting nurses to explain in terms which others can comprehend, the value of nursing expertise.
Resumo:
This paper explores the concept of expertise in intensive care nursing practice from the perspective of its relationship to the current driving forces in healthcare. It discusses the potential barriers to acceptance of nursing expertise in a climate in which quantification of value and cost containment run high on agendas. It argues that nursing expertise which focuses on the provision of individualised, holistic care and which is based largely on intuitive decision-making cannot and should not be reduced to being articulated in positivist terms. The principles of abduction or fuzzy logic, derived from computer science, may be useful in assisting nurses to explain in terms, which others can comprehend, the value of nursing expertise.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
User-generated content plays a pivotal role in the current social media. The main focus, however, has been on the explicitly generated user content such as photos, videos and status updates on different social networking sites. In this paper, we explore the potential of implicitly generated user content, based on users’ online consumption behaviors. It is technically feasible to record users’ consumption behaviors on mobile devices and share that with relevant people. Mobile devices with such capabilities could enrich social interactions around the consumed content, but it may also threaten users’ privacy. To understand the potentials of this design direction we created and evaluated a low-fidelity prototype intended for photo sharing within private groups. Our prototype incorporates two design concepts, namely, FingerPrint and MoodPhotos that leverage users’ consumption history and emotional responses. In this paper, we report user values and user acceptance of this prototype from three participatory design workshops.
Resumo:
This paper presents a method for the continuous segmentation of dynamic objects using only a vehicle mounted monocular camera without any prior knowledge of the object’s appearance. Prior work in online static/dynamic segmentation is extended to identify multiple instances of dynamic objects by introducing an unsupervised motion clustering step. These clusters are then used to update a multi-class classifier within a self-supervised framework. In contrast to many tracking-by-detection based methods, our system is able to detect dynamic objects without any prior knowledge of their visual appearance shape or location. Furthermore, the classifier is used to propagate labels of the same object in previous frames, which facilitates the continuous tracking of individual objects based on motion. The proposed system is evaluated using recall and false alarm metrics in addition to a new multi-instance labelled dataset to evaluate the performance of segmenting multiple instances of objects.
Resumo:
Purpose The repair, maintenance, minor alteration and addition (RMAA) sector has been expanding in many developed cities. Safety problems of the RMAA sector have attracted the attention of many governments. This study has the objectives of comparing the level of safety climate of workers, supervisors and managers in the RMAA sector; and explaining/ predicting the impact of safety climate on injury occurrence of workers, supervisors and managers. Design/methodology/approach A questionnaire survey was administered to RMAA contracting companies in Hong Kong. Findings When comparing the safety climate perception of workers, supervisors and managers in the RMAA sector, the supervisors group had the lowest mean safety climate score. Results showed that a positive workforce safety attitude and acceptance of safety rules and regulations reduced the workers’ likelihood of having injuries. A reasonable production schedule led to a lower probability of supervisors being injured. Management commitment and effective safety management reduced the probability of managers being injured. Originality/value This study revealed variations of safety climate at the different levels in the organizational hierarchy and their varying influence on safety performance of the RMAA sector. Safety of RMAA works could be improved by promulgating specific safety measures at the different hierarchy levels.
Resumo:
E-mail spam has remained a scourge and menacing nuisance for users, internet and network service operators and providers, in spite of the anti-spam techniques available; and spammers are relentlessly circumventing these anti-spam techniques embedded or installed in form of software products on both client and server sides of both fixed and mobile devices to their advantage. This continuous evasion degrades the capabilities of these anti-spam techniques as none of them provides a comprehensive reliable solution to the problem posed by spam and spammers. Major problem for instance arises when these anti-spam techniques misjudge or misclassify legitimate emails as spam (false positive); or fail to deliver or block spam on the SMTP server (false negative); and the spam passes-on to the receiver, and yet this server from where it originates does not notice or even have an auto alert service to indicate that the spam it was designed to prevent has slipped and moved on to the receiver’s SMTP server; and the receiver’s SMTP server still fail to stop the spam from reaching user’s device and with no auto alert mechanism to inform itself of this inability; thus causing a staggering cost in loss of time, effort and finance. This paper takes a comparative literature overview of some of these anti-spam techniques, especially the filtering technological endorsements designed to prevent spam, their merits and demerits to entrench their capability enhancements, as well as evaluative analytical recommendations that will be subject to further research.
Resumo:
Background Cancer monitoring and prevention relies on the critical aspect of timely notification of cancer cases. However, the abstraction and classification of cancer from the free-text of pathology reports and other relevant documents, such as death certificates, exist as complex and time-consuming activities. Aims In this paper, approaches for the automatic detection of notifiable cancer cases as the cause of death from free-text death certificates supplied to Cancer Registries are investigated. Method A number of machine learning classifiers were studied. Features were extracted using natural language techniques and the Medtex toolkit. The numerous features encompassed stemmed words, bi-grams, and concepts from the SNOMED CT medical terminology. The baseline consisted of a keyword spotter using keywords extracted from the long description of ICD-10 cancer related codes. Results Death certificates with notifiable cancer listed as the cause of death can be effectively identified with the methods studied in this paper. A Support Vector Machine (SVM) classifier achieved best performance with an overall F-measure of 0.9866 when evaluated on a set of 5,000 free-text death certificates using the token stem feature set. The SNOMED CT concept plus token stem feature set reached the lowest variance (0.0032) and false negative rate (0.0297) while achieving an F-measure of 0.9864. The SVM classifier accounts for the first 18 of the top 40 evaluated runs, and entails the most robust classifier with a variance of 0.001141, half the variance of the other classifiers. Conclusion The selection of features significantly produced the most influences on the performance of the classifiers, although the type of classifier employed also affects performance. In contrast, the feature weighting schema created a negligible effect on performance. Specifically, it is found that stemmed tokens with or without SNOMED CT concepts create the most effective feature when combined with an SVM classifier.
Resumo:
There is no doubt that place branding is a powerful and ubiquitous practice deployed around the globe. Parallel to its acceptance and development as a distinct discipline is an understanding that place branding as responsible practice offers the means to achieve widespread economic, social and cultural benefits. Drawing on work around place and identity in cultural geography and cultural studies, this paper engages critically with this vision. Specifically, it challenges the widely-held assumption that the relationship between place branding and place identity is fundamentally reflective, arguing instead that this relationship is inherently generative. This shift in perspective, explored in relation to current responsible place branding practice, is central to the realisation of place branding as a force for good.
Resumo:
L-Amino acid oxidases (LAAOs) are useful catalysts for the deracemisation of racemic amino acid sub-strates when combined with abiotic reductants. The gene nadB encoding the L-aspartate amino acid oxidase from Pseudomonas putida (PpLASPO) has been cloned and expressed in E. coli. The purified PpLASPO enzyme displayed a K M for l-aspartic acid of 2.26 mM and a k cat = 10.6 s −1 , with lower activity also displayed towards L-asparagine, for which pronounced substrate inhibition was also observed. The pH optimum of the enzyme was recorded at pH 7.4. The enzyme was stable for 60 min at up to 40 • C, but rapid losses in activity were observed at 50 • C. A mutational analysis of the enzyme, based on its sequence homology with the LASPO from E. coli of known structure, appeared to confirm roles in substrate binding or catalysis for residues His244, His351, Arg386 and Arg290 and also for Thr259 and Gln242. The high activity of the enzyme, and its promiscuous acceptance of both L-asparagine and L-glutamate as substrates, if with low activity, suggests that PpLASPO may provide a good model enzyme for evolution studies towards AAOs of altered or improved properties in the future.
Resumo:
Background Some apple (Malus × domestica Borkh.) varieties have attractive striping patterns, a quality attribute that is important for determining apple fruit market acceptance. Most apple cultivars (e.g. 'Royal Gala') produce fruit with a defined fruit pigment pattern, but in the case of 'Honeycrisp' apple, trees can produce fruits of two different kinds: striped and blushed. The causes of this phenomenon are unknown. Results Here we show that striped areas of 'Honeycrisp' and 'Royal Gala' are due to sectorial increases in anthocyanin concentration. Transcript levels of the major biosynthetic genes and MYB10, a transcription factor that upregulates apple anthocyanin production, correlated with increased anthocyanin concentration in stripes. However, nucleotide changes in the promoter and coding sequence of MYB10 do not correlate with skin pattern in 'Honeycrisp' and other cultivars differing in peel pigmentation patterns. A survey of methylation levels throughout the coding region of MYB10 and a 2.5 Kb region 5' of the ATG translation start site indicated that an area 900 bp long, starting 1400 bp upstream of the translation start site, is highly methylated. Cytosine methylation was present in all three contexts, with higher methylation levels observed for CHH and CHG (where H is A, C or T) than for CG. Comparisons of methylation levels of the MYB10 promoter in 'Honeycrisp' red and green stripes indicated that they correlate with peel phenotypes, with an enrichment of methylation observed in green stripes. Conclusions Differences in anthocyanin levels between red and green stripes can be explained by differential transcript accumulation of MYB10. Different levels of MYB10 transcript in red versus green stripes are inversely associated with methylation levels in the promoter region. Although observed methylation differences are modest, trends are consistent across years and differences are statistically significant. Methylation may be associated with the presence of a TRIM retrotransposon within the promoter region, but the presence of the TRIM element alone cannot explain the phenotypic variability observed in 'Honeycrisp'. We suggest that methylation in the MYB10 promoter is more variable in 'Honeycrisp' than in 'Royal Gala', leading to more variable color patterns in the peel of this cultivar.
Resumo:
The planning of IMRT treatments requires a compromise between dose conformity (complexity) and deliverability. This study investigates established and novel treatment complexity metrics for 122 IMRT beams from prostate treatment plans. The Treatment and Dose Assessor software was used to extract the necessary data from exported treatment plan files and calculate the metrics. For most of the metrics, there was strong overlap between the calculated values for plans that passed and failed their quality assurance (QA) tests. However, statistically significant variation between plans that passed and failed QA measurements was found for the established modulation index and for a novel metric describing the proportion of small apertures in each beam. The ‘small aperture score’ provided threshold values which successfully distinguished deliverable treatment plans from plans that did not pass QA, with a low false negative rate.