935 resultados para point-to-segment algorithm
Resumo:
Person tracking systems are dependent on being able to locate a person accurately across a series of frames. Optical flow can be used to segment a moving object from a scene, provided the expected velocity of the moving object is known; but successful detection also relies on being able segment the background. A problem with existing optical flow techniques is that they don’t discriminate the foreground from the background, and so often detect motion (and thus the object) in the background. To overcome this problem, we propose a new optical flow technique, that is based upon an adaptive background segmentation technique, which only determines optical flow in regions of motion. This technique has been developed with a view to being used in surveillance systems, and our testing shows that for this application it is more effective than other standard optical flow techniques.
Resumo:
Purpose Waiting for service by customers is an important problem for many financial service marketers. Two new approaches are proposed. First, customer evaluation of the service is increased with an ambient scent. Second a cognitive variable is identified which different iates customers by the way they value time so that they can be segmented. Methodology Pretests included focus groups which highlighted financial services and a pilot test were foll owed by a main sample of 607 subjects. Structural equation modelling and multivariate analysis of covariance were used for analysis. Findings A cognitive variable, the need for time management can be used, together with demographic and customer net worth data, to segment a customer base. Two environmental interventions, music and scent, can increase customer satisfaction among customers kept waiting in a line. Research implications Two original approaches to a rapidly growing service marketing problem are identified. Practical implications Service contact points can reduce incidence of "queue rage" and enhance customer satisfaction by either or both of two simple modifications to the service environment or a preventive strategy of offering targeted customers an alternative. Originality A new method of segmentation and a new environmental intervention are proposed .
Resumo:
The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders – academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that these variations may point to a dangerous disconnect between research projects and practical demands.
Resumo:
China has a reputation as an economy based on utility: the large-scale manufacture of low-priced goods. But useful values like functionality, fitness for purpose and efficiency are only part of the story. More important are what Veblen called ‘honorific’ values, arguably the driving force of development, change and value in any economy. To understand the Chinese economy therefore, it is not sufficient to point to its utilitarian aspect. Honorific status-competition is a more fundamental driver than utilitarian cost-competition. We argue that ‘social network markets’ are the expression of these honorific values, relationships and connections that structure and coordinate individual choices. This paper explores how such markets are developing in China in the area of fashion and fashion media. These, we argue, are an expression of ‘risk culture’ for high-end entrepreneurial consumers and producers alike, providing a stimulus to dynamic innovation in the arena of personal taste and comportment, as part of an international cultural system based on constant change. We examine the launch of Vogue China in 2005, and China’s reception as a fashion player among the international editions of Vogue, as an expression of a ‘decisive moment’ in the integration of China into an international social network market based on honorific values.
Resumo:
The issue of health professionals facing criminal charges of manslaughter or criminal negligence causing death or grievous bodily harm as a result of alleged negligence in their professional practice was thrown into stark relief by the recent acquittal of four physicians accused of mismanaging Canada’s blood system in the early 1980s. Stories like these, as well as international reports detailing an increase in the numbers of physicians being charged with (and in some cases convicted of) serious criminal offences as the result of alleged negligence in their professional practice, have resulted in some anxiety about the apparent increase in the incidence of such charges and their appropriateness in the healthcare context. Whilst research has focused on the incidence, nature and appropriateness of criminal charges against health professionals, particularly physicians, for alleged negligence in their professional practice in the United Kingdom, the United States, Japan, and New Zealand, the Canadian context has yet to be examined. This article examines the Canadian context and how the criminal law is used to regulate the negligent acts or omissions of a health care professional in the course of their professional practice. It also assesses the appropriateness of such use. It is important at this point to state that the analysis in this article does not focus on those, fortunately few, cases where a health professional has intentionally killed his or her patients but rather when patients’ deaths or grievous injuries were allegedly as a result of that health professional’s negligent acts or omissions when providing health services to that patient.
Resumo:
The main objective of the thesis is to seek insights into the theory, and provide empirical evidence of rebound effects. Rebound effects reduce the environmental benefits of environmental policies and household behaviour changes. In particular, win-win demand side measures, in the form of energy efficiency and household consumption pattern changes, are seen as ways for households and businesses to save money and the environment. However, these savings have environmental impacts when spent, which are known as rebound effects. This is an area that has been widely neglected by policy makers. This work extends the rebound effect literature in three important ways, (1) it incorporates the potential for variation of rebound effects with household income level, (2) it enables the isolation of direct and indirect effects for cases of energy efficient technology adoption, and examines the relationship between these two component effects, and (3) it expands the scope of rebound effect analysis to include government taxes and subsidies. MACROBUTTON HTMLDirect Using a case study approach it is found that the rebound effect from household consumption pattern changes targeted at electricity is between 5 and 10%. For consumption pattern changes with reduced vehicle fuel use, the rebound effect is in the order of 20 to 30%. Higher income households in general are found to have a lower total rebound effect; however the indirect effect becomes relatively more significant at higher household income levels. In the win-lose case of domestic photovoltaic electricity generation, it is demonstrated that negative rebound effects can occur, which can potentially amplify the environmental benefits of this action. The rebound effect from a carbon tax, which occurs due to the re-spending of raised revenues, was found to be in the range of 11-32%. Taxes and transfers between households of different income levels also have environmental implications. For example, a more progressive tax structure, with increased low income welfare payments is likely to increase greenhouse gas emissions. Subsidies aimed at encouraging environmentally friendly consumption habits are also subject to rebound effects, as they constitute a substitution of government expenditure for household expenditure. For policy makers, these findings point to the need to incorporate rebound effects in the environmental policy evaluation process.’
Resumo:
Introduction and Aims: Since the 1990s illicit drug use death rates in Australia have increased markedly. There is a notable gap in knowledge about changing socio-economic inequalities in drug use death rates. Some limited Australian and overseas data point to higher rates of drug death in the lowest socio-economic groups, but the paucity of available studies and their sometimes conflicting findings need to be addressed. Design and Methods: This paper uses data obtained from the Australian Bureau of Statistics (ABS) to examine changes in age-standardised drug-induced mortality rates for Australian males over the period 1981 – 2002. Socio-economic status was categorised as manual or non-manual work status. Results: With the rapid increase in drug-induced mortality rates in the 1990s, there was a parallel increase in socio-economic inequalities in drug-induced deaths. The decline in drug death rates from 2000 onwards was associated with a decline in socio-economic inequalities. By 2002, manual workers had drug death rates well over twice the rate of non-manual workers. Discussion: Three factors are identified which contribute to these socio-economic inequalities in mortality. First, there has been an age shift in deaths evident only for manual workers. Secondly, there has been an increase in availability until 1999 and a relative decline in the cost of the drug, which most often leads to drug death (heroin). Thirdly, there has been a shift to amphetamine use which may lead to significant levels of morbidity, but few deaths. [Najman JM, Toloo G, Williams GM. Increasing socio-economic inequalities in drug-induced deaths in Australia: 1981–2002.
Resumo:
The Perth Declaration on Science and Technology Education of 2007 expresses strong concern about the state of science and technology education worldwide and calls on governments to respond to a number of suggestions for establishing the structural conditions for their improved practice. The quality of school education in science and technology has never before been of such critical importance to governments. There are three imperatives for its critical importance. The first relates to the traditional role of science in schooling, namely the identification, motivation and initial preparation of those students who will go on to further studies for careers in all those professional fi elds that directly involve science and technology. A suffi cient supply of these professionals is vital to the economy of all countries and to the health of their citizens. In the 21st century they are recognised everywhere as key players in ensuring that industrial and economic development occurs in a socially and environmentally sustainable way. In many countries this supply is now falling seriously short and urgently needs to be addressed. The second imperative is that sustainable technological development and many other possible societal applications of science require the support of scientifically and technologically informed citizens. Without the support and understanding of citizens, technological development can all too easily serve short term and sectional interests. The longer term progress of the whole society is overlooked, citizens will be confused about what should, and what should not be supported, and reactive and the environment will continue to be destroyed rather than sustained. Sustainable development, and the potential that science and technology increasingly offers, involves societies in ways that can often interact strongly, with traditional values, and hence, making decisions about them involve major moral decisions. All students need to be prepared through their science and technology education to be able to participate actively as persons and as responsible citizens in these essential and exciting possibilities. This goal is far from being generally achieved at present, but pathways to it are now more clearly understood. The third imperative derives from the changes that are resulting from the application of digital technologies that are the most rapid, the most widespread, and probably the most pervasive influence that science has ever had on human society. We all, wherever we live, are part of a global communication society. Information exchange and access to it that have been hitherto the realm of the few, are now literally in the hands of individuals. This is leading to profound changes in the World of Work and in what is known as the Knowledge Society. Schooling is now being challenged to contribute to the development in students of an active repertoire of generic and subject-based competencies. This contrasts very strongly with existing priorities, in subjects like the sciences that have seen the size of a student’s a store of established knowledge as the key measure of success. Science and technology education needs to be a key component in developing these competencies. When you add to these imperatives, the possibility that a more effective education in science and technology will enable more and more citizens to delight in, and feel a share in the great human enterprise we call Science, the case for new policy decisions is compellingly urgent. What follows are the recommendations (and some supplementary notes) for policy makers to consider about more operational aspects for improving science and technology education. They are listed under headings that point to the issues within each of these aspects. In the full document, a background is provided to each set of issues, including the commonly current state of science and technology education. Associated with each recommendation for consideration are the positive Prospects that could follow from such decision making, and the necessary Prerequisites, if such bold policy decisions are to fl ow, as intended, into practice in science and technology classrooms.
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
It is important to detect and treat malnutrition in hospital patients so as to improve clinical outcome and reduce hospital stay. The aim of this study was to develop and validate a nutrition screening tool with a simple and quick scoring system for acute hospital patients in Singapore. In this study, 818 newly admitted patients aged above 18 years old were screened using five parameters that contribute to the risk of malnutrition. A dietitian blinded to the nutrition screening score assessed the same patients using the reference standard, Subjective Global Assessment (SGA) within 48 hours. The sensitivity and specificity were established using the Receiver Operator Characteristics (ROC) curve and the best cutoff scores determined. The nutrition parameter with the largest Area Under the ROC Curve (AUC) was chosen as the final screening tool, which was named 3-Minute Nutrition Screening (3-MinNS). The combination of the parameters weight loss, intake and muscle wastage (3-MinNS), gave the largest AUC when compared with SGA. Using 3-MinNS, the best cutoff point to identify malnourished patients is three (sensitivity 86%, specificity 83%). The cutoff score to identify subjects at risk of severe malnutrition is five (sensitivity 93%, specificity 86%). 3-Minute Nutrition Screening is a valid, simple and rapid tool to identify patients at risk of malnutrition in Singapore acute hospital patients. It is able to differentiate patients at risk of moderate malnutrition and severe malnutrition for prioritization and management purposes.
Resumo:
As the paper’s subtitle suggests broadband has had a remarkably checkered trajectory in Australia. It was synonymous with the early 1990s information superhighway and seemed to presage a moment in which “content is [to be] king”. It disappeared almost entirely as a public priority in the mid to late 1990s as intrastructure and content were disconnected in services frameworks focused on information and communication technologies. And it came back in the 2000s as a critical infrastructure for innovation and the knowledge economy. But this time content was not king but rather an intermediate input at the service of innovating industries and processes. Broadband was a critical infrastructure for the digitally-based creative industries. Today the quality of the broadband infrastructure in Australia—itself an outcome of these different policy frameworks—is identified as “fraudband” holding back business, creativity and consumer uptake. In this paper I use the checkered trajectory of broadband on Australian political and policy horizons as a stepping off point to reflect on the ideas governing these changing governmental and public settings. This history enables me to explore how content and infrastructure are simultaneously connected and disconnected in our thinking. And, finally, I want to make some remarks about the way communication, particularly media communication, has been marginally positioned after being, initially so apparently central.
Resumo:
In this paper, we propose an unsupervised segmentation approach, named "n-gram mutual information", or NGMI, which is used to segment Chinese documents into n-character words or phrases, using language statistics drawn from the Chinese Wikipedia corpus. The approach alleviates the tremendous effort that is required in preparing and maintaining the manually segmented Chinese text for training purposes, and manually maintaining ever expanding lexicons. Previously, mutual information was used to achieve automated segmentation into 2-character words. The NGMI approach extends the approach to handle longer n-character words. Experiments with heterogeneous documents from the Chinese Wikipedia collection show good results.
Resumo:
The Thai written language is one of the languages that does not have word boundaries. In order to discover the meaning of the document, all texts must be separated into syllables, words, sentences, and paragraphs. This paper develops a novel method to segment the Thai text by combining a non-dictionary based technique with a dictionary-based technique. This method first applies the Thai language grammar rules to the text for identifying syllables. The hidden Markov model is then used for merging possible syllables into words. The identified words are verified with a lexical dictionary and a decision tree is employed to discover the words unidentified by the lexical dictionary. Documents used in the litigation process of Thai court proceedings have been used in experiments. The results which are segmented words, obtained by the proposed method outperform the results obtained by other existing methods.
Resumo:
Over the last decade, system integration has grown in popularity as it allows organisations to streamline business processes. Traditionally, system integration has been conducted through point-to-point solutions – as a new integration scenario requirement arises, a custom solution is built between the relevant systems. Bus-based solutions are now preferred, whereby all systems communicate via an intermediary system such as an enterprise service bus, using a common data exchange model. This research investigates the use of a common data exchange model based on open standards, specifically MIMOSA OSA-EAI, for asset management system integration. A case study is conducted that involves the integration of processes between a SCADA, maintenance decision support and work management system. A diverse number of software platforms are employed in developing the final solution, all tied together through MIMOSA OSA-EAI-based XML web services. The lessons learned from the exercise are presented throughout the paper.
Resumo:
The current argument is that there exist no indigenous people in Africa because all Africans are indigenous. The obverse considers those Africans who have not been touched by colonialism and lost their traditional cultures commensurate with attachments to the lands or a distinguishable traditional lifestyle to be indigenous. This paper argues in favor of the latter. For example, modernism, materialism, ex-colonial socio-cultural impacts (as in the remnants of European legal structures, and cultural scarring), globalization, and technology are international social homogenizers. People who live in this telos and do not participate in a distinct traditional culture that has been attached to the land for centuries are not indigenous. It is argued that this cultural divergence between modern and traditional is the major identifying point to settle the indigenous-non indigenous African debate. Finally, the paper looks at inclusive development, how this helps to distinguish African indigeneity, and provides a new political analysis model for quantifying inclusivity.