959 resultados para The Impossible Is Possible


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Broad, early definitions of sustainable development have caused confusion and hesitation among local authorities and planning professionals. This confusion has arisen because loosely defined principles of sustainable development have been employed when setting policies and planning projects, and when gauging the efficiencies of these policies in the light of designated sustainability goals. The question of how this theory-rhetoric-practice gap can be filled is the main focus of this chapter. It examines the triple bottom line approach–one of the sustainability accounting approaches widely employed by governmental organisations–and the applicability of this approach to sustainable urban development. The chapter introduces the ‘Integrated Land Use and Transportation Indexing Model’ that incorporates triple bottom line considerations with environmental impact assessment techniques via a geographic, information systemsbased decision support system. This model helps decision-makers in selecting policy options according to their economic, environmental and social impacts. Its main purpose is to provide valuable knowledge about the spatial dimensions of sustainable development, and to provide fine detail outputs on the possible impacts of urban development proposals on sustainability levels. In order to embrace sustainable urban development policy considerations, the model is sensitive to the relationship between urban form, travel patterns and socio-economic attributes. Finally, the model is useful in picturing the holistic state of urban settings in terms of their sustainability levels, and in assessing the degree of compatibility of selected scenarios with the desired sustainable urban future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide a summary description of the doctoral thesis investigating the field of project management (PM) deployment. Researchers will be informed of the current contributions within this topic and of the possible further investigations and researches. The decision makers and practitioners will be aware of a set of tools addressing the PM deployment with new perspectives. Design/methodology/approach – Research undertaken with the thesis is based on quantitative methods using time series statistics (time distance analysis) and comparative and correlation analysis aimed to better define and understand the PM deployment within and between countries or groups. Findings – The results suggest a project management deployment index (PMDI) to objectively measure the PM deployment based on the concept of certification. A proposed framework to empirically benchmark the PM deployment between countries by integrating the PMDI time series with the two dimensional comparative analysis of Sicherl. The correlation analysis within Hoftsede cultural framework shows the impact of the national culture dimensions on the PM deployment. The forecasting model shows a general continual growth trend of the PM deployment, with continual increase in the time distance between the countries. Research limitations/implications – The PM researchers are offered an empirical quantification on which they can construct further investigations and understanding of this phenomenon. The number of possible units that can be studied offers wide possibilities to replicate the thesis work. New researches can be undertaken to investigate further the contribution of other social or economical indicators, or to refine and enrich the definition of the PMDI indicator. Practical implications – These results have important implications on the PM deployment approaches. The PMDI measurements and time series comparisons facilitate considerably the measurement and benchmarking between the units (e.g. countries) and against targets, while the readiness setting of the studied unit (in terms of development and cultural levels) impacts the PM deployment within this country. Originality/value – This paper provides a summary of cutting-edge research work in the studied field of PM deployment and a link to the published works that researchers can use to help them understand the thesis research as well as how it can be extended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neither an international tax, nor an international taxing body exists. Rather, there are domestic taxing rules adopted by jurisdictions which, coupled with double tax treaties, apply to cross-border transactions and international taxation issues. International bodies such as the OECD and UN, which provide guidance on tax issues, often steer and supplement these domestic adoptions but have no binding international taxing powers. These pragmatic realities, together with the specific use of the word ‘regime’ within the tax community, lead many to argue that an international tax regime does not exist. However, an international tax regime should be defined no differently to any other area of international law and when we step outside the confines of tax law to consider the definition of a ‘regime’ within international relations it is possible to demonstrate that such a regime is very real. The first part of this article, by defining an international tax regime in a broader and more traditional context, also outlining both the tax policy and principles which frame that regime, reveals its existence. Once it is accepted that an international tax regime exists, it is possible to consider its adoption by jurisdictions and subsequent constraints it places on them. Using the proposed changes to transfer pricing laws as the impetus for assessing Australia’s adoption of the international tax regime, the constraints on sovereignty are assessed through a taxonomy of the level adoption. This reveals the subsequent constraints which flow from the broad acceptance of an international tax regime through to the specific adoption of technical detail. By undertaking this analysis, the second part of this article demonstrates that Australia has inherently adopted an international tax regime, with a move towards explicit adoption and a clear embedding of its principles within the domestic tax legislation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Circoviruses lack an autonomous DNA polymerase and are dependent on the replication machinery of the host cell for de novo DNA synthesis. Accordingly, the viral DNA needs to cross both the plasma membrane and the nuclear envelope before replication can occur. Here we report on the subcellular distribution of the beak and feather disease virus (BFDV) capsid protein (CP) and replication-associated protein (Rep) expressed via recombinant baculoviruses in an insect cell system and test the hypothesis that the CP is responsible for transporting the viral genome, as well as Rep, across the nuclear envelope. The intracellular localization of the BFDV CP was found to be directed by three partially overlapping bipartite nuclear localization signals (NLSs) situated between residues 16 and 56 at the N terminus of the protein. Moreover, a DNA binding region was also mapped to the N terminus of the protein and falls within the region containing the three putative NLSs. The ability of CP to bind DNA, coupled with the karyophilic nature of this protein, strongly suggests that it may be responsible for nuclear targeting of the viral genome. Interestingly, whereas Rep expressed on its own in insect cells is restricted to the cytoplasm, coexpression with CP alters the subcellular localization of Rep to the nucleus, strongly suggesting that an interaction with CP facilitates movement of Rep into the nucleus. Copyright © 2006, American Society for Microbiology. All Rights Reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rank transform is one non-parametric transform which has been applied to the stereo matching problem The advantages of this transform include its invariance to radio metric distortion and its amenability to hardware implementation. This paper describes the derivation of the rank constraint for matching using the rank transform Previous work has shown that this constraint was capable of resolving ambiguous matches thereby improving match reliability A new matching algorithm incorporating this constraint was also proposed. This paper extends on this previous work by proposing a matching algorithm which uses a dimensional match surface in which the match score is computed for every possible template and match window combination. The principal advantage of this algorithm is that the use of the match surface enforces the left�right consistency and uniqueness constraints thus improving the algorithms ability to remove invalid matches Experimental results for a number of test stereo pairs show that the new algorithm is capable of identifying and removing a large number of in incorrect matches particularly in the case of occlusions

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fundamental problem faced by stereo vision algorithms is that of determining correspondences between two images which comprise a stereo pair. This paper presents work towards the development of a new matching algorithm, based on the rank transform. This algorithm makes use of both area-based and edge-based information, and is therefore referred to as a hybrid algorithm. In addition, this algorithm uses a number of matching constraints, including the novel rank constraint. Results obtained using a number of test pairs show that the matching algorithm is capable of removing most invalid matches. The accuracy of matching in the vicinity of edges is also improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rank transform is a non-parametric technique which has been recently proposed for the stereo matching problem. The motivation behind its application to the matching problem is its invariance to certain types of image distortion and noise, as well as its amenability to real-time implementation. This paper derives an analytic expression for the process of matching using the rank transform, and then goes on to derive one constraint which must be satisfied for a correct match. This has been dubbed the rank order constraint or simply the rank constraint. Experimental work has shown that this constraint is capable of resolving ambiguous matches, thereby improving matching reliability. This constraint was incorporated into a new algorithm for matching using the rank transform. This modified algorithm resulted in an increased proportion of correct matches, for all test imagery used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sfinks is a shift register based stream cipher designed for hardware implementation and submitted to the eSTREAM project. In this paper, we analyse the initialisation process of Sfinks. We demonstrate a slid property of the loaded state of the Sfinks cipher, where multiple key-IV pairs may produce phase shifted keystream sequences. The state update functions of both the initialisation process and keystream generation and also the pattern of the padding affect generation of the slid pairs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A body of research in conversation analysis has identified a range of structurally-provided positions in which sources of trouble in talk-in-interaction can be addressed using repair. These practices are contained within what Schegloff (1992) calls the repair space. In this paper, I examine a rare instance in which a source of trouble is not resolved within the repair space and comes to be addressed outside of it. The practice by which this occurs is a post-completion account; that is, an account that is produced after the possible completion of the sequence containing a source of trouble. Unlike fourth position repair, the final repair position available within the repair space, this account is not made in preparation for a revised response to the trouble-source turn. Its more restrictive aim, rather, is to circumvent an ongoing difference between the parties involved. I argue that because the trouble is addressed in this manner, and in this particular position, the repair space can be considered as being limited to the sequence in which a source of trouble originates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizations from every industry sector seek to enhance their business performance and competitiveness through the deployment of contemporary information systems (IS), such as Enterprise Systems (ERP). Investments in ERP are complex and costly, attracting scrutiny and pressure to justify their cost. Thus, IS researchers highlight the need for systematic evaluation of information system success, or impact, which has resulted in the introduction of varied models for evaluating information systems. One of these systematic measurement approaches is the IS-Impact Model introduced by a team of researchers at Queensland University of technology (QUT) (Gable, Sedera, & Chan, 2008). The IS-Impact Model is conceptualized as a formative, multidimensional index that consists of four dimensions. Gable et al. (2008) define IS-Impact as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (p.381). The IT Evaluation Research Program (ITE-Program) at QUT has grown the IS-Impact Research Track with the central goal of conducting further studies to enhance and extend the IS-Impact Model. The overall goal of the IS-Impact research track at QUT is "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable, 2009). In order to achieve that, the IS-Impact research track advocates programmatic research having the principles of tenacity, holism, and generalizability through extension research strategies. This study was conducted within the IS-Impact Research Track, to further generalize the IS-Impact Model by extending it to the Saudi Arabian context. According to Hofsted (2012), the national culture of Saudi Arabia is significantly different from the Australian national culture making the Saudi Arabian culture an interesting context for testing the external validity of the IS-Impact Model. The study re-visits the IS-Impact Model from the ground up. Rather than assume the existing instrument is valid in the new context, or simply assess its validity through quantitative data collection, the study takes a qualitative, inductive approach to re-assessing the necessity and completeness of existing dimensions and measures. This is done in two phases: Exploratory Phase and Confirmatory Phase. The exploratory phase addresses the first research question of the study "Is the IS-Impact Model complete and able to capture the impact of information systems in Saudi Arabian Organization?". The content analysis, used to analyze the Identification Survey data, indicated that 2 of the 37 measures of the IS-Impact Model are not applicable for the Saudi Arabian Context. Moreover, no new measures or dimensions were identified, evidencing the completeness and content validity of the IS-Impact Model. In addition, the Identification Survey data suggested several concepts related to IS-Impact, the most prominent of which was "Computer Network Quality" (CNQ). The literature supported the existence of a theoretical link between IS-Impact and CNQ (CNQ is viewed as an antecedent of IS-Impact). With the primary goal of validating the IS-Impact model within its extended nomological network, CNQ was introduced to the research model. The Confirmatory Phase addresses the second research question of the study "Is the Extended IS-Impact Model Valid as a Hierarchical Multidimensional Formative Measurement Model?". The objective of the Confirmatory Phase was to test the validity of IS-Impact Model and CNQ Model. To achieve that, IS-Impact, CNQ, and IS-Satisfaction were operationalized in a survey instrument, and then the research model was assessed by employing the Partial Least Squares (PLS) approach. The CNQ model was validated as a formative model. Similarly, the IS-Impact Model was validated as a hierarchical multidimensional formative construct. However, the analysis indicated that one of the IS-Impact Model indicators was insignificant and can be removed from the model. Thus, the resulting Extended IS-Impact Model consists of 4 dimensions and 34 measures. Finally, the structural model was also assessed against two aspects: explanatory and predictive power. The analysis revealed that the path coefficient between CNQ and IS-Impact is significant with t-value= (4.826) and relatively strong with â = (0.426) with CNQ explaining 18% of the variance in IS-Impact. These results supported the hypothesis that CNQ is antecedent of IS-Impact. The study demonstrates that the quality of Computer Network affects the quality of the Enterprise System (ERP) and consequently the impacts of the system. Therefore, practitioners should pay attention to the Computer Network quality. Similarly, the path coefficient between IS-Impact and IS-Satisfaction was significant t-value = (17.79) and strong â = (0.744), with IS-Impact alone explaining 55% of the variance in Satisfaction, consistent with results of the original IS-Impact study (Gable et al., 2008). The research contributions include: (a) supporting the completeness and validity of IS-Impact Model as a Hierarchical Multi-dimensional Formative Measurement Model in the Saudi Arabian context, (b) operationalizing Computer Network Quality as conceptualized in the ITU-T Recommendation E.800 (ITU-T, 1993), (c) validating CNQ as a formative measurement model and as an antecedent of IS Impact, and (d) conceptualizing and validating IS-Satisfaction as a reflective measurement model and as an immediate consequence of IS Impact. The CNQ model provides a framework to perceptually measure Computer Network Quality from multiple perspectives. The CNQ model features an easy-to-understand, easy-to-use, and economical survey instrument.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past decade there have been a number of families who have utilised assisted reproductive technologies (ARTs) to create a tissue-matched child, with the purpose of using the child’s tissue to cure an existing sick child. This inevitably brings such families a sense of hope as the ultimate aim is to overcome a family health crisis. However, this specific use of reproductive technologies has been the subject of significant criticism, most of which is levelled against the potential harm to the ‘saviour’ child. In Australia, families seeking to access reproductive technologies in this context are therefore required to justify their motives to an ethics committee in order to establish, amongst other things, whether the child will suffer harm once born. This paper explores the concept of harm in the context of conception, focusing on whether it is possible to ‘harm’ a healthy child who has been conceived to save another. To achieve this, the paper will evaluate the impact of the ‘non-identity’ principle in the ‘saviour sibling’ context, and assess the existing body of literature which addresses ‘harm’ in the context of conception. As will be established, the majority of such literature has focused on ‘wrongful life’ cases which seek to address whether an existing child who has been born with a disability, has been harmed. Finally, this paper will distinguish the harm arguments in the ‘saviour sibling’ context based on the fact that the harm evaluation concerns the ‘future-life’ assessment of a healthy child.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Australian universities are currently engaging with new governmental policies and regulations that require them to demonstrate enhanced quality and accountability in teaching and research. The development of national academic standards for learning outcomes in higher education is one such instance of this drive for excellence. These discipline-specific standards articulate the minimum, or Threshold Learning Outcomes, to be addressed by higher education institutions so that graduating students can demonstrate their achievement to their institutions, accreditation agencies, and industry recruiters. This impacts not only on the design of Engineering courses (with particular emphasis on pedagogy and assessment), but also on the preparation of academics to engage with these standards and implement them in their day-to-day teaching practice on a micro level. This imperative for enhanced quality and accountability in teaching is also significant at a meso level, for according to the Australian Bureau of Statistics, about 25 per cent of teachers in Australian universities are aged 55 and above and more than 54 per cent are aged 45 and above (ABS, 2006). A number of institutions have undertaken recruitment drives to regenerate and enrich their academic workforce by appointing capacity-building research professors and increasing the numbers of early- and mid-career academics. This nationally driven agenda for quality and accountability in teaching permeates also the micro level of engineering education, since the demand for enhanced academic standards and learning outcomes requires both a strong advocacy for a shift to an authentic, collaborative, outcomes-focused education and the mechanisms to support academics in transforming their professional thinking and practice. Outcomes-focused education means giving greater attention to the ways in which the curriculum design, pedagogy, assessment approaches and teaching activities can most effectively make a positive, verifiable difference to students’ learning. Such education is authentic when it is couched firmly in the realities of learning environments, student and academic staff characteristics, and trustworthy educational research. That education will be richer and more efficient when staff works collaboratively, contributing their knowledge, experience and skills to achieve learning outcomes based on agreed objectives. We know that the school or departmental levels of universities are the most effective loci of changes in approaches to teaching and learning practices in higher education (Knight & Trowler, 2000). Heads of Schools are being increasingly entrusted with more responsibilities - in addition to setting strategic directions and managing the operational and sometimes financial aspects of their school, they are also expected to lead the development and delivery of the teaching, research and other academic activities. Guiding and mentoring individuals and groups of academics is one critical aspect of the Head of School’s role. Yet they do not always have the resources or support to help them mentor staff, especially the more junior academics. In summary, the international trend in undergraduate engineering course accreditation towards the demonstration of attainment of graduate attributes poses new challenges in addressing academic staff development needs and the assessment of learning. This paper will give some insights into the conceptual design, implementation and empirical effectiveness to date, of a Fellow-In-Residence Engagement (FIRE) program. The program is proposed as a model for achieving better engagement of academics with contemporary issues and effectively enhancing their teaching and assessment practices. It will also report on the program’s collaborative approach to working with Heads of Schools to better support academics, especially early-career ones, by utilizing formal and informal mentoring. Further, the paper will discuss possible factors that may assist the achievement of the intended outcomes of such a model, and will examine its contributions to engendering an outcomes-focussed thinking in engineering education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the overwhelming increase in the amount of texts on the web, it is almost impossible for people to keep abreast of up-to-date information. Text mining is a process by which interesting information is derived from text through the discovery of patterns and trends. Text mining algorithms are used to guarantee the quality of extracted knowledge. However, the extracted patterns using text or data mining algorithms or methods leads to noisy patterns and inconsistency. Thus, different challenges arise, such as the question of how to understand these patterns, whether the model that has been used is suitable, and if all the patterns that have been extracted are relevant. Furthermore, the research raises the question of how to give a correct weight to the extracted knowledge. To address these issues, this paper presents a text post-processing method, which uses a pattern co-occurrence matrix to find the relation between extracted patterns in order to reduce noisy patterns. The main objective of this paper is not only reducing the number of closed sequential patterns, but also improving the performance of pattern mining as well. The experimental results on Reuters Corpus Volume 1 data collection and TREC filtering topics show that the proposed method is promising.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method for investigating ship emissions, the plume capture and analysis system (PCAS), and its application in measuring airborne pollutant emission factors (EFs) and particle size distributions. The current investigation was conducted in situ, aboard two dredgers (Amity: a cutter suction dredger and Brisbane: a hopper suction dredger) but the PCAS is also capable of performing such measurements remotely at a distant point within the plume. EFs were measured relative to the fuel consumption using the fuel combustion derived plume CO2. All plume measurements were corrected by subtracting background concentrations sampled regularly from upwind of the stacks. Each measurement typically took 6 minutes to complete and during one day, 40 to 50 measurements were possible. The relationship between the EFs and plume sample dilution was examined to determine the plume dilution range over which the technique could deliver consistent results when measuring EFs for particle number (PN), NOx, SO2, and PM2.5 within a targeted dilution factor range of 50-1000 suitable for remote sampling. The EFs for NOx, SO2, and PM2.5 were found to be independent of dilution, for dilution factors within that range. The EF measurement for PN was corrected for coagulation losses by applying a time dependant particle loss correction to the particle number concentration data. For the Amity, the EF ranges were PN: 2.2 - 9.6 × 1015 (kg-fuel)-1; NOx: 35-72 g(NO2).(kg-fuel)-1, SO2 0.6 - 1.1 g(SO2).(kg-fuel)-1and PM2.5: 0.7 – 6.1 g(PM2.5).(kg-fuel)-1. For the Brisbane they were PN: 1.0 – 1.5 x 1016 (kg-fuel)-1, NOx: 3.4 – 8.0 g(NO2).(kg-fuel)-1, SO2: 1.3 – 1.7 g(SO2).(kg-fuel)-1 and PM2.5: 1.2 – 5.6 g(PM2.5).(kg-fuel)-1. The results are discussed in terms of the operating conditions of the vessels’ engines. Particle number emission factors as a function of size as well as the count median diameter (CMD), and geometric standard deviation of the size distributions are provided. The size distributions were found to be consistently uni-modal in the range below 500 nm, and this mode was within the accumulation mode range for both vessels. The representative CMDs for the various activities performed by the dredgers ranged from 94-131 nm in the case of the Amity, and 58-80 nm for the Brisbane. A strong inverse relationship between CMD and EF(PN) was observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transient hyperopic refractive shifts occur on a timescale of weeks in some patients after initiation of therapy for hyperglycemia, and are usually followed by recovery to the original refraction. Possible lenticular origin of these changes is considered in terms of a paraxial gradient index model. Assuming that the lens thickness and curvatures remain unchanged, as observed in practice, it appears possible to account for initial hyperopic refractive shifts of up to a few diopters by reduction in refractive index near the lens center and alteration in the rate of change between center and surface, so that most of the index change occurs closer to the lens surface. Restoration of the original refraction depends on further change in the refractive index distribution with more gradual changes in refractive index from the lens center to its surface. Modeling limitations are discussed.