974 resultados para approach speed
Resumo:
This paper seeks to identify and quantify sources of the lagging productivity in Singapore’s retail sector as reported in the Economic Strategies Committee 2010 report. A two-stage analysis is adopted. In the first stage, the Malmquist productivity index is employed which provides measures of productivity change, technological change and efficiency change. In the second stage, technical efficiency estimates are regressed against explanatory variables based on a truncated regression model. Sources of technical efficiency were attributed to quality of workers while product assortment and competition negatively impacted on efficiency.
Resumo:
Background When observers are asked to identify two targets in rapid sequence, they often suffer profound performance deficits for the second target, even when the spatial location of the targets is known. This attentional blink (AB) is usually attributed to the time required to process a previous target, implying that a link should exist between individual differences in information processing speed and the AB. Methodology/Principal Findings The present work investigated this question by examining the relationship between a rapid automatized naming task typically used to assess information-processing speed and the magnitude of the AB. The results indicated that faster processing actually resulted in a greater AB, but only when targets were presented amongst high similarity distractors. When target-distractor similarity was minimal, processing speed was unrelated to the AB. Conclusions/Significance Our findings indicate that information-processing speed is unrelated to target processing efficiency per se, but rather to individual differences in observers' ability to suppress distractors. This is consistent with evidence that individuals who are able to avoid distraction are more efficient at deploying temporal attention, but argues against a direct link between general processing speed and efficient information selection.
Resumo:
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gardenfors’ Conceptual Space approach and Humphreys et al.’s matrix model of memory.
Resumo:
While requiring students to think reflectively is a desirable teaching goal, it is often fraught with complexity and is sometimes poorly implemented in higher education. In this paper, we describe an approach to academic reflective practices that fitted a design subject in fashion education and was perceived as effective in enhancing student learning outcomes. In many design-based disciplines, it is essential to evaluate, through a reflective lens, the quality of tangible design outcomes - referred to here as artefacts. Fashion studio based practice (unlike many other theory based disciplines)requires an artefact to be viewed in order to initiate the reflective process. This reflection is not solely limited to reflective writing; the reflection happens through sight, touch and other non-traditional approaches. Fashion students were asked to reflect before, during and after the development of an artefact. Through a variety of media, a review of the first garment prototype - called a Sample Review - occurred. The reflective practices of students during the Sample Review provided a valuable insight into their own learning, as well as a valid assessment indicator for the lecturer. It also mirrored industry practices for design evaluation. We believe that this deliberative approach, characterised by artefact-prompted reflection, has wide applicability across undergraduate courses in a variety of discipline areas.
Resumo:
A Multimodal Seaport Container Terminal (MSCT) is a complex system which requires careful planning and control in order to operate efficiently. It consists of a number of subsystems that require optimisation of the operations within them, as well as synchronisation of machines and containers between the various subsystems. Inefficiency in the terminal can delay ships from their scheduled timetables, as well as cause delays in delivering containers to their inland destinations, both of which can be very costly to their operators. The purpose of this PhD thesis is to use Operations Research methodologies to optimise and synchronise these subsystems as an integrated application. An initial model is developed for the overall MSCT; however, due to a large number of assumptions that had to be made, as well as other issues, it is found to be too inaccurate and infeasible for practical use. Instead, a method of developing models for each subsystem is proposed that then be integrated with each other. Mathematical models are developed for the Storage Area System (SAS) and Intra-terminal Transportation System (ITTS). The SAS deals with the movement and assignment of containers to stacks within the storage area, both when they arrive and when they are rehandled to retrieve containers below them. The ITTS deals with scheduling the movement of containers and machines between the storage areas and other sections of the terminal, such as the berth and road/rail terminals. Various constructive heuristics are explored and compared for these models to produce good initial solutions for large-sized problems, which are otherwise impractical to compute by exact methods. These initial solutions are further improved through the use of an innovative hyper-heuristic algorithm that integrates the SAS and ITTS solutions together and optimises them through meta-heuristic techniques. The method by which the two models can interact with each other as an integrated system will be discussed, as well as how this method can be extended to the other subsystems of the MSCT.
Resumo:
A wireless sensor network system must have the ability to tolerate harsh environmental conditions and reduce communication failures. In a typical outdoor situation, the presence of wind can introduce movement in the foliage. This motion of vegetation structures causes large and rapid signal fading in the communication link and must be accounted for when deploying a wireless sensor network system in such conditions. This thesis examines the fading characteristics experienced by wireless sensor nodes due to the effect of varying wind speed in a foliage obstructed transmission path. It presents extensive measurement campaigns at two locations with the approach of a typical wireless sensor networks configuration. The significance of this research lies in the varied approaches of its different experiments, involving a variety of vegetation types, scenarios and the use of different polarisations (vertical and horizontal). Non–line of sight (NLoS) scenario conditions investigate the wind effect based on different vegetation densities including that of the Acacia tree, Dogbane tree and tall grass. Whereas the line of sight (LoS) scenario investigates the effect of wind when the grass is swaying and affecting the ground-reflected component of the signal. Vegetation type and scenarios are envisaged to simulate real life working conditions of wireless sensor network systems in outdoor foliated environments. The results from the measurements are presented in statistical models involving first and second order statistics. We found that in most of the cases, the fading amplitude could be approximated by both Lognormal and Nakagami distribution, whose m parameter was found to depend on received power fluctuations. Lognormal distribution is known as the result of slow fading characteristics due to shadowing. This study concludes that fading caused by variations in received power due to wind in wireless sensor networks systems are found to be insignificant. There is no notable difference in Nakagami m values for low, calm, and windy wind speed categories. It is also shown in the second order analysis, the duration of the deep fades are very short, 0.1 second for 10 dB attenuation below RMS level for vertical polarization and 0.01 second for 10 dB attenuation below RMS level for horizontal polarization. Another key finding is that the received signal strength for horizontal polarisation demonstrates more than 3 dB better performances than the vertical polarisation for LoS and near LoS (thin vegetation) conditions and up to 10 dB better for denser vegetation conditions.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.
Resumo:
Information Technology and its relationship to organisational performance has long been the interest of researchers. While there is concurrence that IT does contribute to performance, and we are efficiently expanding our knowledge on what factors cause better leveraging of IT resources in organisations, we have done little to understand how these factors interact with technology that results in improved performance. Using a structurational lens that recognises the recursive interaction between technology and people in the presence of social practices, and the norms that inform their ongoing practices, we propose an ethnographic approach to understanding the interaction between technology and resources, aiming to provide richer insight on the nature of the environment that promotes better use of IT resources. Such insights could provide the IT users with at least an initial conception of the IT usage platform that they could promote in their organisations to leverage the most from their IT resources.
Resumo:
We examine the relationship between the effectiveness of IT steering committee-driven IT governance initiatives and firm’s IT management and IT infrastructure related capabilities. We test these relationships empirically by a field survey of 216 firms. Results of this study suggest that a firms’ effectiveness of IT steering committee-driven IT governance initiatives positively relate to the level of their IT-related capabilities. We also found positive relationships between IT-related capabilities and internal process-level performance, which positively relate to improvement in customer service and firm-level performance. For researchers, we demonstrate that the resourcebased theory provides a more robust explanation of the determinants of firms IT governance initiatives. This would be ideal in evaluating other IT governance initiatives effectiveness in relation to how they contribute to building performance-differentiating IT-related capabilities. For decision makers, we hope our study has reiterated the notion that IT governance is truly a coordinated effort, embracing all levels of human resources.
Resumo:
This article explores power within legal education scholarship. It suggests that power relations are not effectively reflected on within this scholarship, and it provokes legal educators to consider power more explicitly and effectively. It then outlines in-depth a conceptual and methodological approach based on Michel Foucault’s concept of ‘governmentality’ to assist in such an analysis. By detailing the conceptual moves required in order to research power in legal education more effectively, this article seeks to stimulate new reflection and thought about the practice and scholarship of legal education, and allow for political interventions to become more ethically sensitive and potentially more effective.
Resumo:
This article examines the current transfer pricing regime to consider whether it is a sound model to be applied to modern multinational entities. The arm's length price methodology is examined to enable a discussion of the arguments in favour of such a regime. The article then refutes these arguments concluding that, contrary to the very reason multinational entities exist, applying arm's length rules involves a legal fiction of imagining transactions between unrelated parties. Multinational entities exist to operate in a way that independent entities would not, which the arm's length rules fail to take into account. As such, there is clearly an air of artificiality in applying the arm's length standard. To demonstrate this artificiality with respect to modern multinational entities, multinational banks are used as an example. The article concluded that the separate entity paradigm adopted by the traditional transfer pricing regime is incongruous with the economic theory of modern multinational enterprises.
Resumo:
Arts managers play a critical role in creating a strong, sustainable arts and cultural sector. They operate as brokers, creating programs, and, more critically, coordinating the relationships between artists, audiences, communities, governments and sponsors required to make these programs a success. Based on study of model developed for a subject in the Master of Creative Industries (Creative Production & Arts Management) at Queensland University of Technology (QUT), this paper examines the pros and cons of a “community of practice” approach in training arts management students to act as cultural brokers. It provides data on the effectiveness of a range of activities – including Position Papers, Case Studies, Masterclasses, and offline and online conversations – that can be used facilitate the peer-to-peer engagement by which students work together to build their cultural brokering skills in a community of practice. The data demonstrates that, whilst students appreciate this approach, educators must provide enough access to voices of authority – that is, to arts professionals – to establish a well-functioning community of practice, and ensure that more expert students do not become frustrated when they are unwittingly and unwillingly thrust into this role by less expert classmates. This is especially important in arts management, where classes are always diverse, due to the fact that most dedicated programs in Australia, as in the US, UK and Europe, are taught via small-scale programs at graduate level which accept applicants from a wide variety of arts and non-arts backgrounds.
Resumo:
In response to developments in international trade and an increased focus on international transfer-pricing issues, Canada’s minister of finance announced in the 1997 budget that the Department of Finance would undertake a review of the transfer-pricing provisions in the Income Tax Act. On September 11, 1997, the Department of Finance released draft transfer-pricing legislation and Revenue Canada released revised draft Information Circular 87-2R. The legislation was subsequently amended and included in Bill C-28, which received first reading on December 10, 1997. The new rules are intended to update Canada’s international transfer-pricing practices. In particular, they attempt to harmonize the standards in the Income Tax Act with the arm’s-length principle established in the OECD’s transfer pricing guidelines. The new rules also set out contemporaneous documentation requirements in respect of cross-border related-party transactions, facilitate administration of the law by Revenue Canada, and provide for a penalty where transfer prices do not comply with the arm’s-length principle. The Australian tax authorities have similarly reviewed and updated their transfer-pricing practices. Since 1992, the Australian commissioner of taxation has issued three rulings and seven draft rulings directly relating to international transfer pricing. These rulings outline the selection and application of transfer pricing methodologies, documentation requirements, and penalties for non-compliance. The Australian Taxation Office supports the use of advance pricing agreements (APAs) and has expanded its audit strategy by conducting transfer-pricing risk assessment reviews. This article presents a detailed review of Australia’s transfer-pricing policy and practices, which address essentially the same concerns as those at which the new Canadian rules are directed. This review provides a framework for comparison of the approaches adopted in the two jurisdictions. The author concludes that although these approaches differ in some respects, ultimately they produce a similar result. Both regimes set a clear standard to be met by multinational enterprises in establishing transfer prices. Both provide for audits and penalties in the event of noncompliance. And both offer the alternative of an APA as a means of avoiding transfer-pricing disputes with Australian and Canadian tax authorities.
Resumo:
This paper seeks to explain the lagging productivity in Singapore’s manufacturing noted in the statements of the Economic Strategies Committee Report 2010. Two methods are employed: the Malmquist productivity to measure total factor productivity change and Simar and Wilson’s (J Econ, 136:31–64, 2007) bootstrapped truncated regression approach. In the first stage, the nonparametric data envelopment analysis is used to measure technical efficiency. To quantify the economic drivers underlying inefficiencies, the second stage employs a bootstrapped truncated regression whereby bias-corrected efficiency estimates are regressed against explanatory variables. The findings reveal that growth in total factor productivity was attributed to efficiency change with no technical progress. Most industries were technically inefficient throughout the period except for ‘Pharmaceutical Products’. Sources of efficiency were attributed to quality of worker and flexible work arrangements while incessant use of foreign workers lowered efficiency.
Resumo:
A practical approach for identifying solution robustness is proposed for situations where parameters are uncertain. The approach is based upon the interpretation of a probability density function (pdf) and the definition of three parameters that describe how significant changes in the performance of a solution are deemed to be. The pdf is constructed by interpreting the results of simulations. A minimum number of simulations are achieved by updating the mean, variance, skewness and kurtosis of the sample using computationally efficient recursive equations. When these criterions have converged then no further simulations are needed. A case study involving several no-intermediate storage flow shop scheduling problems demonstrates the effectiveness of the approach.