922 resultados para Traditional enrichment method
Resumo:
Contends that South African universities must find admissions criteria, other than high school grades, that are both fair and valid for Black applicants severely disadvantaged by an inferior school education. The use of traditional intellectual assessments and aptitude tests for disadvantaged and minority students remains controversial as a fair assessment; they do not take account of potential for change. In this study, therefore, a measure of students' cognitive modifiability, assessed by means of an interactive assessment model, was added as a moderator of traditional intellectual assessment in predicting 1st-yr university success. Cognitive modifiability significantly moderated the predictive validity of the traditional intellectual assessment for 52 disadvantaged Black students. The higher the level of cognitive modifiability, the less effective were traditional methods for predicting academic success and vice versa.
Resumo:
Purpose: The purpose of this paper is to clarify how end-users’ tacit knowledge can be captured and integrated in an overall business process management (BPM) approach. Current approaches to support stakeholders’ collaboration in the modelling of business processes envision an egalitarian environment where stakeholders interact in the same context, using the same languages and sharing the same perspectives on the business process. Therefore, such stakeholders have to collaborate in the context of process modelling using a language that some of them do not master, and have to integrate their various perspectives. Design/methodology/approach: The paper applies the SECI knowledge management process to analyse the problems of traditional top-down BPM approaches and BPM collaborative modelling tools. Besides, the SECI model is also applied to Wikipedia, a successful Web 2.0-based knowledge management environment, to identify how tacit knowledge is captured in a bottom-up approach. Findings – The paper identifies a set of requirements for a hybrid BPM approach, both top-down and bottom-up, and describes a new BPM method based on a stepwise discovery of knowledge. Originality/value: This new approach, Processpedia, enhances collaborative modelling among stakeholders without enforcing egalitarianism. In Processpedia tacit knowledge is captured and standardised into the organisation’s business processes by fostering an ecological participation of all the stakeholders and capitalising on stakeholders’ distinctive characteristics.
Resumo:
Due to proliferation of online stores prior expectations that retailing would move completely online were not fulfilled. Previous research about consumers’ preference of retailing channels suggested that online sales are driven by the convenience of online shopping, or as a natural extension of online searches. This paradigm has changed over the years. Changes in consumer behaviour are indicating that while consumers are searching online using various information sources to learn about products, ultimately when purchasing, consumers are shifting between online and offline retailing channels depending on various factors. Online shopping is still considered to be a convenient way to purchase goods, but the convenience is not the key factor. This qualitative research is based on 22 in-depth interviews with shoppers in Australia.
Resumo:
An external change agent (ECA) was recently employed in three Queensland schools to align the school curriculum with the requirements of the state’s high stakes test known as the Queensland Core Skills test (QCS). This paper reports on the teachers’ perceptions of a change process led by an ECA. With the ever-increasing implementation of high stakes testing in Australian schools, teachers are under mounting pressure to produce ‘results’. Therefore, in order to maximise their students’ success in these tests, schools are altering their curricula to incorporate the test requirements. Rather than the traditional method of managing such curriculum change processes internally, there is a growing trend for principals to source external expertise in the form of ECAs. Although some academics, teachers, and much of the relevant literature, would regard such a practice as problematic, this study found that in fact, teachers were quite open to externally led curriculum change, especially if they perceived the leader to be knowledgeable and creditable in this area.
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost which demand of continuous improvements techniques. In this paper, we propose a fuzzy based performance evaluation method for lean supply chain. To understand the overall performance of cost competitive supply chain, we investigate the alignment of market strategy and position of the supply chain. Competitive strategies can be achieved by using a different weight calculation for different supply chain situations. By identifying optimal performance metrics and applying performance evaluation methods, managers can predict the overall supply chain performance under lean strategy.
Resumo:
This article argues that multinational banks have characteristics which are unique and distinguishable from traditional multinational entities. The first distinguishing feature is the unique nature of the services and consequent products supplied by multinational banks, which are aimed at meeting client global demand. The second distinguishing feature is the non-traditional organisational structure that is adopted. This structure, also designed to meet client global demand, introduces issues previously not recognised in the traditional taxation system, which is designed for the structure of traditional multinational entities. The unique differences between traditional multinational entities and multinational banks means there may be the need for a distinct international tax regime. It is argued that there are “outmoded economic assumptions” upon which the present tax laws relating to multinational banks are based. An examination of the unique nature of multinational banks leads to the conclusion that the appropriate tax treatment of these banks is different from the appropriate tax treatment of multinational entities generally.
Resumo:
House dust is a heterogeneous matrix, which contains a number of biological materials and particulate matter gathered from several sources. It is the accumulation of a number of semi-volatile and non-volatile contaminants. The contaminants are trapped and preserved. Therefore, house dust can be viewed as an archive of both the indoor and outdoor air pollution. There is evidence to show that on average, people tend to stay indoors most of the time and this increases exposure to house dust. The aims of this investigation were to: " assess the levels of Polycyclic Aromatic Hydrocarbons (PAHs), elements and pesticides in the indoor environment of the Brisbane area; " identify and characterise the possible sources of elemental constituents (inorganic elements), PAHs and pesticides by means of Positive Matrix Factorisation (PMF); and " establish the correlations between the levels of indoor air pollutants (PAHs, elements and pesticides) with the external and internal characteristics or attributes of the buildings and indoor activities by means of multivariate data analysis techniques. The dust samples were collected during the period of 2005-2007 from homes located in different suburbs of Brisbane, Ipswich and Toowoomba, in South East Queensland, Australia. A vacuum cleaner fitted with a paper bag was used as a sampler for collecting the house dust. A survey questionnaire was filled by the house residents which contained information about the indoor and outdoor characteristics of their residences. House dust samples were analysed for three different pollutants: Pesticides, Elements and PAHs. The analyses were carried-out for samples of particle size less than 250 µm. The chemical analyses for both pesticides and PAHs were performed using a Gas Chromatography Mass Spectrometry (GC-MS), while elemental analysis was carried-out by using Inductively-Coupled Plasma-Mass Spectroscopy (ICP-MS). The data was subjected to multivariate data analysis techniques such as multi-criteria decision-making procedures, Preference Ranking Organisation Method for Enrichment Evaluations (PROMETHEE), coupled with Geometrical Analysis for Interactive Aid (GAIA) in order to rank the samples and to examine data display. This study showed that compared to the results from previous works, which were carried-out in Australia and overseas, the concentrations of pollutants in house dusts in Brisbane and the surrounding areas were relatively very high. The results of this work also showed significant correlations between some of the physical parameters (types of building material, floor level, distance from industrial areas and major road, and smoking) and the concentrations of pollutants. Types of building materials and the age of houses were found to be two of the primary factors that affect the concentrations of pesticides and elements in house dust. The concentrations of these two types of pollutant appear to be higher in old houses (timber houses) than in the brick ones. In contrast, the concentrations of PAHs were noticed to be higher in brick houses than in the timber ones. Other factors such as floor level, and distance from the main street and industrial area, also affected the concentrations of pollutants in the house dust samples. To apportion the sources and to understand mechanisms of pollutants, Positive Matrix Factorisation (PMF) receptor model was applied. The results showed that there were significant correlations between the degree of concentration of contaminants in house dust and the physical characteristics of houses, such as the age and the type of the house, the distance from the main road and industrial areas, and smoking. Sources of pollutants were identified. For PAHs, the sources were cooking activities, vehicle emissions, smoking, oil fumes, natural gas combustion and traces of diesel exhaust emissions; for pesticides the sources were application of pesticides for controlling termites in buildings and fences, treating indoor furniture and in gardens for controlling pests attacking horticultural and ornamental plants; for elements the sources were soil, cooking, smoking, paints, pesticides, combustion of motor fuels, residual fuel oil, motor vehicle emissions, wearing down of brake linings and industrial activities.
Resumo:
The character of James Bond for many people is intrinsically linked in their minds with particular brands – Aston Martin, Bollinger, Omega, Smirnoff vodka, and so on. This direct association between character and brand highlights the intrinsic role of product placement in the film industry, and in the James Bond films in particular. Selling James Bond: Product Placement in the James Bond Films provides a comprehensive overview of the history of product placement in the James Bond series – charting the progression of the practice and drawing direct correlations to significant cultural and historical events that impacted upon the number and types of products incorporated into the series. While primarily a financial arrangement, it is also important that the practice of product placement be examined and understood in relation to these cultural contexts, an area of research so far largely ignored by academic study. Through extensive content analysis of the official James Bond film series, as well as utilising directors’ commentary and industry reports, this book illustrates the strong impact specific cultural and historical events have had on the practice of product placement in the series. In doing so, it provides an exciting and in-depth “behind the scenes” look at the James Bond film series, and its complicated and sometimes contentious history of product placement. In the process, it charts the gradual emergence of product placement from the more traditional background shot to becoming so embedded in the actual film narrative that they have become simply yet another method for filmmakers to produce cultural meaning.
Resumo:
Monodisperse silica nanoparticles were synthesised by the well-known Stober protocol, then dispersed in acetonitrile (ACN) and subsequently added to a bisacetonitrile gold(I) coordination complex ([Au(MeCN)2]?) in ACN. The silica hydroxyl groups were deprotonated in the presence of ACN, generating a formal negative charge on the siloxy groups. This allowed the [Au(MeCN)2]? complex to undergo ligand exchange with the silica nanoparticles and form a surface coordination complex with reduction to metallic gold (Au0) proceeding by an inner sphere mechanism. The residual [Au(MeCN)2]? complex was allowed to react with water, disproportionating into Au0 and Au(III), respectively, with the Au0 adding to the reduced gold already bound on the silica surface. The so-formed metallic gold seed surface was found to be suitable for the conventional reduction of Au(III) to Au0 by ascorbic acid (ASC). This process generated a thin and uniform gold coating on the silica nanoparticles. The silica NPs batches synthesised were in a size range from 45 to 460 nm. Of these silica NP batches, the size range from 400 to 480 nm were used for the gold-coating experiments.
Resumo:
This article examines the current transfer pricing regime to consider whether it is a sound model to be applied to modern multinational entities. The arm's length price methodology is examined to enable a discussion of the arguments in favour of such a regime. The article then refutes these arguments concluding that, contrary to the very reason multinational entities exist, applying arm's length rules involves a legal fiction of imagining transactions between unrelated parties. Multinational entities exist to operate in a way that independent entities would not, which the arm's length rules fail to take into account. As such, there is clearly an air of artificiality in applying the arm's length standard. To demonstrate this artificiality with respect to modern multinational entities, multinational banks are used as an example. The article concluded that the separate entity paradigm adopted by the traditional transfer pricing regime is incongruous with the economic theory of modern multinational enterprises.
Creativity in policing: building the necessary skills to solve complex and protracted investigations
Resumo:
Despite an increased focus on proactive policing in recent years, criminal investigation is still perhaps the most important task of any law enforcement agency. As a result, the skills required to carry out a successful investigation or to be an ‘effective detective’ have been subjected to much attention and debate (Smith and Flanagan, 2000; Dean, 2000; Fahsing and Gottschalk, 2008:652). Stelfox (2008:303) states that “The service’s capacity to carry out investigations comprises almost entirely the expertise of investigators.” In this respect, Dean (2000) highlighted the need to profile criminal investigators in order to promote further understanding of the cognitive approaches they take to the process of criminal investigation. As a result of his research, Dean (2000) produced a theoretical framework of criminal investigation, which included four disparate cognitive or ‘thinking styles’. These styles were the ‘Method’, ‘Challenge’, ‘Skill’ and ‘Risk’. While the Method and Challenge styles deal with adherence to Standard Operating Procedures (SOPs) and the internal ‘drive’ that keeps an investigator going, the Skill and Risk styles both tap on the concept of creativity in policing. It is these two latter styles that provide the focus for this paper. This paper presents a brief discussion on Dean’s (2000) Skill and Risk styles before giving an overview of the broader literature on creativity in policing. The potential benefits of a creative approach as well as some hurdles which need to be overcome when proposing the integration of creativity within the policing sector are then discussed. Finally, the paper concludes by proposing further research into Dean’s (2000) skill and risk styles and also by stressing the need for significant changes to the structure and approach of the traditional policing organisation before creativity in policing is given the status it deserves.
Resumo:
In phylogenetics, the unrooted model of phylogeny and the strict molecular clock model are two extremes of a continuum. Despite their dominance in phylogenetic inference, it is evident that both are biologically unrealistic and that the real evolutionary process lies between these two extremes. Fortunately, intermediate models employing relaxed molecular clocks have been described. These models open the gate to a new field of “relaxed phylogenetics.” Here we introduce a new approach to performing relaxed phylogenetic analysis. We describe how it can be used to estimate phylogenies and divergence times in the face of uncertainty in evolutionary rates and calibration times. Our approach also provides a means for measuring the clocklikeness of datasets and comparing this measure between different genes and phylogenies. We find no significant rate autocorrelation among branches in three large datasets, suggesting that autocorrelated models are not necessarily suitable for these data. In addition, we place these datasets on the continuum of clocklikeness between a strict molecular clock and the alternative unrooted extreme. Finally, we present analyses of 102 bacterial, 106 yeast, 61 plant, 99 metazoan, and 500 primate alignments. From these we conclude that our method is phylogenetically more accurate and precise than the traditional unrooted model while adding the ability to infer a timescale to evolution.
Resumo:
A standard method for the numerical solution of partial differential equations (PDEs) is the method of lines. In this approach the PDE is discretised in space using �finite di�fferences or similar techniques, and the resulting semidiscrete problem in time is integrated using an initial value problem solver. A significant challenge when applying the method of lines to fractional PDEs is that the non-local nature of the fractional derivatives results in a discretised system where each equation involves contributions from many (possibly every) spatial node(s). This has important consequences for the effi�ciency of the numerical solver. First, since the cost of evaluating the discrete equations is high, it is essential to minimise the number of evaluations required to advance the solution in time. Second, since the Jacobian matrix of the system is dense (partially or fully), methods that avoid the need to form and factorise this matrix are preferred. In this paper, we consider a nonlinear two-sided space-fractional di�ffusion equation in one spatial dimension. A key contribution of this paper is to demonstrate how an eff�ective preconditioner is crucial for improving the effi�ciency of the method of lines for solving this equation. In particular, we show how to construct suitable banded approximations to the system Jacobian for preconditioning purposes that permit high orders and large stepsizes to be used in the temporal integration, without requiring dense matrices to be formed. The results of numerical experiments are presented that demonstrate the effectiveness of this approach.