983 resultados para Not ludic way of playing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Counselling is an unregulated activity in Australia. No statutory regulation currently exists. As a result, different counselling organizations are promoting different voluntary standards for the practice of counselling. This has led to a credentialing dilemma in which counsellors and the public are confronted with a number of counselling qualification choices. This dilemma poses a number of questions: Should counselling become more regulated in Australia? At what level should counselling be regulated? Should there be various levels of counsellor regulation? This article provides an overview of the credentialing dilemma facing counselling in Australia, compares and contrasts two main Australian accreditation efforts, and proposes cooperation as a way of navigating said dilemma. The implications for counselling as a profession are discussed along with suggestions for its development as a profession. This includes a discussion regarding the relative advantages and disadvantages of greater regulation of counselling as a professional activity in Australia. Specifically, what is and is not generally considered a profession is reviewed, different forms of credentialing are outlined, and general arguments for and against accreditation efforts are presented. The efforts of the Australian Counselling Association (ACA) and the Psychotherapy and Counselling Federation of Australia (PACFA) are compared and are shown to have common ground. Consequently, ways in which the main counselling organizations may best work in conjunction to promote counselling as a profession in Australia are proposed. These suggestions include good communication, collaboration, and the avoidance of turf wars. Specifically, that the ACA and PACFA collaborate on developing a combined independent registration list that is supported by both organizations or, minimally, that both organizations have mutual recognition on each other's register lists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While others have attempted to determine, by way of mathematical formulae, optimal resource duplication strategies for random walk protocols, this paper is concerned with studying the emergent effects of dynamic resource propagation and replication. In particular, we show, via modelling and experimentation, that under any given decay (purge) rate the number of nodes that have knowledge of particular resource converges to a fixed point or a limit cycle. We also show that even for high rates of decay - that is, when few nodes have knowledge of a particular resource - the number of hops required to find that resource is small.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The majority of ‘service’ literature has focused on the production side of service work (i.e. employees and management), while treating the role of the customer and/or consumer as secondary (Korczynski and Ott, 2004). Those authors who have addressed the role consumption plays in shaping and maintaining individuals' self- identity have tended to overemphasize the dominance of consumer culture in shaping ‘our consciousness’ (Ritzer, 1999), with little in the way of empirical evidence to support these assertions. This paper develops the conceptualization of service work and consumer culture literature, by placing more emphasis on the customer in the service encounter. Using an ethnographic study of a ‘high class’ department store, this paper addresses employee and customer identity and the nature of managerial, employee and customer control within this ‘exclusive’ context. Of particular interest is how employees and customer’s ‘embody’ this control. Using Bourdieu’s (1986) conception of class and habitus, the concept of exclusivity goes beyond the management /service worker dyad by providing a means of investigating identity control by the organization over both customers and service workers. However, an organization’s exclusivity is not a closed normative pursuit of control, and shows this enterprise is part of a contested terrain, while revealing the ambiguity and ‘openness’ of control practices and pursuits. In order to uphold the ideal of exclusivity, management, service workers and customers must all engage in a precarious quest for establishing and maintaining a sense of control and/or identity. This paper demonstrates the continuing contradiction between bureaucratic practices of control and consumer culture, and highlights the need for research that investigates the context -dependent nature of control in service-related and consumer studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A plethora of process modeling techniques has been proposed over the years. One way of evaluating and comparing the scope and completeness of techniques is by way of representational analysis. The purpose of this paper is to examine how process modeling techniques have developed over the last four decades. The basis of the comparison is the Bunge-Wand-Weber representation model, a benchmark used for the analysis of grammars that purport to model the real world and the interactions within it. This paper presents a comparison of representational analyses of several popular process modeling techniques and has two main outcomes. First, it provides insights, within the boundaries of a representational analysis, into the extent to which process modeling techniques have developed over time. Second, the findings also indicate areas in which the underlying theory seems to be over-engineered or lacking in specialization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Email has been used for some years as a low-cost telemedicine medium to provide support for developing countries. However, all operations have been relatively small scale and fairly labour intensive to administer. A scalable, automatic message-routing system was constructed which automates many of the tasks. During a four-month study period in 2002, 485 messages were processed automatically. There were 31 referrals from eight hospitals in three countries. These referrals were handled by 25 volunteer specialists from a panel of 42. Two system operators, located 10 time zones apart, managed the system. The median time from receipt of a new referral to its allocation to a specialist was 1.0 days (interquartile range 0.7-2.4). The median interval between allocation and first reply was 0.7 days (interquartile range 0.3-2.3). Automatic message handling solves many of the problems of manual email telemedicine systems and represents a potentially scalable way of doing low-cost telemedicine in the developing world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Residual current-operated circuit-breakers (RCCBs) have proved useful devices for the protection of both human beings against ventricular fibrillation and installations against fire. Although they work well with sinusoidal waveforms, there is little published information on their characteristics. Due to shunt connected non-linear devices, not the least of which is the use of power electronic equipment, the supply is distorted. Consequently, RCCBs as well as other protection relays are subject to non-sinusoidal current waveforms. Recent studies showed that RCCBs are greatly affected by harmonics, however the reasons for this are not clear. A literature search has also shown that there are inconsistencies in the analysis of the effect of harmonics on protection relays. In this work, the way RCCBs operate is examined, then a model is built with the aim of assessing the effect of non-sinusoidal current on RCCBs. Tests are then carried out on a number of RCCBs and these, when compared with the results from the model showed good correlation. In addition, the model also enables us to explain the RCCBs characteristics for pure sinusoidal current. In the model developed, various parameters are evaluated but special attention is paid to the instantaneous value of the current and the tripping mechanism movement. A similar assessment method is then used to assess the effect of harmonics on two types of protection relay, the electromechanical instantaneous relay and time overcurrent relay. A model is built for each of them which is then simulated on the computer. Tests results compare well with the simulation results, and thus the model developed can be used to explain the relays behaviour in a harmonics environment. The author's models, analysis and tests show that RCCBs and protection relays are affected by harmonics in a way determined by the waveform and the relay constants. The method developed provides a useful tool and the basic methodology to analyse the behaviour of RCCBs and protection relays in a harmonics environment. These results have many implications, especially the way RCCBs and relays should be tested if harmonics are taken into account.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE. A methodology for noninvasively characterizing the three-dimensional (3-D) shape of the complete human eye is not currently available for research into ocular diseases that have a structural substrate, such as myopia. A novel application of a magnetic resonance imaging (MRI) acquisition and analysis technique is presented that, for the first time, allows the 3-D shape of the eye to be investigated fully. METHODS. The technique involves the acquisition of a T2-weighted MRI, which is optimized to reveal the fluid-filled chambers of the eye. Automatic segmentation and meshing algorithms generate a 3-D surface model, which can be shaded with morphologic parameters such as distance from the posterior corneal pole and deviation from sphericity. Full details of the method are illustrated with data from 14 eyes of seven individuals. The spatial accuracy of the calculated models is demonstrated by comparing the MRI-derived axial lengths with values measured in the same eyes using interferometry. RESULTS. The color-coded eye models showed substantial variation in the absolute size of the 14 eyes. Variations in the sphericity of the eyes were also evident, with some appearing approximately spherical whereas others were clearly oblate and one was slightly prolate. Nasal-temporal asymmetries were noted in some subjects. CONCLUSIONS. The MRI acquisition and analysis technique allows a novel way of examining 3-D ocular shape. The ability to stratify and analyze eye shape, ocular volume, and sphericity will further extend the understanding of which specific biometric parameters predispose emmetropic children subsequently to develop myopia. Copyright © Association for Research in Vision and Ophthalmology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gestalt grouping rules imply a process or mechanism for grouping together local features of an object into a perceptual whole. Several psychophysical experiments have been interpreted as evidence for constrained interactions between nearby spatial filter elements and this has led to the hypothesis that element linking might be mediated by these interactions. A common tacit assumption is that these interactions result in response modulation which disturbs a local contrast code. We addressed this possibility by performing contrast discrimination experiments using two-dimensional arrays of multiple Gabor patches arranged either (i) vertically, (ii) in circles (coherent conditions), or (iii) randomly (incoherent condition), as well as for a single Gabor patch. In each condition, contrast increments were applied to either the entire test stimulus (experiment 1) or a single patch whose position was cued (experiment 2). In experiment 3, the texture stimuli were reduced to a single contour by displaying only the central vertical strip. Performance was better for the multiple-patch conditions than for the single-patch condition, but whether the multiple-patch stimulus was coherent or not had no systematic effect on the results in any of the experiments. We conclude that constrained local interactions do not interfere with a local contrast code for our suprathreshold stimuli, suggesting that, in general, this is not the way in which element linking is achieved. The possibility that interactions are involved in enhancing the detectability of contour elements at threshold remains unchallenged by our experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increased awareness of the crucial role of leadership as a competitive advantage for organisations (McCall, 1998; Petrick, Scherer, Brodzinski, Quinn, & Ainina, 1999) has led to billions spent on leadership development programmes and training (Avolio & Hannah, 2008). However, research reports confusing and contradictory evidence regarding return on investment and developmental outcomes, and a lot of variance has been observed across studies (Avolio, Reichard, Hannah, Walumbwa, & Chan, 2009). The purpose of this thesis is to understand the mechanisms underlying this variability in leadership development. Of the many factors at play in the process, such as programme design and delivery, organisational support, and perceptions of relevance (Mabey, 2002; Day, Harrison, & Halpin, 2009), individual differences and characteristics stand out. One way in which individuals differ is in their Developmental Readiness (DR), a concept recently introduced in the literature that may well explain this variance and which has been proposed to accelerate development (Avolio & Hannah, 2008, 2009). Building on previous work, DR is introduced and conceptualised somewhat differently. In this study, DR is construed of self-awareness, self-regulation, and self-motivation, proposed by Day (2000) to be the backbones of leadership development. DR is suggested to moderate the developmental process. Furthermore, personality dispositions and individual values are proposed to be precursors of DR. The empirical research conducted uses a pre-test post-test quasi-experimental design. Before conducting the study, though, both a measure of Developmental Readiness and a competency profiling measure are tested in two pilot studies. Results do not find evidence of a direct effect of leadership development programmes on development, but do support an interactive effect between DR and leadership development programmes. Personality dispositions Agreeableness, Conscientiousness, and Openness to Experience and value orientations Conservation, Open, and Closed Orientation are found to significantly predict DR. Finally, the theoretical and practical implications of findings are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this project was to carry out an investigastion into suitable alternatives to gasoline for use in modern automobiles. The fuel would provide the western world with a means of extending the natural gasoline resources and the third world a way of cutting down their dependence on the oil producing countries for their energy supply. Alcohols, namely methanol and ethanol, provide this solution. They can be used as gasoline extenders or as fuels on their own.In order to fulfil the aims of the project a literature study was carried out to investigate methods and costs of producing these fuels. An experimental programme was then set up in which the performance of the alcohols was studied on a conventional engine. The engine used for this purpose was the Fiat 127 930cc four cylinder engine. This engine was used because of its popularity in the European countries. The Weber fixed jet carburettor, since it was designed to be used with gasoline, was adapted so that the alcohol fuels and the blends could be used in the most efficient way. This was mainly to take account of the lower heat content of the alcohols. The adaptation of the carburettor was in the form of enlarging the main metering jet. Allowances for the alcohol's lower specfic gravity were made during fuel metering.Owing to the low front end volatility of methanol and ethanol, it was expected that `start up' problems would occur. An experimental programme was set up to determine the temperature range for a minimum required percentage `take off' that would ease start-up since it was determined that a `take off' of about 5% v/v liquid in the vapour phase would be sufficient for starting. Additions such as iso-pentane and n-pentane were used to improve the front end volatility. This proved to be successful.The lower heat content of the alcohol fuels also meant that a greater charge of fuel would be required. This was seen to pose further problems with fuel distribution from the carburettor to the individual cylinders on a multicylinder engine. Since it was not possible to modify the existing manifold on the Fiat 127 engine, experimental tests on manifold geometry were carried out using the Ricardo E6 single cylinder variable compression engine. Results from these tests showed that the length, shape and cross-sectional area of the manifold play an important part in the distribution of the fuel entering the cylinder, ie. vapour phase, vapour/small liquid droplet/liquid film phase, vapour/large liquid droplet/liquid film phase etc.The solvent properties of the alcohols and their greater electrical conductivity suggested that the materials used on the engine would be prone to chemical attack. In order to determine the type and rate of chemical attack, an experimental programme was set up whereby carburettor and other components were immersed in the alcohols and in blends of alcohol with gasoline. The test fuels were aerated and in some instances kept at temperatures ranging from 50oC to 90oC. Results from these tests suggest that not all materials used in the conventional engine are equally suitable for use with alcohols and alcohol/gasoline blends. Aluminium for instance was severely attacked by methanol causing pitting and pin-holing in the surface.In general this whole experimental programme gave valuable information on the acceptability of substitute fuels. While the long term effects of alcohol use merit further study, it is clear that methanol and ethanol will be increasingly used in place of gasoline.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns the development of coordination and co-governance within three different regeneration programmes within one Midlands city over the period from 1999 to 2002. The New Labour government, in office since 1997, had an agenda for ‘joining-up’ government, part of which has had considerable impact in the area of regeneration policy. Joining-up government encompasses a set of related activities which can include the coordination of policy-making and service delivery. In regeneration, it also includes a commitment to operate through co-governance. Central government and local and regional organisations have sought to put this idea into practice by using what may be referred to as network management processes. Many characteristics of new policies are designed to address the management of networks. Network management is not new in this area, it has developed at least since the early 1990s with the City Challenge and Single Regeneration Budget (SRB) programmes as a way of encouraging more inclusive and effective regeneration interventions. Network management theory suggests that better management can improve decision-making outcomes in complex networks. The theories and concepts are utilised in three case studies as a way of understanding how and why regeneration attempts demonstrate real advances in inter-organisational working at certain times whilst faltering at others. Current cases are compared to the historical case of the original SRB programme as a method of assessing change. The findings suggest that: The use of network management can be identified at all levels of governance. As previous literature has highlighted, central government is the most important actor regarding network structuring. However, it can be argued that network structuring and game management are both practised by central and local actors; Furthermore, all three of the theoretical perspectives within network management (Instrumental, Institutional and Interactive), have been identified within UK regeneration networks. All may have a role to play with no single perspective likely to succeed on its own. Therefore, all could make an important contribution to the understanding of how groups can be brought together to work jointly; The findings support Klijn’s (1997) assertion that the institutional perspective is dominant for understanding network management processes; Instrumentalism continues on all sides, as the acquisition of resources remains the major driver for partnership activity; The level of interaction appears to be low despite the intentions for interactive decision-making; Overall, network management remains partial. Little attention is paid to the issues of accountability or to the institutional structures which can prevent networks from implementing the policies designed by central government, and/or the regional tier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One way of describing this thesis, is to state that it attempts to explicate the context within which an application of Stafford Beer's Viable System Model (VSM) makes cybernetic sense. The thesis will attempt to explain how such a context is presently not clearly ennunciated, and why such a lack hinders communications of the model together with its consequent effective take-up by the student or practitioner. The epistemological grounding of the VSM will be described as concerning the ontology of the individuals who apply it and give witness to its application. In describing a particular grounding for the Viable System Model, I am instantiating a methodology which I call a `hermeneutics of distinction'. The final two chapters explicate such a methodology, and consider the implications for the design of a computer system. This thesis is grounded in contemporary insights into the nervous system, and research into the biology of language and cognition. Its conclusions emerge from a synthesis of the twin discourses of Stafford Beer and Humberto Maturana.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first clinically proven nicotine replacement product to obtain regulatory approval was Nicorette® gum. It provides a convenient way of delivering nicotine directly to the buccal cavity, thus, circumventing 'first-pass' elimination following gastrointestinal absorption. Since launch, Nicorette® gum has been investigated in numerous studies (clinical) which are often difficult to compare due to large variations in study design and degree of sophistication. In order to standardise testing, in 2000 the European Pharmacopoeia introduced an apparatus to investigate the in vitro release of drug substances from medical chewing gum. With use of the chewing machine, the main aims of this project were to determine factors that could affect release from Nicorette® gum, to develop an in vitro in vivo correlation and to investigate formulation variables on release of nicotine from gums. A standard in vitro test method was developed. The gum was placed in the chewing chamber with 40 mL of artificial saliva at 37'C and chewed at 60 chews per minute. The chew rate, the type of dissolution medium used, pH, volume, temperature and the ionic strength of the dissolution medium were altered to investigate the effects on release in vitro. It was found that increasing the temperature of the dissolution media and the rate at which the gums were chewed resulted in a greater release of nicotine, whilst increasing the ionic strength of the dissolution medium to 80 mM resulted in a lower release. The addition of 0.1 % sodium Jauryl sulphate to the artificial saliva was found to double the release of nicotine compared to the use of artificial saliva and water alone. Although altering the dissolution volume and the starting pH did not affect the release. The increase in pH may be insufficient to provide optimal conditions for nicotine absorption (since the rate at which nicotine is transported through the buccal membrane was found to be higher at pH values greater than 8.6 where nicotine is predominately unionised). Using a time mapping function, it was also possible to establish a level A in vitro in vivo correlation. 4 mg Nicorette® gum was chewed at various chew rates in vitro and correlated to an in vivo chew-out study. All chew rates used in vitro could be successfully used for IVIVC purposes, however statistically, chew rates of 10 and 20 chews per minute performed better than all other chew rates. Finally a series of nicotine gums was made to investigate the effect of formulation variables on release of nicotine from the gum. Using a directly compressible gum base, in comparison to Nicorette® the gums crumbled when chewed in vitro, resulting in a faster release of nicotine. To investigate the effect of altering the gum base, the concentration of sodium salts, sugar syrup, the form of the active drug, the addition sequence and the incorporation of surfactant into the gum, the traditional manufacturing method was used to make a series of gum formulations. Results showed that the time of addition of the active drug, the incorporation of surfactants and using different gum base all increased the release of nicotine from the gum. In contrast, reducing the concentration of sodium carbonate resulted in a lower release. Using a stronger nicotine ion-exchange resin delayed the release of nicotine from the gum, whilst altering the concentration of sugar syrup had little effect on the release but altered the texture of the gum.