886 resultados para scale development
Resumo:
Two forms of small-scale forestry are developing in Australia, each with different impacts on rural communities. One is based on growing short-rotation Eucalyptus globulus (blue gum) for pulp and the other on production of higher-value products from longer-rotation native hard-woods. Several impediments exist to further development of small-scale forestry, including the lack of a small-scale forestry culture, concerns over harvest rights, lack of market development, the long wait for returns, and satisfaction with current land uses. Nevertheless, the rapid increase in farm woodlot establishment in the past five years has paralleled the strong increase in the private industrial plantation estate. As markets develop and hindrances are overcome, landholders not previously interested in small-scale forestry may consider ita worthwhile land use.
Resumo:
Prior research demonstrates that understanding theory of mind (ToM) is seriously and similarly delayed in late-signing deaf children and children with autism. Are these children simply delayed in timing relative to typical children, or do they demonstrate different patterns of development? The current research addressed this question by testing 145 children (ranging from 3 to 13 years) with deafness, autism, or typical development using a ToM scale. Results indicate that all groups followed the same sequence of steps, up to a point, but that children with autism showed an importantly different sequence of understandings (in the later steps of the progression) relative to all other groups.
Resumo:
This paper provides information on the experimental set-up, data collection methods and results to date for the project Large scale modelling of coarse grained beaches, undertaken at the Large Wave Channel (GWK) of FZK in Hannover by an international group of researchers in Spring 2002. The main objective of the experiments was to provide full scale measurements of cross-shore processes on gravel and mixed beaches for the verification and further development of cross-shore numerical models of gravel and mixed sediment beaches. Identical random and regular wave tests were undertaken for a gravel beach and a mixed sand/gravel beach set up in the flume. Measurements included profile development, water surface elevation along the flume, internal pressures in the swash zone, piezometric head levels within the beach, run-up, flow velocities in the surf-zone and sediment size distributions. The purpose of the paper is to present to the scientific community the experimental procedure, a summary of the data collected, some initial results, as well as a brief outline of the on-going research being carried out with the data by different research groups. The experimental data is available to all the scientific community following submission of a statement of objectives, specification of data requirements and an agreement to abide with the GWK and EU protocols. (C) 2005 Elsevier B.V. All rights reserved.
Resumo:
Community-based treatment and care of people with psychiatric disabilities has meant that they are now more likely to engage in the parenting role. This has led to the development of programs designed to enhance the parenting skills of people with psychiatric disabilities. Evaluation of these programs has been hampered by a paucity of evaluation tools. This study's aim was to develop and trial a tool that examined the parent-child interaction within a group setting, was functional and easy to use, required minimum training and equipment, and had acceptable levels of reliability and validity. The revised tool yielded a single scale with acceptable reliability. It had discriminative validity and concurrent validity with non-independent global ratings of parenting. Sensitivity to change was not investigated. The findings suggest that this method of evaluating parenting is likely to have both clinical and research utility and further investigation of the psychometric properties of the tool is warranted.
Resumo:
Objective: To devise more-effective physical activity interventions, the mediating mechanisms yielding behavioral change need to be identified. The Baron-Kenny method is most commonly used. but has low statistical power and May not identify mechanisms of behavioral change in small-to-medium size Studies. More powerful statistical tests are available, Study Design and Setting: Inactive adults (N = 52) were randomized to either a print or a print-plus-telephone intervention. Walking and exercise-related social support Were assessed at baseline, after file intervention, and 4 weeks later. The Baron-Kenny and three alternative methods of mediational analysis (Freedman-Schatzkin; MacKinnon et al.: bootstrap method) were used to examine the effects of social support on initial behavior change and maintenance. Results: A significant mediational effect of social support on initial behavior change was indicated by the MacKinnon et al., bootstrap. and. marginally. Freedman-Schatzkin methods, but not by the Baron-Kenny method. No significant mediational effecl of social support on maintenance of walking was found. Conclusions: Methodologically rigorous intervention studies to identify mediators of change in physical activity are costly and labor intensive, and may not be feasible with large samples. The Use of statistically powerful tests of mediational effects in small-scale studies can inform the development of more effective interventions. (C) 2006 Elsevier Inc. All rights reserved.
Resumo:
Background. We describe the development, reliability and applications of the Diagnostic Interview for Psychoses (DIP), a comprehensive interview schedule for psychotic disorders. Method. The DIP is intended for use by interviewers with a clinical background and was designed to occupy the middle ground between fully structured, lay-administered schedules, and semi-structured., psychiatrist-administered interviews. It encompasses four main domains: (a) demographic data; (b) social functioning and disability; (c) a diagnostic module comprising symptoms, signs and past history ratings; and (d) patterns of service utilization Lind patient-perceived need for services. It generates diagnoses according to several sets of criteria using the OPCRIT computerized diagnostic algorithm and can be administered either on-screen or in a hard-copy format. Results. The DIP proved easy to use and was well accepted in the field. For the diagnostic module, inter-rater reliability was assessed on 20 cases rated by 24 clinicians: good reliability was demonstrated for both ICD-10 and DSM-III-R diagnoses. Seven cases were interviewed 2-11 weeks apart to determine test-retest reliability, with pairwise agreement of 0.8-1.0 for most items. Diagnostic validity was assessed in 10 cases, interviewed with the DIP and using the SCAN as 'gold standard': in nine cases clinical diagnoses were in agreement. Conclusions. The DIP is suitable for use in large-scale epidemiological studies of psychotic disorders. as well as in smaller Studies where time is at a premium. While the diagnostic module stands on its own, the full DIP schedule, covering demography, social functioning and service utilization makes it a versatile multi-purpose tool.
Resumo:
Because organizations are making large investments in Information systems (IS), efficient IS project management has been found critical to success. This study examines how the use of incentives can improve the project success. Agency theory is used to: identify motivational factors of project success, help the IS owners to understand to what extent management incentives can improve IS development and implementation (ISD/I). The outcomes will help practitioners and researchers to build on theoretical model of project management elements which lead to project success. Given the principal-agent nature of most significant scale of IS development, insights that will allow for greater alignment of the agent’s goals with those of the principal through incentive contracts, will serve to make ISD/I both more efficient and more effective, leading to more successful IS projects.
Resumo:
L'obiettivo principale della politica di sicurezza alimentare è quello di garantire la salute dei consumatori attraverso regole e protocolli di sicurezza specifici. Al fine di rispondere ai requisiti di sicurezza alimentare e standardizzazione della qualità, nel 2002 il Parlamento Europeo e il Consiglio dell'UE (Regolamento (CE) 178/2002 (CE, 2002)), hanno cercato di uniformare concetti, principi e procedure in modo da fornire una base comune in materia di disciplina degli alimenti e mangimi provenienti da Stati membri a livello comunitario. La formalizzazione di regole e protocolli di standardizzazione dovrebbe però passare attraverso una più dettagliata e accurata comprensione ed armonizzazione delle proprietà globali (macroscopiche), pseudo-locali (mesoscopiche), ed eventualmente, locali (microscopiche) dei prodotti alimentari. L'obiettivo principale di questa tesi di dottorato è di illustrare come le tecniche computazionali possano rappresentare un valido supporto per l'analisi e ciò tramite (i) l’applicazione di protocolli e (ii) miglioramento delle tecniche ampiamente applicate. Una dimostrazione diretta delle potenzialità già offerte dagli approcci computazionali viene offerta nel primo lavoro in cui un virtual screening basato su docking è stato applicato al fine di valutare la preliminare xeno-androgenicità di alcuni contaminanti alimentari. Il secondo e terzo lavoro riguardano lo sviluppo e la convalida di nuovi descrittori chimico-fisici in un contesto 3D-QSAR. Denominata HyPhar (Hydrophobic Pharmacophore), la nuova metodologia così messa a punto è stata usata per esplorare il tema della selettività tra bersagli molecolari strutturalmente correlati e ha così dimostrato di possedere i necessari requisiti di applicabilità e adattabilità in un contesto alimentare. Nel complesso, i risultati ci permettono di essere fiduciosi nel potenziale impatto che le tecniche in silico potranno avere nella identificazione e chiarificazione di eventi molecolari implicati negli aspetti tossicologici e nutrizionali degli alimenti.
Resumo:
Recent discussion of the knowledge-based economy draws increasingly attention to the role that the creation and management of knowledge plays in economic development. Development of human capital, the principal mechanism for knowledge creation and management, becomes a central issue for policy-makers and practitioners at the regional, as well as national, level. Facing competition both within and across nations, regional policy-makers view human capital development as a key to strengthening the positions of their economies in the global market. Against this background, the aim of this study is to go some way towards answering the question of whether, and how, investment in education and vocational training at regional level provides these territorial units with comparative advantages. The study reviews literature in economics and economic geography on economic growth (Chapter 2). In growth model literature, human capital has gained increased recognition as a key production factor along with physical capital and labour. Although leaving technical progress as an exogenous factor, neoclassical Solow-Swan models have improved their estimates through the inclusion of human capital. In contrast, endogenous growth models place investment in research at centre stage in accounting for technical progress. As a result, they often focus upon research workers, who embody high-order human capital, as a key variable in their framework. An issue of discussion is how human capital facilitates economic growth: is it the level of its stock or its accumulation that influences the rate of growth? In addition, these economic models are criticised in economic geography literature for their failure to consider spatial aspects of economic development, and particularly for their lack of attention to tacit knowledge and urban environments that facilitate the exchange of such knowledge. Our empirical analysis of European regions (Chapter 3) shows that investment by individuals in human capital formation has distinct patterns. Those regions with a higher level of investment in tertiary education tend to have a larger concentration of information and communication technology (ICT) sectors (including provision of ICT services and manufacture of ICT devices and equipment) and research functions. Not surprisingly, regions with major metropolitan areas where higher education institutions are located show a high enrolment rate for tertiary education, suggesting a possible link to the demand from high-order corporate functions located there. Furthermore, the rate of human capital development (at the level of vocational type of upper secondary education) appears to have significant association with the level of entrepreneurship in emerging industries such as ICT-related services and ICT manufacturing, whereas such association is not found with traditional manufacturing industries. In general, a high level of investment by individuals in tertiary education is found in those regions that accommodate high-tech industries and high-order corporate functions such as research and development (R&D). These functions are supported through the urban infrastructure and public science base, facilitating exchange of tacit knowledge. They also enjoy a low unemployment rate. However, the existing stock of human and physical capital in those regions with a high level of urban infrastructure does not lead to a high rate of economic growth. Our empirical analysis demonstrates that the rate of economic growth is determined by the accumulation of human and physical capital, not by level of their existing stocks. We found no significant effects of scale that would favour those regions with a larger stock of human capital. The primary policy implication of our study is that, in order to facilitate economic growth, education and training need to supply human capital at a faster pace than simply replenishing it as it disappears from the labour market. Given the significant impact of high-order human capital (such as business R&D staff in our case study) as well as the increasingly fast pace of technological change that makes human capital obsolete, a concerted effort needs to be made to facilitate its continuous development.
Resumo:
Purpose - Despite the increasing sophistication of new product development (NPD) research, the reliance on traditional approaches to studying NPD has left several areas in need of further research. The authors propose addressing some of these gaps, especially the limited focus on consumer brands, evaluation criteria used across different project-review points in the NPD process, and the distinction between "kills", "successes", and "failures". Moreover, they propose investigating how screening criteria change across project-review points, using real-time NPD projects. Design/methodology/approach - A postal survey generated 172 usable questionnaires from a sample of European, North American, Far Eastern and Australian consumer packaged-goods firms, providing data on 314 new product projects covering different development and post-commercialization review points. Findings - The results confirm that acceptance-rejection criteria vary through the NPD process. However, financial criteria dominate across all the project-review points. Initial screening is coarse, focusing predominantly on financial criteria. Fit with organizational, product, brand, promotional, and market requirements dominate in the detailed screen and pre-development evaluation points. At pre-launch, decision-makers focus on product, brand, and promotional criteria. Commercial fit, production synergies, and reliability of the firm's market intelligence are significant discriminators in the post-launch review. Moreover, the importance of marketing and channel issues makes the criteria for screening brands different from those of industrial markets. Originality/value - The study, although largely descriptive and involves a relatively small sample of consumer goods firms, offers new insights into NPD project evaluation behavior. Future, larger-scale investigations covering a broader spectrum of consumer product sectors are needed to validate our results and to explain the reasons behind managers' decisions. © Emerald Group Publishing Limited.
Resumo:
A great number of strategy tools are being taught in strategic management modules. These tools are available to managers for use in facilitating strategic decision-making and enhancing the strategy development process in their organisations. A number of studies have been published examining which are the most popular tools; however there is little empirical evidence on how their utilisation influences the strategy process. This paper is based on a large scale international survey on the strategy development process, and seeks to examine the impact of a particular strategy tool, the Balanced Scorecard, upon the strategy process. The Balanced Scorecard is one of the most popular strategy tools whose use has evolved since its introduction in the 1990’s. Recently, it has been suggested that as a strategy tool, Balanced Scorecard can influence all elements of the strategy process. The results of this study indicate that although there are significant differences in some elements of the strategy process between the organisations that have implemented the Balanced Scorecard and those that have not, the impact is not comprehensive.
Resumo:
Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.
The impact of brand owner on consumers' brand perceptions : a development of Heider's Balance Theory
Resumo:
Studies have shown that the brand “owner” is very influential in positioning the brand and when the brand “owner” ceases his or her active role the brand will be perceived differently by the consumers. Balance Theory (HBT), a cognitive psychological theory, studies the triadic relationships between two persons and an entity and predicts that when a person’s original perception of the relationship is disturbed, the person restructures to a new balanced perception. Consequently, this research was undertaken to: conceptualize the brand owner’s impact on consumer’s brand perception; test the applicability of both the static and dynamic predictions of the Heider’s Balance Theory in brand owner-consumer-brand relation (OCB); construct and test a model of brand owner-consumer-brand relation; and examine if personality has an influence on OCB. A discovery-oriented approach was taken to understand the selected market segment, the ready-to-wear and diffusion lines of international designer labels. Chinese Brand Personality Scale, fashion proneness and hedonic and utilitarian shopping scales were developed, and validated. 51 customers were surveyed. Both traditional and extended methods used in the Balance Theory were employed in this study. Responses to liked brand have been used to test and develop the model, while those for disliked brand were used for test and confirmation. A “what if’ experimental approach was employed to test the applicability of dynamic HBT theory in OCB Model. The hypothesized OCB Model has been tested and validated. Consumers have been found to have separate views on the brand and the brand owner; and their responses to contrasting ethical and non-ethical news of the brand owner are different. Personality has been found to have an influence and two personality adapted models have been tested and validated. The actual results go beyond the prediction of the Balance Theory. Dominant triple positive balance mode, dominant negative balance mode, and mode of extreme antipathy have been found. It has been found that not all balanced modes are good for the brand. Contrary to Heider’s findings, simply liking may not necessarily lead to unit relation in the OCB Model.
Resumo:
A description of the background to testing friction materials for automotive brakes explains the need for a rapid, inexpensive means of assessing their behaviour in a way which is both accurate and meaningful. Various methods of controlling inertia dynamometers to simulate road vehicles are rejected in favour of programming by means of a commercially available XY plotter. Investigation of brake service conditions is used to set up test schedules, and a dynamometer programming unit built to enable service conditions on vehicles to be simulated on a full scale dynamometer. A technique is developed by which accelerated testing can be achieved without operating under overload conditions, saving time and cost without sacrificing validity. The development of programming by XY plotter is described, with a method of operating one XY plotter to programme the machine, monitor its own behaviour, and plot its own results in logical sequence. Commissioning trials are described and the generation of reproducible results in frictional behaviour and material durability is discussed. Teclmiques are developed to cross check the operation of the machine in retrospect, and retrospectively correct results in the event of malfunctions. Sensitivity errors in the measuring circuits are displayed between calibrations, whilst leaving the recorded results almost unaffected by error. Typical results of brake lining tests are used to demonstrate the range of performance parameters which can be studied by use of the machine. Successful test investigations completed on the machine are reported, including comments on behaviour of cast iron drums and discs. The machine shows that materials can repeat their complex friction/ temperature/speed/pressure relationships at a reproducibility of the order of +-0.003u and +~ 0.0002 in. thickness loss during wear tests. Discussion of practical and academic implications completes the report with recommendations for further work in both fields.
Resumo:
The thesis examines and explains the development of occupational exposure limits (OELs) as a means of preventing work related disease and ill health. The research focuses on the USA and UK and sets the work within a certain historical and social context. A subsidiary aim of the thesis is to identify any short comings in OELs and the methods by which they are set and suggest alternatives. The research framework uses Thomas Kuhn's idea of science progressing by means of paradigms which he describes at one point, `lq ... universally recognised scientific achievements that for a time provide model problems and solutions to a community of practitioners. KUHN (1970). Once learned individuals in the community, `lq ... are committed to the same rules and standards for scientific practice. Ibid. Kuhn's ideas are adapted by combining them with a view of industrial hygiene as an applied science-based profession having many of the qualities of non-scientific professions. The great advantage of this approach to OELs is that it keeps the analysis grounded in the behaviour and priorities of the groups which have forged, propounded, used, benefited from, and defended, them. The development and use of OELs on a larger scale is shown to be connected to the growth of a new profession in the USA; industrial hygiene, with the assistance of another new profession; industrial toxicology. The origins of these professions, particularly industrial hygiene, are traced. By examining the growth of the professions and the writings of key individuals it is possible to show how technical, economic and social factors became embedded in the OEL paradigm which industrial hygienists and toxicologists forged. The origin, mission and needs of these professions and their clients made such influences almost inevitable. The use of the OEL paradigm in practice is examined by an analysis of the process of the American Conference of Governmental Industrial Hygienists, Threshold Limit Value (ACGIH, TLV) Committee via the Minutes from 1962-1984. A similar approach is taken with the development of OELs in the UK. Although the form and definition of TLVs has encouraged the belief that they are health-based OELs the conclusion is that they, and most other OELs, are, and always have been, reasonably practicable limits: the degree of risk posed by a substance is weighed against the feasibility and cost of controlling exposure to that substance. The confusion over the status of TLVs and other OELs is seen to be a confusion at the heart of the OEL paradigm and the historical perspective explains why this should be. The paradigm has prevented the creation of truly health-based and, conversely, truly reasonably practicable OELs. In the final part of the thesis the analysis of the development of OELs is set in a contemporary context and a proposal for a two-stage, two-committee procedure for producing sets of OELs is put forward. This approach is set within an alternative OEL paradigm. The advantages, benefits and likely obstacles to these proposals are discussed.