935 resultados para hierarchical winner-take-all
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
It is easy to take many of the practices that constitute the contemporary school for granted. Timetables, academic records, rows of desks, playgrounds, guidance counsellors now all seem a natural and inevitable part of an optimal learning environment. However, the evidence suggests that they did not appear by chance. Instead, they were put in place, albeit often in a piecemeal and haphazard way, as part of the process by which a new type of institution was constructed. By understanding the school as a disciplinary society, constituted through a variety of diverse practices, it becomes possible to re-interpret the way we have come to educate ourselves. No longer is the modern school some kind of pedagogic inevitability—simply the best and most obvious way to educate, the end result of two thousand years of trying to finally get it right. Rather, mass schooling, as we know it, is an historical by-product of changes in the way society was organised. It is a contingent collection of particular forms of government, deployed at different historical moments, often for quite different administrative and educational reasons.
Resumo:
Rather than passing judgment of the content of young women’s magazines, it will be argued instead that such texts actually exist as manuals of self-formation, manuals which enroll young women to do specific kinds of work on themselves. In doing so, they form an effective link between the governmental imperatives aimed at constructing particular personas – such as the sexually responsible young girl - and the actual practices whereby these imperatives are operationalised.
Resumo:
Context The School of Information Technology at QUT has recently undertaken a major restructuring of their Bachelor of Information Technology (BIT) course. Some of the aims of this restructuring include a reduction in first year attrition and to provide an attractive degree course that meets both student and industry expectations. Emphasis has been placed on the first semester in the context of retaining students by introducing a set of four units that complement one another and provide introductory material on technology, programming and related skills, and generic skills that will aid the students throughout their undergraduate course and in their careers. This discussion relates to one of these four fist semester units, namely Building IT Systems. The aim of this unit is to create small Information Technology (IT) systems that use programming or scripting, databases as either standalone applications or web applications. In the prior history of teaching introductory computer programming at QUT, programming has been taught as a stand alone subject and integration of computer applications with other systems such as databases and networks was not undertaken until students had been given a thorough grounding in those topics as well. Feedback has indicated that students do not believe that working with a database requires programming skills. In fact, the teaching of the building blocks of computer applications have been compartmentalized and taught in isolation from each other. The teaching of introductory computer programming has been an industry requirement of IT degree courses as many jobs require at least some knowledge of the topic. Yet, computer programming is not a skill that all students have equal capabilities of learning (Bruce et al., 2004) and this is clearly shown by the volume of publications dedicated to this topic in the literature over a broad period of time (Eckerdal & Berglund, 2005; Mayer, 1981; Winslow, 1996). The teaching of this introductory material has been done pretty much the same way over the past thirty years. During this period of time that introductory computer programming courses have been taught at QUT, a number of different programming languages and programming paradigms have been used and different approaches to teaching and learning have been attempted in an effort to find the golden thread that would allow students to learn this complex topic. Unfortunately, computer programming is not a skill that can be learnt in one semester. Some basics can be learnt but it can take many years to master (Norvig, 2001). Faculty data typically has shown a bimodal distribution of results for students undertaking introductory programming courses with a high proportion of students receiving a high mark and a high proportion of students receiving a low or failing mark. This indicates that there are students who understand and excel with the introductory material while there is another group who struggle to understand the concepts and practices required to be able to translate a specification or problem statement into a computer program that achieves what is being requested. The consequence of a large group of students failing the introductory programming course has been a high level of attrition amongst first year students. This attrition level does not provide good continuity in student numbers in later years of the degree program and the current approach is not seen as sustainable.
Networks in the shadow of markets and hierarchies : calling the shots in the visual effects industry
Resumo:
The nature and organisation of creative industries and the creative economy has received increased attention in recent academic and policy literatures (Florida 2002; Grabher 2002; Scott 2006a). Constituted as one variant on new economy narratives, creativity, alongside knowledge, has been presented as a key competitive asset, Such industries – ranging from advertising, to film and new media – are seen as not merely expanding their scale and scope, but as leading edge proponents of a more general trend towards new forms of organization and economic coordination (Davis and Scase 2000). The idea of network forms (and the consequent displacement of markets and hierarchies) has been at the heart of attempts to differentiate the field economically and spatially. Across both the discussion of production models and work/employment relations is the assertion of the enhanced importance of trust and non-market relations in coordinating structures and practices. This reflects an influential view in sociological, management, geography and other literatures that social life is ‘intrinsically networked’ (Sunley 2008: 12) and that we can confidently use the term ‘network society’ to describe contemporary structures and practices (Castells 1996). Our paper is sceptical of the conceptual and empirical foundations of such arguments. We draw on a number of theoretical resources, including institutional theory, global value chain analysis and labour process theory (see Smith and McKinlay 2009) to explore how a more realistic and grounded analysis of the nature of and limits to networks can be articulated. Given space constraints, we cannot address all the dimensions of network arguments or evidence. Our focus is on inter and intra-firm relations and draws on research into a particular creative industry – visual effects – that is a relatively new though increasingly important global production network. Through this examination a different model of the creative industries and creative work emerges – one in which market rules and patterns of hierarchical interaction structure the behaviour of economic actors and remain a central focus of analysis. The next section outlines and unpacks in more detail arguments concerning the role and significance of networks, markets and hierarchies in production models and work organisation in creative industries and the ‘creative economy’.
Resumo:
This study directly measured the load acting on the abutment of the osseointegrated implant system of transfemoral amputees during level walking, and studied the variability of the load within and among amputees. Twelve active transfemoral amputees (age: 54±12 years, mass:84.3±16.3 kg, height: 17.8±0.10 m) fitted with an osseointegrated implant for over 1 year participated in the study. The load applied on the abutment was measured during unimpeded, level walking in a straight line using a commercial six-channel transducer mounted between the abutment and the prosthetic knee. The pattern and the magnitude of the three-dimensional forces and moments were revealed. Results showed a low step-to-step variability of each subject, but a high subject-to-subject variability in local extrema of body-weight normalized forces and moments and impulse data. The high subject-to-subject variability suggests that the mechanical design of the implant system should be customized for each individual, or that a fit-all design should take into consideration the highest values of load within a broad range of amputees. It also suggests specific loading regime in rehabilitation training are necessary for a given subject. Thus the loading magnitude and variability demonstrated should be useful in designing an osseointegrated implant system better able to resist mechanical failure and in refining the rehabilitation protocol.
Resumo:
The purpose of this study is to investigate how secondary school media educators might best meet the needs of students who prefer practical production work to ‘theory’ work in media studies classrooms. This is a significant problem for a curriculum area that claims to develop students’ media literacies by providing them with critical frameworks and a metalanguage for thinking about the media. It is a problem that seems to have become more urgent with the availability of new media technologies and forms like video games. The study is located in the field of media education, which tends to draw on structuralist understandings of the relationships between young people and media and suggests that students can be empowered to resist media’s persuasive discourses. Recent theoretical developments suggest too little emphasis has been placed on the participatory aspects of young people playing with, creating and gaining pleasure from media. This study contributes to this ‘participatory’ approach by bringing post structuralist perspectives to the field, which have been absent from studies of secondary school media education. I suggest theories of media learning must take account of the ongoing formation of students’ subjectivities as they negotiate social, cultural and educational norms. Michel Foucault’s theory of ‘technologies of the self’ and Judith Butler’s theories of performativity and recognition are used to develop an argument that media learning occurs in the context of students negotiating various ‘ethical systems’ as they establish their social viability through achieving recognition within communities of practice. The concept of ‘ethical systems’ has been developed for this study by drawing on Foucault’s theories of discourse and ‘truth regimes’ and Butler’s updating of Althusser’s theory of interpellation. This post structuralist approach makes it possible to investigate the ways in which students productively repeat and vary norms to creatively ‘do’ and ‘undo’ the various media learning activities with which they are required to engage. The study focuses on a group of year ten students in an all boys’ Catholic urban school in Australia who undertook learning about video games in a three-week intensive ‘immersion’ program. The analysis examines the ethical systems operating in the classroom, including formal systems of schooling, informal systems of popular cultural practice and systems of masculinity. It also examines the students’ use of semiotic resources to repeat and/or vary norms while reflecting on, discussing, designing and producing video games. The key findings of the study are that students are motivated to learn technology skills and production processes rather than ‘theory’ work. This motivation stems from the students’ desire to become recognisable in communities of technological and masculine practice. However, student agency is not only possible through critical responses to media, but through performative variation of norms through creative ethical practices as students participate with new media technologies. Therefore, the opportunities exist for media educators to create the conditions for variation of norms through production activities. The study offers several implications for media education theory and practice including: the productive possibilities of post structuralism for informing ways of doing media education; the importance of media teachers having the autonomy to creatively plan curriculum; the advantages of media and technology teachers collaborating to draw on a broad range of resources to develop curriculum; the benefits of placing more emphasis on students’ creative uses of media; and the advantages of blending formal classroom approaches to media education with less formal out of school experiences.
Resumo:
Biotribology, the study of lubrication, wear and friction within the body, has become a topic of high importance in recent times as we continue to encounter debilitating diseases and trauma that destroy function of the joints. A highly successful surgical procedure to replace the joint with an artificial equivalent alleviates dysfunction and pain. However, the wear of the bearing surfaces in prosthetic joints is a significant clinical problem and more patients are surviving longer than the life expectancy of the joint replacement. Revision surgery is associated with increased morbidity and mortality and has a far less successful outcome than primary joint replacement. As such, it is essential to ensure that everything possible is done to limit the rate of revision surgery. Past experience indicates that the survival rate of the implant will be influenced by many parameters, of primary importance, the material properties of the implant, the composition of the synovial fluid and the method of lubrication. In prosthetic joints, effective boundary lubrication is known to take place. The interaction of the boundary lubricant and the bearing material is of utmost importance. The identity of the vital active ingredient within synovial fluid (SF) to which we owe the near frictionless performance of our articulating joints has been the quest of researchers for many years. Once identified, tribo tests can determine what materials and more importantly what surfaces this fraction of SF can function most optimally with. Surface-Active Phospholipids (SAPL) have been implicated as the body’s natural load bearing lubricant. Studies in this thesis are the first to fully characterise the adsorbed SAPL detected on the surface of retrieved prostheses and the first to verify the presence of SAPL on knee prostheses. Rinsings from the bearing surfaces of both hip and knee prostheses removed from revision operations were analysed using High Performance Liquid Chromatography (HPLC) to determine the presence and profile of SAPL. Several common prosthetic materials along with a novel biomaterial were investigated to determine their tribological interaction with various SAPLs. A pin-on-flat tribometer was used to make comparative friction measurements between the various tribo-pairs. A novel material, Pyrolytic Carbon (PyC) was screened as a potential candidate as a load bearing prosthetic material. Friction measurements were also performed on explanted prostheses. SAPL was detected on all retrieved implant bearing surfaces. As a result of the study eight different species of phosphatidylcholines were identified. The relative concentrations of each species were also determined indicating that the unsaturated species are dominant. Initial tribo tests employed a saturated phosphatidylcholine (SPC) and the subsequent tests adopted the addition of the newly identified major constituents of SAPL, unsaturated phosphatidylcholine (USPC), as the test lubricant. All tribo tests showed a dramatic reduction in friction when synthetic SAPL was used as the lubricant under boundary lubrication conditions. Some tribopairs showed more of an affinity to SAPL than others. PyC performed superior to the other prosthetic materials. Friction measurements with explanted prostheses verified the presence and performance of SAPL. SAPL, in particular phosphatidylcholine, plays an essential role in the lubrication of prosthetic joints. Of particular interest was the ability of SAPLs to reduce friction and ultimately wear of the bearing materials. The identification and knowledge of the lubricating constituents of SF is invaluable for not only the future development of artificial joints but also in developing effective cures for several disease processes where lubrication may play a role. The tribological interaction of the various tribo-pairs and SAPL is extremely favourable in the context of reducing friction at the bearing interface. PyC is highly recommended as a future candidate material for use in load bearing prosthetic joints considering its impressive tribological performance.
Resumo:
Emissions from airport operations are of significant concern because of their potential impact on local air quality and human health. The currently limited scientific knowledge of aircraft emissions is an important issue worldwide, when considering air pollution associated with airport operation, and this is especially so for ultrafine particles. This limited knowledge is due to scientific complexities associated with measuring aircraft emissions during normal operations on the ground. In particular this type of research has required the development of novel sampling techniques which must take into account aircraft plume dispersion and dilution as well as the various particle dynamics that can affect the measurements of the aircraft engine plume from an operational aircraft. In order to address this scientific problem, a novel mobile emission measurement method called the Plume Capture and Analysis System (PCAS), was developed and tested. The PCAS permits the capture and analysis of aircraft exhaust during ground level operations including landing, taxiing, takeoff and idle. The PCAS uses a sampling bag to temporarily store a sample, providing sufficient time to utilize sensitive but slow instrumental techniques to be employed to measure gas and particle emissions simultaneously and to record detailed particle size distributions. The challenges in relation to the development of the technique include complexities associated with the assessment of the various particle loss and deposition mechanisms which are active during storage in the PCAS. Laboratory based assessment of the method showed that the bag sampling technique can be used to accurately measure particle emissions (e.g. particle number, mass and size distribution) from a moving aircraft or vehicle. Further assessment of the sensitivity of PCAS results to distance from the source and plume concentration was conducted in the airfield with taxiing aircraft. The results showed that the PCAS is a robust method capable of capturing the plume in only 10 seconds. The PCAS is able to account for aircraft plume dispersion and dilution at distances of 60 to 180 meters downwind of moving a aircraft along with particle deposition loss mechanisms during the measurements. Characterization of the plume in terms of particle number, mass (PM2.5), gaseous emissions and particle size distribution takes only 5 minutes allowing large numbers of tests to be completed in a short time. The results were broadly consistent and compared well with the available data. Comprehensive measurements and analyses of the aircraft plumes during various modes of the landing and takeoff (LTO) cycle (e.g. idle, taxi, landing and takeoff) were conducted at Brisbane Airport (BNE). Gaseous (NOx, CO2) emission factors, particle number and mass (PM2.5) emission factors and size distributions were determined for a range of Boeing and Airbus aircraft, as a function of aircraft type and engine thrust level. The scientific complexities including the analysis of the often multimodal particle size distributions to describe the contributions of different particle source processes during the various stages of aircraft operation were addressed through comprehensive data analysis and interpretation. The measurement results were used to develop an inventory of aircraft emissions at BNE, including all modes of the aircraft LTO cycle and ground running procedures (GRP). Measurements of the actual duration of aircraft activity in each mode of operation (time-in-mode) and compiling a comprehensive matrix of gas and particle emission rates as a function of aircraft type and engine thrust level for real world situations was crucial for developing the inventory. The significance of the resulting matrix of emission rates in this study lies in the estimate it provides of the annual particle emissions due to aircraft operations, especially in terms of particle number. In summary, this PhD thesis presents for the first time a comprehensive study of the particle and NOx emission factors and rates along with the particle size distributions from aircraft operations and provides a basis for estimating such emissions at other airports. This is a significant addition to the scientific knowledge in terms of particle emissions from aircraft operations, since the standard particle number emissions rates are not currently available for aircraft activities.
Resumo:
Business Process Management (BPM) has increased in popularity and maturity in recent years. Large enterprises engage use process management approaches to model, manage and refine repositories of process models that detail the whole enterprise. These process models can run to the thousands in number, and may contain large hierarchies of tasks and control structures that become cumbersome to maintain. Tools are therefore needed to effectively traverse this process model space in an efficient manner, otherwise the repositories remain hard to use, and thus are lowered in their effectiveness. In this paper we analyse a range of BPM tools for their effectiveness in handling large process models. We establish that the present set of commercial tools is lacking in key areas regarding visualisation of, and interaction with, large process models. We then present six tool functionalities for the development of advanced business process visualisation and interaction, presenting a design for a tool that will exploit the latest advances in 2D and 3D computer graphics to enable fast and efficient search, traversal and modification of process models.
Resumo:
The School Based Youth Health Nurse Program was established in 1999 by the Queensland Government to fund school nurse positions in Queensland state high schools. Schools were required to apply for a School Based Youth Health Nurse during a five-phase recruitment process, managed by the health districts, and rolled out over four years. The only mandatory selection criterion for the position of School Based Youth Health Nurse was registration as a General Nurse and most School Based Youth Health Nurses are allocated to two state high schools. Currently, there are approximately 115 Full Time Equivalent School Based Youth Health Nurse positions across all Queensland state high schools. The literature review revealed an abundance of information about school nursing. Most of the literature came from the United Kingdom and the United States, who have a different model of school nursing to school based youth health nursing. However, there is literature to suggest school nursing is gradually moving from a disease-focused approach to a social view of health. The noticeable number of articles about, for example, drug and alcohol, mental health, and contemporary sexual health issues, is evidence of this change. Additionally, there is a significant the volume of literature about partnerships and collaboration, much of which is about health education, team teaching and how school nurses and schools do health business together. The surfacing of this literature is a good indication that school nursing is aligning with the broader national health priority areas. More particularly, the literature exposed a small but relevant and current body of research, predominantly from Queensland, about school based youth health nursing. However, there remain significant gaps in the knowledge about school based youth health nursing. In particular, there is a deficit about how School Based Youth Heath Nurses understand the experience of school based youth health nursing. This research aimed to reveal the meaning of the experience of school based youth health nursing. The research question was How do School Based Youth Health Nurses’ understand the experience of school based youth health nursing? This enquiry was instigated because the researcher, who had a positive experience of school based youth health nursing, considered it important to validate other School Based Youth Health Nurses’ experiences. Consequently, a comprehensive use of qualitative research was considered the most appropriate manner to explore this research question. Within this qualitative paradigm, the research framework consists of the epistemology of social constructionism, the theoretical perspective of interpretivism and the approach of phenomenography. After ethical approval was gained, purposeful and snowball sampling was used to recruit a sample of 16 participants. In-depth interviews, which were voluntary, confidential and anonymous, were mostly conducted in public venues and lasted from 40-75 minutes. The researcher also kept a researchers journal as another form of data collection. Data analysis was guided by Dahlgren and Fallsbergs’ (1991, p. 152) seven phases of data analysis which includes familiarization, condensation, comparison, grouping, articulating, labelling and contrasting. The most important finding in this research is the outcome space, which represents the entirety of the experience of school based youth health nursing. The outcome space consists of two components: inside the school environment and outside the school environment. Metaphorically and considered as whole-in-themselves, these two components are not discreet but intertwined with each other. The outcome space consists of eight categories. Each category of description is comprised of several sub-categories of description but as a whole, is a conception of school based youth health nursing. The eight conceptions of school based youth health nursing are: 1. The conception of school based youth health nursing as out there all by yourself. 2. The conception of school based youth health nursing as no real backup. 3. The conception of school based youth health nursing as confronted by many barriers. 4. The conception of school based youth health nursing as hectic and full-on. 5. The conception of school based youth health nursing as working together. 6. The conception of school based youth health nursing as belonging to school. 7. The conception of school based youth health nursing as treated the same as others. 8. The conception of school based youth health nursing as the reason it’s all worthwhile. These eight conceptions of school based youth health nursing are logically related and form a staged hierarchical relationship because they are not equally dependent on each other. The conceptions of school based youth health nursing are grouped according to negative, negative and positive and positive conceptions of school based youth health nursing. The conceptions of school based youth health nursing build on each other, from the bottom upwards, to reach the authorized, or the most desired, conception of school based youth health nursing. This research adds to the knowledge about school nursing in general but especially about school based youth health nursing specifically. Furthermore, this research has operational and strategic implications, highlighted in the negative conceptions of school based youth health nursing, for the School Based Youth Health Nurse Program. The researcher suggests the School Based Youth Health Nurse Program, as a priority, address the operational issues The researcher recommends a range of actions to tackle issues and problems associated with accommodation and information, consultations and referral pathways, confidentiality, health promotion and education, professional development, line management and School Based Youth Health Nurse Program support and school management and community. Strategically, the researcher proposes a variety of actions to address strategic issues, such as the School Based Youth Health Nurse Program vision, model and policy and practice framework, recruitment and retention rates and evaluation. Additionally, the researcher believes the findings of this research have the capacity to spawn a myriad of future research projects. The researcher has identified the most important areas for future research as confidentiality, information, qualifications and health outcomes.
Resumo:
The social tags in web 2.0 are becoming another important information source to profile users' interests and preferences to make personalized recommendations. To solve the problem of low information sharing caused by the free-style vocabulary of tags and the long tails of the distribution of tags and items, this paper proposes an approach to integrate the social tags given by users and the item taxonomy with standard vocabulary and hierarchical structure provided by experts to make personalized recommendations. The experimental results show that the proposed approach can effectively improve the information sharing and recommendation accuracy.
Resumo:
Abstract: Purpose – Several major infrastructure projects in the Hong Kong Special Administrative Region (HKSAR) have been delivered by the build-operate-transfer (BOT) model since the 1960s. Although the benefits of using BOT have been reported abundantly in the contemporary literature, some BOT projects were less successful than the others. This paper aims to find out why this is so and to explore whether BOT is the best financing model to procure major infrastructure projects. Design/methodology/approach – The benefits of BOT will first be reviewed. Some completed BOT projects in Hong Kong will be examined to ascertain how far the perceived benefits of BOT have been materialized in these projects. A highly profiled project, the Hong Kong-Zhuhai-Macau Bridge, which has long been promoted by the governments of the People's Republic of China, Macau Special Administrative Region and the HKSAR that BOT is the preferred financing model, but suddenly reverted back to the traditional financing model to be funded primarily by the three governments with public money instead, will be studied to explore the true value of the BOT financial model. Findings – Six main reasons for this radical change are derived from the analysis: shorter take-off time for the project; difference in legal systems causing difficulties in drafting BOT agreements; more government control on tolls; private sector uninterested due to unattractive economic package; avoid allegation of collusion between business and the governments; and a comfortable financial reserve possessed by the host governments. Originality/value – The findings from this paper are believed to provide a better understanding to the real benefits of BOT and the governments' main decision criteria in delivering major infrastructure projects.
Resumo:
Work-integrated learning in the form of internships is increasingly important for universities as they seek to compete for students, and seek links with industries. Yet, there is surprisingly little empirical research on the details of internships: (1) What they should accomplish? How they should be structure? (3) How students performance should be assess? There is also surprisingly little conceptual analysis of these key issues, either for business internships in general. or for marketing internships in particular. Furthermore, the "answers" on these issues may differ depending upon the perspective if the three stakeholders: students, business managers and university academics. There is not study in the marketing literature which surveys all three groups on these important aspects of internships. To fill these gaps, this paper discusses and analyses internships goals, internship structure, and internship assessment or undergraduate marketing internships, and then reports on a survey of the views of all three stakeholder groups on these issues. There are a considerable variety of approaches for internships, but generally there is consensus among the stake holder groups, with some notable differences. Managerial implication include recognition of the importance of having and academic aspects in internships; mutual understanding concerning needs and constraints; and the requirement that companies, students, and academics take a long-term view of internship programs to achieve mutually beneficial outcomes.
Resumo:
Are the Academy Awards heading towards an identity crisis? This year's Academy Awards have been characterised by a major disconnect between the most popular films at the box office and socially important films deemed the 'best pictures' by the Academy. Will the popularity of a film always remain inferior to whether or not it tackles serious social issues? Can popularity in its own right ever become indicative of a film's worth? Or should the awards retain their artistic integrity and suffer declining audiences and any criticisms they receive to maintain the respect they garner within the film industry? Whatever the answers may be, the winner of this year's Academy Awards was art over commerce, but this may not always be the case.