24 resultados para data envelopment analysis
Resumo:
Purpose: Given the emergent nature of i-branding as an academic field of study and a lack of applied research output, the aim of this paper is to explain how businesses manage i-branding to create brand equity.
Design/methodology/approach: Within a case-study approach, seven cases were developed from an initial sample of 20 food businesses. Additionally, utilising secondary data, the analysis of findings introduces relevant case examples from other industrial sectors.
Findings: Specific internet tools and their application are discussed within opportunities to create brand equity for products classified by experience, credence and search characteristics. An understanding of target customers will be critical in underpinning the selection and deployment of relevant i-branding tools. Tools facilitating interactivity – machine and personal – are particularly significant.
Research limitations/implications: Future research positioned within classification of goods constructs could provide further contributions that recognise potential moderating effects of product/service characteristics on the development of brand equity online. Future studies could also employ the i-branding conceptual framework to test its validity and develop it further as a means of explaining how i-branding can be managed to create brand equity.
Originality/value: While previous research has focused on specific aspects of i-branding, this paper utilises a conceptual framework to explain how diverse i-branding tools combine to create brand equity. The literature review integrates fragmented literature around a conceptual framework to produce a more coherent understanding of extant thinking. The location of this study within a classification of goods context proved critical to explaining how i-branding can be managed.
Resumo:
The purpose of this study was to explore the care processes experienced by community-dwelling adults dying from advanced heart failure, their family caregivers, and their health-care providers. A descriptive qualitative design was used to guide data collection, analysis, and interpretation. The sample comprised 8 patients, 10 informal caregivers, 11 nurses, 3 physicians, and 3 pharmacists. Data analysis revealed that palliative care was influenced by unique contextual factors (i.e., cancer model of palliative care, limited access to resources, prognostication challenges). Patients described choosing interventions and living with fatigue, pain, shortness of breath, and functional decline. Family caregivers described surviving caregiver burden and drawing on their faith. Health professionals described their role as trying to coordinate care, building expertise, managing medications, and optimizing interprofessional collaboration. Participants strove towards 3 outcomes: effective symptom management, satisfaction with care, and a peaceful death. © McGill University School of Nursing.
Resumo:
Scytalidium thermophilum plays an important role in determining selectivity of compost produced for growing Agaricus bisporus. The objective of this study was to characterise S. thermophilum isolates by random amplified polymorphic DNA (RAPD) analysis and sequence analysis of internally transcribed spacer (ITS) regions of the rDNA, to assess the genetic variation exhibited by this species complex and to compare this with existing morphological and thermogravimetric data. RAPD analysis of 34 isolates from various parts of the world revealed two distinct groups, which could be separated on the basis of the differences in the banding patterns produced with five random primers. Nucleotide sequence analysis of the ITS region, which was ca 536 bp in length, revealed only very minor variation among S. thermophilum isolates examined. Several nucleotide base changes within this region demonstrated variation. Genetic distance values among type 1 and 2 S. thermophilum isolates, as determined by ITS sequence analysis, varied by a value of 0.005 %. Molecular analyses carried out in the present study would suggest that isolates within this species complex exhibit genetic differences which correlate well with morphological variation and thermogravimetric data previously determined.
Resumo:
Cancer registries must provide complete and reliable incidence information with the shortest possible delay for use in studies such as comparability, clustering, cancer in the elderly and adequacy of cancer surveillance. Methods of varying complexity are available to registries for monitoring completeness and timeliness. We wished to know which methods are currently in use among cancer registries, and to compare the results of our findings to those of a survey carried out in 2006.
Methods
In the framework of the EUROCOURSE project, and to prepare cancer registries for participation in the ERA-net scheme, we launched a survey on the methods used to assess completeness, and also on the timeliness and methods of dissemination of results by registries. We sent the questionnaire to all general registries (GCRs) and specialised registries (SCRs) active in Europe and within the European Network of Cancer Registries (ENCR).
Results
With a response rate of 66% among GCRs and 59% among SCRs, we obtained data for analysis from 116 registries with a population coverage of ∼280 million. The most common methods used were comparison of trends (79%) and mortality/incidence ratios (more than 60%). More complex methods were used less commonly: capture–recapture by 30%, flow method by 18% and death certificate notification (DCN) methods with the Ajiki formula by 9%.
The median latency for completion of ascertainment of incidence was 18 months. Additional time required for dissemination was of the order of 3–6 months, depending on the method: print or electronic. One fifth (21%) did not publish results for their own registry but only as a contribution to larger national or international data repositories and publications; this introduced a further delay in the availability of data.
Conclusions
Cancer registries should improve the practice of measuring their completeness regularly and should move from traditional to more quantitative methods. This could also have implications in the timeliness of data publication.
Resumo:
The future European power system will have a hierarchical structure created by layers of system control from a Supergrid via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the context of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called 'back-up generation' needed to support an 80% renewable energy portfolio in Europe by 2050. © 2013 IEEE.
Resumo:
The power system of the future will have a hierarchical structure created by layers of system control from via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the concept of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called ‘back-up generation’ needed to support an 80% renewable energy portfolio in Europe by 2050.
Resumo:
The continued use of traditional lecturing across Higher Education as the main teaching and learning approach in many disciplines must be challenged. An increasing number of studies suggest that this approach, compared to more active learning methods, is the least effective. In counterargument, the use of traditional lectures are often justified as necessary given a large student population. By analysing the implementation of a web based broadcasting approach which replaced the traditional lecture within a programming-based module, and thereby removed the student population rationale, it was hoped that the student learning experience would become more active and ultimately enhance learning on the module. The implemented model replaces the traditional approach of students attending an on-campus lecture theatre with a web-based live broadcast approach that focuses on students being active learners rather than passive recipients. Students ‘attend’ by viewing a live broadcast of the lecturer, presented as a talking head, and the lecturer’s desktop, via a web browser. Video and audio communication is primarily from tutor to students, with text-based comments used to provide communication from students to tutor. This approach promotes active learning by allowing student to perform activities on their own computer rather than the passive viewing and listening common encountered in large lecture classes. By analysing this approach over two years (n = 234 students) results indicate that 89.6% of students rated the approach as offering a highly positive learning experience. Comparing student performance across three academic years also indicates a positive change. A small data analytic analysis was conducted into student participation levels and suggests that the student cohort's willingness to engage with the broadcast lectures material is high.
Resumo:
We have carried out a 129 close-coupling level Dirac-Coulomb R-matrix calculation for the electron-impact excitation of Ni-like Xe. We have utilized this data to generate the spectral signature of Xe26+ in terms of feature photon-emissivity coefficients (F-PεCs). We have compared these F-PεCs with those generated using semi-relativistic plane-wave Born excitation data, which forms the heavy species baseline for the Atomic Data and Analysis Structure (ADAS), We find that the Born-based F-PεCs give a reasonable qualitative description of the spectral signature but that, quantitatively, the R-matrix-based F-PεCs differ by up to a factor of 2. The spectral signature of heavy species is key to diagnosing hot plasmas such as will be found in the International Thermonuclear Experimental Reactor.
Resumo:
We analyze how a set of 22 European countries was affected by increased Chinese export competition between 1995 and 2008. Employing product-group level data, we observe a reduction in the export volumes of European countries due to increased Chinese export competition. This deceleration in the export sector induces changes within the manufacturing industries, especially a decline in employment. When using more aggregated, regional-level data, our analysis shows that the industry sector as a whole declines, resulting in an increased unemployment rate. The importance of Chinese export competition for Europe is attributable to its high export intensity.