811 resultados para Data-driven analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Positrons can attach to molecules via vibrational Feshbach resonances, leading to very large annihilation rates. The predictions of a recent theory for this process are validated for deuterated methyl halides where all modes are dipole coupled to the incident positron. Data and analysis are presented for methanol and ethylene, demonstrating the importance of combination and overtone resonances and the ability of the theory to account for these features. The mechanism for these resonances and criteria for their occurrence as well as outstanding questions are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper, which builds on an earlier paper published in this Journal (Vol. 20, No. 6), is to develop the discussion around how English has been taught, used and perceived in Kenya, using data gathered from a small second-level English-medium school in Kenya. The complex relationships between language and identity are at work in the everyday routines of both staff and pupils within such a context. The paper seeks to set out a clear methodology for gathering data which could help describe these relationships with more clarity while also subjecting the data to analysis informed by the growing body of research and theory that focuses on language policy in post-colonial and neo-colonial settings. Finally, these pieces of data are used as the basis of a further exploration of the implications for classroom practice in teaching English in this environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The article investigates the relationships between technological regimes and firm-level productivity performance, and it explores how such a relationship differs in different Schumpeterian patterns of innovation. The analysis makes use of a rich dataset containing data on innovation and other economic characteristics of a large representative sample of Norwegian firms in manufacturing and service industries for the period 1998–2004. First, we decompose TFP growth into technical progress and efficiency changes by means of data envelopment analysis. We then estimate an empirical model that relates these two productivity components to the characteristics of technological regimes and a set of other firm-specific factors. The results indicate that: (i) TFP growth has mainly been achieved through technical progress, while technical efficiency has on average decreased; (ii) the characteristics of technological regimes are important determinants of firm-level productivity growth, but their impacts on technical progress are different from the effects on efficiency change; (iii) the estimated model works differently in the two Schumpeterian regimes. Technical progress has been more dynamic in Schumpeter Mark II industries, while efficiency change has been more important in Schumpeter Mark I markets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

How best to predict the effects of perturbations to ecological communities has been a long-standing goal for both applied and basic ecology. This quest has recently been revived by new empirical data, new analysis methods, and increased computing speed, with the promise that ecologically important insights may be obtainable from a limited knowledge of community interactions. We use empirically based and simulated networks of varying size and connectance to assess two limitations to predicting perturbation responses in multispecies communities: (1) the inaccuracy by which species interaction strengths are empirically quantified and (2) the indeterminacy of species responses due to indirect effects associated with network size and structure. We find that even modest levels of species richness and connectance (similar to 25 pairwise interactions) impose high requirements for interaction strength estimates because system indeterminacy rapidly overwhelms predictive insights. Nevertheless, even poorly estimated interaction strengths provide greater average predictive certainty than an approach that uses only the sign of each interaction. Our simulations provide guidance in dealing with the trade-offs involved in maximizing the utility of network approaches for predicting dynamics in multispecies communities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – This paper seeks to present findings from the first all-Ireland study that consulted older people on their perceptions of interventions and services to support people experiencing abuse.

Design/methodology/approach – Utilising a grounded theory approach, 58 people aged 65 years and over took part in focus groups across Ireland. Four peer-researchers were also trained to assist in recruitment, data collection, analysis, and dissemination.

Findings – Participants identified preventative community-based approaches and peer supports as important mechanisms to support people experiencing, and being at risk of, elder abuse. Choices regarding care provision and housing, as well as opportunities for engagement in community activities where they can discuss issues with others, were identified as ways to prevent abuse.

Originality/value – The development of elder abuse services has traditionally been defined from the perspective of policy makers and professionals. This study looked at the perspective of the end-users of such services for the first time. The research also gave an active role to older people in the research process. The policy implication of the findings from this research is that enhanced attention and resources should be directed to community activities that enable older people to share their concerns informally thereby gaining confidence to seek more formal interventions when necessary.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: Given the emergent nature of i-branding as an academic field of study and a lack of applied research output, the aim of this paper is to explain how businesses manage i-branding to create brand equity.

Design/methodology/approach: Within a case-study approach, seven cases were developed from an initial sample of 20 food businesses. Additionally, utilising secondary data, the analysis of findings introduces relevant case examples from other industrial sectors.

Findings: Specific internet tools and their application are discussed within opportunities to create brand equity for products classified by experience, credence and search characteristics. An understanding of target customers will be critical in underpinning the selection and deployment of relevant i-branding tools. Tools facilitating interactivity – machine and personal – are particularly significant.

Research limitations/implications: Future research positioned within classification of goods constructs could provide further contributions that recognise potential moderating effects of product/service characteristics on the development of brand equity online. Future studies could also employ the i-branding conceptual framework to test its validity and develop it further as a means of explaining how i-branding can be managed to create brand equity.

Originality/value: While previous research has focused on specific aspects of i-branding, this paper utilises a conceptual framework to explain how diverse i-branding tools combine to create brand equity. The literature review integrates fragmented literature around a conceptual framework to produce a more coherent understanding of extant thinking. The location of this study within a classification of goods context proved critical to explaining how i-branding can be managed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this study was to explore the care processes experienced by community-dwelling adults dying from advanced heart failure, their family caregivers, and their health-care providers. A descriptive qualitative design was used to guide data collection, analysis, and interpretation. The sample comprised 8 patients, 10 informal caregivers, 11 nurses, 3 physicians, and 3 pharmacists. Data analysis revealed that palliative care was influenced by unique contextual factors (i.e., cancer model of palliative care, limited access to resources, prognostication challenges). Patients described choosing interventions and living with fatigue, pain, shortness of breath, and functional decline. Family caregivers described surviving caregiver burden and drawing on their faith. Health professionals described their role as trying to coordinate care, building expertise, managing medications, and optimizing interprofessional collaboration. Participants strove towards 3 outcomes: effective symptom management, satisfaction with care, and a peaceful death. © McGill University School of Nursing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Scytalidium thermophilum plays an important role in determining selectivity of compost produced for growing Agaricus bisporus. The objective of this study was to characterise S. thermophilum isolates by random amplified polymorphic DNA (RAPD) analysis and sequence analysis of internally transcribed spacer (ITS) regions of the rDNA, to assess the genetic variation exhibited by this species complex and to compare this with existing morphological and thermogravimetric data. RAPD analysis of 34 isolates from various parts of the world revealed two distinct groups, which could be separated on the basis of the differences in the banding patterns produced with five random primers. Nucleotide sequence analysis of the ITS region, which was ca 536 bp in length, revealed only very minor variation among S. thermophilum isolates examined. Several nucleotide base changes within this region demonstrated variation. Genetic distance values among type 1 and 2 S. thermophilum isolates, as determined by ITS sequence analysis, varied by a value of 0.005 %. Molecular analyses carried out in the present study would suggest that isolates within this species complex exhibit genetic differences which correlate well with morphological variation and thermogravimetric data previously determined.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Within the last few years the field personalized medicine entered the stage. Accompanied with great hopes and expectations it is believed that this field may have the potential to revolutionize medical and clinical care by utilizing genomics information about the individual patients themselves. In this paper, we reconstruct the early footprints of personalized medicine as reflected by information retrieved from PubMed and Google Scholar. That means we are providing a data-driven perspective of this field to estimate its current status and potential problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents generalized Laplacian eigenmaps, a novel dimensionality reduction approach designed to address stylistic variations in time series. It generates compact and coherent continuous spaces whose geometry is data-driven. This paper also introduces graph-based particle filter, a novel methodology conceived for efficient tracking in low dimensional space derived from a spectral dimensionality reduction method. Its strengths are a propagation scheme, which facilitates the prediction in time and style, and a noise model coherent with the manifold, which prevents divergence, and increases robustness. Experiments show that a combination of both techniques achieves state-of-the-art performance for human pose tracking in underconstrained scenarios.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer registries must provide complete and reliable incidence information with the shortest possible delay for use in studies such as comparability, clustering, cancer in the elderly and adequacy of cancer surveillance. Methods of varying complexity are available to registries for monitoring completeness and timeliness. We wished to know which methods are currently in use among cancer registries, and to compare the results of our findings to those of a survey carried out in 2006.

Methods
In the framework of the EUROCOURSE project, and to prepare cancer registries for participation in the ERA-net scheme, we launched a survey on the methods used to assess completeness, and also on the timeliness and methods of dissemination of results by registries. We sent the questionnaire to all general registries (GCRs) and specialised registries (SCRs) active in Europe and within the European Network of Cancer Registries (ENCR).

Results
With a response rate of 66% among GCRs and 59% among SCRs, we obtained data for analysis from 116 registries with a population coverage of ∼280 million. The most common methods used were comparison of trends (79%) and mortality/incidence ratios (more than 60%). More complex methods were used less commonly: capture–recapture by 30%, flow method by 18% and death certificate notification (DCN) methods with the Ajiki formula by 9%.

The median latency for completion of ascertainment of incidence was 18 months. Additional time required for dissemination was of the order of 3–6 months, depending on the method: print or electronic. One fifth (21%) did not publish results for their own registry but only as a contribution to larger national or international data repositories and publications; this introduced a further delay in the availability of data.

Conclusions
Cancer registries should improve the practice of measuring their completeness regularly and should move from traditional to more quantitative methods. This could also have implications in the timeliness of data publication.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymer extrusion, in which a polymer is melted and conveyed to a mould or die, forms the basis of most polymer processing techniques. Extruders frequently run at non-optimised conditions and can account for 15–20% of overall process energy losses. In times of increasing energy efficiency such losses are a major concern for the industry. Product quality, which depends on the homogeneity and stability of the melt flow which in turn depends on melt temperature and screw speed, is also an issue of concern of processors. Gear pumps can be used to improve the stability of the production line, but the cost is usually high. Likewise it is possible to introduce energy meters but they also add to the capital cost of the machine. Advanced control incorporating soft sensing capabilities offers opportunities to this industry to improve both quality and energy efficiency. Due to strong correlations between the critical variables, such as the melt temperature and melt pressure, traditional decentralized PID (Proportional–Integral–Derivative) control is incapable of handling such processes if stricter product specifications are imposed or the material is changed from one batch to another. In this paper, new real-time energy monitoring methods have been introduced without the need to install power meters or develop data-driven models. The effects of process settings on energy efficiency and melt quality are then studied based on developed monitoring methods. Process variables include barrel heating temperature, water cooling temperature, and screw speed. Finally, a fuzzy logic controller is developed for a single screw extruder to achieve high melt quality. The resultant performance of the developed controller has shown it to be a satisfactory alternative to the expensive gear pump. Energy efficiency of the extruder can further be achieved by optimising the temperature settings. Experimental results from open-loop control and fuzzy control on a Killion 25 mm single screw extruder are presented to confirm the efficacy of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The future European power system will have a hierarchical structure created by layers of system control from a Supergrid via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the context of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called 'back-up generation' needed to support an 80% renewable energy portfolio in Europe by 2050. © 2013 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The power system of the future will have a hierarchical structure created by layers of system control from via regional high-voltage transmission through to medium and low-voltage distribution. Each level will have generation sources such as large-scale offshore wind, wave, solar thermal, nuclear directly connected to this Supergrid and high levels of embedded generation, connected to the medium-voltage distribution system. It is expected that the fuel portfolio will be dominated by offshore wind in Northern Europe and PV in Southern Europe. The strategies required to manage the coordination of supply-side variability with demand-side variability will include large scale interconnection, demand side management, load aggregation and storage in the concept of the Supergrid combined with the Smart Grid. The design challenge associated with this will not only include control topology, data acquisition, analysis and communications technologies, but also the selection of fuel portfolio at a macro level. This paper quantifies the amount of demand side management, storage and so-called ‘back-up generation’ needed to support an 80% renewable energy portfolio in Europe by 2050.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The continued use of traditional lecturing across Higher Education as the main teaching and learning approach in many disciplines must be challenged. An increasing number of studies suggest that this approach, compared to more active learning methods, is the least effective. In counterargument, the use of traditional lectures are often justified as necessary given a large student population. By analysing the implementation of a web based broadcasting approach which replaced the traditional lecture within a programming-based module, and thereby removed the student population rationale, it was hoped that the student learning experience would become more active and ultimately enhance learning on the module. The implemented model replaces the traditional approach of students attending an on-campus lecture theatre with a web-based live broadcast approach that focuses on students being active learners rather than passive recipients. Students ‘attend’ by viewing a live broadcast of the lecturer, presented as a talking head, and the lecturer’s desktop, via a web browser. Video and audio communication is primarily from tutor to students, with text-based comments used to provide communication from students to tutor. This approach promotes active learning by allowing student to perform activities on their own computer rather than the passive viewing and listening common encountered in large lecture classes. By analysing this approach over two years (n = 234 students) results indicate that 89.6% of students rated the approach as offering a highly positive learning experience. Comparing student performance across three academic years also indicates a positive change. A small data analytic analysis was conducted into student participation levels and suggests that the student cohort's willingness to engage with the broadcast lectures material is high.