58 resultados para Real-life Projects
Resumo:
Bio energy is a renewable energy and a solution to the depleting fossil fuels. Bio energy such as heat, power and bio fuel is generated by conversion technologies using biomass for example domestic waste, root crops, forest residue and animal slurry. Pyrolysis, anaerobic digestion and combined heat and power engine are some examples of the technologies. Depending on the nature of a biomass, it can be treated with various technologies giving out some products, which can be further treated with other technologies and eventually converted into the final products as bio energy. The pathway followed by the biomass, technologies, intermediate products and bio energy in the conversion process is referred to as bio energy pathway. Identification of appropriate pathways optimizes the conversion process. Although there are various approaches to create or generate the pathways, there is still a need for a semantic approach to generate the pathways, which allow checking the consistency of the knowledge, and to share and extend the knowledge efficiently. This paper presents an ontology-based approach to automatic generation of the pathways for biomass to bio energy conversion, which exploits the definition and hierarchical structure of the biomass and technologies, their relationship and associated properties, and infers appropriate pathways. A case study has been carried out in a real-life scenario, the bio energy project for the North West of Europe (Bioen NW), which showed promising results.
Resumo:
The distribution of the secret key is the weakest link of many data encryption systems. Quantum key distribution (QKD) schemes provide attractive solutions [1], however their implementation remains challenging and their range and bit-rate are limited. Moreover, practical QKD systems, employ real-life components and are, therefore, vulnerable to diverse attack schemes [2]. Ultra-Long fiber lasers (UFLs) have been drawing much attention recently because of their fundamentally different properties compared to conventional lasers as well as their unique applications [3]. Here, we demonstrate a 100Bps, practically secure key distribution, over a 500km link, employing Raman gain UFL. Fig. 1(a) depicts a schematic of the UFL system. Each user has an identical set of two wavelength selective mirrors centered at l0 and l 1. In order to exchange a key-bit, each user independently choose one of these mirrors and introduces it as a laser reflector at their end. If both users choose identical mirrors, a clear signal develops and the bits in these cases are discarded. However if they choose complementary mirrors, (1, 0 or 0, 1 states), the UFL remains below lasing threshold and no signal evolves. In these cases, an eavesdropper can only detect noise and is unable to determine the mirror choice of the users, where the choice of mirrors represent a single key bit (e.g. Alice's choice of mirror is the key-bit). These bits are kept and added to the key. The absence of signal in the secure states faxilitates fast measurements to distinguish between the non-secure and the secure states and to determine the key-bit in the later case, Sequentially reapeating the single bit exchange protocol generate the entire keys of any desirable length. © 2013 IEEE.
Resumo:
This paper presents a case study that reveals how stakeholders in the research process, by recommending specific data collection and analytical techniques, exert significant ‘hidden’ influence on the decisions made on the basis of market research findings. While disagreements amongst stakeholders regarding research design are likely, the possibility that strategies adopted by companies are dependent upon stakeholder research preferences has not been adequately addressed in the literature. Two widely used quantitative customer satisfaction evaluation approaches, involving stated and derived importance, are compared within a real life market research setting at an international bank. The comparative analysis informs an ongoing debate surrounding the applicability of explicit and implicit importance measures and demonstrates how recommendations are dependent upon the methodological and analytical techniques selected. The findings, therefore, have significant implications for importance based satisfaction market research planning and highlight the need to consider the impact of stakeholder preferences on research outcomes.
Resumo:
The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!
Resumo:
Every year throughout the world, individuals' health is damaged by their exposure to toxic chemicals at work. In most cases these problems will resolve, but many will sustain permanent damage. Whilst any justified claim for compensation requires medical and legal evidence a crucial and often controversial component of this process is the establishment of a causal link between the individual's condition and exposure to a specific chemical or substance. Causation, in terms of how a substance or substances led the claimant to his or her current plight, can be difficult to establish and the main purpose of this book, is to provide the aspiring expert report writer with a concise, practical guide that uses case histories to illuminate the process of establishing causation in occupational toxicity proceedings. In summary: A practical, accessible guide to the preparation of balanced, scientifically sound expert reports in the context of occupational toxicology. Focuses on the scientist's role in establishing a causal link between exposure to toxins and an individual's ill health. Includes real-life case histories drawn from the Author's 15 years experience in this area to illustrate the principles involved. Expert Report Writing in Toxicology: Forensic, Scientific and Legal Aspects proves invaluable to scientists across a range of disciplines needing guidance as to what is expected of them in terms of the best use of their expertise and how to present their findings in a manner that is authoritative, balanced and informative.
Resumo:
Firms worldwide are taking major initiatives to reduce the carbon footprint of their supply chains in response to the growing governmental and consumer pressures. In real life, these supply chains face stochastic and non-stationary demand but most of the studies on inventory lot-sizing problem with emission concerns consider deterministic demand. In this paper, we study the inventory lot-sizing problem under non-stationary stochastic demand condition with emission and cycle service level constraints considering carbon cap-and-trade regulatory mechanism. Using a mixed integer linear programming model, this paper aims to investigate the effects of emission parameters, product- and system-related features on the supply chain performance through extensive computational experiments to cover general type business settings and not a specific scenario. Results show that cycle service level and demand coefficient of variation have significant impacts on total cost and emission irrespective of level of demand variability while the impact of product's demand pattern is significant only at lower level of demand variability. Finally, results also show that increasing value of carbon price reduces total cost, total emission and total inventory and the scope of emission reduction by increasing carbon price is greater at higher levels of cycle service level and demand coefficient of variation. The analysis of results helps supply chain managers to take right decision in different demand and service level situations.
Resumo:
One of the most pressing demands on electrophysiology applied to the diagnosis of epilepsy is the non-invasive localization of the neuronal generators responsible for brain electrical and magnetic fields (the so-called inverse problem). These neuronal generators produce primary currents in the brain, which together with passive currents give rise to the EEG signal. Unfortunately, the signal we measure on the scalp surface doesn't directly indicate the location of the active neuronal assemblies. This is the expression of the ambiguity of the underlying static electromagnetic inverse problem, partly due to the relatively limited number of independent measures available. A given electric potential distribution recorded at the scalp can be explained by the activity of infinite different configurations of intracranial sources. In contrast, the forward problem, which consists of computing the potential field at the scalp from known source locations and strengths with known geometry and conductivity properties of the brain and its layers (CSF/meninges, skin and skull), i.e. the head model, has a unique solution. The head models vary from the computationally simpler spherical models (three or four concentric spheres) to the realistic models based on the segmentation of anatomical images obtained using magnetic resonance imaging (MRI). Realistic models – computationally intensive and difficult to implement – can separate different tissues of the head and account for the convoluted geometry of the brain and the significant inter-individual variability. In real-life applications, if the assumptions of the statistical, anatomical or functional properties of the signal and the volume in which it is generated are meaningful, a true three-dimensional tomographic representation of sources of brain electrical activity is possible in spite of the ‘ill-posed’ nature of the inverse problem (Michel et al., 2004). The techniques used to achieve this are now referred to as electrical source imaging (ESI) or magnetic source imaging (MSI). The first issue to influence reconstruction accuracy is spatial sampling, i.e. the number of EEG electrodes. It has been shown that this relationship is not linear, reaching a plateau at about 128 electrodes, provided spatial distribution is uniform. The second factor is related to the different properties of the source localization strategies used with respect to the hypothesized source configuration.
Resumo:
Online writing plays a complex and increasingly prominent role in the life of organizations. From newsletters to press releases, social media marketing and advertising, to virtual presentations and interactions via e-mail and instant messaging, digital writing intertwines and affects the day-to-day running of the company - yet we rarely pay enough attention to it. Typing on the screen can become particularly problematic because digital text-based communication increases the opportunities for misunderstanding: it lacks the direct audio-visual contact and the norms and conventions that would normally help people to understand each other. Providing a clear, convincing and approachable discussion, this book addresses arenas of online writing: virtual teamwork, instant messaging, emails, corporate communication channels, and social media. Instead of offering do and don’t lists, however, it teaches the reader to develop a practice that is observant, reflective, and grounded in the understanding of the basic principles of language and communication. Through real-life examples and case studies, it helps the reader to notice previously unnoticed small details, question previously unchallenged assumptions and practices, and become a competent digital communicator in a wide range of professional contexts.
Resumo:
Identity influences the practice of English language teachers and supervisors, their professional development and their ability to incorporate innovation and change. Talk during post observation feedback meetings provides participants with opportunities to articulate, construct, verify, contest and negotiate identities, processes which often engender issues of face. This study examines the construction and negotiation of identity and face in post observation feedback meetings between in-service English language teachers and supervisors at a tertiary institution in the United Arab Emirates. Within a linguistic ethnography framework, this study combined linguistic microanalysis of audio recorded feedback meetings with ethnographic data gathered from participant researcher knowledge, pre-analysis interviews and post-analysis participant interpretation interviews. Through a detailed, empirical description of situated ‘real life’ institutional talk, this study shows that supervisors construct identities involving authority, power, expertise, knowledge and experience while teachers index identities involving experience, knowledge and reflection. As well as these positive valued identities, other negative, disvalued identities are constructed. Identities are shown to be discursively claimed, verified, contested and negotiated through linguistic actions. This study also shows a link between identity and face. Analysis demonstrates that identity claims verified by an interactional partner can lead to face maintenance or support. However, a contested identity claim can lead to face threat which is usually managed by facework. Face, like identity, is found to be interactionally achieved and endogenous to situated discourse. Teachers and supervisors frequently risk face threat to protect their own identities, to contest their interactional partner’s identities or to achieve the feedback meeting goal i.e. improved teaching. Both identity and face are found to be consequential to feedback talk and therefore influence teacher development, teacher/supervisor relationships and the acceptance of feedback. Analysis highlights the evaluative and conforming nature of feedback in this context which may be hindering opportunities for teacher development.
Resumo:
This paper deals with a very important issue in any knowledge engineering discipline: the accurate representation and modelling of real life data and its processing by human experts. The work is applied to the GRiST Mental Health Risk Screening Tool for assessing risks associated with mental-health problems. The complexity of risk data and the wide variations in clinicians' expert opinions make it difficult to elicit representations of uncertainty that are an accurate and meaningful consensus. It requires integrating each expert's estimation of a continuous distribution of uncertainty across a range of values. This paper describes an algorithm that generates a consensual distribution at the same time as measuring the consistency of inputs. Hence it provides a measure of the confidence in the particular data item's risk contribution at the input stage and can help give an indication of the quality of subsequent risk predictions. © 2010 IEEE.
Resumo:
The requirement that primary school children appreciate fully the pivotal role played by engineering in the sustainable development of future society is reflected in the literature with much attention being paid to the need to spark childrens engineering imagination early-on in their school careers. Moreover, UK policy documents highlight the value of embedding engineering into the school curriculum, arguing that programmes aimed at inspiring children through a process of real-life learning experiences are vital pedagogical tools in promoting engineering to future generations. Despite such attention, engineering education at school-level remains sporadic, often reliant on individual engineering-entrepreneurs such as teachers who, through personal interest, get children involved in what are usually extra-curriculum, time-limited engineering focused programmes and competitions. This paper briefly discusses an exploratory study aimed at investigating the issues surrounding embedding engineering into the primary school curriculum. It gives some insight into the perceptions of various stakeholders in respect of the viability and value of introducing engineering education into the primary school curriculum from the age of 6 or 7. A conceptual framework of primary level engineering education, bringing together the theoretical, pedagogical and policy related phenomena influencing the development of engineering education is proposed. The paper concludes by arguing that in order to avert future societal disaster, childrens engineering imagination needs to be ignited from an early age and that to do this primary engineering education needs to be given far more educational, social and political attention. © 2009 Authors.
Resumo:
Despite being frequently misrepresented as outdated or old fashioned (IMechE, 2009, p1), engineering is increasingly called upon to deal with some of societies biggest challenges including those associated with climate, infrastructure and security. In order to meet such challenges there needs to be a supply of engineering talent able to turn its collective mind to what is required. Yet at a time when demands for engineers able to provide innovative solutions to contemporary problems is possibly at its highest, the profession is plagued by shortages and an inability to attract young people (DIUS, 2008; RAE 2008; NSF, 2009). Although the current situation appears critical, potential future shortages of engineers means that unless action is taken urgently, matters will get worse during the next 20 to 30 years. For higher education, the challenge is how to change young peoples perceptions of engineering in such a manner that it is seen as a worthwhile and rewarding career. This paper considers this challenge, looking in detail at why young people fail to view engineering positively. A theoretical framework outlining the various real-life barriers and drivers is proposed. A critical analysis of current policy and practice suggests that in order to promote engineering as a profession that young people want to enter, both pedagogic and policy grounded solutions need to be found. By bringing together pedagogy and policy within an engineering framework the paper adds to current debates in engineering education whilst providing a distinctive look at what seems to be a recurring problem. © 2009 Authors.
Resumo:
Background: Although numerous studies and metanalysis have shown the beneficial effect of statin therapy in CVD secondary prevention, there is still controversy such the use of statins for primary CVD prevention in patients with DM. The purpose of this study was to evaluate the occurrence of total major adverse cardio-vascular events (MACE) in a cohort of patients with type 2 diabetes complicated by nephropathy treated with statins, in order to verify real life effect of statin on CVD primary prevention. Methods: We conducted an observational prospective multicenter study on 564 patients with type 2 diabetic nephropathy free of cardiovascular disease attending 21 national outpatient diabetes clinics and followed them up for 8 years. 169 of them were treated with statins (group A) while 395 were not on statins (group B). Results: Notably, none of the patients was treated with a high-intensity statin therapy according to last ADA position statement. Total MACE occurred in 32 patients from group A and in 68 patients from group B. Fatal MACE occurred in 13 patients from group A and in 30 from group B; nonfatal MACE occurred in 19 patients from group A and in 38 patients from group B. The analysis of the Kaplan-Meier survival curves showed a not statistically significant difference in the incidence of total (p 0.758), fatal (p 0.474) and nonfatal (p 0.812) MACE between the two groups. HbA1c only showed a significant difference in the incidence of MACE between the two groups (HR 1.201, CI 1.041-1.387, p 0.012). Conclusions: These findings suggest that, in a real clinical setting, moderate-intensity statin treatment is ineffective in cardiovascular primary prevention for patients with diabetic nephropathy.