26 resultados para Application area
em Aston University Research Archive
Resumo:
Respiration is a complex activity. If the relationship between all neurological and skeletomuscular interactions was perfectly understood, an accurate dynamic model of the respiratory system could be developed and the interaction between different inputs and outputs could be investigated in a straightforward fashion. Unfortunately, this is not the case and does not appear to be viable at this time. In addition, the provision of appropriate sensor signals for such a model would be a considerable invasive task. Useful quantitative information with respect to respiratory performance can be gained from non-invasive monitoring of chest and abdomen motion. Currently available devices are not well suited in application for spirometric measurement for ambulatory monitoring. A sensor matrix measurement technique is investigated to identify suitable sensing elements with which to base an upper body surface measurement device that monitors respiration. This thesis is divided into two main areas of investigation; model based and geometrical based surface plethysmography. In the first instance, chapter 2 deals with an array of tactile sensors that are used as progression of existing and previously investigated volumetric measurement schemes based on models of respiration. Chapter 3 details a non-model based geometrical approach to surface (and hence volumetric) profile measurement. Later sections of the thesis concentrate upon the development of a functioning prototype sensor array. To broaden the application area the study has been conducted as it would be fore a generically configured sensor array. In experimental form the system performance on group estimation compares favourably with existing system on volumetric performance. In addition provides continuous transient measurement of respiratory motion within an acceptable accuracy using approximately 20 sensing elements. Because of the potential size and complexity of the system it is possible to deploy it as a fully mobile ambulatory monitoring device, which may be used outside of the laboratory. It provides a means by which to isolate coupled physiological functions and thus allows individual contributions to be analysed separately. Thus facilitating greater understanding of respiratory physiology and diagnostic capabilities. The outcome of the study is the basis for a three-dimensional surface contour sensing system that is suitable for respiratory function monitoring and has the prospect with future development to be incorporated into a garment based clinical tool.
Resumo:
RFID is one of the contemporary technologies that has the potential to enable improved data gathering and cross-companies integration, and thus achieve cost efficiency. However, RFID has not yet become primary approach to collect data from the supply chain activities. This is partly due to (relative) high cost of implementation, partly due to technical deficiencies, as well as cross-company systems integration issues. This paper discusses a potential application area for RFID technology, which is environmentally sustainable supply chain management. The paper discusses the current practices in green supply chain management, and proposes possible applications of RFID enabling green supply chain management. The paper also proposes an idea of ad hoc RFID systems which are rapidly deployable and require minimal, if at all, pre-existing infrastructure. © 2010 IEEE.
Resumo:
This research concerns information systems and information systems development. The thesis describes an approach to information systems development called Multiview. This is a methodology which seeks to combine the strengths of a number of different, existing approaches in a coherent manner. Many of these approaches are radically different in terms of concepts, philosophy, assumptions, methods, techniques and tools. Three case studies are described presenting Multiview 'in action'. The first is used mainly to expose the strengths and weaknesses of an early version of the approach discussed in the thesis. Tools and techniques are described in the thesis which aim to strengthen the approach. Two further case studies are presented to illustrate the use of this second version of Multiview. This is not put forward as an 'ideal methodology' and the case studies expose some of the difficulties and practical problems of information systems work and the use of the methodology. A more contingency based approach to information systems development is advocated using Multiview as a framework rather than a prescriptive tool. Each information systems project and the use of the framework is unique, contingent on the particular problem situation. The skills of different analysts, the backgrounds of users and the situations in which they are constrained to work have always to be taken into account in any project. The realities of the situation will cause departure from the 'ideal methodology' in order to allow for the exigencies of the real world. Multiview can therefore be said to be an approach used to explore the application area in order to develop an information system.
Resumo:
This paper summarizes the scientific work presented at the 32nd European Conference on Information Retrieval. It demonstrates that information retrieval (IR) as a research area continues to thrive with progress being made in three complementary sub-fields, namely IR theory and formal methods together with indexing and query representation issues, furthermore Web IR as a primary application area and finally research into evaluation methods and metrics. It is the combination of these areas that gives IR its solid scientific foundations. The paper also illustrates that significant progress has been made in other areas of IR. The keynote speakers addressed three such subject fields, social search engines using personalization and recommendation technologies, the renewed interest in applying natural language processing to IR, and multimedia IR as another fast-growing area.
Resumo:
Bioenergy schemes are multi-faceted and complex by nature, with many available raw material supplies and technical options and a diverse set of stakeholders holding a raft of conflicting opinions. To develop and operate a successful scheme there are many requirements that should be considered and satisfied. This paper provides a review of those academic works attempting to deal with problems arising within the bioenergy sector using multi-criteria decision-making (MCDM) methods. These methods are particularly suitable to bioenergy given its multi-faceted nature but could be equally relevant to other energy conversion technologies. Related articles appearing in the international journals from 2000 to 2010 are gathered and analysed so that the following two questions can be answered. (i) Which methods are the most popular? (ii) Which problems attract the most attention? The review finds that optimisation methods are most popular with methods choosing between few alternatives being used in 44% of reviewed papers and methods choosing between many alternatives being used in 28%. The most popular application area was to technology selection with 27% of reviewed papers followed by policy decisions with 18%. © 2012 Elsevier Ltd.
Resumo:
The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!
Resumo:
We evaluate the performance of composite leading indicators of turning points of inflation in the Euro area, constructed by combining the techniques of Fourier analysis and Kalman filters with the National Bureau of Economic Research methodology. In addition, the study compares the empirical performance of Euro Simple Sum and Divisia monetary aggregates and provides a tentative answer to the issue of whether or not the UK should join the Euro area. Our findings suggest that, first, the cyclical pattern of the different composite leading indicators very closely reflect that of the inflation cycle for the Euro area; second, the empirical performance of the Euro Divisia is better than its Simple Sum counterpart and third, the UK is better out of the Euro area. © 2005 Taylor & Francis Group Ltd.
Resumo:
We use the Fleissig and Whitney (2003) weak separability test to determine admissible levels of monetary aggregation for the Euro area. We find that the Euro area monetary assets in M2 and M3 are weakly separable and construct admissible Divisia monetary aggregates for these assets. We evaluate the Divisia aggregates as indicator variables, building on Nelson (2002), Reimers (2002), and Stracca (2004). Specifically, we show that real growth of the admissible Divisia aggregates enter the Euro area IS curve positively and significantly for the period from 1980 to 2005. Out of sample, we show that Divisia M2 and M3 appear to contain useful information for forecasting Euro area inflation.
Resumo:
Purpose - The purpose of this paper is to examine consumer emotions and the social science and observation measures that can be utilised to capture the emotional experiences of consumers. The paper is not setting out to solve the theoretical debate surrounding emotion research, rather to provide an assessment of methodological options available to researchers to aid their investigation into both the structure and content of the consumer emotional experience, acknowledging both the conscious and subconscious elements of that experience. Design/methodology/approach - A review of a wide range of prior research from the fields of marketing, consumer behaviour, psychology and neuroscience are examined to identify the different observation methods available to marketing researchers in the study of consumer emotion. This review also considers the self report measures available to researchers and identifies the main theoretical debates concerning emotion to provide a comprehensive overview of the issues surrounding the capture of emotional responses in a marketing context and to highlight the benefits that observation methods offer this area of research. Findings - This paper evaluates three observation methods and four widely used self report measures of emotion used in a marketing context. Whilst it is recognised that marketers have shown preference for the use of self report measures in prior research, mainly due to ease of implementation, it is posited that the benefits of observation methodology and the wealth of data that can be obtained using such methods can compliment prior research. In addition, the use of observation methods cannot only enhance our understanding of the consumer emotion experience but also enable us to collaborate with researchers from other fields in order to make progress in understanding emotion. Originality/value - This paper brings perspectives and methods together to provide an up to date consideration of emotion research for marketers. In order to generate valuable research in this area there is an identified need for discussion and implementation of the observation techniques available to marketing researchers working in this field. An evaluation of a variety of methods is undertaken as a point to start discussion or consideration of different observation techniques and how they can be utilised.
Resumo:
The initial image-processing stages of visual cortex are well suited to a local (patchwise) analysis of the viewed scene. But the world's structures extend over space as textures and surfaces, suggesting the need for spatial integration. Most models of contrast vision fall shy of this process because (i) the weak area summation at detection threshold is attributed to probability summation (PS) and (ii) there is little or no advantage of area well above threshold. Both of these views are challenged here. First, it is shown that results at threshold are consistent with linear summation of contrast following retinal inhomogeneity, spatial filtering, nonlinear contrast transduction and multiple sources of additive Gaussian noise. We suggest that the suprathreshold loss of the area advantage in previous studies is due to a concomitant increase in suppression from the pedestal. To overcome this confound, a novel stimulus class is designed where: (i) the observer operates on a constant retinal area, (ii) the target area is controlled within this summation field, and (iii) the pedestal is fixed in size. Using this arrangement, substantial summation is found along the entire masking function, including the region of facilitation. Our analysis shows that PS and uncertainty cannot account for the results, and that suprathreshold summation of contrast extends over at least seven target cycles of grating. © 2007 The Royal Society.
Resumo:
The best way of finding “natural groups” in management research remains subject to debate and within the literature there is no accepted consensus. The principle motivation behind this study is to explore the effect of choices of method upon strategic group research, an area that has suffered enduring criticism, as we believe that these method choices are still not fully exploited. Our study is novel in the use of a variety of more robust clustering and validation techniques, rarely used in management research, some borrowed from the natural sciences, which may provide a useful and more robust base for this type of research. Our results confirm that methods do exist to address the concerns over strategic group research and adoption of our chosen methods will improve the quality of management research.
Resumo:
B-ISDN is a universal network which supports diverse mixes of service, applications and traffic. ATM has been accepted world-wide as the transport technique for future use in B-ISDN. ATM, being a simple packet oriented transfer technique, provides a flexible means for supporting a continuum of transport rates and is efficient due to possible statistical sharing of network resources by multiple users. In order to fully exploit the potential statistical gain, while at the same time provide diverse service and traffic mixes, an efficient traffic control must be designed. Traffic controls which include congestion and flow control are a fundamental necessity to the success and viability of future B-ISDN. Congestion and flow control is difficult in the broadband environment due to the high speed link, the wide area distance, diverse service requirements and diverse traffic characteristics. Most congestion and flow control approaches in conventional packet switched networks are reactive in nature and are not applicable in the B-ISDN environment. In this research, traffic control procedures mainly based on preventive measures for a private ATM-based network are proposed and their performance evaluated. The various traffic controls include CAC, traffic flow enforcement, priority control and an explicit feedback mechanism. These functions operate at call level and cell level. They are carried out distributively by the end terminals, the network access points and the internal elements of the network. During the connection set-up phase, the CAC decides the acceptance or denial of a connection request and allocates bandwidth to the new connection according to three schemes; peak bit rate, statistical rate and average bit rate. The statistical multiplexing rate is based on a `bufferless fluid flow model' which is simple and robust. The allocation of an average bit rate to data traffic at the expense of delay obviously improves the network bandwidth utilisation.
Resumo:
This doctoral thesis originates from an observational incongruence between the perennial aims and aspirations of economic endeavour and actually recorded outcomes, which frequently seem contrary to those intended and of a recurrent, cyclical type. The research hypothesizes parallel movement between unstable business environments through time, as expressed by periodically fluctuating levels of economic activity, and the precipitation rates of industrial production companies. A major problem arose from the need to provide theoretical and empirical cohesion from the conflicting, partial and fragmented interpretations of several hundred historians and economists, without which the research question would remain unanswerable. An attempt to discover a master cycle, or superimposition theorem, failed, but was replaced by minute analysis of both the concept of cycles and their underlying data-bases. A novel technique of congregational analysis emerged, resulting in an integrated matrix of numerical history. Two centuries of industrial revolution history in England and Wales was then explored and recomposed for the first time in a single account of change, thereby providing a factual basis for the matrix. The accompanying history of the Birmingham area provided the context of research into the failure rates and longevities of firms in the city's staple metal industries. Sample specific results are obtained for company longevities in the Birmingham area. Some novel presentational forms are deployed for results of a postal questionnaire to surviving firms. Practical demonstration of the new index of national economic activity (INEA) in relation to company insolvencies leads to conclusions and suggestions for further applications of research into the tempo of change, substantial Appendices support the thesis and provide a compendium of information covering immediately contiguous domains.
Resumo:
Previous work has indicated the presence of collapsing and structured soils in the surface layers underlying Sana's, the capital of Yemen Republic. This study set out initially to define and, ultimately, to alleviate the problem by investigating the deformation behaviour of these soils through both field and laboratory programmes. The field programme was carried out in Sana'a while the laboratory work consisted of two parts, an initial phase at Sana's University carried out in parallel with the field programme on natural and treated soils and the major phase at Aston University carried out on natural, destructured and selected treated soils. The initial phase of the laboratory programme included classification, permeability, and single (collapsing) and double oedometer tests while the major phase, at Aston, was extended to also include extensive single and double oedometer tests, Scanning Electron Microscopy and Energy Dispersive Spectrum analysis. The mechanical tests were carried out on natural and destructed samples at both the in situ and soaked moisture conditions. The engineering characteristics of the natural intact, field-treated and laboratory destructured soils are reported, including their collapsing potentials which show them to be weakly bonded with nil to severe collapsing susceptibility. Flooding had no beneficial effect, with limited to moderate improvement being achieved by preloading and roller compaction, while major benefits were achieved from deep compaction. From these results a comparison between the soil response to the different treatments and general field remarks were presented. Laboratory destructuring reduced the stiffness of the soils while their compressibility was increasing. Their collapsing and destructuring mechanisms have been examined by studying the changes in structure accompanying these phenomena. Based on the test results for the intact and the laboratory destructured soils, a simplified framework has been developed to represent the collapsing and deformation behaviour at both the partially saturated and soaked states, and comments are given on its general applicability and limitations. It has been used to evaluate all the locations subjected to field treatment. It provided satisfactory results for the deformation behaviour of the soils destructed by field treatment. Finally attention is drawn to the design considerations together with the recommendations for the selection of potential improvement techniques to be used for foundation construction on the particular soils of the Sana's region.