922 resultados para High-level Design Specification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet of Things (IoT) consists of a worldwide “network of networks,” composed by billions of interconnected heterogeneous devices denoted as things or “Smart Objects” (SOs). Significant research efforts have been dedicated to port the experience gained in the design of the Internet to the IoT, with the goal of maximizing interoperability, using the Internet Protocol (IP) and designing specific protocols like the Constrained Application Protocol (CoAP), which have been widely accepted as drivers for the effective evolution of the IoT. This first wave of standardization can be considered successfully concluded and we can assume that communication with and between SOs is no longer an issue. At this time, to favor the widespread adoption of the IoT, it is crucial to provide mechanisms that facilitate IoT data management and the development of services enabling a real interaction with things. Several reference IoT scenarios have real-time or predictable latency requirements, dealing with billions of device collecting and sending an enormous quantity of data. These features create a new need for architectures specifically designed to handle this scenario, hear denoted as “Big Stream”. In this thesis a new Big Stream Listener-based Graph architecture is proposed. Another important step, is to build more applications around the Web model, bringing about the Web of Things (WoT). As several IoT testbeds have been focused on evaluating lower-layer communication aspects, this thesis proposes a new WoT Testbed aiming at allowing developers to work with a high level of abstraction, without worrying about low-level details. Finally, an innovative SOs-driven User Interface (UI) generation paradigm for mobile applications in heterogeneous IoT networks is proposed, to simplify interactions between users and things.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rapid economic development has occurred during the past few decades in China with the Yangtze River Delta (YRD) area as one of the most progressive areas. The urbanization, industrialization, agricultural and aquaculture activities result in extensive production and application of chemicals. Organohalogen contaminants (OHCs) have been widely used as i.e. pesticides, flame retardants and plasticizers. They are persistent, bioaccumulative and pose a potential threat to ecosystem and human health. However, limited research has been conducted in the YRD with respect to chemicals environmental exposure. The main objective of this thesis is to investigate the contamination level, distribution pattern and sources of OHCs in the YRD. Wildlife from different habitats are used to indicate the environmental pollution situation, and evaluate selected matrices for use in long term biomonitoring to determine the environmental stress the contamination may cause. In addition, a method is developed for dicofol analysis. Moreover, a specific effort is made to introduce statistic power analysis to assist in optimal sampling design. The thesis results show extensive contamination of OHCs in wildlife in the YRD. The occurrences of high concentrations of chlorinated paraffins (CPs) are reported in wildlife, in particular in terrestrial species, (i.e. short-tailed mamushi snake and peregrine falcon). Impurities and byproducts of pentachlorophenol products, i.e. polychlorinated diphenyl ethers (PCDEs) and hydroxylated polychlorinated diphenyl ethers (OH-PCDEs) are identified and reported for the first time in eggs from black-crowned night heron and whiskered tern. High concentrations of octachlorodibenzo-p-dioxin (OCDD) are determined in these samples. The toxic equivalents (TEQs) of polychlorinated dibenzo-p-dioxin (PCDDs) and polychlorinated dibenzofurans (PCDFs) are at mean levels of 300 and 520 pg TEQ g-1lw (WHO2005 TEQ) in eggs from the two bird species, respectively. This is two orders of magnitude higher than European Union (EU) regulation limit in chicken eggs. Also, a novel pattern of polychlorinated biphenyls (PCBs) with octa- to decaCBs, contributing to as much as 20% of total PCBs therein, are reported in birds. The legacy POPs shows a common characteristic with relatively high level of organochlorine pesticides (i.e. DDT, hexacyclohexanes (HCHs) and Mirex), indicating historic applications. In contrast, rather low concentrations are shown of industrial chemicals such as PCBs and polybrominated diphenyl ethers (PBDEs). A refined and improved analytical method is developed to separate dicofol from its major decomposition compound, 4,4’-dichlorobenzophenone. Hence dicofol is possible to assess as such. Statistic power analysis demonstrates that sampling of sedentary species should be consistently spread over a larger area to monitor temporal trends of contaminants in a robust manner. The results presented in this thesis show high CPs and OCDD concentrations in wildlife. The levels and patterns of OHCs in YRD differ from other well studied areas of the world. This is likely due to the extensive production and use of chemicals in the YRD. The results strongly signal the need of research biomonitoring programs that meet the current situation of the YRD. Such programs will contribute to the management of chemicals and environment in YRD, with the potential to grow into the human health sector, and to expand to China as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the advent of High Level Programming languages (HLPLs) in the early 1950s researchers have sought ways to automate the construction of HLPL compilers. To this end a variety of Translator Writing Tools (TWTs) have been developed in the last three decades. However, only a very few of these tools have gained significant commercial acceptance. This thesis re-examines traditional compiler construction techniques, along with a number of previous TWTs, and proposes a new improved tool for automated compiler construction called the Aston Compiler Constructor (ACC). This new tool allows the specification of complete compilation systems using a high level compiler oriented specification notation called the Compiler Construction Language (CCL). This specification notation is based on a modern variant of Backus Naur Form (BNF) and an extended variant of Attribute Grammars (AGs). The implementation and processing of the CCL is discussed along with an extensive CCL example. The CCL is shown to have an extensive expressive power, to be convenient in use, and highly readable, and thus a superior alternative to earlier TWTs, and to traditional compiler construction techniques. The execution performance of CCL specifications is evaluated and shown to be acceptable. A number of related areas are also addressed, including tools for the rapid construction of individual compiler components, and tools for the construction of compilation systems for multiprocessor operating systems and hardware. This latter area is expected to become of particular interest in future years due to the anticipated increased use of multiprocessor architectures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are been a resurgence of interest in the neural networks field in recent years, provoked in part by the discovery of the properties of multi-layer networks. This interest has in turn raised questions about the possibility of making neural network behaviour more adaptive by automating some of the processes involved. Prior to these particular questions, the process of determining the parameters and network architecture required to solve a given problem had been a time consuming activity. A number of researchers have attempted to address these issues by automating these processes, concentrating in particular on the dynamic selection of an appropriate network architecture.The work presented here specifically explores the area of automatic architecture selection; it focuses upon the design and implementation of a dynamic algorithm based on the Back-Propagation learning algorithm. The algorithm constructs a single hidden layer as the learning process proceeds using individual pattern error as the basis of unit insertion. This algorithm is applied to several problems of differing type and complexity and is found to produce near minimal architectures that are shown to have a high level of generalisation ability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work described in this thesis is directed towards the reduction of noise levels in the Hoover Turbopower upright vacuum cleaner. The experimental work embodies a study of such factors as the application of noise source identification techniques, investigation of the noise generating principles for each major source and evaluation of the noise reducing treatments. It was found that the design of the vacuum cleaner had not been optimised from the standpoint of noise emission. Important factors such as noise `windows', isolation of vibration at the source, panel rattle, resonances and critical speeds had not been considered. Therefore, a number of experimentally validated treatments are proposed. Their noise reduction benefit together with material and tooling costs are presented. The solutions to the noise problems were evaluated on a standard Turbopower and the sound power level of the cleaner was reduced from 87.5 dB(A) to 80.4 db(A) at a cost of 93.6 pence per cleaner.The designers' lack of experience in noise reduction was identified as one of the factors for the low priority given to noise during design of the cleaner. Consequently, the fundamentals of acoustics, principles of noise prediction and absorption and guidelines for good acoustical design were collated into a Handbook and circulated at Hoover plc.Mechanical variations during production of the motor and the cleaner were found to be important. These caused a vast spread in the noise levels of the cleaners. Subsequently, the manufacturing processes were briefly studied to identify their source and recommendations for improvement are made.Noise of a product is quality related and a high level of noise is considered to be a bad feature. This project suggested that the noise level be used constructively both as a test on the production line to identify cleaners above a certain noise level and also to promote the product by `designing' the characteristics of the sound so that the appliance is pleasant to the user. This project showed that good noise control principles should be implemented early in the design stage.As yet there are no mandatory noise limits or noise-labelling requirements for household appliances. However, the literature suggests that noise-labelling is likely in the near future and the requirement will be to display the A-weighted sound power level. However, the `noys' scale of perceived noisiness was found more appropriate to the rating of appliance noise both as it is linear and therefore, a sound level that seems twice as loud is twice the value in noys and also takes into consideration the presence of pure tones, which even in the absence of a high noise level can lead to annoyance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the demand for engineering graduates at what may be defined as an unprecedented high, many universities find themselves facing significant levels of student attrition-with high "drop-out levels" being a major issue in engineering education. In order to address this, Aston University in the UK has radically changed its undergraduate engineering education curriculum, introducing capstone CDIO (Conceive, Design, Implement, Operate) modules for all first year students studying Mechanical Engineering and Design. The introduction of CDIO is aimed at making project / problem based learning the norm. Utilising this approach, the learning and teaching in engineering purposefully aims to promote innovative thinking, thus equipping students with high-level problem-solving skills in a way that builds on theory whilst enhancing practical competencies and abilities. This chapter provides an overview of an Action Research study undertaken contemporaneously with the development, introduction, and administration of the first two semesters of CDIO. It identifies the challenges and benefits of the approach and concludes by arguing that whilst CDIO is hard work for staff, it can make a real difference to students' learning experiences, thereby positively impacting retention. © 2012, IGI Global.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Objective: Medication non-compliance is a considerable obstacle in achievinga therapeutic goal, whichcan result in poorerhealthcare outcomes, increased expenditure, wastage and potential for medication resistance. The UK Government’s Audit Commission’s publication ‘A Spoonful of Sugar’1 addresses these issues and promotes self-medication systems as a possible solution. The self-medication system within the Liver Transplant Unit (LTU) was implemented to induct patients onto new post- transplantation medication regimes ready for discharge. The system involves initial consultations with both the Liver Transplant Pharmacist and Trans- plant Co-ordinator, supported with additional advice as and when necessary. Design: Following ethical approval, evaluation of the self-medication sys- tem for liver transplant patients was conducted between January and March 2004 via two methods: audit and structured post-transplantation interview. The audit enabled any discrepancies between current Hospital guidelines and Liver Transplant Unit (LTU) practices to be highlighted. Patient interviews generated a retrospective insight into patient acceptance of the self-medication system. Setting: LTU, Queen Elizabeth Hospital, Birmingham, England. Main Outcome Measures: LTU compliance with Hospital self-medication guidelines and patient insight into self-medication system. Results: A total of seven patients were audited. Findings illustrated that self- medication by transplant patients is a complex process which was not fully addressed by current Hospital self-medication guidelines. Twenty-three patients were interviewed, showing an overwhelming positive attitude to- wards participating in their own care and a high level of understanding towards their individual medication regimes. Following a drugs counselling session, 100% of patients understood why they were taking their medica- tion, and their doses, 95% understood how to take their medication and 85% were aware of potential side effects. Conclusions: From this pilot evaluation it can be stated that the LTU self-medication system is appreciated by patients and assists them in fully understanding their medication regimes. There appear to be no major defects in the system. However areas such as communication barriers and on-going internet education were illustrated as areas for possible future investigation. References: 1. Audit Commission. A spoonful of sugar – medicines management in NHS hospitals. London: Audit Commission; 2001.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An expert system (ES) is a class of computer programs developed by researchers in artificial intelligence. In essence, they are programs made up of a set of rules that analyze information about a specific class of problems, as well as provide analysis of the problems, and, depending upon their design, recommend a course of user action in order to implement corrections. ES are computerized tools designed to enhance the quality and availability of knowledge required by decision makers in a wide range of industries. Decision-making is important for the financial institutions involved due to the high level of risk associated with wrong decisions. The process of making decision is complex and unstructured. The existing models for decision-making do not capture the learned knowledge well enough. In this study, we analyze the beneficial aspects of using ES for decision- making process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method to implementation of dialog based on graphical static scenes using an ontology-based approach to user interface development is proposed. The main idea of the approach is to form necessary to the user interface development and implementation information using ontologies and then based on this high-level specification to generate the user interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The neural-like growing networks used in the intelligent system of recognition of images are under consideration in this paper. All operations made over the image on a pre-design stage and also classification and storage of the information about the images and their further identification are made extremely by mechanisms of neural-like networks without usage of complex algorithms requiring considerable volumes of calculus. At the conforming hardware support the neural network methods allow considerably to increase the effectiveness of the solution of the given class of problems, saving a high accuracy of result and high level of response, both in a mode of training, and in a mode of identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objects of a large-scale gas-transport company (GTC) suggest a complex unified evolutionary approach, which covers basic building concepts, up-to-date technologies, models, methods and means that are used in the phases of design, adoption, maintenance and development of the multilevel automated distributed control systems (ADCS).. As a single methodological basis of the suggested approach three basic Concepts, which contain the basic methodological principles and conceptual provisions on the creation of distributed control systems, were worked out: systems of the lower level (ACS of the technological processes based on up-to-date SCADA), of the middle level (ACS of the operative-dispatch production control based on MES-systems) and of the high level (business process control on the basis of complex automated systems ERP).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the article, we have reviewed the means for visualization of syntax, semantics and source code for programming languages which support procedural and/or object-oriented paradigm. It is examined how the structure of the source code of the structural and object-oriented programming styles has influenced different approaches for their teaching. We maintain a thesis valid for the object-oriented programming paradigm, which claims that the activities for design and programming of classes are done by the same specialist, and the training of this specialist should include design as well as programming skills and knowledge for modeling of abstract data structures. We put the question how a high level of abstraction in the object-oriented paradigm should be presented in simple model in the design stage, so the complexity in the programming stage stay low and be easily learnable. We give answer to this question, by building models using the UML notation, as we take a concrete example from the teaching practice including programming techniques for inheritance and polymorphism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine challenges and potential of big data in heterogeneous business networks and relate these to an implemented logistics solution. Design/methodology/approach – The paper establishes an overview of challenges and opportunities of current significance in the area of big data, specifically in the context of transparency and processes in heterogeneous enterprise networks. Within this context, the paper presents how existing components and purpose-driven research were combined for a solution implemented in a nationwide network for less-than-truckload consignments. Findings – Aside from providing an extended overview of today’s big data situation, the findings have shown that technical means and methods available today can comprise a feasible process transparency solution in a large heterogeneous network where legacy practices, reporting lags and incomplete data exist, yet processes are sensitive to inadequate policy changes. Practical implications – The means introduced in the paper were found to be of utility value in improving process efficiency, transparency and planning in logistics networks. The particular system design choices in the presented solution allow an incremental introduction or evolution of resource handling practices, incorporating existing fragmentary, unstructured or tacit knowledge of experienced personnel into the theoretically founded overall concept. Originality/value – The paper extends previous high-level view on the potential of big data, and presents new applied research and development results in a logistics application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most fundamental and challenging function of government is the effective and efficient delivery of services to local taxpayers and businesses. Counties, once known as the “dark continent” of American government, have recently become a major player in the provision of services. Population growth and suburbanization have increased service demands while the counties' role as service provider to incorporated residents has also expanded due to additional federal and state mandates. County governments are under unprecedented pressure and scrutiny to meet citizens' and elected officials' demands for high quality, and equitable delivery of services at the lowest possible cost while contending with anti-tax sentiments, greatly decreased state and federal support, and exceptionally costly and complex health and public safety problems. ^ This study tested the reform government theory proposition that reformed structures of county government positively correlate with efficient service delivery. A county government reformed index was developed for this dissertation comprised of form of government, home-rule status, method of election, number of government jurisdictions, and number of elected officials. The county government reform index and a measure of relative structural fragmentation were used to assess their impact on two measures of service output: mean county road pavement condition and county road maintenance expenditures. The study's multi-level design triangulated results from different data sources and methods of analysis. Data were collected from semi-structured interviews of county officials, secondary archival sources, and a survey of 544 elected and appointed officials from Florida's 67 counties. The results of the three sources of data converged in finding that reformed Florida counties are more likely than unreformed counties to provide better road service and to spend less on road expenditures. The same results were found for unfragmented Florida counties. Because both the county government reform index and the fragmentation variables were specified acknowledging the reform theory as well as elements from the public-choice model, the results help explain contradicting findings in the urban service research. ^ Therefore, as suggested by the corroborated findings of this dissertation, reformed as well as unfragmented counties are better providers of road maintenance service and do so in a less costly manner. These findings hold although the variables were specified to capture theoretical arguments from the consolidated as well as the public-choice theories suggesting a way to advance the debate from the consolidated-fragmented dichotomy of urban governance. ^