961 resultados para system implementation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This analysis estimates several economic benefits derived from national implementation of the National Oceanic and Atmospheric Administration’s Physical Oceanographic Real-Time System (PORTS®) at the 175 largest ports in the United States. Significant benefits were observed owing to: (1) lower commercial marine accident rates and resultant reductions in morbidity, mortality and property damage; (2) reduced pollution remediation costs; and, (3) increased productivity associated with operation of more fully loaded commercial vessels. Evidence also suggested additional benefits from heightened commercial and recreational fish catch and diminished recreational boating accidents. Annual gross benefits from 58 current PORTS® locations exceeded $217 million with an addition $83 million possible if installed at the largest remaining 117 ports in the United States. Over the ten-year economic life of PORTS® instruments, the present value for installation at all 175 ports could approach $2.5 billion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

China's water pollution control law stipulates the Water Pollution Discharge Permit (WPDP) institution and authorizes the State Council to draft the regulations for its implementation and enforcement. However, until today, national regulations have not been established and the permitting system has been operating according to provincial regulations. in contrast to USA, the effluents permit system has been operated for more than 40 years and received relatively successful results. The CWA/NPDES experience offers a valuable reference for China’s water permit system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT Title of Document: AN ANALYSIS OF THE IMPLEMENTATION AND PERCEIVED EFFECTIVENESS OF THE SCHOOLMAX FAMILY PORTAL Warren Wesley Watts, Doctor of Education, 2015 Directed By: Margaret J. McLaughlin, Ph.D. Department of Counseling, Higher Education and Special Education School districts have spent millions of dollars implementing student information systems that offer family portals with web-based access to parents and students. One of the main purposes of these systems is to improve school-to-home communication. Research has shown that when school-to-home communication is implemented effectively, parent involvement improves and student achievement increases (Epstein, 2001). The purpose of the study was to (a) understand why parents used or refrained from using the family portal and (b) determine what barriers to use might exist. To this end, this descriptive study identified the information parent users accessed in the SchoolMAX family portal, determined how frequently parents accessed the portal, and ascertained whether parents perceived an increase in communication with their children about academic matters after they began accessing the portal. Finally, the study sought to identify whether barriers existed that prevented parents from using the family portal. The inquiry employed three data sources to answer the aforementioned queries. These sources included (a) a survey sent electronically to 19,108 parents who registered online for the SchoolMAX family portal; (b) SchoolMAX portal usage data from the student information system for system usage between January 1, 2015 and June 30, 2015; and (c) a paper survey sent to 691 parents of students that had never used the SchoolMAX family portal in one elementary school, one middle school and one high school that were representative of other schools in the district. Survey results indicated that parents at all grade levels used the family portal. Usage data also confirmed that approximately 19% of the students had parents who monitored their progress through the family portal. Usage data also showed that parents were monitoring approximately 25% of students in secondary schools (6th – 12th grade) and 16% of students in elementary schools. Of the wide menu of resources available through the SchoolMAX family portal, parents used three areas most frequently: attendance, daily grades, and report cards. Approximately 70% of parents responded that their communication had improved with their children about academic matters since they started using the SchoolMAX family portal, and 90% of parents responded that the SchoolMAX family portal was an effective or somewhat effective tool. Parents also expressed interest in the addition of additional information to the SchoolMAX family portal. Specifically, the top three additions parents wanted to see included homework assignments, high stakes test scores, and graduation requirements. Parents also reported that 92% of them spoke to their children at least 2 to 3 times per week about academics. Due to the low response rate of the parent non-user survey, potential barriers to using the SchoolMAX family portal could not be addressed in this study. However, this issue may be a useful research topic in a future study. Keywords: school to home communication, student information systems, family portal, parent portal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image processing offers unparalleled potential for traffic monitoring and control. For many years engineers have attempted to perfect the art of automatic data abstraction from sequences of video images. This paper outlines a research project undertaken at Napier University by the authors in the field of image processing for automatic traffic analysis. A software based system implementing TRIP algorithms to count cars and measure vehicle speed has been developed by members of the Transport Engineering Research Unit (TERU) at the University. The TRIP algorithm has been ported and evaluated on an IBM PC platform with a view to hardware implementation of the pre-processing routines required for vehicle detection. Results show that a software based traffic counting system is realisable for single window processing. Due to the high volume of data required to be processed for full frames or multiple lanes, system operations in real time are limited. Therefore specific hardware is required to be designed. The paper outlines a hardware design for implementation of inter-frame and background differencing, background updating and shadow removal techniques. Preliminary results showing the processing time and counting accuracy for the routines implemented in software are presented and a real time hardware pre-processing architecture is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a long history of debate around mathematics standards, reform efforts, and accountability. This research identified ways that national expectations and context drive local implementation of mathematics reform efforts and identified the external and internal factors that impact teachers’ acceptance or resistance to policy implementation at the local level. This research also adds to the body of knowledge about acceptance and resistance to policy implementation efforts. This case study involved the analysis of documents to provide a chronological perspective, assess the current state of the District’s mathematics reform, and determine the District’s readiness to implement the Common Core Curriculum. The school system in question has continued to struggle with meeting the needs of all students in Algebra 1. Therefore, the results of this case study will be useful to the District’s leaders as they include the compilation and analysis of a decade’s worth of data specific to Algebra 1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial Immune Systems have been used successfully to build recommender systems for film databases. In this research, an attempt is made to extend this idea to web site recommendation. A collection of more than 1000 individuals' web profiles (alternatively called preferences / favourites / bookmarks file) will be used. URLs will be classified using the DMOZ (Directory Mozilla) database of the Open Directory Project as our ontology. This will then be used as the data for the Artificial Immune Systems rather than the actual addresses. The first attempt will involve using a simple classification code number coupled with the number of pages within that classification code. However, this implementation does not make use of the hierarchical tree-like structure of DMOZ. Consideration will then be given to the construction of a similarity measure for web profiles that makes use of this hierarchical information to build a better-informed Artificial Immune System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. The use of artificial immune systems in intrusion detection is an appealing concept for two reasons. Firstly, the human immune system provides the human body with a high level of protection from invading pathogens, in a robust, self-organised and distributed manner. Secondly, current techniques used in computer security are not able to cope with the dynamic and increasingly complex nature of computer systems and their security. It is hoped that biologically inspired approaches in this area, including the use of immune-based systems will be able to meet this challenge. Here we collate the algorithms used, the development of the systems and the outcome of their implementation. It provides an introduction and review of the key developments within this field, in addition to making suggestions for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For various reasons, many Algol 68 compilers do not directly implement the parallel processing operations defined in the Revised Algol 68 Report. It is still possible however, to perform parallel processing, multitasking and simulation provided that the implementation permits the creation of a master routine for the coordination and initiation of processes under its control. The package described here is intended for real time applications and runs in conjunction with the Algol 68R system; it extends and develops the original Algol 68RT package, which was designed for use with multiplexers at the Royal Radar Establishment, Malvern. The facilities provided, in addition to the synchronising operations, include an interface to an ICL Communications Processor enabling the abstract processes to be realised as the interaction of several teletypes or visual display units with a real time program providing a useful service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This portfolio thesis describes work undertaken by the author under the Engineering Doctorate program of the Institute for System Level Integration. It was carried out in conjunction with the sponsor company Teledyne Defence Limited. A radar warning receiver is a device used to detect and identify the emissions of radars. They were originally developed during the Second World War and are found today on a variety of military platforms as part of the platform’s defensive systems. Teledyne Defence has designed and built components and electronic subsystems for the defence industry since the 1970s. This thesis documents part of the work carried out to create Phobos, Teledyne Defence’s first complete radar warning receiver. Phobos was designed to be the first low cost radar warning receiver. This was made possible by the reuse of existing Teledyne Defence products, commercial off the shelf hardware and advanced UK government algorithms. The challenges of this integration are described and discussed, with detail given of the software architecture and the development of the embedded application. Performance of the embedded system as a whole is described and qualified within the context of a low cost system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of no-tillage (NT) farming has revolutionized agricultural systems by allowing growers to manage greater areas of land with reduced energy, labour and machinery inputs to control erosion, improve soil health and reduce greenhouse gas emission. However, NT farming systems have resulted in a build-up of herbicide-resistant weeds, an increased incidence of soil- and stubble-borne diseases and enrichment of nutrients and carbon near the soil surface. Consequently, there is an increased interest in the use of an occasional tillage (termed strategic tillage, ST) to address such emerging constraints in otherwise-NT farming systems. Decisions around ST uses will depend upon the specific issues present on the individual field or farm, and profitability and effectiveness of available options for management. This paper explores some of the issues with the implementation of ST in NT farming systems. The impact of contrasting soil properties, the timing of the tillage and the prevailing climate exert a strong influence on the success of ST. Decisions around timing of tillage are very complex and depend on the interactions between soil water content and the purpose for which the ST is intended. The soil needs to be at the right water content before executing any tillage, while the objective of the ST will influence the frequency and type of tillage implement used. The use of ST in long-term NT systems will depend on factors associated with system costs and profitability, soil health and environmental impacts. For many farmers maintaining farm profitability is a priority, so economic considerations are likely to be a primary factor dictating adoption. However, impacts on soil health and environment, especially the risk of erosion and the loss of soil carbon, will also influence a grower's choice to adopt ST, as will the impact on soil moisture reserves in rainfed cropping systems. © 2015 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of artificial immune systems in intrusion detection is an appealing concept for two reasons. Firstly, the human immune system provides the human body with a high level of protection from invading pathogens, in a robust, self-organised and distributed manner. Secondly, current techniques used in computer security are not able to cope with the dynamic and increasingly complex nature of computer systems and their security. It is hoped that biologically inspired approaches in this area, including the use of immune-based systems will be able to meet this challenge. Here we review the algorithms used, the development of the systems and the outcome of their implementation. We provide an introduction and analysis of the key developments within this field, in addition to making suggestions for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. The use of artificial immune systems in intrusion detection is an appealing concept for two reasons. Firstly, the human immune system provides the human body with a high level of protection from invading pathogens, in a robust, self-organised and distributed manner. Secondly, current techniques used in computer security are not able to cope with the dynamic and increasingly complex nature of computer systems and their security. It is hoped that biologically inspired approaches in this area, including the use of immune-based systems will be able to meet this challenge. Here we collate the algorithms used, the development of the systems and the outcome of their implementation. It provides an introduction and review of the key developments within this field, in addition to making suggestions for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dyslipidaemia is one of the major cardiovascular risk factors, it can be due to primary causes (i.e. monogenic, characterized by a single gene mutation, or dyslipidaemia of polygenic/environmental causes), or secondary to specific disorders such as obesity, diabetes mellitus or hypothyroidism. Monogenic patients present the most severe phenotype and so they need to be identified in early age so pharmacologic treatment can be implemented to decrease the cardiovascular risk. However the majority of hyperlipidemic patients most likely have a polygenic disease that can be mainly controlled just by the implementation of a healthy lifestyle. Thus, the distinction between monogenic and polygenic dyslipidaemia is important for a prompt diagnosis, cardiovascular risk assessment, counselling and treatment. Besides the already stated biomarkers as LDL, apoB and apoB/apoA-I ratio, other promising (yet, needing further research) biomarkers for clinical differentiation between dyslipidaemias are apoE, sdLDL, apoC-2 and apoC-3. However, none of these biomarkers can explain the complex lipid profile of the majority of these patients.