876 resultados para Computing and software systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report was developed to provide summary information to allow state agency staff, practitioners and juvenile justice system officials to access specific sections of Iowa’s Three Year Plan. It includes the “Service Network” section of Iowa’s 2006 Juvenile Justice and Delinquency Prevention Act formula grant Three-Year Plan. The complete Three Year Plan serves as Iowa’s application for Juvenile Justice and Delinquency Prevention Act formula grant funding. The information included in this report overviews some of the systems and services that relate to Iowa’s delinquency and CINA systems. The systems and services discussed include substance abuse , mental health, alternative or special education, and job training.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project analyzes the characteristics and spatial distributions of motor vehicle crash types in order to evaluate the degree and scale of their spatial clustering. Crashes occur as the result of a variety of vehicle, roadway, and human factors and thus vary in their clustering behavior. Clustering can occur at a variety of scales, from the intersection level, to the corridor level, to the area level. Conversely, other crash types are less linked to geographic factors and are more spatially “random.” The degree and scale of clustering have implications for the use of strategies to promote transportation safety. In this project, Iowa's crash database, geographic information systems, and recent advances in spatial statistics methodologies and software tools were used to analyze the degree and spatial scale of clustering for several crash types within the counties of the Iowa Northland Regional Council of Governments. A statistical measure called the K function was used to analyze the clustering behavior of crashes. Several methodological issues, related to the application of this spatial statistical technique in the context of motor vehicle crashes on a road network, were identified and addressed. These methods facilitated the identification of crash clusters at appropriate scales of analysis for each crash type. This clustering information is useful for improving transportation safety through focused countermeasures directly linked to crash causes and the spatial extent of identified problem locations, as well as through the identification of less location-based crash types better suited to non-spatial countermeasures. The results of the K function analysis point to the usefulness of the procedure in identifying the degree and scale at which crashes cluster, or do not cluster, relative to each other. Moreover, for many individual crash types, different patterns and processes and potentially different countermeasures appeared at different scales of analysis. This finding highlights the importance of scale considerations in problem identification and countermeasure formulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to evaluate the change in soil C and N mineralization due to successive pig slurry application under conventional tillage (CT) and no tillage (NT) systems. The experiment was carried out in a clayey Latossolo Vermelho eutrófico (Rhodic Eutrudox) in Palotina, PR, Brazil. Increasing doses of pig slurry (0, 30, 60 and 120 m³ ha-1 per year) were applied in both tillage systems, with three replicates. Half of the pig slurry was applied before summer soil preparation, and the other half before the winter crop season. The areas were cultivated with soybean (Glycine max L.) and maize (Zea mays L.) in the summers of 1998 and 1999, respectively, and with wheat (Triticum sativum Lam.) in the winters of both years. Soil samples were collected at 0-5, 5-10, and 10-20 cm depths. Under both CT and NT systems, pig slurry application increased C and N mineralization. However, increasing pig slurry additions decreased the C to N mineralization ratio. Under the NT system, C and N mineralization was greater than in CT system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nomadic workers travel often between different work sites and work mainly outside their regular work place, but often require access to information stored electronically in corporate information systems. While working in field conditions, communication with an information system can be achieved by using mobile technology, i.e. mobile devices and wireless communication. This master’s thesis researches the use of mobile technology to assist nomadic field workers in their tasks. First different mobile technologies are compared and constraints that characterize mobile computing are explained. In the practical part of the thesis client software is developed for a mobile device. The software allows a nomadic construction worker to identify concrete elements and to acquire and update information concerning them. The characteristics of mobile computing and their effect on usability are taken into account when implementing the client software and the software is designed to be as easy to use as possible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The software development industry is constantly evolving. The rise of the agile methodologies in the late 1990s, and new development tools and technologies require growing attention for everybody working within this industry. The organizations have, however, had a mixture of various processes and different process languages since a standard software development process language has not been available. A promising process meta-model called Software & Systems Process Engineering Meta- Model (SPEM) 2.0 has been released recently. This is applied by tools such as Eclipse Process Framework Composer, which is designed for implementing and maintaining processes and method content. Its aim is to support a broad variety of project types and development styles. This thesis presents the concepts of software processes, models, traditional and agile approaches, method engineering, and software process improvement. Some of the most well-known methodologies (RUP, OpenUP, OpenMethod, XP and Scrum) are also introduced with a comparison provided between them. The main focus is on the Eclipse Process Framework and SPEM 2.0, their capabilities, usage and modeling. As a proof of concept, I present a case study of modeling OpenMethod with EPF Composer and SPEM 2.0. The results show that the new meta-model and tool have made it possible to easily manage method content, publish versions with customized content, and connect project tools (such as MS Project) with the process content. The software process modeling also acts as a process improvement activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical and psychological stress cause different patterns of changes in the fluorescence intensity of nigral and tuberoinfundibular DA neurons which point to changes in neuronal activity. In order to investigate possible interactions between alpha-MSH (alpha-melanotropin) and DA systems in stress, systemic and intraventricular injections of antiserum against alpha-MSH were made. The functional state of DA neurons was assessed by histochemical microfluorimetry and hormone levels were measured by radioimmunossay. Antiserum against alpha-MSH was found to affect the functional state of DA neurons, but only thorugh the intravenous route. Under physical stress i.v. injection of antiserum against alpha-MSH was accompanied by elevated levels of activity of the DA neurons of the substantia nigra. An intraventricular injection of the same antiserum was ineffective. In psychological stress, an effect was again seen only after intravenous injection of antiserum against alpha-MSH. In this situation, the activity in DA cell groups of the substantia nigra, ventral tegmental area and tubero-infundibular system was increased after antiserum injection. Possible influences from manipulations were checked; certain effects which depended upon experimental situation were noted. Our data suggest a modulatory influence of circulating alpha-MSH on the functional state of central DA systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the first results of a current research project about human – environmental interactions in the Montseny Massif. Our work sets out to integrate two research lines in the studied area: - Archaeological and archaeo-morphological surveys in a lower part of the mountains in order to characterize the evolution of the settlements and field systems. - The geological and geomorphological characterization of the slope and terrace deposits in relation with field systems and archaeological data. First results point out the intensive occupation of these inland areas during the Iberian and the Roman periods. Post-Roman sediments show different processes of erosion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clúster format per una màquina principal HEAD Node més 19 nodes de càlcul de la gama SGI13 Altix14 XE Servers and Clusters, unides en una topologia de màster subordinat, amb un total de 40 processadors Dual Core i aproximadament 160Gb de RAM.