882 resultados para Project 2002-005-C : Decision Support Tools for Concrete Infrastructure Rehabilitation
Resumo:
The study area is La Colacha sub-basins from Arroyos Menores basins, natural areas at West and South of Río Cuarto in Province of Córdoba of Argentina, fertile with loess soils and monsoon temperate climate, but with soil erosions including regressive gullies that degrade them progressively. Cultivated gently since some hundred sixty years, coordinated action planning became necessary to conserve lands while keeping good agro-production. The authors had improved data on soils and on hydrology for the study area, evaluated systems of soil uses and actions to be recommended and applied Decision Support Systems (DSS) tools for that, and were conducted to use discrete multi-criteria models (MCDM) for the more global views about soil conservation and hydraulic management actions and about main types of use of soils. For that they used weighted PROMETHEE, ELECTRE, and AHP methods with a system of criteria grouped as environmental, economic and social, and criteria from their data on effects of criteria. The alternatives resulting offer indication for planning depending somehow on sub basins and on selections of weights, but actions for conservation of soils and water management measures are recommended to conserve the basins conditions, actually sensibly degrading, mainly keeping actual uses of the lands.
Resumo:
Hoy en día, existen numerosos sistemas (financieros, fabricación industrial, infraestructura de servicios básicos, etc.) que son dependientes del software. Según la definición de Ingeniería del Software realizada por I. Sommerville, “la Ingeniería del Software es una disciplina de la ingeniería que comprende todos los aspectos de la producción de software desde las etapas iniciales de la especificación del sistema, hasta el mantenimiento de éste después de que se utiliza.” “La ingeniería del software no sólo comprende los procesos técnicos del desarrollo de software, sino también actividades tales como la gestión de proyectos de software y el desarrollo de herramientas, métodos y teorías de apoyo a la producción de software.” Los modelos de proceso de desarrollo software determinan una serie de pautas para poder desarrollar con éxito un proyecto de desarrollo software. Desde que surgieran estos modelos de proceso, se investigado en nuevas maneras de poder gestionar un proyecto y producir software de calidad. En primer lugar surgieron las metodologías pesadas o tradicionales, pero con el avance del tiempo y la tecnología, surgieron unas nuevas llamadas metodologías ágiles. En el marco de las metodologías ágiles cabe destacar una determinada práctica, la integración continua. Esta práctica surgió de la mano de Martin Fowler, con el objetivo de facilitar el trabajo en grupo y automatizar las tareas de integración. La integración continua se basa en la construcción automática de proyectos con una frecuencia alta, promoviendo la detección de errores en un momento temprano para poder dar prioridad a corregir dichos errores. Sin embargo, una de las claves del éxito en el desarrollo de cualquier proyecto software consiste en utilizar un entorno de trabajo que facilite, sistematice y ayude a aplicar un proceso de desarrollo de una forma eficiente. Este Proyecto Fin de Grado (PFG) tiene por objetivo el análisis de distintas herramientas para configurar un entorno de trabajo que permita desarrollar proyectos aplicando metodologías ágiles e integración continua de una forma fácil y eficiente. Una vez analizadas dichas herramientas, se ha propuesto y configurado un entorno de trabajo para su puesta en marcha y uso. Una característica a destacar de este PFG es que las herramientas analizadas comparten una cualidad común y de alto valor, son herramientas open-source. El entorno de trabajo propuesto en este PFG presenta una arquitectura cliente-servidor, dado que la mayoría de proyectos software se desarrollan en equipo, de tal forma que el servidor proporciona a los distintos clientes/desarrolladores acceso al conjunto de herramientas que constituyen el entorno de trabajo. La parte servidora del entorno propuesto proporciona soporte a la integración continua mediante herramientas de control de versiones, de gestión de historias de usuario, de análisis de métricas de software, y de automatización de la construcción de software. La configuración del cliente únicamente requiere de un entorno de desarrollo integrado (IDE) que soporte el lenguaje de programación Java y conexión con el servidor. ABSTRACT Nowadays, numerous systems (financial, industrial production, basic services infrastructure, etc.) depend on software. According to the Software Engineering definition made by I.Sommerville, “Software engineering is an engineering discipline that is concerned with all aspects of software production from the early stages of system specification through to maintaining the system after it has gone into use.” “Software engineering is not just concerned with the technical processes of software development. It also includes activities such as software project management and the development of tools, methods, and theories to support software production.” Software development process models determine a set of guidelines to successfully develop a software development project. Since these process models emerged, new ways of managing a project and producing software with quality have been investigated. First, the so-called heavy or traditional methodologies appeared, but with the time and the technological improvements, new methodologies emerged: the so-called agile methodologies. Agile methodologies promote, among other practices, continuous integration. This practice was coined by Martin Fowler and aims to make teamwork easier as well as automate integration tasks. Nevertheless, one of the keys to success in software projects is to use a framework that facilitates, systematize, and help to deploy a development process in an efficient way. This Final Degree Project (FDP) aims to analyze different tools to configure a framework that enables to develop projects by applying agile methodologies and continuous integration in an easy and efficient way. Once tools are analyzed, a framework has been proposed and configured. One of the main features of this FDP is that the tools under analysis share a common and high-valued characteristic: they are open-source. The proposed framework presents a client-server architecture, as most of the projects are developed by a team. In this way, the server provides access the clients/developers to the tools that comprise the framework. The server provides continuous integration through a set of tools for control management, user stories management, software quality management, and software construction automatization. The client configuration only requires a Java integrated development environment and network connection to the server.
Resumo:
Esta dissertação apresenta dois projetos que apontam possibilidades de se construir novos conhecimentos com práticas viáveis e significativas para os alunos de Administração de Empresas do ensino superior. O objetivo é fomentar a discussão sobre o papel do docente e a formação dos professores no ensino superior com qualidade, através dos estudos em uma Faculdade da Zona Leste da cidade de São Paulo. Este trabalho aponta que o bacharelado em Administração pode ir para além de meros conteúdos técnicos a serem humanizados a partir da utilização de recursos didáticos inovadores. A análise dos projetos permite concluir que ações educativas diferenciadas, inspiradas na teoria freiriana, apresentaram-se como forma de enfrentamento às dificuldades nas relações entre ensino e aprendizagem. Com a aplicação dos referidos projetos, as possíveis dificuldades de aprendizagem foram transformadas em oportunidades, e hoje são contribuições à qualidade educativa.
Resumo:
El reciente crecimiento masivo de medios on-line y el incremento de los contenidos generados por los usuarios (por ejemplo, weblogs, Twitter, Facebook) plantea retos en el acceso e interpretación de datos multilingües de manera eficiente, rápida y asequible. El objetivo del proyecto TredMiner es desarrollar métodos innovadores, portables, de código abierto y que funcionen en tiempo real para generación de resúmenes y minería cross-lingüe de medios sociales a gran escala. Los resultados se están validando en tres casos de uso: soporte a la decisión en el dominio financiero (con analistas, empresarios, reguladores y economistas), monitorización y análisis político (con periodistas, economistas y políticos) y monitorización de medios sociales sobre salud con el fin de detectar información sobre efectos adversos a medicamentos.
Resumo:
Outliers are objects that show abnormal behavior with respect to their context or that have unexpected values in some of their parameters. In decision-making processes, information quality is of the utmost importance. In specific applications, an outlying data element may represent an important deviation in a production process or a damaged sensor. Therefore, the ability to detect these elements could make the difference between making a correct and an incorrect decision. This task is complicated by the large sizes of typical databases. Due to their importance in search processes in large volumes of data, researchers pay special attention to the development of efficient outlier detection techniques. This article presents a computationally efficient algorithm for the detection of outliers in large volumes of information. This proposal is based on an extension of the mathematical framework upon which the basic theory of detection of outliers, founded on Rough Set Theory, has been constructed. From this starting point, current problems are analyzed; a detection method is proposed, along with a computational algorithm that allows the performance of outlier detection tasks with an almost-linear complexity. To illustrate its viability, the results of the application of the outlier-detection algorithm to the concrete example of a large database are presented.
Resumo:
Background and objective: In this paper, we have tested the suitability of using different artificial intelligence-based algorithms for decision support when classifying the risk of congenital heart surgery. In this sense, classification of those surgical risks provides enormous benefits as the a priori estimation of surgical outcomes depending on either the type of disease or the type of repair, and other elements that influence the final result. This preventive estimation may help to avoid future complications, or even death. Methods: We have evaluated four machine learning algorithms to achieve our objective: multilayer perceptron, self-organizing map, radial basis function networks and decision trees. The architectures implemented have the aim of classifying among three types of surgical risk: low complexity, medium complexity and high complexity. Results: Accuracy outcomes achieved range between 80% and 99%, being the multilayer perceptron method the one that offered a higher hit ratio. Conclusions: According to the results, it is feasible to develop a clinical decision support system using the evaluated algorithms. Such system would help cardiology specialists, paediatricians and surgeons to forecast the level of risk related to a congenital heart disease surgery.
Resumo:
The construction industry has long been considered as highly fragmented and non-collaborative industry. This fragmentation sprouted from complex and unstructured traditional coordination processes and information exchanges amongst all parties involved in a construction project. This nature coupled with risk and uncertainty has pushed clients and their supply chain to search for new ways of improving their business process to deliver better quality and high performing product. This research will closely investigate the need to implement a Digital Nervous System (DNS), analogous to a biological nervous system, on the flow and management of digital information across the project lifecycle. This will be through direct examination of the key processes and information produced in a construction project and how a DNS can provide a well-integrated flow of digital information throughout the project lifecycle. This research will also investigate how a DNS can create a tight digital feedback loop that enables the organisation to sense, react and adapt to changing project conditions. A Digital Nervous System is a digital infrastructure that provides a well-integrated flow of digital information to the right part of the organisation at the right time. It provides the organisation with the relevant and up-to-date information it needs, for critical project issues, to aid in near real-time decision-making. Previous literature review and survey questionnaires were used in this research to collect and analyse data about information management problems of the industry – e.g. disruption and discontinuity of digital information flow due to interoperability issues, disintegration/fragmentation of the adopted digital solutions and paper-based transactions. Results analysis revealed efficient and effective information management requires the creation and implementation of a DNS.
Resumo:
Stroke is a leading cause of death and permanent disability worldwide, affecting millions of individuals. Traditional clinical scores for assessment of stroke-related impairments are inherently subjective and limited by inter-rater and intra-rater reliability, as well as floor and ceiling effects. In contrast, robotic technologies provide objective, highly repeatable tools for quantification of neurological impairments following stroke. KINARM is an exoskeleton robotic device that provides objective, reliable tools for assessment of sensorimotor, proprioceptive and cognitive brain function by means of a battery of behavioral tasks. As such, KINARM is particularly useful for assessment of neurological impairments following stroke. This thesis introduces a computational framework for assessment of neurological impairments using the data provided by KINARM. This is done by achieving two main objectives. First, to investigate how robotic measurements can be used to estimate current and future abilities to perform daily activities for subjects with stroke. We are able to predict clinical scores related to activities of daily living at present and future time points using a set of robotic biomarkers. The findings of this analysis provide a proof of principle that robotic evaluation can be an effective tool for clinical decision support and target-based rehabilitation therapy. The second main objective of this thesis is to address the emerging problem of long assessment time, which can potentially lead to fatigue when assessing subjects with stroke. To address this issue, we examine two time reduction strategies. The first strategy focuses on task selection, whereby KINARM tasks are arranged in a hierarchical structure so that an earlier task in the assessment procedure can be used to decide whether or not subsequent tasks should be performed. The second strategy focuses on time reduction on the longest two individual KINARM tasks. Both reduction strategies are shown to provide significant time savings, ranging from 30% to 90% using task selection and 50% using individual task reductions, thereby establishing a framework for reduction of assessment time on a broader set of KINARM tasks. All in all, findings of this thesis establish an improved platform for diagnosis and prognosis of stroke using robot-based biomarkers.
Resumo:
National Highway Traffic Safety Administration, Office of Alcohol Countermeasures, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
"January 1991."
Resumo:
Cover title: Deliberation Support Division (DSD) products and services.
Resumo:
Shipping list no.: 92-370-P.
Resumo:
This paper investigates how demographic (socioeconomic) and land-use (physical and environmental) data can be integrated within a decision support framework to formulate and evaluate land-use planning scenarios. A case-study approach is undertaken with land-use planning scenarios for a rapidly growing coastal area in Australia, the Shire of Hervey Bay. The town and surrounding area require careful planning of the future urban growth between competing land uses. Three potential urban growth scenarios are put forth to address this issue. Scenario A ('continued growth') is based on existing socioeconomic trends. Scenario B ('maximising rates base') is derived using optimisation modelling of land-valuation data. Scenario C ('sustainable development') is derived using a number of social, economic, and environmental factors and assigning weightings of importance to each factor using a multiple criteria analysis approach. The land-use planning scenarios are presented through the use of maps and tables within a geographical information system, which delineate future possible land-use allocations up until 2021. The planning scenarios are evaluated by using a goal-achievement matrix approach. The matrix is constructed with a number of criteria derived from key policy objectives outlined in the regional growth management framework and town planning schemes. The authors of this paper examine the final efficiency scores calculated for each of the three planning scenarios and discuss the advantages and disadvantages of the three land-use modelling approaches used to formulate the final scenarios.