974 resultados para Software -- Evaluation
Resumo:
El objetivo de ésta tesis es estudiar cómo desarrollar una aplicación informática que implemente algoritmos numéricos de evaluación de características hidrodinámicas de modelos geométricos representativos de carenas de buques. Se trata de especificar los requisitos necesarios que debe cumplir un programa para informático orientado a dar solución a un determinado problema hidródinámico, como es simular el comportamiento en balance de un buque sometido a oleaje, de popa o proa. una vez especificada la aplicación se realizará un diseño del programa; se estudiarán alternativas para implementar la aplicación; se explicará el proceso que ha de seguirse para obtener la aplicación en funcionamiento y se contrastarán los resultados obtenidos en la medida que sea posible. Se pretende sistematizar y sintetizar todo el proceso de desarrollo de software, orientado a la simulación del comportamiento hidrodinámico de un buque, en una metodología que se pondrá a disposición de la comunidad académica y científica en la forma que se considere más adecuada. Se trata, por tanto, de proponer una metodología de desarrollo de software para obetener una aplicación que facilite la evaluación de diferentes alternativas de estudio variando parámetros relativos al problema en estudio y que sea capaz de proporcionar resultados para su análisis. Así mismo se incide en cómo ha de conducirse en el proceso para que dicha aplicación pueda crecer, incorporando soluciones existentes no implementadas o nuevas soluciones que aparezcan en este ámbito de conocimiento. Como aplicación concreta de la aplicación se ha elegido implementar los algoritmos necesarios para evaluar la aparición del balance paramétrico en un buque. En el análisis de éste problema se considera de interés la representación geométrica que se hace de la carena del buque. Además de la carena aparecen otros elementos que tienen influencia determinante en éste estudio, como son las situación de mar y las situaciones de carga. Idealmente, el problema sería resuelto si se consiguiera determinar el ángulo de balance que se produce al enfrentar un buque a las diferentes condiciones de mar. Se pretende preparar un programa utilizando el paradigma de la orientación a objetos. Considero que es la más adecuada forma de modularizar el programa para poder utilizar diferentes modelos de una misma carena y así comparar los resultados de la evaluación del balance paramétrico entre sí. En una etapa posterior se podrían comparar los resultados con otros obtenidos empíricamente. Hablo de una nueva metodología porque pretendo indicar cómo se ha de construir una aplicación de software que sea usable y sobre la que se pueda seguir desarrollando. Esto justifica la selección del lenguaje de programación C++. Se seleccionará un núcleo geométrico de software que permita acoplar de forma versátil los distintos componentes de software que van a construir el programa. Este trabajo pretende aplicar el desarrollo de software a un aspecto concreto del área de conocimiento de la hidrodinámica. No se pretende aportar nuevos algoritmos para resolver problemas de hidrodinámica, sino diseñar un conjunto de objetos de software que implementen soluciones existentes a conocidas soluciones numéricas a dichos problemas. Se trata fundamentalmente de un trabajo de software, más que de hidrodinámica. Lo que aporta de novedad es una nueva forma de realizar un programa aplicado a los cálculos hidrodinámicos relativos a la determinación del balance paramétrico, que pueda crecer e incorporar cualquier novedad que pueda surgir más adelante. Esto será posible por la programación modular utilizada y los objetos que representan cada uno de los elementos que intervienen en la determinación del balance paramétrico. La elección de aplicar la metodología a la predicción del balance paramétrico se debe a que este concepto es uno de los elementos que intervienen en la evaluación de criterios de estabilidad de segunda generación que estan en estudio para su futura aplicación en el ámbito de la construcción naval. Es por tanto un estudio que despierta interés por su próxima utilidad. ABSTRACT The aim of this thesis is to study how to develop a computer application implementing numerical algorithms to assess hydrodynamic features of geometrical models of vessels. It is therefore to propose a methodology for software development applied to an hydrodynamic problem, in order to evaluate different study alternatives by varying different parameters related to the problem and to be capable of providing results for analysis. As a concrete application of the program it has been chosen to implement the algorithms necessary for evaluating the appearance of parametric rolling in a vessel. In the analysis of this problem it is considered of interest the geometrical representation of the hull of the ship and other elements which have decisive influence in this phenomena, such as the sea situation and the loading condition. Ideally, the application would determine the roll angle that occurs when a ship is on waves of different characteristics. It aims to prepare a program by using the paradigm of object oriented programming. I think it is the best methodology to modularize the program. My intention is to show how face the global process of developing an application from the initial specification until the final release of the program. The process will keep in mind the spefici objetives of usability and the possibility of growing in the scope of the software. This work intends to apply software development to a particular aspect the area of knowledge of hydrodynamics. It is not intended to provide new algorithms for solving problems of hydrodynamics, but designing a set of software objects that implement existing solutions to these problems. This is essentially a job software rather than hydrodynamic. The novelty of this thesis stands in this work focuses in describing how to apply the whole proccess of software engineering to hydrodinamics problems. The choice of the prediction of parametric balance as the main objetive to be applied to is because this concept is one of the elements involved in the evaluation of the intact stability criteria of second generation. Therefore, I consider this study as relevant usefull for the future application in the field of shipbuilding.
Resumo:
In the last decades accumulated clinical evidence has proven that intra-operative radiation therapy (IORT) is a very valuable technique. In spite of that, planning technology has not evolved since its conception, being outdated in comparison to current state of the art in other radiotherapy techniques and therefore slowing down the adoption of IORT. RADIANCE is an IORT planning system, CE and FDA certified, developed by a consortium of companies, hospitals and universities to overcome such technological backwardness. RADIANCE provides all basic radiotherapy planning tools which are specifically adapted to IORT. These include, but are not limited to image visualization, contouring, dose calculation algorithms-Pencil Beam (PB) and Monte Carlo (MC), DVH calculation and reporting. Other new tools, such as surgical simulation tools have been developed to deal with specific conditions of the technique. Planning with preoperative images (preplanning) has been evaluated and the validity of the system being proven in terms of documentation, treatment preparation, learning as well as improvement of surgeons/radiation oncologists (ROs) communication process. Preliminary studies on Navigation systems envisage benefits on how the specialist to accurately/safely apply the pre-plan into the treatment, updating the plan as needed. Improvements on the usability of this kind of systems and workflow are needed to make them more practical. Preliminary studies on Intraoperative imaging could provide an improved anatomy for the dose computation, comparing it with the previous pre-plan, although not all devices in the market provide good characteristics to do so. DICOM.RT standard, for radiotherapy information exchange, has been updated to cover IORT particularities and enabling the possibility of dose summation with external radiotherapy. The effect of this planning technology on the global risk of the IORT technique has been assessed and documented as part of a failure mode and effect analysis (FMEA). Having these technological innovations and their clinical evaluation (including risk analysis) we consider that RADIANCE is a very valuable tool to the specialist covering the demands from professional societies (AAPM, ICRU, EURATOM) for current radiotherapy procedures.
Resumo:
A qualidade óssea, bem como a estabilidade inicial dos implantes, está diretamente relacionada com o sucesso das reabilitações na implantodontia. O presente estudo teve como objetivo analisar a correlação entre índices radiomorfométricos de densidade óssea por meio de radiografias panorâmicas, perfil de qualidade óssea com o auxílio de Tomografia Computadorizada de Feixe Cônico (TCFC) com o uso do software de imagens OsiriX, Análise da Frequência de Ressonância (RFA) e Torque de Inserção do implante. Foram avaliados 160 implantes de 72 indivíduos, com média etária de 55,5 (±10,5) anos. Nas radiografias panorâmicas foram obtidos os índices IM, IPM e ICM, e nas tomografias computadorizadas de feixe cônico, os valores de pixels e a espessura da cortical da crista óssea alveolar, além da estabilidade primária por meio do torque de inserção e análise da frequência de ressonância. Os resultados foram analisados pelo coeficiente de correlação de Spearman, para p<= 0,01 foi obtido entre o torque de inserção e valores de pixels (0.330), o torque de inserção e a espessura da cortical da crista alveolar (0.339), o torque de inserção e o ISQ vestibulo-lingual (0.193), os valores de pixels e espessura da cortical da crista alveolar (0.377), as duas direções vestíbulo-lingual e mesio-distal do ISQ (0.674), o ISQ vestíbulo-lingual e a espessura da cortical da crista alveolar (0.270); os índices radiomorfométricos foram correlacionados entre eles e para p<= 0,05 foi obtido entre torque de inserção e ISQ mesio-distal (0.131), entre o ISQ vestibulo-lingual e os valores de pixels (0.156) e ISQ mesio-distal e IPMI esquerdo (0.149) e ISQ mesio-distal e IPMS esquerdo (0.145). Existe correlação entre a TCFC, o torque de inserção e a RFA na avaliação da qualidade óssea. É possível utilizar, pré-cirurgicamente, os exames de TCFC para avaliar a qualidade e quantidade óssea, tendo em vista as correlações obtidas neste estudo.
Resumo:
The very purpose of a recruiting software program is to help the management of organizations, primarily the HR department to keep track of the job applications. An applicant tracking system can reduce an organization's overall recruitment cost, increase productivity, and raise the level of satisfaction due to faster and better completion of transactions and services. This project analyzes four software providers to discover an applicant tracking system which best suits an organization's recruiting needs. The capstone also highlights that great success an organization can be achieved by significantly improving the delivery of its recruiting services to employees, managers and applicants. The adoption of a well managed applicant tracking system can support this goal.
Resumo:
Background: The Clinical Learning Environment, Supervision and Nurse Teacher scale is a reliable and valid instrument to evaluate the quality of the clinical learning process in international nursing education contexts. Objectives: This paper reports the development and psychometric testing of the Spanish version of the Clinical Learning Environment, Supervision and Nurse Teacher scale. Design: Cross-sectional validation study of the scale. Setting: 10 public and private hospitals in the Alicante area, and the Faculty of Health Sciences (University of Alicante, Spain). Participants: 370 student nurses on clinical placement (January 2011–March 2012). Methods: The Clinical Learning Environment, Supervision and Nurse Teacher scale was translated using the modified direct translation method. Statistical analyses were performed using PASW Statistics 18 and AMOS 18.0.0 software. A multivariate analysis was conducted in order to assess construct validity. Cronbach’s alpha coefficient was used to evaluate instrument reliability. Results: An exploratory factorial analysis identified the five dimensions from the original version, and explained 66.4% of the variance. Confirmatory factor analysis supported the factor structure of the Spanish version of the instrument. Cronbach’s alpha coefficient for the scale was .95, ranging from .80 to .97 for the subscales. Conclusion: This version of the Clinical Learning Environment, Supervision and Nurse Teacher scale instrument showed acceptable psychometric properties for use as an assessment scale in Spanish-speaking countries.
Resumo:
Paraconsistent logic admits that the contradiction can be true. Let p be the truth values and P be a proposition. In paraconsistent logic the truth values of contradiction is . This equation has no real roots but admits complex roots . This is the result which leads to develop a multivalued logic to complex truth values. The sum of truth values being isomorphic to the vector of the plane, it is natural to relate the function V to the metric of the vector space R2. We will adopt as valuations the norms of vectors. The main objective of this paper is to establish a theory of truth-value evaluation for paraconsistent logics with the goal of using in analyzing ideological, mythical, religious and mystic belief systems.
Resumo:
La partición hardware/software es una etapa clave dentro del proceso de co-diseño de los sistemas embebidos. En esta etapa se decide qué componentes serán implementados como co-procesadores de hardware y qué componentes serán implementados en un procesador de propósito general. La decisión es tomada a partir de la exploración del espacio de diseño, evaluando un conjunto de posibles soluciones para establecer cuál de estas es la que mejor balance logra entre todas las métricas de diseño. Para explorar el espacio de soluciones, la mayoría de las propuestas, utilizan algoritmos metaheurísticos; destacándose los Algoritmos Genéticos, Recocido Simulado. Esta decisión, en muchos casos, no es tomada a partir de análisis comparativos que involucren a varios algoritmos sobre un mismo problema. En este trabajo se presenta la aplicación de los algoritmos: Escalador de Colinas Estocástico y Escalador de Colinas Estocástico con Reinicio, para resolver el problema de la partición hardware/software. Para validar el empleo de estos algoritmos se presenta la aplicación de este algoritmo sobre un caso de estudio, en particular la partición hardware/software de un codificador JPEG. En todos los experimentos es posible apreciar que ambos algoritmos alcanzan soluciones comparables con las obtenidas por los algoritmos utilizados con más frecuencia.
Resumo:
This paper describes a study and analysis of surface normal-base descriptors for 3D object recognition. Specifically, we evaluate the behaviour of descriptors in the recognition process using virtual models of objects created from CAD software. Later, we test them in real scenes using synthetic objects created with a 3D printer from the virtual models. In both cases, the same virtual models are used on the matching process to find similarity. The difference between both experiments is in the type of views used in the tests. Our analysis evaluates three subjects: the effectiveness of 3D descriptors depending on the viewpoint of camera, the geometry complexity of the model and the runtime used to do the recognition process and the success rate to recognize a view of object among the models saved in the database.
Resumo:
Background. Health care professionals, especially those working in primary health-care services, can play a key role in preventing and responding to intimate partner violence. However, there are huge variations in the way health care professionals and primary health care teams respond to intimate partner violence. In this study we tested a previously developed programme theory on 15 primary health care center teams located in four different Spanish regions: Murcia, C Valenciana, Castilla-León and Cantabria. The aim was to identify the key combinations of contextual factors and mechanisms that trigger a good primary health care center team response to intimate partner violence. Methods. A multiple case-study design was used. Qualitative and quantitative information was collected from each of the 15 centers (cases). In order to handle the large amount of information without losing familiarity with each case, qualitative comparative analysis was undertaken. Conditions (context and mechanisms) and outcomes, were identified and assessed for each of the 15 cases, and solution formulae were calculated using qualitative comparative analysis software. Results. The emerging programme theory highlighted the importance of the combination of each team’s self-efficacy, perceived preparation and women-centredness in generating a good team response to intimate partner violence. The use of the protocol and accumulated experience in primary health care were the most relevant contextual/intervention conditions to trigger a good response. However in order to achieve this, they must be combined with other conditions, such as an enabling team climate, having a champion social worker and having staff with training in intimate partner violence. Conclusions. Interventions to improve primary health care teams’ response to intimate partner violence should focus on strengthening team’s self-efficacy, perceived preparation and the implementation of a woman-centred approach. The use of the protocol combined with a large working experience in primary health care, and other factors such as training, a good team climate, and having a champion social worker on the team, also played a key role. Measures to sustain such interventions and promote these contextual factors should be encouraged.
Resumo:
Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.
Resumo:
Trabalho de Projeto apresentado à Escola Superior de Tecnologia do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Desenvolvimento de Software e Sistemas Interactivos, realizada sob a orientação científica do Professor Doutor José Carlos Metrôlho, do Instituto Politécnico de Castelo Branco.
Resumo:
Costs and environmental impacts are key elements in forest logistics and they must be integrated in forest decision-making. The evaluation of transportation fuel costs and carbon emissions depend on spatial and non-spatial data but in many cases the former type of data are dicult to obtain. On the other hand, the availability of software tools to evaluate transportation fuel consumption as well as costs and emissions of carbon dioxide is limited. We developed a software tool that combines two empirically validated models of truck transportation using Digital Elevation Model (DEM) data and an open spatial data tool, specically OpenStreetMap©. The tool generates tabular data and spatial outputs (maps) with information regarding fuel consumption, cost and CO2 emissions for four types of trucks. It also generates maps of the distribution of transport performance indicators (relation between beeline and real road distances). These outputs can be easily included in forest decision-making support systems. Finally, in this work we applied the tool in a particular case of forest logistics in north-eastern Portugal
Resumo:
Software architecture erodes over time and needs to be constantly monitored to be kept consistent with its original intended design. Consistency is rarely monitored using automated techniques. The cost associated to such an activity is typically not considered proportional to its benefits. To improve this situation, we propose Dicto, a uniform DSL for specifying architectural invariants. This language is designed to reduce the cost of consistency checking by offering a framework in which existing validation tools can be matched to newly-defined language constructs. In this paper we discuss how such a DSL can be qualitatively and qualitatively evaluated in practice.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Enterprise systems interoperability (ESI) is an important topic for business currently. This situation is evidenced, at least in part, by the number and extent of potential candidate protocols for such process interoperation, viz., ebXML, BPML, BPEL, and WSCI. Wide-ranging support for each of these candidate standards already exists. However, despite broad acceptance, a sound theoretical evaluation of these approaches has not yet been provided. We use the Bunge-Wand-Weber (BWW) models, in particular, the representation model, to provide the basis for such a theoretical evaluation. We, and other researchers, have shown the usefulness of the representation model for analyzing, evaluating, and engineering techniques in the areas of traditional and structured systems analysis, object-oriented modeling, and process modeling. In this work, we address the question, what are the potential semantic weaknesses of using ebXML alone for process interoperation between enterprise systems? We find that users will lack important implementation information because of representational deficiencies; due to ontological redundancy, the complexity of the specification is unnecessarily increased; and, users of the specification will have to bring in extra-model knowledge to understand constructs in the specification due to instances of ontological excess.