935 resultados para accessibility analysis tools


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Arctic environment is changing constantly. There are several factors that constitute to the rate and immensity of the development. The region differs from the surrounding markets that most of the countries in the region have been used to. Therefore the purpose of the study was to understand how the political environment affects Finnish companies’ strategies and business operations. The issues analyzed were the political environment in the region, the business environment and economic development, and the opportunities and threats that the Finnish companies have in Arctic. The main theories were found from strategic management and market analysis tools. The different theories and definitions were gone through in order to understand the context of the study. This is a qualitative study that uses content analysis as its main method of analyzing the data. Therefore the data analyzed was gathered from already existing material and it was analyzed until the saturation point was found. This was done in order to minimize the risks related to using secondary data. The data collected was then categorized into themes accordingly. First the general political environment in the Arctic was studied, especially the Arctic Council and its work as the main political entity. From there the focus shifted to the business environment and the general opportunities and threats that are found from Arctic economic development. China offered another point of view to this as it represented a non-Arctic state with a keen interest on the region. Lastly the two previous objectives were combined and looked through from a Finnish perspective. Finnish companies have a great starting point to Arctic business and the operational business environment gives them the framework with which they have to operate in. As a conclusion it can be said that there are three main factors leading the Arctic economic development; the climate change, the development of technology, and the political environment. These set the framework with which the companies operating in the region must comply with. The industry that is likely to lead the development is the marine industry. Furthermore it became evident that the Finnish companies operating in the Arctic face many opportunities as well as threats which can be utilized, taken advantage of or controlled through effective strategic management. The key characteristics needed in the region are openness and understanding of the challenging environment and the ability to face and manage the arising challenges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study presents an agile tool set for the business modeling in companies, entering the turbulent environment. The study’s aim is to explore business modeling techniques and their tooling by utilizing a case study of a Finnish media monitoring company, expanding to the Russian market. This work proposes a tailored “two-approach” of business modeling development that analyzes both the past and future conditions of two key factors of business modeling for companies – internal and external environments. The study explores a case company by investigating the benefits and disadvantages of firm’s present business modeling tools, developing a new tooling and applying it to the case company. Among primary data utilization, such as interviews with media monitoring industry analysts and representatives of the competing companies, and academic experts, study leans up on the comprehensive analysis of Russian media monitoring niche and its players. This study benefits the business modeling research area and combines traditional analysis tools, such as market, PESTLE and competitor analyses, in a systemic manner, with the business modeling techniques. This transformation proceeds through applying of the integrated scenario, heat map and critical design issues’ analyses in the societal, industrial and competitive context of turbulent environments. The practical outcome of this approach is the development of agile business modeling tool set, customizable by company’s requirements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract Discussions of conflict of interest (COI) in the university have tended to focus on financial interests in the context of medical research; much less attention has been given to COI in general or to the policies that seek to manage COI. Are university COI policies accessible and understandable? To whom are these policies addressed (faculty, staff, students)? Is COI clearly defined in these policies and are procedures laid out for avoiding or remedying such situations? To begin tackling these important ethical and governance questions, our study examines the COI policies at the Group of Thirteen (G13) leading Canadian research universities. Using automated readability analysis tools and an ethical content analysis, we begin the task of comparing the strengths and weaknesses of these documents, paying particular attention to their clarity, readability, and utility in explaining and managing COI.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analyser le code permet de vérifier ses fonctionnalités, détecter des bogues ou améliorer sa performance. L’analyse du code peut être statique ou dynamique. Des approches combinants les deux analyses sont plus appropriées pour les applications de taille industrielle où l’utilisation individuelle de chaque approche ne peut fournir les résultats souhaités. Les approches combinées appliquent l’analyse dynamique pour déterminer les portions à problèmes dans le code et effectuent par la suite une analyse statique concentrée sur les parties identifiées. Toutefois les outils d’analyse dynamique existants génèrent des données imprécises ou incomplètes, ou aboutissent en un ralentissement inacceptable du temps d’exécution. Lors de ce travail, nous nous intéressons à la génération de graphes d’appels dynamiques complets ainsi que d’autres informations nécessaires à la détection des portions à problèmes dans le code. Pour ceci, nous faisons usage de la technique d’instrumentation dynamique du bytecode Java pour extraire l’information sur les sites d’appels, les sites de création d’objets et construire le graphe d’appel dynamique du programme. Nous démontrons qu’il est possible de profiler dynamiquement une exécution complète d’une application à temps d’exécution non triviale, et d’extraire la totalité de l’information à un coup raisonnable. Des mesures de performance de notre profileur sur trois séries de benchmarks à charges de travail diverses nous ont permis de constater que la moyenne du coût de profilage se situe entre 2.01 et 6.42. Notre outil de génération de graphes dynamiques complets, nommé dyko, constitue également une plateforme extensible pour l’ajout de nouvelles approches d’instrumentation. Nous avons testé une nouvelle technique d’instrumentation des sites de création d’objets qui consiste à adapter les modifications apportées par l’instrumentation au bytecode de chaque méthode. Nous avons aussi testé l’impact de la résolution des sites d’appels sur la performance générale du profileur.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les observations astronomiques et cosmologiques suggèrent fortement la présence d’une matière exotique, non-relativiste et non-baryonique qui représenterait 26% du contenu de masse-énergie de l’Univers actuel. Cette matière dite sombre et froide serait compo- sée de particules neutres, massives et interagissant faiblement avec la matière ordinaire (WIMP : Weakly Interactive Massive Particles). Le projet PICASSO (Projet d’Identification des CAndidats Supersymétriques de la matière SOmbre) est une des expériences installées dans le site souterrain de SNOLAB à Sudbury en Ontario, qui tente de détecter directement un des candidats de la matière sombre, proposé dans le cadre des extensions supersymétriques du modèle standard : le neutralino. Pour cela, PICASSO utilise des détecteurs à gouttelettes surchauffées de C4F10, basés sur le principe de la chambre à bulles. Les transitions de phase dans les liquides surchauffés peuvent être déclenchées par le recul du 19 F, causé par une collision élastique avec les neutralinos. La nucléation de la gouttelette génère une onde sonore enregistrée par des senseurs piézo-électriques. Cette thèse présentera les récents progrès de l’expérience PICASSO qui ont conduit à une augmentation substantielle de sa sensibilité dans la recherche du neutralino. En effet, de nouvelles procédures de fabrication et de purification ont permis de réduire à un facteur de 10, la contamination majeure des détecteurs, causée par les émetteurs alpha. L’étude de cette contamination dans les détecteurs a permis de localiser la source de ces émetteurs. Les efforts effectués dans le cadre de l’analyse des données, ont permis d’améliorer l’effet de discrimination entre des évènements engendrés par les particules alpha et par les reculs nucléaires. De nouveaux outils d’analyse ont également été implémentés dans le but de discriminer les évènements générés par des particules de ceux générés par des bruits de fond électroniques ou acoustiques. De plus, un mécanisme important de suppression de bruit de fond indésirable à haute température, a permis à l’expérience PICASSO d’être maintenant sensible aux WIMPs de faibles masses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La version intégrale de cette thèse est disponible uniquement pour consultation individuelle à la Bibliothèque de l'Université de Montréal (www.bib.umontreal.ca/MU).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les mesures cosmologiques les plus récentes ont montré la présence d’un type de matière exotique constituant 85% de la masse de l’univers. Ce type de matière non baryonique serait formé de particules neutres, non relativistes, massives et interagissant faiblement avec la matière baryonique. L’ensemble des candidats est regroupé sous le nom générique WIMP (Weakly Interactive Massive Particles). L’expérience PICASSO (Projet d’Identification des CAndidats Supersymétriques de la matière SOmbre) est une expérience utilisant des détecteurs à seuil d’énergie contenant des gouttelettes surchauffées constituées de C4F10. Cette technique de détection est basée sur le principe de la chambre à bulles. Le projet PICASSO a pour but de détecter directement une particule de matière sombre. Le principe de détection est qu’une particule de matière sombre interagissant avec le liquide actif engendre un recul nucléaire du 19F. L’énergie de recul serait suffisante pour engendrer une transition de phase accompagnée d’un signal acoustique enregistrée par des senseurs piézoélectriques. Dans le cadre de ce mémoire, une simulation du taux de comptage de l’étalonnage des détecteurs PICASSO soumis à des neutrons monoénergétiques a été effectuée en utilisant la théorie de Seitz qui décrit les critères pour qu’une transition de phase ait lieu pour un liquide en état de surchauffe. De plus, un modèle calculant le signal acoustique émis lors d’une transition de phase engendré par différents types de radiations a été créé permettant de caractériser la discrimination entre différents bruits de fond en fonction de l’énergie de seuil. Finalement, un outil d’analyse, la localisation des évènements, a été utilisé pour appliquer des coupures sur le volume dans le but d’améliorer la discrimination alpha-neutron.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge discovery support environments include beside classical data analysis tools also data mining tools. For supporting both kinds of tools, a unified knowledge representation is needed. We show that concept lattices which are used as knowledge representation in Conceptual Information Systems can also be used for structuring the results of mining association rules. Vice versa, we use ideas of association rules for reducing the complexity of the visualization of Conceptual Information Systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose to analyze shapes as “compositions” of distances in Aitchison geometry as an alternate and complementary tool to classical shape analysis, especially when size is non-informative. Shapes are typically described by the location of user-chosen landmarks. However the shape – considered as invariant under scaling, translation, mirroring and rotation – does not uniquely define the location of landmarks. A simple approach is to use distances of landmarks instead of the locations of landmarks them self. Distances are positive numbers defined up to joint scaling, a mathematical structure quite similar to compositions. The shape fixes only ratios of distances. Perturbations correspond to relative changes of the size of subshapes and of aspect ratios. The power transform increases the expression of the shape by increasing distance ratios. In analogy to the subcompositional consistency, results should not depend too much on the choice of distances, because different subsets of the pairwise distances of landmarks uniquely define the shape. Various compositional analysis tools can be applied to sets of distances directly or after minor modifications concerning the singularity of the covariance matrix and yield results with direct interpretations in terms of shape changes. The remaining problem is that not all sets of distances correspond to a valid shape. Nevertheless interpolated or predicted shapes can be backtransformated by multidimensional scaling (when all pairwise distances are used) or free geodetic adjustment (when sufficiently many distances are used)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

“What is value in product development?” is the key question of this paper. The answer is critical to the creation of lean in product development. By knowing how much value is added by product development (PD) activities, decisions can be more rationally made about how to allocate resources, such as time and money. In order to apply the principles of Lean Thinking and remove waste from the product development system, value must be precisely defined. Unfortunately, value is a complex entity that is composed of many dimensions and has thus far eluded definition on a local level. For this reason, research has been initiated on “Measuring Value in Product Development.” This paper serves as an introduction to this research. It presents the current understanding of value in PD, the critical questions involved, and a specific research design to guide the development of a methodology for measuring value. Work in PD value currently focuses on either high-level perspectives on value, or detailed looks at the attributes that value might have locally in the PD process. Models that attempt to capture value in PD are reviewed. These methods, however, do not capture the depth necessary to allow for application. A methodology is needed to evaluate activities on a local level to determine the amount of value they add and their sensitivity with respect to performance, cost, time, and risk. Two conceptual tools are proposed. The first is a conceptual framework for value creation in PD, referred to here as the Value Creation Model. The second tool is the Value-Activity Map, which shows the relationships between specific activities and value attributes. These maps will allow a better understanding of the development of value in PD, will facilitate comparison of value development between separate projects, and will provide the information necessary to adapt process analysis tools (such as DSM) to consider value. The key questions that this research entails are: · What are the primary attributes of lifecycle value within PD? · How can one model the creation of value in a specific PD process? · Can a useful methodology be developed to quantify value in PD processes? · What are the tools necessary for application? · What PD metrics will be integrated with the necessary tools? The research milestones are: · Collection of value attributes and activities (September, 200) · Development of methodology of value-activity association (October, 2000) · Testing and refinement of the methodology (January, 2001) · Tool Development (March, 2001) · Present findings at July INCOSE conference (April, 2001) · Deliver thesis that captures a formalized methodology for defining value in PD (including LEM data sheets) (June, 2001) The research design aims for the development of two primary deliverables: a methodology to guide the incorporation of value, and a product development tool that will allow direct application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

On last years we have seen an increase on the use of GIS technologies as analysis tools on the field of historical research. The study of landscape, and how it has influenced the development of History is a focal point of research fields like archaeology and battlefield analysis, and we are seeing nowadays how its use is spreading. (...)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Primera conferencia. Bibliotecas y Repositorios Digitales: Gestión del Conocimiento, Acceso Abierto y Visibilidad Latinoamericana. (BIREDIAL) Mayo 9 al 11 de 2011. Bogotá, Colombia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las organizaciones han sido percibidas y descritas por los académicos como un producto de la racionalidad, por lo tanto su gestión debe realizarse de manera racional. En el presente trabajo se reivindica el rol de la intuición en los procesos de toma de decisiones estratégicas en las organizaciones; para ello, una vez descrita la “organización racional” y sus implicaciones, con ayuda de la psicología y de la sociología se desvirtúa la separación entre racionalidad e intuición. Mediante el trabajo de campo realizado y utilizando como herramientas de análisis los modelos duales de cognición y comportamiento desarrollados en la última década, se evidencia que la intuición juega un papel importante y relevante en los procesos de toma de decisiones estratégicas en las organizaciones, papel que ha sido deliberadamente olvidado por la academia y que debe ser estudiado, con el fin de poder tener una comprensión integral de las organizaciones.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabajo de grado tiene como objetivo identificar las rutas y métodos seguidos por una organización para salir de la crisis. El caso de estudio es la empresa Laboratorios Vogue, quienes estuvieron en acuerdos de reestructuración (Ley 550/1999), y a través de las decisiones tomadas por sus líderes empresariales sobrevivieron y permanecen en el mercado, razón por la cual la Facultad de Administración de la Universidad del Rosario le otorgó el Premio Ave Fénix 2009, con el que reconoce y destaca el esfuerzo colectivo de sus directivos y trabajadores que reconstruyen las empresas, identificando características que les permitan ser sostenibles, perdurables y exitosas. Con este panorama de la situación empresarial, se busca resaltar la gestión de empresarios que sobresalen de sus crisis y enfrentan mercados cada vez más competitivos. Para el desarrollo de la investigación se utilizaron varios instrumentos de análisis como las entrevistas, encuestas y visitas a la empresa en las que se analizaron las estrategias tomadas y su eficiencia para sobresalir de la crisis y alcanzar el cumplimiento de su acuerdo de reestructuración. El trabajo se organizó bajo el lineamiento de la Facultad de administración, en la que se analizaron también variables como la Dirección y Gerencia para identificar su gestión administrativa y la ruta del éxito para superar la crisis de Laboratorios Vogue.