935 resultados para accessibility analysis tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

SKI-l/SlP protease is a member of the proprotein convertase family, with several functions in cellular metabolism and homeostasis. It is responsible for the processing of several cellular substrates, including ATF6, SREBPs, and GlcNAc-1- phosphotranspherase. Furthermore, SKI-1/SlP is also responsible for maturation of arenavirus surface glycoprotein into GP1 and GP2 subunits. This processing is a strict requirement in order to achieve fully mature and fusion-competent virions. Furthermore, SKI-1/SlP itself is synthesized as an inactive zymogen, requiring sequential autocatalytic processing at several sites (B'/B and C) in its prodomain in order to mature and become fully active. Our project focused on the analysis of SKI- 1/S1P prodomain in the biogenesis of the active enzyme. In this context we have additionally developed and characterized a novel cell-based sensor for assessment of cellular activity of the enzyme, with a potential application in screening for novel SKI- 1/S1P inhibitors. In a first aim we have analysed the relevance of cleavage motifs found in the enzyme prodomain. Using molecular and biochemistry tools we have identified and characterized a novel C' maturation site. Furthermore, we found that SKI-1/SlP autoprocessing results in intermediates whose catalytic domain remains associated with prodomain fragments of different lengths. Contrasting with other proprotein convertases, incompletely matured intermediates of SKI-1/SlP exhibit full catalytic activity toward selected substrates. In a second aim, we turned our attention to the structural basis of SKI-1/SlP N- terminus assisted folding. Studying the folding and activity of prodomain-truncated forms of the enzyme we found that a minimal folding unit is contained in the AB region. Deletion of the BC sequence affected auto-maturation but not folding, and partial activity was retained. However, the BC region seemed required for complete and full activity. Phylogenetic analyses showed that the AB sequence is highly conserved, while the BC fragment is variable in sequence and length. Specifically, replacement of the human prodomain with that of Drosophila, resulted in a fully mature and active chimeric enzyme, suggesting an evolution process of SKI-1/SlP prodomain towards a more complex arrangement and steps of activation. Overall, the additional data we have produced might provide fundamental knowledge crucial for the development of novel SKI-1/SlP inhibitors while also providing new SKI- 1/S1P variants with potential use in crystallization purpose. -- SKI-l/SlP est une protéase membre de la famille des proprotéines convertases (PCs), avec plusieurs fonctions dans le métabolisme cellulaire et de l'homéostasie. Il est responsable pour la maturation de plusieurs substrats cellulaires, y compris ATF6, SREBPs et GlcNAc-1-phosphotranspherase. SKI-l/SlP est également responsable pour la maturation de la glycoprotéine des arénavirus, une exigence stricte pour atteindre des virions infectieuse. Synthétisé comme un zymogène inactif, SKI-l/SlP nécessite d'un traitement autocatalytique séquentiel sur plusieurs sites (B'/B et C) de son prodomaine afin de devenir pleinement active. Notre projet était axé sur l'analyse de SKI-l/SlP prodomaine dans la biogenèse de l'enzyme. Dans ce contexte, nous avons développé un nouveau senseur-cellulaire pour l'évaluation de l'activité de l'enzyme. Ce dernier pourrait avoir une potentielle application dans l'identification de nouveaux inhibiteurs de SKI-l/SlP. Premièrement, nous avons analysé la pertinence des motifs de clivage trouvés dans le prodomaine de l'enzyme. En utilisant des outils moléculaires et biochimiques, nous avons identifié et caractérisé un nouveau site de maturation (C'). Aussi, nous avons constaté que la maturation de SKI-l/SlP a des intermédiaires dont le domaine catalytique reste associé à des fragments du prodomaine de différentes longueurs. Contrastant avec d'autres PCs, les intermédiaires partiellement matures de SKI-1 / SIP présentent une activité catalytique complète envers des substrats spécifiques. Dans un deuxième but nous avons tourné notre attention sur la base structurelle du pliage de SKI-l/SlP assisté par son N-terminus: En étudiant l'activité et pliage des formes tronquées dans le prodomaine de l'enzyme, nous avons constaté qu'une unité de pliage minimale est contenue dans la région de l'AB. La suppression de la séquence d'auto-BC affecte la maturation mais pas le pliage, et l'activité partielle est maintenue. Cependant, la région BC semble nécessaire pour une activité complète. Les analyses phylogénétiques ont montré que la séquence AB est fortement conservée, tandis que le fragment de BC est variable en longueur et en séquence. En particulier, le remplacement du prodomaine humain avec celui de la drosophile, a donné lieu à une enzyme chimérique complètement mature et active. Suggérant un processus d'évolution du prodomaine vers un arrangement et des mesures d'activation plus complexe. Globalement, ces donnees supplémentaires augment les connaissances fondamentales cruciales pour le développement de nouveaux inhibiteurs de SKI-1/ SIP, tout en offrant de nouvelles variantes SKI-1 / SIP dans le but d'obtenir la structure cristallographique de l'enzyme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently available molecular biology tools allow forensic scientists to characterize DNA evidence found at crime scenes for a large variety of samples, including those of limited quantity and quality, and achieve high levels of individualization. Yet, standard forensic markers provide limited or no results when applied to mixed DNA samples where the contributors are present in very different proportions (unbalanced DNA mixtures). This becomes an issue mostly for the analysis of trace samples collected on the victim or from touched objects. To this end, we recently proposed an innovative type of genetic marker, named DIP-STR that relies on pairing deletion/insertion polymorphisms (DIP) with standard short tandem repeats (STR). This novel compound marker allows detection of the minor DNA contributor in a DNA mixture of any gender and cellular origin with unprecedented resolution (beyond a DNA ratio of 1:1000). To provide a novel analytical tool useful in practice to common forensic laboratories, this article describes the first set of 10 DIP-STR markers selected according to forensic technical standards. The novel DIP-STR regions are short (between 146 and 271 bp), include only highly polymorphic tri-, tetra- and pentanucleotide tandem repeats and are located on different chromosomes or chromosomal arms to provide statistically independent results. This novel set of DIP-STR can target the amplification of 0.03-0.1 ng of DNA when mixed with a 1000-fold excess of major DNA. DIP-STR relative allele frequencies are estimated based on a survey of 103 Swiss individuals. Finally, this study provides an estimate of the occurrence of informative alleles and a calculation of the corresponding random match probability of the detected minor DIP-STR genotype assessed across 10,506 pairwise conceptual mixtures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of qualitative research methods has grown substantially over the last twenty years, both in social sciences and, more recently, in the health sciences. This growth came with questions on the quality criteria needed to evaluate this work, and numerous guidelines were published. The latters include many discrepancies though, both in their vocabulary and construction. Many expert evaluators decry the absence of consensual and reliable evaluation tools. The authors present the results of an evaluation of 58 existing guidelines in 4 major health science fields (medicine and epidemiology; nursing and health education; social sciences and public health; psychology / psychiatry, research methods and organization) by expert users (article reviewers, experts allocating funds, editors, etc.). The results propose a toolbox containing 12 consensual criteria with the definitions given by expert users. They also indicate in which disciplinary field each type of criteria is known to be more or less essential. Nevertheless, the authors highlight the limitations of the criteria comparability, as soon as one focuses on their specific definitions. They conclude that each criterion in the toolbox must be explained to come to broader consensus and identify definitions that are consensual to all the fields examined and easily operational.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coating and filler pigments have strong influence to the properties of the paper. Filler content can be even over 30 % and pigment content in coating is about 85-95 weight percent. The physical and chemical properties of the pigments are different and the knowledge of these properties is important for optimising of optical and printing properties of the paper. The size and shape of pigment particles can be measured by different analysers which can be based on sedimentation, laser diffraction, changes in electric field etc. In this master's thesis was researched particle properties especially by scanning electron microscope (SEM) and image analysis programs. Research included nine pigments with different particle size and shape. Pigments were analysed by two image analysis programs (INCA Feature and Poikki), Coulter LS230 (laser diffraction) and SediGraph 5100 (sedimentation). The results were compared to perceive the effect of particle shape to the performance of the analysers. Only image analysis programs gave parameters of the particle shape. One part of research was also the sample preparation for SEM. Individual particles should be separated and distinct in ideal sample. Analysing methods gave different results but results from image analysis programs corresponded even to sedimentation or to laser diffraction depending on the particle shape. Detailed analysis of the particle shape required high magnification in SEM, but measured parameters described very well the shape of the particles. Large particles (ecd~1 µm) could be used also in 3D-modelling which enabled the measurement of the thickness of the particles. Scanning electron microscope and image analysis programs were effective and multifunctional tools for particle analyses. Development and experience will devise the usability of analysing method in routine use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The agricultural sector has always been characterized by a predominance of small firms. International competition and the consequent need for restraining costs are permanent challenges for farms. This paper performs an empirical investigation of cost behavior in agriculture using panel data analysis. Our results show that transactions caused by complexity influence farm costs with opposite effects for specific and indirect costs. While transactions allow economies of scale in specific costs, they significantly increase indirect costs. However, the main driver for farm costs is volume. In addition, important differences exist for small and big farms, since transactional variables significantly influence the former but not the latter. While sophisticated management tools, such ABC, could provide only limited complementary useful information but no essential allocation bases for farms, they seem inappropriate for small farms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The agricultural sector has always been characterized by a predominance of small firms. International competition and the consequent need for restraining costs are permanent challenges for farms. This paper performs an empirical investigation of cost behavior in agriculture using panel data analysis. Our results show that transactions caused by complexity influence farm costs with opposite effects for specific and indirect costs. While transactions allow economies of scale in specific costs, they significantly increase indirect costs. However, the main driver for farm costs is volume. In addition, important differences exist for small and big farms, since transactional variables significantly influence the former but not the latter. While sophisticated management tools, such ABC, could provide only limited complementary useful information but no essential allocation bases for farms, they seem inappropriate for small farms

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decades, calibration techniques have been widely used to improve the accuracy of robots and machine tools since they only involve software modification instead of changing the design and manufacture of the hardware. Traditionally, there are four steps are required for a calibration, i.e. error modeling, measurement, parameter identification and compensation. The objective of this thesis is to propose a method for the kinematics analysis and error modeling of a newly developed hybrid redundant robot IWR (Intersector Welding Robot), which possesses ten degrees of freedom (DOF) where 6-DOF in parallel and additional 4-DOF in serial. In this article, the problem of kinematics modeling and error modeling of the proposed IWR robot are discussed. Based on the vector arithmetic method, the kinematics model and the sensitivity model of the end-effector subject to the structure parameters is derived and analyzed. The relations between the pose (position and orientation) accuracy and manufacturing tolerances, actuation errors, and connection errors are formulated. Computer simulation is performed to examine the validity and effectiveness of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Herein, we report the formation of organized mesoporous silica materials prepared from a novel nonionic gemini surfactant, myristoyl-end capped Jeffamine, synthesized from a polyoxyalkyleneamine (ED900). The behavior of the modified Jeffamine in water was first investigated. A direct micellar phase (L1) and a hexagonal (H1) liquid crystal were found. The structure of the micelles was investigated from the SAXS and the analysis by Generalized Indirect Fourier Transformation (GIFT), which show that the particles are globular of coreshell type. The myristoyl chains, located at the ends of the amphiphile molecule are assembled to form the core of the micelles and, as a consequence, the molecules are folded over on themselves. Mesoporous materials were then synthesized from the self-assembly mechanism. The recovered materials were characterized by SAXS measurements, nitrogen adsorptiondesorption analysis, transmission and scanning electron microscopy. The results clearly evidence that by modifying the synthesis parameters, such as the surfactant/silica precursor molar ratio and the hydrothermal conditions, one can control the size and the nanostructuring of the resulting material. It was observed that, the lower the temperature of the hydrothermal treatment, the better the mesopore ordering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When laboratory intercomparison exercises are conducted, there is no a priori dependence of the concentration of a certain compound determined in one laboratory to that determined by another(s). The same applies when comparing different methodologies. A existing data set of total mercury readings in fish muscle samples involved in a Brazilian intercomparison exercise was used to show that correlation analysis is the most effective statistical tool in this kind of experiments. Problems associated with alternative analytical tools such as mean or paired 't'-test comparison and regression analysis are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IT outsourcing refers to the way companies focus on their core competencies and buy the supporting functions from other companies specialized in that area. Service is the total outcome of numerous of activities by employees and other resources to provide solutions to customers' problems. Outsourcing and service business have their unique characteristics. Service Level Agreements quantify the minimum acceptable service to the user. The service quality has to be objectively quantified so that its achievement or non-achievement of it can be monitored. Usually offshoring refers to the transferring of tasks to low-cost nations. Offshoring presents a lot of challenges that require special attention and they need to be assessed thoroughly. IT Infrastructure management refers to installation and basic usability assistance of operating systems, network and server tools and utilities. ITIL defines the industry best practices for organizing IT processes. This thesis did an analysis of server operations service and the customers’ perception of the quality of daily operations. The agreed workflows and processes should be followed better. Service providers’ processes are thoroughly defined but both the customer and the service provider might disobey them. Service provider should review the workflows regarding customer functions. Customer facing functions require persistent skill development, as they communicate the quality to the customer. Service provider needs to provide better organized communication and knowledge exchange methods between the specialists in different geographical locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electricity distribution network operation (NO) models are challenged as they are expected to continue to undergo changes during the coming decades in the fairly developed and regulated Nordic electricity market. Network asset managers are to adapt to competitive technoeconomical business models regarding the operation of increasingly intelligent distribution networks. Factors driving the changes for new business models within network operation include: increased investments in distributed automation (DA), regulative frameworks for annual profit limits and quality through outage cost, increasing end-customer demands, climatic changes and increasing use of data system tools, such as Distribution Management System (DMS). The doctoral thesis addresses the questions a) whether there exist conditions and qualifications for competitive markets within electricity distribution network operation and b) if so, identification of limitations and required business mechanisms. This doctoral thesis aims to provide an analytical business framework, primarily for electric utilities, for evaluation and development purposes of dedicated network operation models to meet future market dynamics within network operation. In the thesis, the generic build-up of a business model has been addressed through the use of the strategicbusiness hierarchy levels of mission, vision and strategy for definition of the strategic direction of the business followed by the planning, management and process execution levels of enterprisestrategy execution. Research questions within electricity distribution network operation are addressed at the specified hierarchy levels. The results of the research represent interdisciplinary findings in the areas of electrical engineering and production economics. The main scientific contributions include further development of the extended transaction cost economics (TCE) for government decisions within electricity networks and validation of the usability of the methodology for the electricity distribution industry. Moreover, DMS benefit evaluations in the thesis based on the outage cost calculations propose theoretical maximum benefits of DMS applications equalling roughly 25% of the annual outage costs and 10% of the respective operative costs in the case electric utility. Hence, the annual measurable theoretical benefits from the use of DMS applications are considerable. The theoretical results in the thesis are generally validated by surveys and questionnaires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Appearance of the vibration is the very important problem in long tool turning and milling. Current solutions of minimizing vibrations provided by different tool suppliers are very expensive. This Master’s Thesis is presenting the new type of vibration free machining tools produced by Konepaja ASTEX Gear Oy that have cheaper production costs compare to competitors’ products. Vibration problems in machining and their today’s solutions are analyzed in this work. The new vibration damping invention is presented and described. Moreover, the production, laboratory experimental modal analysis and practical testing of the new vibration free prototypes are observed and analyzed on the pages of this Thesis. Based on the testing results the new invention is acknowledged to be successful and approved for further studies and developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.