914 resultados para Using Lean tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Author identification is the problem of identifying the author of an anonymous text or text whose authorship is in doubt from a given set of authors. The works by different authors are strongly distinguished by quantifiable features of the text. This paper deals with the attempts made on identifying the most likely author of a text in Malayalam from a list of authors. Malayalam is a Dravidian language with agglutinative nature and not much successful tools have been developed to extract syntactic & semantic features of texts in this language. We have done a detailed study on the various stylometric features that can be used to form an authors profile and have found that the frequencies of word collocations can be used to clearly distinguish an author in a highly inflectious language such as Malayalam. In our work we try to extract the word level and character level features present in the text for characterizing the style of an author. Our first step was towards creating a profile for each of the candidate authors whose texts were available with us, first from word n-gram frequencies and then by using variable length character n-gram frequencies. Profiles of the set of authors under consideration thus formed, was then compared with the features extracted from anonymous text, to suggest the most likely author.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, research projects such as PADLR and SWAP have developed tools like Edutella or Bibster, which are targeted at establishing peer-to-peer knowledge management (P2PKM) systems. In such a system, it is necessary to obtain provide brief semantic descriptions of peers, so that routing algorithms or matchmaking processes can make decisions about which communities peers should belong to, or to which peers a given query should be forwarded. This paper proposes the use of graph clustering techniques on knowledge bases for that purpose. Using this clustering, we can show that our strategy requires up to 58% fewer queries than the baselines to yield full recall in a bibliographic P2PKM scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In early stages of architectural design, as in other design domains, the language used is often very abstract. In architectural design, for example, architects and their clients use experiential terms such as "private" or "open" to describe spaces. If we are to build programs that can help designers during this early-stage design, we must give those programs the capability to deal with concepts on the level of such abstractions. The work reported in this thesis sought to do that, focusing on two key questions: How are abstract terms such as "private" and "open" translated into physical form? How might one build a tool to assist designers with this process? The Architect's Collaborator (TAC) was built to explore these issues. It is a design assistant that supports iterative design refinement, and that represents and reasons about how experiential qualities are manifested in physical form. Given a starting design and a set of design goals, TAC explores the space of possible designs in search of solutions that satisfy the goals. It employs a strategy we've called dependency-directed redesign: it evaluates a design with respect to a set of goals, then uses an explanation of the evaluation to guide proposal and refinement of repair suggestions; it then carries out the repair suggestions to create new designs. A series of experiments was run to study TAC's behavior. Issues of control structure, goal set size, goal order, and modification operator capabilities were explored. In addition, TAC's use as a design assistant was studied in an experiment using a house in the process of being redesigned. TAC's use as an analysis tool was studied in an experiment using Frank Lloyd Wright's Prairie houses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lean Transition of Emerging Industrial Capability (LeanTEC) program was a cooperative agreement between the Boeing Company and AFRL conducted from January 1998 to January 2002. The results of this program are documented in the Manual for Effective Technology Transition Processes included as an attachment to this report. This manual provides processes, procedures, and tools for greatly improving technology transition in the aerospace industry. Methodology for the implementation of these improvements is given along with methods for customizing the various processes, procedures, and tools for a given company or business unit. The indicated methodology was tested by the LeanTEC team and results are documented in the report.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last two decades have made possible a much deeper exploration of the nature of variability, and the possible processes associated with compositional data sets from many disciplines. In this paper we concentrate on geochemical data sets. First we explain how hypotheses of compositional variability may be formulated within the natural sample space, the unit simplex, including useful hypotheses of subcompositional discrimination and specific perturbational change. Then we develop through standard methodology, such as generalised likelihood ratio tests, statistical tools to allow the systematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require special construction. We comment on the use of graphical methods in compositional data analysis and on the ordination of specimens. The recent development of the concept of compositional processes is then explained together with the necessary tools for a staying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland. Finally we point out a number of unresolved problems in the statistical analysis of compositional processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This collection of videos shows you how to use a range of time-saving tools when writing a thesis in MS Word 2010/2013. See the full SupportGuide at http://www.go.soton.ac.uk/thesispc. There are videos on using styles; creating tables of contents and tables of figures; using the Navigation Pane; using the Browse Object tool and many more. There is an equivelent collection for use with Word 2011 which is for use with Apple computers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En un país que ha afrontado un conflicto armado interno por décadas en el cual la posición geográfica es de difícil acceso y los puntos de control son remotos, se hace necesario generar un sistema de distribución y almacenamiento óptimo y eficaz. Un sistema donde los tiempos de respuesta sean inmediatos, las cantidades requeridas sean exactamente iguales a las enviadas y la calidad óptima para que los oficiales, sub oficiales e infantes de marina pueda cumplir con su función. En el periodo presidencial de Álvaro Uribe Vélez (2002-2010), se crearon los comandos conjuntos lo que llevó a las fuerzas armadas a estar incorporando año tras año nuevo personal efectivo y esto a su vez implicó un mayor abastecimiento de armamento, bases, intendencia y alimentos.. Este estudio está centrado en el abastecimiento de intendencia de la Armada Nacional donde se analiza la bodega de almacenamiento, el proceso de distribución y la optimización de recursos físicos de la Dirección de Abastecimiento de la Armada Nacional. Adicionalmente, se analiza la optimización del espacio de la bodega de almacenaje para incrementar la rotación de inventarios, el control de dotación y la distribución dentro de la bodega. Finalmente se plantea un plan de mejora en la cadena de suministro utilizando herramientas como Flujo-grama causa-efecto, Lay-out y diagrama de recorrido.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the performance of the Dimensional Clinical Personality Inventory (DCPI) using Rasch-based person and item analysis. 1281 participants were recruited, between 18 and 90 years of age (M=26.64; SD=8.94), 431 men (33.6%) and 127 (9.9%) patients diagnosed with axis I disorders and/or axis II according to DSM-IV-TR. Results indicated the IDCP scales performed reasonably well, and the usefulness of the analyses presented, demonstrate the Rasch model’s applicability for clinical applications. Among the important tools offered by the Rasch model, we explore the use of the person-item map, which visually presents the intuitively understandable psychological construct along the dimensional scale of the instrument.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'agricultura i la industrialització han causat un augment significatiu del nombre d'ambients rics en amoni. La presència de compostos nitrogenats redueix la qualitat de l'aigua, causant problemes de toxicitat, deteriorant el medi ambient i fins i tot afectant la salut humana. En conseqüència, la nitrificació s'ha convertit en un procés global que afecta al cicle del nitrogen a la biosfera. Els bacteris oxidadors d'amoni (AOB) són els responsables de l'oxidació de l'amoni a nitrit, i juguen un paper essencial en el cicle del nitrogen. Els primers oxidadors d'amoni foren aïllats a finals del segle XIX, però la lentitud del seu creixement i les dificultats per cultivar-los feren que fins als anys 80, amb els primers estudis emprant el gen 16SrDNA, no s'assolís un coneixement complert d'aquest grup bacterià. Actualment les bases de dades contenen multitud d'entrades amb seqüències corresponents a AOB. L'objectiu d'aquest treball era trobar, desenvolupar i avaluar eines útils i fiables per a l'estudi dels AOB en mostres ambientals. En aquest treball primer descrivim la utilització de la hibridació in situ amb fluorescència (FISH), mitjançant l'aplicació de sondes amb diana en el 16SrRNA dels AOB. La FISH ens va permetre detectar i recomptar aquest grup bacterià; no obstant, aquest mètode no permetia la detecció de noves seqüències, pel que es necessitava una nova eina. Amb aquesta intenció vam aplicar la seqüència de la sonda Nso1225 en una PCR. El fet d'amplificar específicament un fragment del 16SrDNA dels AOB va suposar el desenvolupament d'una nova eina molecular que permetia detectar la presència i diversitat d'aquests bacteris en ambients naturals. Malgrat tot, algunes seqüències pertanyents a bacteris no oxidadors d'amoni del subgrup β dels proteobacteris, eren també obtingudes amb aquesta tècnica. Així mateix, un dels inconvenients de l'ús del 16SrDNA com a marcador és la impossibilitat de detectar simultàniament els AOB que pertanyen als subgrups β i γ dels proteobacteris. El gen amoA, que codifica per la subunitat A de l'enzim amoni monooxigenasa (AMO), era aleshores àmpliament utilitzat com a marcador per a la detecció dels AOB. En aquest treball també descrivim la utilització d'aquest marcador en mostres procedents d'un reactor SBR. Aquest marcador ens va permetre identificar seqüències de AOB en la mostra, però la necessitat de detectar amoA mitjançant clonatge fa que l'ús d'aquest marcador requereixi massa temps per a la seva utilització com a eina en estudis d'ecologia microbiana amb moltes mostres. Per altra banda, alguns autors han assenyalat l'obtenció de seqüències de no AOB en utilitzar amoA en un protocol de PCR-DGGE. Amb la finalitat d'obtenir una eina ràpida i rigorosa per detectar i identificar els AOB, vam desenvolupar un joc nou d'oligonucleòtids amb diana en el gen amoB, que codifica per a la subunitat transmembrana de l'enzim AMO. Aquest gen ha demostrat ser un bon marcador molecular pels AOB, oferint, sense tenir en compte afiliacions filogenètiques, una elevada especificitat, sensibilitat i fiabilitat. En aquest treball també presentem una anàlisi de RT-PCR basada en la detecció del gen amoB per a la quantificació del gènere Nitrosococcus. El nou joc d'oligonucleòtids dissenyat permet una enumeració altament específica i sensible de tots els γ-Nitrosococcus coneguts. Finalment, vam realitzar un estudi poligènic, comparant i avaluant els marcadors amoA, amoB i 16SrDNA, i vàrem construir un arbre filogenètic combinat. Com a resultat concloem que amoB és un marcador adequat per a la detecció i identificació dels AOB en mostres ambientals, proporcionant alhora agrupacions consistents en fer inferències filogenètiques. Per altra banda, la seqüència sencera del gen 16S rDNA és indicada com a marcador en estudis amb finalitats taxonòmiques i filogenètiques en treballar amb cultius purs de AOB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proposal presented in this thesis is to provide designers of knowledge based supervisory systems of dynamic systems with a framework to facilitate their tasks avoiding interface problems among tools, data flow and management. The approach is thought to be useful to both control and process engineers in assisting their tasks. The use of AI technologies to diagnose and perform control loops and, of course, assist process supervisory tasks such as fault detection and diagnose, are in the scope of this work. Special effort has been put in integration of tools for assisting expert supervisory systems design. With this aim the experience of Computer Aided Control Systems Design (CACSD) frameworks have been analysed and used to design a Computer Aided Supervisory Systems (CASSD) framework. In this sense, some basic facilities are required to be available in this proposed framework: ·

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research record on the quantification of sediment transport processes in periglacial mountain environments in Scandimvia dates back to the 1950s. A wide range of measurements is. available, especially from the Karkevagge region of northern Sweden. Within this paper satellite image analysis and tools provided by geographic information systems (GIS) are exploited in order to extend and improve this research and to complement geophysical methods. The processes of interest include mass movements such as solifluction, slope wash, dirty avalanches and rock-and boulder falls. Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing Field measurements are employed in the budget calculation. In the Karkevagge catch ment. 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Karkevagge is approximately 680 t a(-1) whilst about 150 : a-1 are transported into the fluvial System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RothC and Century are two of the most widely used soil organic matter (SOM) models. However there are few examples of specific parameterisation of these models for environmental conditions in East Africa. The aim of this study was therefore, to evaluate the ability of RothC and the Century to estimate changes in soil organic carbon (SOC) resulting from varying land use/management practices for the climate and soil conditions found in Kenya. The study used climate, soils and crop data from a long term experiment (1976-2001) carried out at The Kabete site at The Kenya National Agricultural Research Laboratories (NARL, located in a semi-humid region) and data from a 13 year experiment carried out in Machang'a (Embu District, located in a semi-arid region). The NARL experiment included various fertiliser (0, 60 and 120 kg of N and P2O5 ha(-1)), farmyard manure (FYM - 5 and 10 t ha(-1)) and plant residue treatments, in a variety of combinations. The Machang'a experiment involved a fertiliser (51 kg N ha(-1)) and a FYM (0, 5 and 10 t ha(-1)) treatment with both monocropping and intercropping. At Kabete both models showed a fair to good fit to measured data, although Century simulations for treatments with high levels of FYM were better than those without. At the Machang'a site with monocrops, both models showed a fair to good fit to measured data for all treatments. However, the fit of both models (especially RothC) to measured data for intercropping treatments at Machang'a was much poorer. Further model development for intercrop systems is recommended. Both models can be useful tools in soil C Predictions, provided time series of measured soil C and crop production data are available for validating model performance against local or regional agricultural crops. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.